Installation

Prerequisits

  1. SQL Server database on premise or in Azure.
  2. Visual studio ( preferably 2019*) with the following toolsets: 2.1 Data storage and processing ( If you already installed visual studio, open the VS installer to modify toolsets).
  3. SQL Server Management studio (SSMS)
    *  perhaps VS code will work as well.

Installation of BETL

  1. Clone the entire betl solution from Github using Visual studio. (https://github.com/basvdberg/BETL-Core.git)
  2. Build the solution
  3. Create an empty database with the name of your choosing using ssms.
  4. Publish the sqldb_betl project to the newly created database. You can do this in two ways:
    1. Right click the project and choose publish. (If you don’t see a publish option, make sure that you are in the solution view and not the folder view by clicking icon next to home icon in the solution explorer).
    2. You can also double click the stored publish settings in sqldb_betl.publish.xml. Make sure that you adjust these settings.

Note: publishing might give some dependency related errors caused by the order in which the DDL is generated. please ignore them.

The module xxxx depends on the missing object yyy. The module will still be created; however, it cannot run successfully 
until the object exists.

Getting started using betl in Azure

For the getting started guide, you will need the following resources:

  • Azure SQL database containing betl
  • Azure SQL source database ( e.g. adventureWorks).
  • Azure SQL target database. For this example let’s call this the raw data warehouse (rdw).
  • Azure Data factory
  • Azure key vault ( for storing the secrets, for example the connection strings). You can skip the key vault for the getting started guide. 

You can create these resources manually or by using azure dev-ops. I wrote an article about cicd using azure devops : Setup ADF and keyvault using Azure dev ops ). Here you can find the bicep code that I used to create the resources ( https://github.com/basvdberg/cicd_azure_devops_adf_and_sql ). For this getting started guide, I assume that you have the above listed resources.

manual installation

  1. Create a new Azure data factory resource using the portal with name <adf-name>.
  2. Create a sql server resource and add three databases using the portal. For the AW database specify that you want to use the AdventureWorks sample data. 
  3. Grant permission for this data factory to the sql server database using the managed identity:
-- database master
create user [<adf-name>] from external provider

-- database sqldb_betl
create user [<adf-name>] from external provider
ALTER ROLE db_owner ADD MEMBER [<adf-name>]

-- database sqldb_aw
create user [<adf-name>] from external provider
ALTER ROLE db_owner ADD MEMBER [<adf-name>]

-- database sqldb_rdw
create user [<adf-name>] from external provider
ALTER ROLE db_owner ADD MEMBER [<adf-name>]