Friday, July 15, 2022

Azure Data Factory - Run SSIS packages in ADF as File System (Project Model) using SSIS-IR

 Azure Data Factory - Run SSIS packages in ADF as File System (Project Model) using SSIS-IR


In this we are using Project Deployment Model, we are taking the .ispac file that is getting generated after build. From that we are uploading the .ispac file and the package(.dtsx) to Azure File Share. 

If you want to know the FileSystem (Package),

Please visit: https://youtu.be/LzvoUwriio8




Azure Data Factory - Run SSIS packages in ADF as File System (Package Model) using SSIS-IR

 Azure Data Factory - Run SSIS packages in ADF as File System (Package Model) using SSIS-IR


In this we are using package deployment model and storing our Configuration File (.dtsConfig) and Package(.dtsx) in an Azure File Share.  

Using Azure Data Factory we are calling the package as File System (Package) and giving the configuration file with package

ADF looks for the configuration file and performs the action based on the content /variables that we given in the configuration file




Thursday, July 14, 2022

Azure Data Factory - Lift & Shift SSIS -Execute SSIS package in ADF using Project deployment Model (SSISDB)

Azure Data Factory - Lift & Shift SSIS -Execute SSIS package in ADF using Project deployment Model (SSISDB) by creating SSIS - Integration Runtime


In this video we are using Project Deployment model . We are deploying our Project into SSISDB which is being Hosted in Azure SQL Database Server. 

We are executing the packages from Azure Data Factory which are hosted in the SSISDB.



Wednesday, July 13, 2022

Azure Data Factory - Copying Today's files with date and timestamp in name of the file

 Azure Data Factory - Incremental data copy   Copy files that are Created / Modified today

This helps us to get files which are added or modified on Today's date


Similarly there are multiple approaches to get today's file

If you want only non-empty file and modified or created date with today,

please follow below 

https://youtu.be/LOo9JC-HtLk


If you want Copy Today's files with date and timestamp in name of the file and copy based on the name, please follow below

https://youtu.be/u67xQ1u6NjU




Azure Data Factory - Copy files which are not empty and last modified is today to ADLS container

 Azure Data Factory - Copy files which are not empty and last modified is today to ADLS container


This approach helps us to get only files which are not empty


Expression is @greaterOrEquals(activity('Lookup1').output.count,2)

This checks file along with the header, Sometimes files can also contain header but it could be an empty file with no further rows. The above expressions evaluates row count from 2 or more and that will be treated as non empty file.


For True conditions, 

Take a copy Activity,

Source is same dataset given to lookup activity

FileName as parameter, should have @item().name as value

Sink as dataset which points a folder 

and copy behavior as Preserve hierarchy

For False conditions,

Just create a Wait activity




Tuesday, July 12, 2022

Azure Data Factory - Read Files From On Premise File System to Azure Blob Storage - Practical Demo

 To access one premise file system, we need to setup a Self Hosted Integration runtime.

Default Azure runtime will have only scope up to Azure environment, to access files/resources outside azure we need to configure Self Hosted Integrated Runtime.


Here to stimulate an On premise environment we had a Windows 10 virtual machine and we installed Self Hosted IR in that




Global Certifications: