hi everyone welcome back with another video on ATL tutorial series and this video is about creating your first pipeline in Microsoft SEO yeah here I am on my Microsoft Azure portal so the first step is that you need to go to the dashboard and from this dashboard you need to click on the data Factory resource I'll go to my data Factory resource and in the Azure data Factory storage I will click on launch Studio it will open up a new page for me it will load the Azure data Factory for me you can see that my ad data Factory has been open so I will click on this author icon here and if I click on here it will open up a new page for me all right let's create our first pipeline so from here I will just click on these three dots and I will click on new Pipeline and on the right hand side we have the properties panel and here we can insert a pipeline name and I will give my pipeline the name of PL underscore ingest underscore WS sales to data Lake just like that so PL stands for the pipeline and WS stands for web store so let's apply a description as it is a good practice so I will give it a description of ingest web store online sales data into the data Lake and we do not need to change anything in the annotations section and so the next thing is to create the sales data sets and I like to be little organized and therefore I am going to create a folder which will reflect the source system so I will click on here and I will click on new folder and I will name my folder web store and I am going to click on this create to create the folder let's create a new data set so let me just click on this one on the folders Three Dots and I will click on you can see there are plenty of options here so I am going to click on Azure blob storage so I'll click on this one and then I will click on continue so there are plenty of file options you can choose from like Avro binary it can be the CSV file the Excel so I will click on my Json file because I have the data in the Json format and then I will click on continue I will give it a name of BS underscore online underscore sales underscore Json so I named my data set and I will just click on link service and I will create a new one for me so I will give it a name as for abs underscore web store here LS stands for Linked service and abs stands for as your blog storage so I will give it a description which is a good thing to do I will call it as connection so Azure blog storage okay the next thing that you see below is the integration runtime the integration runtime is basically a process execution container that is allocated Computing resources or perhaps infrastructure that is required to connect to your data sets and at this stage we are not going to create a custom integration runtime so we are going to use the default integration methods which is fine at this moment and which is basically our as your active directory account we shall explore other account authentication methods a bit later so let's rather keep the connections section select the subscription which I will choose my Azure subscription one which is my free Azure subscription and here I am going to pick the storage account name which will be my blob storage at this point which is SE web store data001 and it's loading the key now from here I can just choose to test the connection everything connects upgrade so you can you can see here that we got a success message here so I'll just click on the create and it will create a new link service for me yeah successfully created so let's get the online sales file here and to do that I am simply gonna select this folder icon here and I'll navigate inside sales one and I will just pick sales.json and from here I can just simply click on OK so it will just fetch the data for me it will take some time all right since we have successfully created our online data set it's time to go back to our pipeline I'm here back on my pipeline within this particular canvas I need to insert a transformation component and the transformation component is located within move and transform and I will just drag this here to copy your data to the data Lake just like that so I will name it as copy web store online sales data so I will give it a description as well which is a good thing I will call it as copy online sales data on the web store and ingest it into Data Lake all right all right so if we have a look at the timeout it seems to be set to 12 hours now this is very long time to wait for an activity to run a single process if you think about it so just imagine a pipeline running for 12 hours so let's just change it to at least 10 minutes just like that I'll do it as 0 0 here and I will just make it 10. so I will then move on to the next window which is source and I will select my source data set which is the and underscore online sales underscore Json which is fine all right the next thing that we need to do is to set a sync data set so I'm going to select Sync here so let's create a new one here so I will select new just like that and this time around it will be Azure data Lake generation 2 just like that and I will click on continue and once again we are going to choose Json all right so when you are ingesting data into the raw area of the data Lake it is always recommended to maintain the exact same format so let's set it as I said Json so I will click on continue and in the properties I will name it as the underscore d l underscore online underscore sales underscore Json so let's create a new link service for this one as well I will just click on new so I will name my link Service as LS underscore ADLs underscore data engineering underscore d l to complete this as it will be a data engineering data Lake essentially so as a description it will be a connection to data engineering I'm going to leave everything as it is instead just I will select my Azure subscription and I will choose my storage account name which will be data Lake storage in this scenario and I will select it will load the key and once again I am going to test connection and let's see whether the connection will be successful or not yeah the connection is successful so let me select create here it will create a new link service for me from here I will just select where I want to save my data so I will browse here I will choose web development then Web Store Raw online sales and I will just click on OK here my data will be saved I'll simply just once I am done I will click on OK so all right we don't need to specify anything further what we can do now is to validate our pipeline so I'll just click on validate here and my pipeline has been validated with no errors found so I'll just click on the close now what I can do is to publish my pipeline so publishing is the act of saving our objects to our data Factory I am about to select at least three changes you can see here and I will click on publish all and then I will click on publish it's publishing my pipeline deploying the changes to the factory yeah the publishing has been completed so now I will just trigger my pipeline to see whether it's triggering correctly or not I'll just select on trigger now so it seems that the pipeline was triggered using the last published configuration so I will click on OK so now I will monitor my pipeline to see whether it was working correctly or not I will just go to the monitor section yeah then I will just click on this pipeline to see the activity runs so you can see here the status is succeeded and the duration is set to 10 minutes so now finally I will just see the final details of my pipeline so I'll just click on details and you can see here that we have just read 68.3 MB of data and the data was successfully transferred from Azure drop storage the region is just as selected here to the Azure data Lake generation 2 storage account so to confirm the successful trance from the data I will just close this window and I will go to my storage account I'll just click on the dashboard and I will open up my and data Lake generation 2 storage account here I will then open the container and I will navigate to my main folder with the data should have installed and you can see here my sales.json file is present here and you can see we have successfully copied our online sales data from the web store into our data Lake Generation to storage account we just created our first Azure data Pipeline and we copied our data from The Blob storage and to the Azure data Lake storage account and this is it for me and I shall catch you up on the next one goodbye and have a good day thank you foreign