Transcript for:

foreign foreign platform is constantly evolving and expanding numerous Industries are utilizing Azure development and the demand for Azure professionals is soaring high around 80 percent of businesses are expected to move their workloads to the cloud in the next few years due to this there is a rapid growth potential in the career of azure professionals with the advancement in the career hello everyone and welcome to this session we are currently watching an edureka Microsoft Azure full course video by the end of this video you will have a thorough understanding of Microsoft Azure all the way from Theory to practical applications that are required to master now if you love watching videos like these then subscribe to your records YouTube channel and click the Bell button to never miss out any updates from us also if you want to learn more about Azure after watching this session and wish to obtain edureka's Microsoft Azure certification course then please see the link in the description below now let's begin with our agenda where we'll have a brief overview of what we will cover in this Microsoft Azure full course video we will start with the introduction to Azure where we'll learn what Azure is and why should we learn it then we will learn how to create a free Azure account now it's time to Deep dive into the different Services of azure we'll start with Azure compute Services followed by Azure identity City and access management after this we will learn some Azure networking and storage Services once this is done we will then understand Azure web services and Azure database Services we will also see Azure analytics and Azure machine learning at Laos we will see the Azure iot and Azure integration well we truly hope that this session assists you in getting jobs in the industry in order to accomplish this we will look at how to start a career in Microsoft Azure where we will see the certifications in Azure followed by some essential Microsoft Azure interview questions with answers so stick till the end now let's get started with our first topic that is Introduction to Microsoft azure [Music] let's kick things off by talking about why businesses and everybody for that matter need cloud computing here's an example meet John who's an entrepreneur who has a brilliant idea for an app which solves a problem and has a great user experience so what does John need John will need developers to build this app but other than that in terms of infrastructure he will need servers then he will need storage devices a dedicated network security systems and of course he will need a dedicated it operations team to monitor the whole infrastructure and make sure everything is working properly John figured that there would be four major problems with this setup and what are those let's take a look so the first problem is that owning his own infrastructure would require a huge amount of money and because of a huge investment in setup of the infrastructure this would also greatly increase the risk if the app fails second problem is that he noticed that the infrastructure would take too long to set up as he would have to buy all the components required for his infrastructure then higher I.T technicians to install all the components and connect everything up so that everything is talking to each other and working nicely so that means more money and more time next he realized that even after the expensive setup of the infrastructure he would have to hire a team of I.T operations to work around the clock for the upkeep of the infrastructure this would include things like resolving issues with the hardware or software replacing broken pieces of the infrastructure so he would need to have spare parts on hand switching to backup servers if the server goes down preventative maintenance and so on means more money and more headaches when the issues crop up another problem has to do with the inability of the infrastructure to scale so what if the traffic on the service increases or decreases abruptly well if the trash traffic increases he will have to scale the infrastructure to meet that demand which means expanding the infrastructure so buying more components probably renting more office spaces hiring more I.T staff to monitor the expanded infrastructure and so on but what if there's a sudden drop in the demand all of this extra infrastructure and resources will sit idle what about the extra people he hired he will have to let them go well it's terrible but he may have to well now let me tell you John was smart he did not opt for this setup he opted for cloud computing because he was studying up on cloud computing and he found that all big and small companies that are at the top in their domains are using cloud computing and this Research into cloud computing was a real eye-opener as he found that all these issues that he would face with the infrastructure that researchers talked about could be solved by using the cloud so obviously he opted for cloud this is because there was almost no initial setup cost wired and it could be done in a few minutes or hours by using a single computer and you wouldn't need a dedicated team to take care of the physical machines because that is taken care of by the cloud provider and finally to John's relief infrastructure in the cloud scales automatically as the demand changes isn't that freaking amazing so why don't we find out more about cloud computing in this next section what is cloud computing cloud computing is the delivery of the Computing Services like servers storage databases networking software analytics intelligence and so on over the Internet which we call colloquially the cloud okay so let's understand that John takes a computer that is connected to the internet then he accesses the resources on cloud platform so these resources that John needs are running on physical machines in the data center that are owned by Cloud providers so like I said before all John needs is a computer that is connected to the internet and the remainder of the infrastructure is actually provided by the cloud provider and he can access the resources that he needs from the data center using the cloud platform so how does this work well first John goes to the cloud provider whichever one he likes the most Amazon web services Microsoft Azure Google Cloud these are the top Cloud providers so after going to the cloud provider's website he signs up for an account and then he signs up for the services as per his requirements and configures the resources that he will need for the app to run on and the amazing thing is that he only pays for the services that he uses and for the amount of time that the services were up for isn't that amazing of course it is so let's now talk about what is azure Azure is a cloud platform that is provided by the company Microsoft from which you can rent your resources and services so you can grab a computer that is connected to internet then go to Azure portal on your web browser and then access the resources that you need by signing up with Microsoft Azure so now it's time to find out which Cloud platform did John choose it's kind of a no-brainer but here's what John says the developers of my app and I we all use Windows OS and other Microsoft products so we wanted a cloud service that is best compatible with them can you guess which one this is of course it is Microsoft Azure because it is most compatible with the Microsoft products so before we talk about the services why don't we find out some of the interesting things about assure just like every other Cloud platform Azure provides five distinct type of services infrastructure as a service platform Foreign Service software as a service container as a service and functions as a service apart from that Azure has 22 main categories of products which we will take a look at later on next Azure is operating globally in 60 plus regions and provides its services to 140 countries and Counting so just like the major Cloud platforms Azure provides pay-as-you-go plans and free plan so there are actually no upfront fees for signing up with Azure and you can pay as you go and only pay for the time your services are running and just like every other Cloud platform Azure provides free basic plan which is surprisingly good and it is valid for 12 months so you can Tinker around and see if you like it okay so what about the languages that are supported on Microsoft Azure they are C sharp F-sharp Java typescript and python now let's move on to the next section where we discuss these services offered on azure so in the previous section you probably remember that I mentioned that Azure has 22 categories of products but what I did not mention is that it has over 600 services so obviously we are not going to be able to have enough time to discuss all 600 services so here's your homework go to azure's website and check out each of these categories and read up on them if you have any questions come back to this video and comment down in the section and we will be more than happy to answer any of those for you so I hope that you got a picture of all the categories of services provided by Microsoft Azure so let's now go back to John John needed three services they were compute networking and storage so let's take a look at each one of these in a little bit of a detail and then move on so the first product that he's going to need to build his infrastructure in the cloud is compute he can use this to deploy and manage virtual machines containers and batch jobs as well as support Remote application access compute resources created within the cloud can be configured with either public IP addresses or private IP addresses depending on whether the resource needs to be accessible to the outside world so some of the services that are offered within compute product are virtual machine containers kubernetes service cloud services mobile apps let's move on to talk about the next which is networking so after setting up the virtual machines using the compute product he will require networking in order to connect those virtual machines up so they can talk to each other and form a network so this product gives you the ability to set up virtual networks dedicated connections and gateways and it also gives you services for traffic management and diagnosis load balancing DNS hosting and network protection against some of the attacks like DDOS next one that John needs is storage obviously for his virtual machines so that they can store customer data and use backup storage for all of the infrastructure so there are various services that are offered within storage like Azure disk storage blob storage Azure backup Q storage and many more this category of services provides scalable cloud storage for structured and unstructured data it also supports Big Data projects persistent storage and archival Storage similarly there are 19 more categories that provide services that can cater to all needs of any organization or company which brings us to our next section which is companies using Azure let's see which companies use Azure so hundreds of big companies use Azure but these are the ones that I picked for you guys because each one of these use Azure for a different purpose so let's check that out Ebay has been using Azure for app development and hosting since 2010 which is the same year Microsoft announced Azure to the world Boeing mainly uses Azure for its data analytics services for things like crew planning maintenance optimization fuel optimization and crew training by building predictive models on the platform BMW uses azure's internet of things to make manufacturing process within their factories more efficient by connecting all different machines sensors and other devices on the other hand Samsung has used Azure for its Smart TV infrastructure the entire Smart TV infrastructure is actually on the Azure platform by choosing Azure Samsung was able to achieve a significant reduction in costs and increased capacity in order to meet its rapidly growing customer base with that we have come to the end of this video I hope this video peaked your curiosity and if it did and you would like to learn more about Azure then I'm going to leave the links to edureka's YouTube playlist on Azure and edureka's blog series on Azure and you'll find these links in the description along with a link to Azure architect certification if you want to be certified in Azure then this is going to be very helpful foreign looks like you have all the services listed there on the left side right and this is the dashboard this is called the dashboard whatever Services you launch you can pin it here for basically getting accessed quickly it is just like a desktop on your computer which has all the shortcuts and everything and this is what the dashboard is actually used for all right um having said that let's see how we can launch the app service in Azure so first you will click on app Services as you can see from here you can click on app services and moving ahead once you have clicked on app Services you will reach this page click on create app services and then you'll click on web app and that is it let me show you how you'll do that so you'll click on app Services click on create app Services you scroll down click on web app and over here as you can see after this you will reach the screen wherein you will see that you have four options you can either code your website in.net PHP node.js or python so if it is either of these four languages you don't have to do anything you just have to upload your code in this web app that you'll be creating and your app will be deployed automatically without installing any software without doing any configuration Justice all right okay so once you've reached this page you will click on create so let's click on create and then you will reach the screen so now you have to give your application some names so let us give us this name as edureka hyphen01 so let's see if everything is fine so this subscription is pay as you go Resource Group is something which you can create or you can use is existing so Resource Group is something it's a group of all the resources right so if you put if you are it will be creating a storage account we'll be creating database account we are creating an app service right so all of this will come under one group it can be clubbed in one group and it will be called as Resource Group so any changes that you want to put across the group you can do to the resource Group all right we'll be discussing in detail what a resource Group is in the further modules but for now all you should understand is that a group of resources is called a resource Group so if you have an application and it's utilizing say three or four resources it's better to put those resources under one group all right so this is about that then you have the app service plan so app service plan is something as in what kind of computers you use and everything so let's create a new plan for you so that you understand it better under the pricing tier this is is the main part that you have to select guys you can select the app service plan you can give it some name let's give it as a service hyphen one all right and this is my app service plan you can select what kind of plan you need let me select the basic plan as of now and click on select right and then click on OK so this was about app service plan you can click on application insights which basically gives you the monitoring tools for now let's not discuss it let's not go into it we'll be discussing it later and now let's click on create so I am taking the PIN to dashboard which will basically create a shortcut on my dashboard to this application comes in handy and that's it guys you will click on create all right so now my application is being created Okay so all right why my web app is deploying let me go back to my slide and let's jump on to our next service which is blob storage so like I said blob storage is just like a file system you need a file system to show your files right so that is what the blob series is all about let's see how we can create a blob storage instance in Azure so you will reach this is the dashboard you reach the screen and you will click on storage accounts once you have clicked on storage accounts you'll click on ADD and that is it guys nothing much required uh let's go ahead and do this I'll go here I'll click on storage accounts I will reach the screen and click on ADD and I will start entering the value so let's give it the name called edureka hyphen01 so this is taken let's give it name as Azure edu right this name is available all right so the performance should be standard because this is a demo replication is not required so I'll say locally return in Storage storage service encryption is not require secure transfer is not required and subscription is pay as you go okay uh Resource Group uh let's select edureka hyphen01 because this is the resource Group that I selected earlier right and let's spin it to dashboard so click on create all right so my web app has been deployed already and my storage account is also being deployed all right so this was about it let's go on uh let's see for our MySQL now so for MySQL um let's see what all we have to do so we will click on new we will click on databases and then we'll click on Azure database for MySQL all right and that is about it let's go back and do that so we'll click on new will search for Azure database for MySQL all right here it is so I click on this and I'll click on create so here it is I have to enter the server name let's enter the server name as eleurica hyphen01 it is available it's good the resource Group should be the same so let's select the Eureka hyphen zero one server admin login name let's give it as edureka password let's give it an eureka1234 confirm the password and eureka1234 and it is asking me some special characters so let me add the special character as well okay so basically you cannot have the login name in your password so let me change the password to edu1234 exclamation same in the confirmation all right so this is available now the location sounds into the US version is 5.7 pricing here let me see if there is something more less all right I'll take this I'll click on OK pin it to dashboard and click on create it's pretty simple guys you would know when you will uh do the Hands-On on yourself so the pricing tier can be the minimum if you are using it for demo or if you're creating an application which is for global scale application you can choose the pricing tier according to that all right while it is deploying let's move ahead so let's come on to the auto scaling part now so basically we have to configure our web app to order scale as and when required so let's see how we can do that so let's understand the types of order scaling first so basically there are two types of scaling one is called horizontal scaling and one is called vertical scaling so horizontal scaling is when you are increasing the number of servers that you require say you have an i7 server for example all right now the usage goes up so you take two i7 servers and if it again goes up you take three i7 servers right so this is what horizontal scaling is all about right let's come on to Vertical scaling now so vertical scaling is when you are increasing the capacity or the configuration of your system say you are using an I3 system before right the traffic increases so now you're using an IFI system the traffic increasing now you are using the i7 system so you are having only one machine but you are increasing the configuration of your system all right so these are the two types of scaling which exists in or Auto scaling also the other way is manual scaling so you can also manually scale up your in the number of instances or the configuration of your system right so the way to do that is to manually go into the service change the pricing tier and you are set this is how you manually scale Auto scaling I'm going to tell you in a couple of few slides also guys when you are Auto scaling like I said load balancing is automatically attached to it all right so when you will be Auto scaling in a web app you don't have to configure the load balancer the load balancer is configured automatically and said automatically for you to be used all right because it doesn't make sense to use Auto scaling and not use load balancer and that is the reason Azure has automated the process of attaching a load balancer when you're using Motors Right Moving ahead guys let's see how we can launch this instance so guys first you will click on app Services you will go to your particular instance that you have launched you will go to scale out and you will click on enable auto scale all right so let's do that we will go to app Services which is here so this is my instance that I've launched I click on edureka hyphen01 I'll go down and as you can see I have two options I have scale up and I have scale out when I click on scale up you will see the pricing tier so I can increase the configuration of my system right but this is not what I want I want to scale out which is I want to increase the number of instances running right now right so so it says Auto scale is not supported for basic tier of web apps okay so let me change my configuration all right so if I go to scale out now you can see that I have the option to enable auto scale so I will click on enable auto scale and then let's give it a name let's give it a name as edureka Auto all right so in my auto scaling property name is elyrica auto the resource Group is illurica hyphen01 right so now you have an option to Scale based on a metric which is based on some statistics like the CPU usage or the memory usage or anything right and the other one is scale to a specific instance count so for example whenever the traffic increases say you were one instance now like increased five instances all right so these are the two options will go with Scale based on a metric and scale out and scale in your instances based on Metric all right so let's add a rule let's add the rule as in what do we want what kind of metric do we want to Monitor and scale according to right so as you can see the time aggregation is not something we should be worried about okay guys so these are all the metrics that you can actually monitor and scale according to you can scale according to the memory percentage you can scale according to this queue HTTP queue data in and data out right so let's for now keep it simple and scale according to CPU percentage right uh the time is not something you should be worried about and the operator all right so what should be the condition so should it be greater than or should it be greater than or equal to let's keep it at greater than or equal to so whenever the threshold is greater than whenever the CPUs it will go greater than or equal to 70 for 10 minutes so you can set this as say two minutes all right okay let's so it says that it should be between 5 and 720 okay so let's keep it at five uh increase count by so let's see what all the options we have here so you can increase by percentage you can increase by count two as in if you were one you can increase it to ten right and you can decrease the count or you can decrease the person by as well all right so when you are Auto scaling guys uh you have to set two rules one is the rule to increase and obviously one rule is there to decrease so whenever the traffic goes down you have to decrease your instances as well so for now we are setting the increase rules so let's set the increase count by option so how many instances should it increase I think one is a fair number and cooldown is something that is actually there when you are say you are scared right now right and then uh say you order scaled one Min back and again the CPU usage goes up so rather than going on and scaling up again what you can do is you can wait for five minutes and watch the metrics right because it doesn't happen in a second that your CPU usage will go down because there are a lot of services which are actually running right so you can be once your new server has been deployed you can wait for the traffic to be transferred to that server as well and then you can see the metrics go down all right for that we keep something called cooldown minutes right so we have the default is for five let's keep it at five only and now this is it guys there's nothing else to configure let's click on ADD all right so as you can see this rule has been set let's add one more rule to decrease so the CPU percentage is fine the operator should be less than or less than right so whenever the threshold is less than 50 percent decrease count by one instance and let the cooldown be this much all right so again the cooldown logic goes here as well say you decrease an instance and again your CPU is still low because it takes time for your traffic to get transferred to the server number it has decreased down to right so let's keep the cool down minutes so we'll click on ADD now all right so we have added the auto scaling rule guys this is there's nothing much to consider actually if you think about it it's a very complex process but Azure has given you the Simplicity to do it very simplistic or a very English manner because you just say like I said if the CPU is above 70 percent increase the number of servers by one right so if you go here and if you don't know about Auto scaling you'll actually figure out what is happening right and same is the case with scale in so whenever your CPU is less than 50 percent decrease the count by one right so as you can see you can actually read it here that this is what we have configured all right so let's save this now all right guys so this is how you Auto scale your web app instance now since we are doing a demo guys we don't require the order scale module for my application so I'll just discard it for now because I can't even demonstrate this to you right uh so I just showed you how you can Auto scale uh now let's go back and see what all is left all right so now we are going to do the demonstration so we have launched each and every service we have launched the app service we have launched the blob storage and we have launched a Maya skill for Azure right now let's go back and check and deploy our website so let me show you how the website looks like in my localhost so my localhost is this all right okay so this is showing me an error because as of now my code is not connected to my Azure instances all right so let's first configure our instances so let us go to the dashboard and let's first configure our database all right now for configuring your database the first thing that you have to do is connect to it right now the way you can connect to it is using the command prompt for that you have to click on connection security now you cannot connect to your database just like that you have to configure your IP address in the set of rules that are there for the firewall of the database and when your IP address is listed there then you can connect to the database let me show you how let me show you that you cannot connect as of now so I will launch the command prompt I will go to my MySQL installation all right all right so this is my MySQL installation guys so I will type in the command MySQL hyphen Edge base and then the server name which is this I'll paste it here all right then I will give the port number so the port number is hyphen capital B guys don't forget it's capital P it's different than small P right small P so password capital P so the port number and then I'll give the username so the username is this I'll paste it here right and then I will type in the password and hit enter enter the password which is edu123 for exclamation hit enter see as you can see client with IP address is not allowed to access the server all right so now what I'll do is I'll go to connection security I will add my IP and I'll click on Save so it says successfully update the connection security let's go back to our Command Prompt execute the same command again type in our password and now as you can see I have successfully connected to my database now I can create a database here called edureka let me create a database let me clear the screen oh guys I forgot I'm working on windows so in Windows you cannot clear your MySQL screen all right so if you are using Linux you could have just typed in control l and it would have been fine all right guys so as I was saying let's create a database called Eureka so create database at Eureka and that is it so let's type in use a Eureka I'll create a table let's create a table called image right and the item name as this one second my call name is name all right so I've created a table so if I type in show tables it will show me the table name which is image all right guys so this is done now I have created a table at first I created a database called edureka and my minus here which is hosted on Azure and then I create a table called imagine it and the table has one column called name all right so this is it this is my MySQL so MySQL has been configured now all right so my MySQL has been configured now let me go back to the overview all right so my MySQL has been configured as you can see in the databases it will show that there's a database called edureka which I just created all right let's go back you might have to add the web apps IP address as well we'll do that later let's go to our the web now awesome so we'll go to our web app and guys this is your web app this is the URL for your website we'll click on this URL and you will see a welcome page this is your welcome page so now what you have to do is you have to upload your code over here all right so the way you can do this is using GitHub or you can do it using FTP as well now I know most of you are from non-technical background so you might not be knowing what GitHub is so for now what I'll be doing is I will be using a FTP software called filezilla for transferring my files from the next session onwards I'll be using GitHub and for that I have actually there is a video in your LMS which is a short tutorial on GitHub how to use GitHub which is enough for a demonstration so you can go through that video before coming to the next session and then we will be using GitHub all right for now let's use the FTP software which is called filezilla all right so the way you can connect here is like this so you have to create a deployment credential all right so my deployment credentials is hr1 let's give it a password and the password could be okay let me give as hay month 199 all right and over here also a month 199. all right uh let's click on Save or maybe this username is not available so let me give it hr12 click on Save okay let's try this again hr12 all right maybe I'm connected here maybe that's why let me exit it payment 1994 let's click on Save all right so I get the error out of here all right so late we have successfully reset the deployment credentials now all right so what we can do is uh I will go back to my web app all right so now uh I can get connected to my web app I'll have to enter the FTP hostname so it will go here one second all right the host name will go here the username is hr12 password is this right and what else all right the username is this I'm sorry the username will be this all right let's click on Quick Connect all right so as you can see we have successfully logged into our FTP channel so over here you'll go on a site all right we'll go on www root and then you can copy all your website files over here so let us do that my files are some here all right so let's copy the files now I can just drag and drop it over here and the process will start now it might take a while so let us configure some other files while my website is being copied okay so now what we can do is we will go to our dashboard and we can configure uh storage mind you guys I might have to upload my code which in my web app because there are some things that I have to change in the code because the addresses and now I'm creating new Services right so I'll have to change the services address in my code so I'll do that and then I will update the particular file that I'll be changing in in my web app so let me just give me a minute okay so now what I'll do is I will create a blob over here I'll create a container and the container name shall be hello and the access type B container all right I click on OK so I have successfully created a container called blob if you go in properties you will see that this is the URL for it right so let me show you the code for my website guys so this is the code for my website all right um now I'll have to change the url at places so let me do that so it will get changed here all right then my container name is hello so that is okay and now one more thing guys so whenever you are basically going to connect to your storage account the order your blob storage you have to go here I'll show you you will have to go to access keys so here's your key right so you this is the connection string that you have to include in your code I've copied it I'll go here and I will include it in the connection string code all right so I control V and over here you have to remove the endpoint suffix all right so this is not required if you put this it will not work okay I don't think anything else is required oh yes I have to change the database credentials as well so let's do that as well so your storage account is set you don't have to change anything else let's go to the MySQL but all right here it is so the server name has to be changed let's copy the server name and give it here's the host name is this all right and let's see so our login name is this let's put the login name here password is anyone to three for exclamation which is right uh DB name database name is edureka which is nice username user table as image which is right and the field name is name all right everything seems fine one more place I have to change it so let's do that as well so let's copy this paste it here copy our username paste it here and everything seems fine now so now when I try to run the code it will run on my Local Host but it will not run on my web app why because I'll tell you so I was getting these warnings right so let me refresh wait let me save it so let me refresh it now okay uh one second okay so say successfully updated let's go back and check whether it's working all right so basically you have to disable the Infosys cell Connection in your MySQL and that would be it if I want to do this all right so now I will choosing a file and I'll be uploading this particular file which is Desert I'll click on upload image and it will take some time to upload the image and it says well done blob update complete and as you can see my image is loading let's check whether we have it in our dashboard as in in our storage account we'll go to zero EDU we'll click on blobs we'll go to Hello container and you can refresh it and as you can see there will be a file here which says one four nine six seven four two four six eight all right so let me connect to my MySQL now again all right let's give me a second I will type in my SQL hyphen Edge and then the username so this is my hostname guys so I'll copy The Source name paste it here and then my port number which is 3306 my username which is this I paste here and then the password which is as of now this ad1234 exclamation connected to my Json base I'll use the database that I've created select table and I will show you the record now let's compare it with what is there it's one four nine six and four two four six eight and it's one four nine six seven four two four six eight so guys as you can see the same file has been uploaded here and it has been mentioned here so what my website is now doing is it is basically fetching the file from my database and then accessing the file over here in my file system and hence displaying it in its background so as of now this since there is only one image it is not showing a slideshow let me choose one more file let's upload the koala image and click on upload image so now when you will see that's the image which is being loaded and if I go here and I refresh it I can see that there are two images now here right and in my database as well if I refresh it I can see two images alright guys so this website seems to work fine it is connecting with my storage account it is connecting with my database on Azure now let's check if my files have been uploaded so it says okay this might take some time guys so let's wait because we are all set here we are done with everything we are just speeding for our files to be transferred here and then I'll show you how it works over there all right so let's build all right guys so my transfers have finished so now let's check whether my web app is working or not right so I'll go to the dashboard I will go to my web app and this is the link I'll click on this link now guys like I said you have to add the IP address of the web app in the MySQL as well so when I go here you see an arrow all right and now like I said uh we have to send the index file again so I delete the index file from the server from here all right copy the index file again because it's been updated all right and now I will refresh this so as you can see it says the IP address is not allowed to access the server so what we'll do is we'll copy the IP address we will go to our database Google connection security all right so here we are we'll select the web app give the start IP and the nip all right so let's get rid of the spaces and this is it now we'll click on Save now why are we doing this we are doing this because uh here we got an error that this IP address is not validated with the MySQL so we have added this IP address over here it says successfully updated security settings let's refresh it and check okay awesome guys so now my website is working so I have successfully updated my website in the web app all right and I didn't have to configure anything I didn't go to the UI I just transferred my files and my website is up and ready this is the address which is being used to get to the website all right so guys this was the demo let me get back to my slide so let me recap what I did I configured my app Service uh with the code I configured MySQL with the IP addresses of my own computer to configure it and the IP address of my app service so that the app servers can communicate within MySQL I configured my blog storage and I configured its address in my code my PHP code and I configured the hostname and the password for the created MySQL service in my PHP code as well [Music] foreign certification Azure certification is a level of Microsoft cloud expertise that an individual obtains by passing one or more certification exams Microsoft Azure certification validates an individual Cloud expertise and skills now let us move on to our next topic and see some of the benefits of azure certification with an Azure certification to validate your Cloud skills in a selected domain you will earn The credibility and your present or a future employee will know for sure that you have worked on Azure and have the skill as you mentioned the next Advantage is you will have higher salary packages Business Wire has estimated that Azure course certification have raised their wages by 20 to 40 percent payscale.com also announced that based on the position and job description those credited by Microsoft Azure could get a salary of 128 000 per annum with the Azure certification we can pursue a wide range of career options you can become a cloud architect a developer a solution architect or even AI engineer there are many options and in addition to this the certification enables you to work with different industry at different locations the next Advantage would be it provides a proof of commitment to get an Azure certification you should be able to commit yourself to it you would have to sign up for a course study and then only pass the exam that would imply that you can commit your time and resources towards achieving your goal and that you're dedicated to improving your career objectives in the long term these were some of the benefits of assure certification now let us move on to a next topic and see what are the types of azure certification first we'll take a look at different Azure certification based on job roles so first we have developer certification who designs build and test Cloud Solutions second the system administrator who implements monitors and maintains Microsoft solution third solution architect who expertise in compute network storage and security fourth we have the data engineer who designs and implement the management monitoring security and privacy of data using a full stack of data services fifth is data scientist who applies machine learning techniques to train evaluate and deploy models that solve business problems and then sixth we have the AI engineer who use cognitive Services machine learning and knowledge mining to architect and Implement Microsoft AI Solutions on Seventh we have devops engineer who combines people process and Technologies to continuously deliver valuable products and services to its company Aid to the security engineer who implements security control and threat protection he also manages identity and access and protects data application and networks and last we have the functional consultant who leverages Microsoft Dynamics 365 and Microsoft Power Platform to anticipate and plan for customer needs Microsoft Azure certification have four level of certification first the fundamental level second is the associate level third is export and we have specialty now specialty certification is a certification based on a particular domain now let us move on to a next topic and see some of the major job role based certification now before we get into any major role-based certification I would like to talk about a fundamental certification which is Microsoft certified Asha fundamentals or AZ 900 now this certification is intended for candidates who are just beginning to work with cloud-based solution and services or who are new to Azure platform the certification will provide an opportunity to prove your knowledge of clouds Concepts assure Services Azure workloads security and privacy in Azure as well as ashore pricing and support candidate who take the certification should be familiar with General technology Concepts including concepts of networking storage compute application support and application development the AZ 900 certification is a fundamental certification which can be used to prepare for any other Azure role-based or specialty certification but it is not a prerequisites for any of them now let us see some of the major role-based Azure certification we have the Azure administrator associate or az104 candidates for the Azure administrator associate certification should have subject matter expertise in implementing managing and monitoring and organizations Microsoft Azure environment the responsibility include implementing managing and monitoring the identity governance storage compute and virtual Network in a cloud environment they also have to monitor and adjust resources wherever and whenever needed they often serve as a part of larger team dedicated to implementing an organization's Cloud infrastructure for this certification a candidate should have at least six months of hands-on experience administrating Azure along with that they should also have strong understanding of core Azure Services azure your workloads security and governance the next certification is azure developer associate or az204 candidates for the Azure developer associate certification should have subject matter expertise in designing building testing and maintaining Cloud application and services on Microsoft Azure the responsibilities include participating in All Phases of cloud development from requirements definition and design to development deployment and maintenance to Performance Tuning and monitoring Azure developers should participate with Cloud solution architect Cloud administrator and clients to implement Solutions a candidate appearing for the certification should have at least one or two years professional development experience and experience with Microsoft Azure next we have an expert level certification which is azure Solutions architect expert or az303 or 304 candidate for the Azure solution architect expert certification should have subject matter expertise in designing and implementing solution that run on Microsoft Azure which would include aspects like compute network storage and security the responsibilities include advising stakeholders and also translating business requirements into secure scalable and reliable Cloud Solutions they should partner with Cloud administrator cloud dbas and clients to implement Solutions a candidate appearing for the certification should have advanced experience and knowledge of it operation and also know how decision in each area affects an overall solution these were some of the major role-based certification now let us move on to a next topic and see an overview of the exam guide first we have the exam format or question types Microsoft continuously introduces Innovative testing Technologies and question types your exam might contain questions like multiple choices repeated answer choices short answers Mark review drag and drop labs and so on the time limit for different examination is for associate and expert job role exams you have 130 minutes to complete it for job role fundamental exams you have 60 Minutes to complete it for associate and expert job role exam that contains lab you have 150 minutes to complete it so talking about the cost for the exam for the fundamental exam it would cost you 99 US dollars for the associate and the export level exams the cost is 165 US Dollars and you can write the certification exams in either English Chinese Korean or Japanese now the fundamental certification can also be written in Spanish German or French now let us move on to a last and important topic which is how can you prepare for the certification first start with the basics so if you're not 100 sure which certification to start with I would recommend you to start small by taking the AZ 900 Azure fundamental certification exam this will help you understand how Microsoft exam work by not being too deep into Technologies and after you have experience taking the Microsoft exam this will help you understand how Microsoft exam work while not being too deep into the technology having experience taking the Microsoft exam it will also help you focus on the actual topic and not on the testing process the next step is to understand the exam content the first thing to do after you decide which exam you're going to write is to see what you're asked during the exam every Microsoft exam page has the skill measured in the exam this list is very accurate and helps you focus and study the right content the page will also have some training and courses to prepare for the exam you should also understand the different question types asked in the exam which will help you in the certification the next step is to take Hands-On learning courses on Microsoft learn for free Microsoft learn is a free learning platform for a lot of different Microsoft Technologies it provides you with various learning Parts depending on your job roles or the skills you're looking for most of the learning paths gives you Hands-On learning opportunity so that you can develop practical skills to interactive training the next and the most important step is hands-on experience the best way to learn and pass the Microsoft Azure exams is by having real hands-on experience working with the technology while Microsoft learn gives you some free Hands-On learning modules there is also Azure free account the Azure free account will provide you with 12 months of free Azure services so you can work on various Azure services for free now the next step which will help you prepare for the certification is reading the Microsoft documentation it will help you understand the topic better which might appear in the exam as I've mentioned before read the skills measured on the exam Page look up for the specific Microsoft documentation pages and read through them and then try them out in the tutorials you can also refer some books from Microsoft press you can find Microsoft Azure documentation and Microsoft press in Microsoft Azure certification official page after this step you can also take up a practice exam some of the Azure certification have practiced examination along with it which is very similar to the actual certification examination foreign [Music] so basically you get many things for one year which includes 200 credits for first 30 days popular Services free for 12 months more than 40 other services which are free all the time and 750 hours of virtual machines which includes windows and Linux and you can create up to 250 GB SQL database apart from this you can get other services like free Dev test Labs free load balances free batch jobs free SQL Server Etc now we'll see how to create an Azure free account for this you require a phone number a credit card a GitHub or a Microsoft account to create your Azure free account the best way is you can create a Hotmail or Outlook email ID then then you can use the same email ID and password to sign up for the Azure free account so firstly you need to sign in to the Microsoft Account using your Outlook Hotmail or any ID and after that you need to follow the couple of steps that is you need to First fill out about you and then you will go through a verification by phone and then you go through the verification by card and finally you sign up in the first step that is about you here you fill up your basic details that is first name last name email ID and phone number Then followed by Next Step that is identity verification by phone so here you give your phone number and you will receive an OTP and you need to just like type your a one-time password and just verify your phone number so once you are verified you move on to the next step that is identity verification by card here you fill up your all credit card details followed by card hold number then credit card number it's expiry and CVV and your address once all it is done then they charge you around 2 rupees just for your verification they charge you two rupees and it is deducted once it is directed you move to the next step that is like accepting the agreements so you need to just check mark all the agreements and finally you sign up so once all this process is done then you are ready to start with azure so this is how you create your Azure free account it's very simple and super easy so now let us know how to create an Azure free account without a credit card so can we do this so the answer is yes yes you can you can create an Azure free account without a credit card but for this you need to have an a school account or a required school email ID so you basically login as a student in this case if you login through a student account so you will receive only 100 credit points and you also get various other services like 750 hours Windows Virtual Machine and you can create up to 250 GB SQL database and you also get machine learning free workspaces SQL Server Developer Edition that is also for free and lots of AI and machine learning features this is how you create your Azure free account without a credit card now let's move on to the next topic that is can we delete a credit card so many of us had a doubt that can we actually delete our credit card so here in this picture you could see that I have a payment method and here the delete option is disabled so why is this so in this case you cannot delete a credit card as it is there is only one payment method option if you really want to delete a credit card detail then you need to add one more payment method so you can click on to this add icon and you can add your new payment method once the payments method is added then your delete option gets enabled then after that you get the access to delete any one of your credit card details so this is how you delete a credit card detail next how to cancel subscription so it is very easy to cancel the Azure subscription so for this you just need to go to cost management section there you get go to subscription option once you click onto the go to subscription you get all the details of your current subscription so and you can also find the cancel subscription option there so once you click on to the cancel subscription they will ask you the type of subscription name and you need to provide a reason for why are you canceling the subscription so you just give the reason prescribed reason of your own and then just click on cancel subscription so once you click on to the cancel subscription they will ask you the type of subscription and the reason why you want to cancel the subscription so you can give your details accordingly and click on cancel subscription so this is how you can cancel your Azure subscription [Music] how easy it is to actually go ahead and carry out your business processes or Computing processes on Microsoft Azure so what makes it this easy for you to actually go ahead and use all these services on this particular platform well one of the major reason is it's portal as your portal actually helps you do a lot of things that basically simplifies your work of cloud computing so let us just go ahead and take a look at what this Azure portal does before I start about Azure portal I would like to inform you that we would be opening azure's portal and then we'd be understanding what this portal is what are the services it helps you with and how it actually relates or combines all the stuff that cloud has to offer to you so to do that let me just quickly switch into the console part and we can just get started with the remaining part as well so guys what I've done is I've gone ahead and I've switched into Microsoft Azure portal so for people again who are completely new to cloud computing they might ask as in what is this Azure portal and how do I get to this portal so this is what the link is it is portal.asdar.com if you are a completely new person to this particular page or this website you would be required to sign in and create an account now as you can see I already have an account here it is a fresh account the reason I've chosen a fresh account is I wanted this dashboard to be clean so that we could go through certain pointers however there are certain applications that are running which would help us to understand certain pointers which otherwise you wouldn't have so this is a perfect account for me apart from that if you need to have this account as I've already mentioned you would have to go ahead and create a particular account how do you do that you visit this link that is portal.azure.com once you do that it would ask you to sign in and put in certain details so when you do sign in or sign up you'd have to enter certain details like your email ID contact details and why do you want to use this account one pointer you need to consider is Microsoft Azure would ask you to enter your credit or debit card details now do not worry they do not charge you unnecessarily what happens is this is a free tier account where Microsoft Azure would give you certain credit limit maybe 12 000 15 000 something like that I do not remember the exact number but it is somewhere around 10 to 15 000 in that range 10 to 15 000 Indian rupees because I am based in India right now so I'm talking about Indian rupees when you talk about a ten dollars I think it is somewhere around 200 So based on where your account exists in that local currency you would be given certain amount which you can spend for one month and use all these services for free so yes you have sufficient time to explore all the resources what Microsoft Azure does is it asks you to enter your credit card or debit card details that is once you exceed these limits you might be charged because all these Services have a certain cap on that don't mistake it for the the fact that you'd be charged because these limits are fairly high and probably you won't be charged in any way but still you would be wanting to get through the documentation as in how much would you be charged but before that I would suggest that you actually go ahead and enter in all these details create an account and once you have an account probably you can go through the pricing models as well to avoid getting build in case if you actually go ahead and exceed these limits which is very less likely to happen so that being said we have an account if you do create an account and sign in this is what your interface would look like and this is something we call it as the Microsoft Azure portal so let us try to understand what does this portal has to offer what are the components that are there now as you can see on the screen on your left hand side you have all the services that are listed now these are the most commonly used Services when I say Services most of them are domains as in these domains have more services under them so domains and the services that you can see here on this part there is an option called as create a resource using which you can actually create these resources as as well then you have this dashboard it is blue in color for now let me see whether the color changes yep as you can see the color has changed so if you double click on this dashboard the color would change so you can probably customize it to meet the needs for now let me just stick to the standard version which is this so you have the freedom of changing the color and all the stuff that happens the applications you launch you can actually go ahead and have a picture of these things on your dashboard how do you do that probably you can choose first of all this dashboard you want to use if you have multiple dashboards you can do that as you can see this is One dashboard which has certain things pinned here you can unpin those things and you can pin in all your applications to this dashboard you can click on those applications and you have an option of pinning them to the dashboard next thing that you can do here is when you actually launch an application there is this notification panel where you would be notified as in what has happened if you've launched an instance maybe and you want your virtual machine or your application that is running on top of it to be on your dashboard here you would be given an option do you want to pin it to the dashboard and you can say yes and immediately it gets pinned on the dashboard now what benefit does this particular thing serve I mean what would I get if I actually go ahead and pin it here the thing is based on your business model I mean this is a fairly simple demo so we do not have too many applications that we are talking about but as your business grows probably you might be dealing with a number of servers a number of applications so you might want particular information which is readily accessible to you so you can probably pin those things on your dashboard you get stats on those dashboards if there are certain insights you can actually have a plugin for analytics which gives you reports and all those things it readily integrates with power bi which is a very powerful data visualization tool using which you can actually have all the insights that you need so yes this is what the dashboard is you have all the options here you can create a new dashboard you can upload files download files you can edit it share the stuff that is there on your dashboard you can probably maximize the screen size as well clone certain dashboards and delete it here and here so when you talk about dashboards you have all the features now this is your Cloud shell guys so what does a cloud shell do basically so when I say a cloud shell as I've already mentioned you have your portal using which you can launch applications and you can do all the stuff right away but at times you might be required to use command prompt you might be required to hard code certain things and how do you do that you click on cloud cell where you can use it through the command prompt Medium as well Cloud shell is the command prompt which is provided by Microsoft Azure apart from that you have settings where you can actually go ahead and modify the settings as you can see this is the theme you can change the theme here as well if you want some other features which you do want to explore you can just go ahead and explore all the settings delete certain settings private dashboards as well that boils down to your requirement if you are unable to figure certain things on yourself or on your own you don't want to actually go ahead and check in what the support has to offer to you as well feedback you can submit feedback whether you're happy or not and this is your accountification click on this link whether changing your passwords and stuff like that all those things would be taken care of here so guys yeah this is pretty much the dashboard I mean you can click on these services and you can just explore those Services if you have a particular requirement you can type it here say if I type Azure bot service if there is something or any service that is related to this any documents as well that would be popping up here and you can actually go ahead and direct or redirect yourself to those services from this mode as well so Yep this is pretty much the interface guys the portal which I wanted you people to see but let's not keep it this basic for now what I'm gonna do is I'm gonna go ahead and create something so if I click on this create option it gives me an option of the stuff I mean I can choose say for example phono it says Windows Server VM which means that it would launch a virtual machine guys which is Windows Server 2016 VM now let me tell you what a VM is for people who are completely new this might be basic for people who have some experience of cloud computing and they might wonder as in why am I talking about these small little things but probably we have people around who are completely new so for them I might get into the details of certain basic topics as well so when you talk about a virtual machine what we do is suppose I am running a particular application and it is supported by a particular operating system say windows in this case so I want my architecture to support that particular operating system right but when you talk about larger applications you might not always work on a particular environment your environment might change so what cloud computing does is it readily lets you launch these virtual machines that can deal with different environments and yet all these virtual machines can be under one network and they can stay connected under one VPC or one virtual Network basically so yes you can run multiple environments you can have multiple applications running on different platforms and stuff like that so this is one service that lets you do that that is virtual machines offered by Microsoft Azure and you have something called as web app which let lets you launch web applications it is more platform as a service kind of an application so what is a platform as a service application here a complete platform is provided to you by a vendor you just enter in certain details and it does everything for you it launches virtual machines I talked about having different environments right so when you click on a virtual machine and create one using Microsoft Azure you have to specify details like the security group IM user what network it lies under and then you have to go ahead and manually install softwares run your applications on top of it but when you use something as platform as a service what it does is it launches all this platform already so your VM basically which I talked about is your infrastructure as a service where just the infrastructure is given to you and on top of it you do everything but in pass your vendor does everything for you they launch this instance in the background all the services that I mentioned as in assigning security groups security I am you users assigning it a network all these things are taken care by a vendor even the platform that you choose that is set up by your vendor all you do is you just put in your data there so this is platform as a service so when you talk about Microsoft Azure your web apps basically when you create a web app it is a pass service all you have to do is just go ahead and enter a name say sample app for me you have to select the subscription for now it is the company's account that I'm using it would create a resource Group on its own as you can see and then all I have to say is create so it has selected an operating system whether it wants to use a code or a Docker image it is up to it right so it would take care of everything that is happening you guys my application would be launched right away as you can see I have been notified already that the deployment is in progress it might take maybe half a minute or minute and a half maybe for this deployment to complete I mean all the processes that I talked about those are happening in the background right now and once those process complete I can actually go ahead and launch my application so this would be a very basic application guys and as I've already mentioned when you have a dashboard it would ask you do you want to pin your application to the dashboard and if I say yes it would pin it there directly let's hope that it happens quicker see it says pin to the dashboard I would say yes and there you go it has been pinned to the dashboard I can delete it unpin it and do whatever I want to I can click on it and the application would open for me so this is what the interface does it gives me certain options guys I can check in the activity log what all has happened in this application if I click on this icon it would give me the information the operation seat this is the operation name that is right sites so it has written on site access controls who gets to control this application in particular if you need a quick start guide you can come here deployment slots you can take a look at that as well application settings you can click on this and you'll be getting all the settings if you want to make a change to that you can do that as well the version what application are you using and stuff like that configuration preview click on this certain other details would be given so what framework are you using and stuff like that authentication process guys if you click on this icon it would give me details about that as well so if I scroll down I come here I have quite a few other things which I can actually look at so if you are interested in these things you can just come here and explore as in what is happening here so this is pretty much it I mean the application has been launched so if you take a look at it it would give you details what is the resource Group that you're using what is the URL where is it located it is located in central us guys so see I'm based in India right now and I've launched an application at a data center based in USA so that is what you can do and if you click on this link it would tell you that your application is up and ready so yes a simple application has been created so this interface that is Microsoft azure's portal is fairly easy to use fairly nice to use and there are quite a few things which you can do I've talked about certain basic things only there are quite a few things that you can explore I would suggest that you do go ahead and explore all these resources as you can see that this has been done if you want to take a look at some of the services here you can actually go ahead and say all services and it would give you details about all the services all the resources yes you click here and you have all the resources that are available to you you can actually go ahead and modify your search as well you can decide which resource Group you want to view your data into for now I have these many resource groups that are there which are running because this is an account that belongs to someone else so this is my Resource Group I can actually go ahead and select which resource groups I want to use to shortlist this data here's the information about all the resources we can modify them delete them work on those and do all the stuff that is required if you go back to the dashboard you would be taken here if you open you have all the other information again diagnosis application insights you can install certain plugins that give you all these insights as well for now one more thing that you need to know is if there are certain applications that might charge you on the longer run you would want to get rid of those resources once your usage is done you can just go ahead and delete those resources as well type the name of the app it says it is sample app for me if I'm not wrong there you go you can just go ahead and delete this application and it will delete that application for you guys it's fairly simple guys the application has been deleted as well so yes you are free to actually go ahead and explore this particular portal and I hope that you'd be having a lot of things to do as far as this particular session goes this was what my aim was to introduce you to these basic things and which I feel I've done in case if you feel that I might have missed out on certain things you might put those in the comment section and probably next time when we do have a session on this we can have a discussion on that as well [Music] y1 needs a draw virtual machine so first thing is of course the cost saving the pricing criteria so we can like shut down and stop the virtual machines if not in use stopped uh virtual machine will not incur any charge it restarting the virtual machine will maintain its state based on persistentice second the boosted scalability so we can like scale uh up and down or in and out the virtual machines Auto skill features is also there to support based on some metrics for example scale out to five instances when CPU utilization is greater than 70 for more than five minutes third is alert so like we can trigger actions and alerts based on metrics of computing resources consumed by virtual machine for this we can have a better control of say more amount of control so with Azure machines developer have more control over the development environment which is a very helpful in case of building a highly security architecture for a complex solution developers can choose operating system networking storage connections Etc to build a Sandbox solution lastly easy Diagnostics means Azure virtual machine provides the facility to troubleshoot issues with options like remote debugging event logs IIs locks application logs Etc going ahead let's understand and brief what is azure virtual machine so for that we are going to first understand what is a virtual machine so a virtual machine is a computer file typically called as an image which behaves like an actual computer it is one of the files which contains everything it runs in Windows Linux Etc this gives you a flexibility that you can run multiple machines in a physical computer each system can have a different operating system each of these virtual machines provide its own virtual Hardware which includes CPUs memory hard drives network interfaces and other such devices now let's understand what is azure virtual machine so Azure virtual machine is one of the services provided by Azure to create your own instances so it can be used for various ways like development and test applications in the cloud and also for extended data center it is to remember that when you are running Azure virtual machine you must pay for the compute time on a per minute basis the pricing of this is based on the size operating system and any licensed software that has been installed in it in order to avoid corresponding charges when you are not using it it is to be saying that you change its state to stopped that is deallocated now let's understand some of the key features of azure over machine so first is isolation so each of these networks work independently when creating a virtual Network you can divide them into segments you can configure the virtual network to use your own DNA servers second is virtual network connectivity so it can be connected to each other enabling resources in any virtual Network to communicate with resources in any other virtual Network third is internet communication so by default when you are launching any instances in the Azure virtual machine it can access the internet as and when you need you can enable inbound access to specific resources for this on-premises connectivity so a virtual Network can be connected to an on-premises Network enabling the sources to communicate between each other then there is others resource communication so resources which falls under Azure virtual Network can communicate with each other using private IP addresses irrespective of the resources are from different subnets they provide default routing between subnets on-premise networks operating system you don't have to configure and manage roads so the next one is the routing so azure's routing can be optionally overridden by default through configuring with your own routes or by propagating bgp Roots through Network Gateway and the last one is traffic filtering so Network traffic can be filtered from resources in a virtual Network by Source IP addresses and port so now let's understand the Azure virtual machine architecture Azure was like initially named as a project Red Dog by Microsoft the print behind the projector red dog is Mr David Cutler in initial day it was a custom version of Windows and the goal was to boot from a virtual heart is so for that let's uh first try to understand the main components of azure virtual machine so first is fabric controller so one of the most important components in Azure virtual machine architecture is this is one of the pillars based on which the architecture Works public controller basically owns all the resources the placement of the nodes load balancing scalability as per requirement update patch management everything is owned by this segment as the user creates their own instance of virtual machine all the tasks like provisioning or the like the deprovisioning along with the supervision of all the nodes is taken care by the fabric controller in brief we can say that this is the backbone of the Azure virtual machine system so next is the patch management so patch management workflow is a major advantage of the Azure system as this module makes Azure easy to manage and stay updated across all the nodes in Azure system hosts are image based therefore there is no need to update all the nodes one by one like a legacy system here Azure basically delivers updated vhd at one place and the entire system of nodes gets updated a typical update takes place every four to six weeks which ensures security and system stabilization Microsoft here ensures that the updates are tested and verified before it gets rolled out to the nodes third is partition so Azure fabric controller has a couple of partition types and they are update domains for domains these partitions help Azure virtual machine to maintain its availability and resilience the partitioning helps maintain zero downtime minimum failures and constant availability so the first is update domains in Partition that is it is the strategy that ensures the upgradation of the servers without taking it down then a Microsoft Azure system distributes instances into a few update domains here every update domains work as a logical unit one update domain gets updated at a time the upgrade process gets completed when all the update domains are processed Azure makes sure that the upgrade is completed with a minimum impact on the running Services then there is fall domains uh fault domain is nothing but a single period Point fabric controller ensures that isolated Hardware failure does not happen by Distributing the instances to multiple fault domains this will prevent the malfunctioning of the system to do any single component malfunction then there is a Azure compute stems so Azure divide the things into multiple stamps and each stamp has its individual fabric controller Azure has two kinds of stamps compute stamps and storage stamps the fabric controller is also distributed in multiple branches Azure generally has a five replicas of every controller and they are all in synchronization to replicate the state in these five instances one is primary and the information is passed to primary by the control panel now primary instances make sure that same instruction has been passed through the secondary instances now the last one are like some other virtual machine components so apart from these uh there are three other important components of azure machine they are computer storage and networking so Microsoft Azure also has a capability to monitor the health of the virtual machines in place and if any issue is found there is an option to auto recovery this service is known as service healing as the service controller has the capability to detect the failure therefore this task is carried out by the fabric controller to detect failure and Trigger auto recovery so this was all about the components of like the main components of azure virtual machine now let's understand the architecture of how to run a Windows Virtual Machine on Azure so provisioning our virtual machine in Azure requires some additional components beside the virtual machine itself including networking and storage resources that we recently talked about the other virtual machine components so let's understand this step by step the first is the resource Group Resource Group is a logic container that holds the related Azure resources in general group resources based on their lifetime and who will manage them then there is a virtual machine so you can provision a virtual machine from a list of published images or from a custom managed image or virtual hard disk uh like what we say as vhd file uploaded to Azure blob storage so Azure offers many different virtual machine sizes if you are moving an existing workload to Azure start with the virtual machine size that's the closest match to your on-premise server then measure the performance of your actual workload in terms of CPU memory and disk or input or output operations per second and adjust the size as needed so generally choose an Azure region that is the closest to your internal users or customers not all virtual machine sizes are available in all regions so keep that in mind then there are disks so for best uh disk input output performance we recommend like a VM storage for that go with the TM storage which stores data on a solid state drives means ssds so cost is based on the capacity of the provision disk iops and throughput also depend on disk size so when your provision is consider all the factors like capacity iops and throughput we also recommend using manage disk so managed to simplifier Disk Management by handling the storage for you managers do not require a storage account you simply specify the size and type of disk and it is deployed as a highly available resource the operating system disk is a virtual Hydra hard drive stored in Azure storage so it persists even when the host machine is down we also recommend creating one or more data disks which are persistent so virtual heart is used for application data when possible install applications on a data risk not the operating system disk so some Legacy applications might need to install components on the C drive in that case you can like resize the operating system disk using Powershell the virtual machine is also created with a temporary disk this disk is stored on a physical drive on the host machine it is not saved in Azure storage and may be deleted doing reboots and other virtual machine lifecycle events use this risk only for temporary data such as page or Swift files then there is a network so the networking components includes the following resources like virtual Network so every virtual machine is deployed into a virtual Network that can be segmented into multiple subnets then there is network interface in short we say it as Nic the Nic enables the virtual machine to communicate with the virtual Network so if you need a multiple nics for your VM be aware that maximum number of nics is defined for each virtual machine size then there's public IP address a public IP address is needed to communicate with the virtual machine for example via remote desktop means RDP so the public IP address can be dynamic or static the default is dynamic then there is a network security group where we say at NSD which are used to allow or deny Network traffic to Virtual machines nsds can be Associated uh either with subnets or with the individual virtual machine instances then there are certain operations so that includes Diagnostics so enable monitoring and Diagnostics including basic health metrics Diagnostics infrastructure logs and boot Diagnostics so boot Diagnostics can help you diagnose wood failure if your virtual machine gets into a non-votable state create an Azure storage account to store the logs or standard locally redundant storage account is sufficient for Diagnostic loss then there is a availability so your virtual machine can be maybe affected by a planned maintenance or unplanned downtime you can use Virtual Machine reboot locks to determine whether a virtual machine you put was caused by plant maintenance for higher availability deploy multiple virtual machines in an availability set then there are backups to protect against accidental data loss so use the Azure backup service to backup your virtual machines to Geo redundant storage Azure backup provides like application consistent backups then there is a stopping or virtual machine so Azure makes a distinction between stopped and deallocated states you are charged when the virtual machine status is stopped but not when the virtual machine is deallocated in the Azure portal the stop button the allocates the virtual machine if you shut down through the operating system while logged in the virtual machine is stopped but not deallocated so you will still be charged then there is a deleting a virtual machine so if you delete a virtual machine the virtual hard drives are not deleted that means you can safely delete the virtual machine without losing data however you will be charged for storage to delete the virtual hard drive delete the file from blob storage to prevent accidental deletion as a resource lock to lock the entire Resource Group or like lock individual resources such as virtual machine so this is all about the major components and how they work in synchronization with each other to form a virtual machine so this is how the architecture looks like and how it works of azure virtual machine now that we have a theoretical as well as an architectural understanding of azure virtual machine let's try our hands on it on how to create a virtual machine and how to connect it so you can just go to the Azure portal here you see all the services which you have recently used and you can just directly go search for virtual machines from here and go to the virtual machines here you see there's no virtual machine being created in now so you can just go to create so subscription Azure Labs we have and then resource to we have to give if you already have a resource Group just give or create a new one from here and resource nearest to me and also my consumers are nearest to me that's the region I'm selecting so always keep in mind while you select the region selected region which is near to you and also to your consumers then in availability options or requirements security type standard Windows 7009 data center generation 2 right Azure no not required size then we have to choose yeah I can go with P1 or you can like select any other size you can see for all the sizes so you can select any one of them like right now I'm using B1 I can use b2s as well so I'll just repeat those that has two virtual CPUs and 4GB of memory then I will be creating a username so Azure admin is good for me and I will give a username as well it should be alphanumeric along with spatial character with one capital letters as well so keep that in mind while creating the password password meshed so let's go to the next step public sector now selected reports RDP 3389 you have to select if it's not selected so yeah now let's go to the disk and thanks not required then select whichever is available for you like SSD or HDD standard one it also like the price was also dependent on which option you are going with so I'm going with standard SSD right now next networking nothing is required if you want to change anything as per your requirement you can change the management auto shutdown if you require anywhere like if you never order shut down let's depend on you right now I don't require that so go to the next one extensions and everything is there so I am selecting an extension here for myself you again also like select which one is required for you I require Microsoft internal View go next so then uh yes I have to create that so I created next steps if you want to add any text you can it's up to you and then you can just go to review and create so here you will actually get the price right now I'm not getting because I have a different kind of a company based account but if you have any personal account then you will be getting the total price of your virtual machine you're going to host right so can we check all the information you've selected for all the services we have been selected sub services and then you can create it so the virtual machine is getting deployed so here you can see deployment to Resource Group is in progress so now you can see the deployment is complete so you can just go to your resource from here let's see what all is created we can see a virtual machine is created public IP address is there on Netflix Security Group is created for security purposes then there is network interface being created then a storage disk is being created and as well as the virtual Network as we have understood all these things are required on virtual machine architecture if understood right so that's all we've been created so we can go to a virtual machine now you can see here uh status it is running right now location everywhere like the region location you can see it's showing Central India and you can see the public IP address as well here so all this is here you can actually like copy the public IP address and directly connect it through your remote desktop connection by pasting it here and connecting it or you can just go to connect and go to RDP and here you can like download the RDP file the personal rdb file just for that uh one thing remember you have to go back to that you need to copy the eye for that is copied so let's open the RDP yeah go to connect yeah for for that same IP the RDP has been downloaded so we don't require IP to paste we can just give the username and ID that we have given while creating the virtual machine and then the password credentials yes so we have created the virtual machine and now we have connected it as well so here is your service manager which has been open and here how your virtual machine has been created so what your computer is being created here you can see the dashboard all the things like add roles and features whatever you want to add and okay from here or you can create your local server given all the requirements you can create a local server as well and come to this PC and here you can see all like this is the storage you have got temporary storage and Main program storage so this is how we can connect the virtual machine I hope you have understood it so stop this session yeah so I have closed the virtual machine and I can even stop it from here if I go here and stop so that it stops like the charging cycle gets stopped you won't be charged for running the virtual machine here so through this we can like save our cost it is under process it is getting stopped takes little while see you can see the status is deallocating means peallocating means it's getting stopped so you can see it's successfully stocked and after that you can even delete the virtual machine the virtual machine is getting deleted and through this billing cycle was uh still going on for because we had a virtual machine with it so it got deleted so virtual machine is you can see is deleted still other things are still there if you want to delete the resource Group as well you can completely delete the resource Group you just have to type the resource Group name so delete so all the services which are being under this will be deleted along with the resource code so you can see oh and just come to home and now you can see there is no okay I can go to the VM resource the resource not found because it got deleted okay let me discuss it once again so yeah there's no Resource Group anymore so this is all about the virtual machine and how to create it as well as how to delete it and all the concepts behind it so next is the pricing options so for pricing option you can just simply go for Azure virtual machine pricing let's see the pricing structure how the pricing works for your virtual machine you can see like you can select whatever the operating system you want to choose you can choose that from here just a second yeah everything is there you can choose as per your requirement right now I'm going with the I'm going to be Windows OS only and category I can choose like whichever I want to alright now I'm choosing general purpose only pricing is getting updated here you can see bscd is av2 in the standard d2a assistant all standards are there on the basis of the pricing differs and now if I choose the VMware it's just a second Windows first since the category that is general purpose then I have to choose the VM series it is suppose I want to choose PS series within the region you can see and then you can come to see the prices right for b1s because we have choose the B ones right vs series The b1's that is the price Dollar Wise similarly if you could go with a different series like suppose we say this V5 then you can see the prices got changed this is how it gets changed so this is how you can look for the prices and go as per a requirement [Music] what Azure kubernetes service is now I'm not gonna get into the details of this service right away but just to give you an introduction Azure kubernetes service is a fully managed container Management Service it is provided by Microsoft Azure obviously which is a very popular cloud service provider and recent stats also state that it is one of the leading public cloud service providers in the market so when you talk about Azure kubernetes it is basically managed by Microsoft Azure what it does is it offers serverless continuous integration and deployment experience now what does that mean what we can do with it we'll discuss all those things as we get into the details of azure kubernetes service for now we need to understand quite a few other Concepts before we get to that point but to summarize few of its benefits what Azure kubernetes service does is what it offers is end-to-end deployment scalability and availability now how do these things work we'll discuss that but before that let's try and understand what evolution of software development and deployment is and how it has evolved with time right so we all know how applications were built and how they've progressed with time right so if you talk about the popular ways in which software development and deployment has evolved we all think of waterfall model agile model and devops which is the current way in which we basically go ahead and deploy applications right or basically not only develop but also maintain Monitor and deploy applications right so let's try and understand all of these one by one let's see how they evolved with time so when you talk about waterfall model agile model or devops they all have these three things which basically sets them apart right the application architecture beat their deployment or packaging and finally the application infrastructure that we can think of now when you talk about the application architecture the first one that is your waterfall model is known for its monolithic architectures right something that we came away from if you talk about current generation we are more more into the micro Services model that is handled by a devops right so when you talk about waterfall we had monolithic applications that means everything was inside a single application your servers right your applications be it your management tools and stuff like that everything was inside a single application and these kind of architectures were called as monolithic architectures now with time we had agile model where we had multi-tier applications where some of your applications or some of the functionalities of your applications were hundred over to some other domain or to some other tier right and hence we had multiple tiers say for example you would have your application tier in one place and you'd have your basically your database here in other places right something like that and when you talk about devops they took this functionality to a further classified I'd say architecture where we had micro services or granular parts of your application that could function independently now why do we need these kind of changes right why did we not offer monolithic applications now when we talk about monolithical applications these were bulky and these went very rigid not easy to basically go ahead and modify if you had to you had to change the entire thing when you talk about agile it gave you more flexibility how we had multi tiers right so if one of the tiers is not functioning right so we could actually go ahead and work on that here and it would not affect the remaining functionality but when you talk about devops or when you talk about micro services in particular it basically breaks down your application into granular chunks where each of these applications performs as a whole but you can think of them as an individual part which you can basically go ahead and function or operate on individually and also ensure that you can basically make changes to these individual Parts individually without affecting the functioning of the entire application now when we talk about deployment and packaging we have different ways in which these applications are hosted right or in which these different ways of software development and deployment work right so when you talk about deployment packaging your waterfall model supported or supports your physical servers right you have a physical server similarly when you talk about your devops or rather we have agile first so when you talk about agile you had your virtual servers now if you talk about cloud computing yes to some extent it supports virtualization but back in those days virtual servers were little different so if you talk about agile model it basically lies on a deployment model which is based on Virtual servers when you talk about micro services on the other hand they are more based on containers kind of an deployment is what I would say similarly if you go ahead and discuss their application infrastructure now for your waterfall models we had data centers that basically had all these applications being hosted right your monolithic applications similarly when you talk about agile we mostly moved to the hosted platforms right and when we talk about current microservices or your container infrastructure we talk about Cloud hosting and that is why we've come up with this session where we'd be understanding how containers work on top of Cloud that is your kubernetes orchestration tool in this case would be working on our Microsoft Azure Cloud platform so I believe the evolution of development and deployment is clear to you all let us now go ahead and understand or discuss some of the other bits that are there on update now before we actually go ahead and understand how kubernetes works right and why are we talking about containers and stuff we need to understand how containers are different from virtual machines because virtual machines are something that people have been using and many of them have migrated to Containers why because both of them they have their benefits and reasons for these benefits so let us try and understand how a container is different from a virtual machine more than a comparison it is just understanding how they're different from each other now this image should give you an understanding as to how these two things are different the one on the left hand side you have your container and on the right hand side you have your virtual machine now when you talk about virtual machine what a virtual machine does is on top of your server it lets you host your guest operating systems to give you an example when you talk about a service right or if you talk about a cloud platform like Microsoft Azure where you have Azure virtual machines which again lets you spawn instances or virtual machines now these virtual machines or instances can have different guest OSS say for example if you launch a virtual machine that basically has a guest OS of Linux and there's another virtual machine that has a guest OS of your windows right so we are talking about two different virtual machines or entities which function on the same server but they have different operating systems that means if I want an application to run in Windows I can do that using my VM that is Windows in nature and similarly if I want something to run on top of Linux I can use the virtual machine that is Linux in nature right so that is what virtual machines offer you okay but there's a catch here when you talk about a virtual machine you are talking about a server right on top of that server you'd be having your host operating system on which you'd be running a hypervisor now this is more of a middleware okay or a service that supports all the applications that are there on top of it including the operating system the guest operating system that we are talking about now here the problem is as you can see the virtual machines that you see on the left hand side right they'll have separate guest OSS they'll have their bins and then on top of that they'll be having the applications that are running now what is the disadvantage here here what happens is each time you have a virtual machine that is operating we need to ensure that there's a guest OS there so there's a deep dependency on this and what this does is it makes it difficult for you to basically break this virtual machine from the existing infrastructure or architecture and make it run somewhere else right because this is a complicated process but when you talk about your containers this is where the difference lies say for example I'll be having a Docker engine in this case right it can be anything I mean when you talk about Docker engine it is a type of tool that provides you containers okay so supports containers spawning for you in simple words the other container tools as well if you are not here to discuss Docker but Docker is one of the most popular container tools that is why you see Docker here so think of it as your container image what you see there in the left hand side so you can see that there's a server host you have your hypervisor you have your guest OS so guest OS here is common right so when you have a guest OS here what that means is all the containers that would be running on top of it would be running on the same guest OS so you do not have a dependency where basically you have to ensure that all the guest OS requirements are also there in your container so your container is more independent of the operating system right the only concern here though is in case of containers you have to ensure that if you have one hypervisor or one Docker engine that is running for you right in that Docker engine where there's a single guest OS in that case you'll be running all your containers that are supported by that guest OS right now if my guest OS is Windows I cannot expect myself to run Linux containers there I'm just giving you an example okay so there's a drawback there but the benefit here is right since your applications your bins and your library files are independent you can package them in a container and move them across easily so you can run it on other Docker engines as well to support your claim what that does is since you have this Freedom here where you're not dependent on your guest OS right or you do not have a guest OS inside your container you get certain benefits it makes your containers faster okay makes them reliable and at the same time makes them efficient so that is what basically containers are and that is how they're different from virtual machine so I hope by now the point here is clear to you all as in how do these things function okay let us now go ahead and understand the other bits so now let us try and understand what kubernetes is right because many of us know that it is a container orchestration tool we always try to compare it with Docker which may not be the rightest things to do or the best things to do rather why is that let's try and understand that as in what is kubernetes why do we need it and is it same as Docker or not okay now here are few things that would tell us why do we need kubernetes and how it is different from Docker is something that I would discuss in this bit itself now when you talk about kubernetes it is used for container communication now this is where it is different from Docker when you talk about Docker it is something that lets you basically set up containers and work with containers okay when you talk about kubernetes it is something that lets you manage these containers be it your Docker containers or be it other kind of containers as well so that is what kubernetes does for you now let us try and understand why do we need something like kubernetes to manage these containers be it a Docker container or be it any other container now when you talk about containers they cannot communicate with each other they need help to do that this is where a tool like kubernetes comes into picture apart from that if you go ahead what containers do is they basically can be deployed as applications right as individual applications but since it is easier to deploy these applications thanks to Containers you cannot just go ahead and deploy applications the way it suits you cannot just go ahead and deploy your containers the way it suits you right there has to be a process behind it you need to understand what containers to spawn what to schedule what to release right I mean what to deploy rather so to do that this is where kubernetes again comes into picture third bit here is careful container management now what do you mean by careful container management it again is similar to what we discussed right as we talked about how deployment Works similarly we also need to understand how container management works right when you talk about containers there are quite a few pointers that come into picture now at the start when I talked about Azure kubernetes service you realize that there were certain pointers that talked about I mean there were some highlights of feature us there on the first slide that showed you it helps you in better deployment right and quite a few other things so when you talk about those pointers right what services like kubernetes do is they ensure that your deployment is happening properly you are able to deploy multiple applications right and you are also able to track whether these applications are up and functioning whether one of these applications is dead do you need to work on it right so you get this kind of management when you talk about kubernetes enable auto scaling to give you an example let's assume that you have an application that basically deals with three servers now let's assume that that application does very well right in this case you might need to scale that application is it easier just to go ahead and scale these applications no right because there could be a possibility that now you have to manage not just three servers but you are required to manage like 100 servers in that case using a tool or using an orchestration tool rather like kubernetes would help you ensure that all your containers running in different 100 servers on top of these different hundred servers are functioning properly so this is where Auto scaling comes into picture what kubernetes does is it gives an ability to Auto scale your applications in situations where manual intervention is not needed and that is what kubernetes does for you so these are some of the reasons why kubernetes came into picture so yes we had containers but these were some of the drawbacks that containers had and this is what something like kubernetes lets you do okay let us now go ahead and understand what is kubernetes right so I've already explained what kubernetes is there's not a lot of different pointers that have left to be explained so if you talk about kubernetes right it is an open source container management tool which automates container deployment container descaling and container load balancing so all the activities that we discussed right be it managing multiple servers ensuring that all the containers are working together right they are able to communicate with each other so basically all the container management it happens all the orchestration it happens using your kubernetes and that is what it does okay what is one of the benefits it offers it works brilliantly with all Cloud vendors be it your public hybrid or on-premise okay and one of those popular Cloud platforms as I've mentioned is Microsoft Azure which we'll be discussing as we move further so what do you need to know more about kubernetes now these are some of the facts right I mean some of the things that might interest you I'm not very sure about that but if you talk about kubernetes it was developed by Google okay so Google as we know brings in a big customer base a big audience right a big Community to discuss and that is why this in itself is a big thing why because when you talk about something like kubernetes it is going to have a huge community and that would mean that there's use support use documentation for you to work with it was developed using a programming language called as golang and it entirely was built using it apart from that to just to summarize what kubernetes is you can think of it as n number of containers that can be put into one logical unit so that you can simplify the process of managing these containers and how they work okay so this is what kubernetes is in a nutshell let's try and understand how the architecture of kubernetes works and what are the components that you should know about kubernetes so this is what the basic architecture of kubernetes looks like okay so you have your apis you have your kubernetes master you have your image registry then you have your nodes which basically are nothing but your applications so let's try and understand these one by one as in how do these function okay so when you talk about the API as the CLI and the UI right the first thing that you need to do is you need to set up a kubernetes cluster right so when you talk about a kubernetes cluster a cluster would be nothing but your central management Point using which you'll be managing all your applications or your different containers that hosted applications right so first what you do is you create this cluster something that you can do by using your CLI right and then you can interact or others can also interact with your cluster by using an API server okay so yeah API server is nothing but that basically ensures the authentication and the identification of the people or the resources interacting with your kubernetes master or with your kubernetes cluster okay now if you talk about kubernetes cluster to start with your kubernetes cluster here what you can see is you have your kubernetes master right kubernetes Master is something that basically manages your containers and what you see on the left hand side the nodes and stuff right this is nothing but your worker nodes now now this is where you will be having your applications that are actually running so your worker nodes would be implementing or running your applications your kubernetes master on the other hand would basically be managing these applications and telling you how to basically run these applications on top of it to understand this better let's go ahead and take a look at this diagram that you have okay so what you see here is you see a kubernetes master right and inside this kubernetes master there are quite a few other things as well we'll discuss those but what you see on the screen are some key components that you need to know okay so inside your kubernetes master you'd be having two important pointers that is your replication controller and your service okay so when you talk about a replication controller it is nothing but a resource that is owned by a master that is your kubernetes master right what it does is it ensures that the requested number of parts are running on nodes always okay so your application controller is something that controls the functioning not controls it basically tells your worker nodes that keep the nodes or the parts running now what are the parts and what do they do we'll discuss that do not worry about that it's not very complicated but that is what your replication controller does apart from that you have your service which is nothing but again an object on your master node right or your kubernetes master rather that provides load balancing across replicated groups of pods now let me try and make sense for these two terms by first explaining what are your nodes what are your pods right so when you talk about your nodes these are nothing but your worker nodes which would be holding your application now in this case your node can hold more than one application what we do is we basically classify our worker nodes into ports now ports are sub entities right these are logical collections of the containers which need to interact with each other for the applications to keep running okay and inside these parts you basically have your containers so that is what the infrastructure looks like you'll be having parts and which would be holding a logical set of containers that probably result in a particular kind of application right and you can have multiple such Parts what you see here is you see Docker so what it tells you is a worker node is nothing but a container right so it can be other container tool as well need not be Docker but again as I mentioned Docker is a popular tool so you see it here so what you see here is basically a Docker container right which has multiple pods and these multiple pods are basically responsible for running your application and who controls all these applications your kubernetes master so assume a situation where you need to scale up a particular virtual machine not a virtual machine in this case a particular node so how do you do that to do that you basically have something like a load balancer which helps you balance a load right either distribute it or in case if you need scaling it helps you scale your applications as well right and then you have your replication controller that basically ensures right your application is running so what your replication controller does is it tells your node that this application has to run okay and who makes it run there's something called as cubelet now inside every node there would be another entity called as cubelet that cubelet is responsible for running your application so this is how your kubernetes architecture looks like and this is how it functions now we've understood what kubernetes is how it works right let us now go ahead and understand Azure kubernetes service now to give you a brief about Azure kubernetes Service as I've already mentioned it is a kubernetes service that is hosted on the Azure Cloud platform okay so let me just go back to the slide one and explain that again so that we can go ahead and understand how working of azure kubernetes happens so we've already seen this slide right what is azure community service but now it would make a lot more sense for you okay so what can happen or what Azure kubernetes service offers to you now if you talk about Azure to start with it is a popular cloud service provider what that means is with as there are kubernetes or with Microsoft Azure you can basically have various services that help you in software development planning management right be it your administrative practices development practices or architectural practices related to a software all these things can be hosted on a public server in this case which again can be your Microsoft Azure Cloud now what it does is it basically gives you all the services that are needed to carry out all the actions that I just mentioned where you can rent these resources or follow a pay-as-you-go model and use these services to benefit your requirement okay now when you talk about a platform like Microsoft Azure which again is a cloud platform as I've just mentioned the possibility of dealing with applications are plenty or the possibility is rather are plenty so you can do so many things so what that does is that gives developers or application owners options to work with these applications in numerous ways okay it simplifies their complications right since if you're talking about a platform like Microsoft Azure it helps you basically orchestrate quite a lot of activities be it managing your servers infrastructure right be it basically load balancing be it auto scaling all these things can be controlled by using the cloud platform so what that does is that gives developers this added benefit okay or the application owners this added benefit and that is why what these Cloud platforms are doing these days is they are encroaching or basically covering different domains be it your big data devops analytics right all these things are or can be implemented using these Cloud platforms because when you talk about these Cloud platforms they're already owning various markets of software development and application hosting okay so that is why having something like kubernetes work on top of azure is a benefit why because some of the complications that you'd find difficult to implement using kubernetes can be controlled by using Microsoft Azure and that is why you have these added benefits be it your end-to-end video end-to-end deployment BTR availability or scalability if you talk about some of the benefits that it actually offers what it does is you pay only for the notes that you use that means last time I checked Microsoft Azure did not charge you for your master node you only get charged via worker notes that you use okay your cluster upgrades are easier so you do not have to worry about the patching part and stuff like that integration with various Azure and other tools and services becomes easier not just that what you can also do is you can actually go ahead and enforce various rules that are under Azure policy across multiple clusters to have uniformity if you need to right what you can also do is you can scale kubernetes and add the nodes using Auto scalar that Microsoft Azure supports and you can even expand your scale even greater by scheduling your containers on Azure containers so that is what Microsoft Azure kubernetes service provides you with okay let us now move further in this Azure kubernetes service tutorial video and let's try and understand some of the other important pointers like how does the architecture of azure kubernetes Services works okay to understand the working of a case or Azure kubernetes service right you can see this diagram on the screen it's a fairly simple diagram and it tells you what other kubernetes service does Okay so we've already seen how Azure kubernetes architecture works right I mean we know what all components it has and stuff like that this is a more detailed view so what you'll have is you'll have again a kubernetes master master node or cluster is what you can call it okay so in this master node the functionality would be same it would be controlling of the deployment and how the management of your containers work so basically Azure will manage your master node it will basically help you control quite a few activities that are there as a customer you'd be managing the applications only that way you would have support from Microsoft Azure how we would understand that now if you talk about the control plane what you have here is you have your API server now when you talk about your API server what it does is it basically tells you how the underlying kubernetes apis are exposed okay what this component basically does is it provides the interaction for your management tools beat your cubactyl or the kubernetes dashboard similarly you have your etcd now etcd basically what it does is it maintains the state of your kubernetes when I say the state of kubernetes I'm talking about the kubernetes cluster and basically it helps you control all the configuration that is there this thing is highly available and it is key value based in nature okay what you also have here is basically you have a scheduler what a scheduler does is it lets you create and scale your applications to basically determine which nodes can run on what workload and stuff like that okay and finally you have your controller manager now this is an important bit again what it does is it controls or oversees a number of smaller controllers whether they are performing the actions or not what it also helps you in doing is replicating the pods that you have and handling your nodal operations as well so this is what your control plane does and it is basically managed by Azure now what is the benefit of having Azure and manage it right since Azure is managing it you do not have to worry about the configuration part setting up part right your cluster node is basically set up by Microsoft Azure and that is what it simplifies then the other bit is your customer managed bit beat your cubelet right Cube proxy your container and container runtime to understand this better I think this diagram should draw a clearer or a better picture for you okay so basically when you talk about your customer part now there are quite a few things that you need to know I've mentioned cubelet which basically lets you run your pods right your pods are run by your cubelet that is what your cubelet does okay there are other things as well that you need to know here you have your Azure virtual network interface now it is something that takes care of all the interface requirements that you might otherwise have okay and similarly you have your Cube proxy right Q proxy again works together with your virtual network interface right or basically your Azure virtual Network and tells you what needs to be done okay okay all these things they reside on a virtual machine again so we've discussed all the differences and benefits of it so what Azure does is it basically launches a virtual machine for you on top of which your containers would reside okay so when you talk about these containers again these are nothing but your applications that are running for you okay and these are the services that would help you actually implement or control all these applications that are there you can actually go ahead and scale up your virtual machines to meet the requirements of your application in different ways so this is how the working of AKs is I mean this is how the master node works and this is how you are basically customer node or your worker nodes work when you talk about a case we'll see these as we get into the demo part but before that let's go ahead and understand a use case or see a use case as to how vehicle safety with AKs is supported now when we say Vehicle Safety we are talking about a use case that was set forth by Bosch okay Bosch as we all know is a very popular company which has numerous thousands of employees working across the globe for meeting different technological needs of the world okay in this particular use case what they thought of was the amount of accidents that happened due to wrong way driving okay and what they thought was could we find a solution that could basically help us overcome this bit of problem okay so what were the problems that they were facing right when you talk about wrong way driving problems right what are the issues that we face okay because of wrong way driving there could be a lot of accidents unplanned casualties and stuff like that but if I am to avoid a wrong way driving using technology it's a very tedious task the reason for that is what I can try and do here is I can basically set up an application that notifies me of wrong way driving right but then for that to happen and I have to have a lot of data that is in real time and when I'm driving my only way of fetching data is by using cell phones and GPS right but then again there are problems with these things when you talk about cell phones they basically function very well in a radius of 5 meters distance when you talk about their operational radians or radius rather or the limitations when you talk about range on exchanging this data okay so you can transfer data to a larger audiences we see that through WhatsApp and stuff but if you talk about GPS kind of data we are talking about multiple points or multiple data that we are dealing with and to deal with them parallely can be a difficult task so because of that boss felt was that this was an impossible task to deal with why in order to get the last piece of information out of the noisy sensor data can be a problem and in order to develop a highly scalable and Ultra flexible service to process this data in real time can also be a tedious task what they also figured out was to solve this problem they needed this data in real time and in very short duration if we could do that we could actually go ahead and save accidents and we could save these wrong way driving incidents and that is why they decided to work with Microsoft Azure which was their already supporting partner what they also did was they ensured that they did go ahead and accommodate Azure kubernetes service in their infrastructure what this did was it ensure that Bosch would get repeatable manageable clusters of containers and since they work in such shorter durations do not have latencies in minutes but rather in seconds they offered a simple form of services for them to work with and what that insured was they had the average time to calculate whether a driver is going the wrong way or not in milliseconds so around 60 milliseconds they could figure out that is around the second where they could figure out whether a driver was in the wrong way or or not and that is what basically your Azure kubernetes service enabled them to do the major problem that is they had the time constraint that is something it helped them solve and because of that they could come up with this a wrong way driving problem solution where they basically use Microsoft Azure to host that data and basically Azure kubernetes service to basically help them in simple provisioning scaling of applications and ensuring that the real-time data is made available to the drivers within a span of 60 milliseconds which again is a big achievement so that is what AKs or Azure kubernetes service had to offer if you go to Microsoft azure's website there's so many good use cases that talk about AKs so you might want to go ahead and take a look at that as well as far as the session goes we would go ahead and take a look at the demo and understand how we can actually go ahead and set up another kubernetes cluster and a lot more okay so guys let's go ahead and get into the demo part now this is how the Azure portal looks like okay I've signed into the Azure portal now for people who are new to Microsoft Azure what Microsoft Azure does is it gives you an Azure portal where you can Implement all the cloud related activities that concern Microsoft Azure okay so if you want to practice from scratch it also offers a free tier account where you have a number of services available free of cost okay or rather in case of azure it works differently for AWS you have those free services with Azure you have this 200 of credit that it gives you so you can spend those two hundred dollars on creating resources for 11 months duration post that those will expire so if you do not have an account you can just go ahead and create a free tier account fill in the details right and you should have this free Azure portal for your usage okay in my case this is a paid account this is the company's account so it's called as pay as you go in my case in your case it would be a free tier account but the functionality would be similar for you as well the only thing is with the paid account you have more services available to you and even the larger instances is something that you can actually go ahead and spawn and use because you're paying for those right under free tier you don't have to worry about paying for them right so there are certain restrictions and limits there so do go ahead and register to Microsoft azure's portal okay now coming back to the demo part what we are going to do now is we are going to create a kubernetes cluster and on top of that we are going to deploy an application and the application would be a basic pre-built app by Microsoft Azure we are not going to create a new app I'll be sharing the code for the application so do not worry about that so we are just going to go ahead and use that pre-built code and we are going to deploy that code on top of kubernetes cluster that we have okay so the main thing to note here is we have to see how easily the cluster is created we are not concerned about the orchestration process at all here everything is taken care by your Microsoft Azure and that is what we are going to look into so let's not waste any time just quickly go ahead and get an instance in place rather a cluster in place okay so to do that we are going to click on this icon and say create a resource okay here you can see all the service domains that are there for offering so I'm going to click on the containers one and click on kubernetes service once you do that you see you'll be brought to this page and you'll get all the details right AKs manages your hosted kubernetes environment making it quick and easy for you to manage containerized applications the main point is without having to container orchestrate or have any expertise in that okay it eliminates the burden of ongoing operations and maintaining provisioning upgrading scaling your resources rather which happens on demand here so you don't have to worry about those things so let's just go ahead and create a resource Group so for people who do not know what resource groups are Resource Group is nothing but you can think of it as a small container or a folder where you put forth all the necessary things that are important for your resource to work now in this case our resources kubernetes so quite a few other nitty-gritties related to it would be in the resource Group okay so let's just go ahead and create one so I'm going to click on create new let's just say we have RG for demo I'm gonna give a simple name so that I can remember that okay so I'm gonna say okay I'm gonna create a cluster I'm gonna call it Cube cluster okay this is enough so Cube let's call it Q Plus term okay and this is in place where do I want it if you're using a pay as you go model there won't be any restrictions here if you're using free tier there might be some okay so just select any reason that you want to do I'm gonna randomly select a reason now what region it should be my region would be okay let's pick this one and it would ask me how many nodes do you want to create let's create one since it's a very simple demo we do not need more nodes okay we've discussed what nodes are right so that is what we are doing we are creating a node here where our application would run basically so I'm gonna say review or let's just go ahead and see if we have all the settings node poles authentication you have two options whether you want your server or rather your Cloud platform to manage your identity pools and basically create resources accordingly you can do that or you can select a service principle which is an older way to do it where you use an Azure active directory to do that so there's no rule here you can go for this one as well I'm just gonna do this just because I've thought of showing it to you I selected that it does not hold a lot of impact on our demo okay I'm just gonna say create interview okay so for creating this thing now it will review all the requirements if the requirements are in place what it will do is it will just go ahead and create your cluster and put everything in place okay so it says we want to create it now I'll say yes create okay let's just refresh and see whether our application is done now it's still deploying let's wait for this deployment to get in place and once it's done we will just go ahead and get these things in place after this the process is very simple okay we are gonna just go ahead and put in certain set of commands and a job would be done so you don't have to wait for too long post that okay let's just refresh it again and see whether it's in place normally it takes a couple of minutes so could take longer right but it's a very easy deployment process so let me give you an idea as in what happens when you say that the deployment is happening so first your kubernetes will actually go ahead and launch a Linux instance to start okay so once the Linux instance is in place that is when we'll actually go ahead and put forth all our resources on top of it in that Linux instance we'll set up a cluster right all the integrities that we need and on top of that cluster our application would start working here in this case it would probably set up a cache could be a redis as well we'll have to go ahead and look into it whether it's that one okay and then accordingly we can go ahead and Implement everything on top of that but we don't have to get into the complications of things it's a fairly simple process for us okay okay let's just go ahead and refresh it once more okay it's done your deployment has been complete is what it says that means we are good to go ahead and use our resource you can do other things as well on top of it it is suggesting what all you can do okay so we are going to do one of these things do not worry so just go to the resource in our case okay and in this resource you can see there's all the information that you need okay you'll get to know where it is hosted and stuff like that what kind of instance it is where it is located everything one node pool is what you have right so here I have all the informs to actually go ahead and connect to it so in your case I'm hopeful that it would happen faster than what you see in my case or on my screen rather you see Cloud shell is up and running so let's just go ahead now and basically get credentials right here let's get the credentials or merge our cluster okay so that we can start using it so how do we do that we use this command we say AZ AKs okay and in this case what we are going to do is we are going to say get credentials okay now the next thing is what do we need Resource Group details so I'm going to say resource Dash group and I'm gonna give the resource Group name which I am certain that we should see if we just scroll up okay so the resource name is RG for demo that we used okay RG for demo and what we also need is we need the name of the cluster and in that case it is Cube cluster so you can do it vice versa as well you don't have to go this order you can start by giving in the details for your cluster and then the detail for your resource Group I'm just gonna hit the enter button now merge troop cluster as a current context in this path is what it says okay now let's just go ahead and get the details for the node in this case we know there's only one node but you could be working on multiple nodes right so in that case you have to use this command you say Cube CTL now it is basically a client-side interactive tool for interacting with your kubernetes okay so that is what Cube CTL is all about okay and you say get nodes if there are multiple nodes it will give you multiple nodes if there's only one node it will give you only one node so there you go you can see that there's one node that we have with this okay we have all the details it's been running for five minutes it says right because that is when we started the node okay now the next thing is we are gonna go ahead and select the yaml file where we've actually put both the details for our application right so how do we do that for that we are going to go ahead and use this command Okay Nano okay space as your the name of the application is or the application rather is related to voting so that is where I'm going to use this name and I'm gonna say dot EML okay okay so guys I've already pasted a piece of code here because I was I tried implementing this demo before so you'll see that there's code here okay so in this case what we are going to do is we are going to remove this code just to be safe I'm gonna just paste the code again so basically this is where you'll have all the details what is the API that is being used what is the container Port right what is the memory that you are using what call are you making right all those details are here do not worry I'll share the link to this piece of code in the description if not you can just look for Azure demos or getting started with Azure on your browser and you'll find this piece of code it's available everywhere you'll get this code on GitHub as well I do not own this code so I will not take any write-in saying that this is something that I've created it is a code that has been used and it is somebody else's work okay so you may check out online the details I'll also pass in the details in the description okay so everything is removed now I'm just gonna go ahead and paste my piece of code here okay I'm gonna say paste and all the same code is again here with this okay now how do I go ahead with it in this case I'm gonna say justify cut uncut do I want to replace I'm just gonna say exit for now so it's Ctrl X in my case and I'm gonna say no we'll discard the changes so we'll say yes so why that means it will save my code okay I'm gonna hit enter now so it will take me back to my cluster okay the next step is now what we are going to do is we are going to go ahead and call out this yaml file okay so once we do that this is how we are going to do it we're going to say Cube CTL again okay and lie slash F Zord what was the name again slash vote Dot yaml I'm gonna hit the enter button and there you go what it has done is it has created a vote back application it's front is created its back is created right so we need to see how it looks like right so we are gonna find out all the details as in if you want to watch this app working we need the details for it so we we're just going to go ahead and use a command called as get service okay so get services in place and in this case again my app name as your Port but this time I want the front Okay so I'm gonna say front and I'm gonna say watch and hit the enter button guy basically made a blender here and that blender is we have to call this command through Cube CTL right so in this case again we'll have to type the remaining command again so I'm just going to do it quickly okay get service as they are what Dash front watch and there you go you have your external IPR so guys at times what you see here your external IP if your system is very slow which is in my case but I don't know it happened its age is only 85 seconds is what it says so normally it takes time for you to see this external IP okay but luckily we can see it so I'm gonna just copy it here okay and I'm gonna paste it here and this way we should be able to see the application that we've created okay so you can see you can vote here one let me vote for dogs twice Thrice and I can just go ahead and reset it as well okay so guys this is how you actually go ahead and create a cluster okay a suggestion here would be you delete all the resources that is there delete this one delete the resource Group and also the Azure active directory so that you do not get charged if you want to see what all we did here right and how the performance went on Azure lets you track all the working as well right if you want to take a look at the insights and stuff if you click here you'll have all those details as well so there you go if I just refresh it you should have all the details right how long did we use the cluster this is where the user started okay so you'll get all these details that you want so you can do that so that's it I believe this demo is clear to you people and you got to know what things we've worked on how we've used these things and how basically your kubernetes cluster works okay so we've created simple kubernetes cluster and explored it ensure when you go back you just delete just go to the resources and delete that Resource Group the cluster and the active directory if any [Music] what are arm templates so as you can see in the definition arm template is a block of code that defines the infrastructure and the configuration of your project so here it means that it is a way of deploying infrastructure as a code to the Azure environment so here you create a project in that you define the objects you want and you also specify their properties names and their types and you convert it into a Json file and once you upload this file into the Azure what does arm API do it understands the whole code which is in the Json file so here the Json file is then checked into the source control and managed like any other code file so here what arm templates are like they give you the ability to roll out within the Azure environment so it can contain the contents of the entire Resource Group like they can manage the whole Resource Group or it can also contain one or more resources which is assigned under that particular Resource Group so when a template is deployed you have the option of either using complete or the incremental code like part by part codes so if you go with a complete mode it deletes any object that are not visible or that does not appear within the template okay so if you have a template and in that if you are not able to see any objects so you can delete you can delete from those Resource Group which you are deploying so in this scenario whatever you are deploying into the resource Group you will see that many amount of objects into that particular Resource Group in the incremental deployment it uses basically the template to the additional resources to an existing Resource Group so the benefit of this is that you don't lose any infrastructure that is missing from the template but the downside is that you will have to clear up old resources from the other way so it's like whatever new resources you are adding into the existing Resource Group that won't delete you are able to see those but once you are adding new then you need to filter out the previous resources as well in this case the ideal deployment is complete but it doesn't mean that you need to have a good automated deployment Pipeline with at least one test environment where you can validate that template and it doesn't rip the heart of your production like it should not ruin your management now that we have come to know what our arm templates now let us see why choose arm templates so how are they useful to us what makes it so special so there are a few advantages which validates that how RM templates are useful to use in Azure environment so we'll look on to them one by one so our first is declarative syntax so what does it means is that the RM template allows you to create and deploy the entire Azure infrastructure declaratively so it means here that if you are using the RM template you can create it while declaring all your by declaring all the codes in the Json file we would declare your needs your properties and everything it is very declarative you declare then it allows you to deploy it in that way itself okay for example you can not only deploy virtual machines but also Network infrastructure storage systems and any other resources that you need within that Azure environment you don't have to go one by one like first you create virtual machine and then you can go to storage account you can create it at once next is repeatable results so here you can repeatedly deploy your infrastructure throughout the development life cycle and you have the confidence that your resources are deployed in a consistent manner it is not K you can only deploy once you can deploy it more than one time so and you get the consistent results out of it like whenever you deploy you will get that much appropriate results consistently so here the templates are quite important which means that you can deploy the same template many times and the same resource types in the same state so if you have created a template you can deploy it more than one time so whenever you deploy the resources under it will give you the same results you can develop one template that represents the desired State rather than developing lots of separate templates to represent updates next is orchestration so what do you mean here is that you don't have to worry about the complexities of ordering operations so the resource manager orchestrates the development of interdependent resources so they are created in the correct order so here you don't have to worry that in which order like you don't have to think that which one should be deployed first which one should be deployed second so arm template manages it reads the code and it manages itself and it makes it an order which has to be deployed first which has to be deployed last so it is done by the ERM itself you don't have to do it manually when possible resource manager deploys resource in parallel so your deployment finish faster than serial deployment it can also manage your time by just deploying the resources parallely so it can do the multitasking work like it can deploy to resources at once so that you get the faster results and you don't need so many commands to it you can just deploy the template through one command rather than throw multiple imperative commands so here you just need to like hit a command to run this particular template and it all your other tasks are done simultaneously next is modular files so here you can break your templates into smaller reusable components and Link them together at the deployment time you can also Nest the one template inside the another template so you can break down these templates into one or more parts and then you can link them you can also just create a nested Loop out of like you can create a proper chain how your deployment should process next is you can create any Azure resources so as we said you can immediately use new Azure resources and features into the templates as soon as our resource provider introduces new resource you can deploy those resources through these templates so you don't have to like wait for the tools or modules to be updated before using the new Services if you want that particular service to be included in your template it automatically grasps that particular service and it makes it ready to deploy next is testing so when we talk here about testing you can make sure that your template follows recommended guidelines by testing it with the arm template toolkit so what does it do it helps you like to test whether your whole process or your whole project is running smoothly and based on the requirements or not so the test kit is a Powershell script that you can just download from GitHub or any portal so the main benefit of this toolkit is that it makes it easier for you to develop expertise using the template language itself everything is within the arm itself you don't have to have a separate testing toolkit for it all right okay so one of our folks named Rohan has asked is there any other toolkit available so as I said Rohan the testing process is within the arm manager so arm itself has its own toolkit so when you're deploying a particular template into that arm API so once it is in the process of deployment it checks once it is deployed it will automatically go under the testing process and it will test the whole template over there so you don't need an extra toolkit for it because arm is one old soul thing it is managing your services it will managing your resources whatever you're deploying into the Azure lastly once it is deployed it is also testing it whether that particular resource is working smoothly or not I hope you got clear with this moving ahead we have deployment blueprints so before we see what is the advantage of These Blueprints let us understand what are basically blueprints so just as blueprints allow a particular engineer or an architecture to sketch a Project's design for the parameters so Azure blueprints is also one kind of them that enables the cloud Architects or the central Information Technology groups to define a reputable set of azure resources that implements and adheres to an organization's standard patterns and requirements so basically the blueprint makes it possible for the development team to rapidly build and start new environments with same trust that they are building within the organizational compliances with a set of built-in components such as networking speeding up of development and the delivery so basically it is a declarative way to orchestrate the deployment of various resource templates and other artifacts such as role assignments policy assignment arm templates itself resource groups so what is the advantage of deploying blueprints is like you can meet the regularity and the compliances standard so These Blueprints include pre-built templates for various architectures which will help you organize your work as well as fasten your deployment rate next is track deployments so in Azure portal you can review the deployment history and get information about the template deployment and you can also see the template that was deployed the parameters value passed in and any output values other infrastructure as a code service aren't tracked through the portal so here whatever your deployment history you can track down the deployment history you can also get to know key which template has been deployed and what were the parameters under that particular template and what output did you get out of those templates so you can have keep a track of each and every deployments next is exportable code so here you can get a template for an existing Resource Group by either exporting the current state of the resource Group or viewing the template used for that particular deployment so viewing these exported template is a helpful way to learn about the template syntax lastly policy has a code so Azure policy as we know Azure policy is a policy as a code framework to automate the governance so if you are using the Azure policies policy remediation is done on a non-compliant resources when deployed through the template so these are some of the key benefits of the advantages of arm templates which makes it more useful within the Azure environment now let us understand what are templates so basically the template is a JavaScript object notation or a Json file that defines the infrastructure and configuration of your project as we had discussed before to create a particular object or if you want to run a particular resource you need to like specify what services you want to use and what are the parameters its properties and its name and all the information into a Json file and then you upload it so it becomes a template the Json file is nothing but a template that is run inside the arm that is nothing but Azure resource manager so the template uses the declarative syntax which lets you state what you intend to deploy without having to write the sequence of programming commands to create it so in the template you specify the resources to deploy and the properties for those resources that is nothing but in the Json file you specify each and everything what are the properties whatever resources you want to deploy and it becomes a template form and it gets deployed in the erm now that we have understood what are templates now let us understand the template design so defining the templates and the resource Group is entirely up to you and how you want to manage your solution for example you can deploy your three Tire application through a single template so as you can see in this diagram so this is a three Tire template all right so in this template you can run three types of application that is virtual machine app services and SQL database and then it can be done within the one Resource Group itself so here you don't have to Define your entire infrastructure in a single template often it makes sense to divide your deployment requirements into a set of targeted purpose specific templates but you can easily reuse these templates for a different solutions so now if you see in this another diagram you can also deploy a three Tire solution through a parent class or in the nested form so as you can see here this is a parent template under this we have created sub templates that is one template is for virtual machine the second template is for app services and the third one is for the SQL database so here you can create it in the nested format so it depends upon you how you need to design your own templates and you can create different resource groups also or you can add them into a single Resource Group now let us understand how do arm templates work so as you can see in this diagram the RM templates actually work on the hierarchical scope level so as we know there are different ways to create our resources like Azure portal Azure power shell and Azure CLI and apart from the Azure environment we also have the third party clients those are rest clients so these are the different ways through which you can like create your own resources and your resource groups as well once you start creating the resources in the Azure environment so whenever you are creating a resource Group or you are opting for us new service so what happens that in the back end there creates a Json file that is nothing but an arm template so as we discussed that arm template is all about the Json format it has the Json syntax so whatever resources you would be creating or whatever Services you would be opting within the Azure environment that would be converted in the backhand as the Json file and that Json file then goes to our API that is nothing but Azure resource manager API so what does the Azure resource manager API do is that extract that file and then understands what are the requirements in that particular template and for that like now if you are opting for a resource for any services from Azure you always need a subscription so it will check whether you are subscribed to Azure portal or not then based on your subscriptions you can it allows the particular resources to get deployed so once your subscription is authenticated then you have a resource providers so in the arm template we specify what are the different Services we want to opt for or what and all resources we require within the Azure environment so it goes automatically into a resource provider and within that we can run multiple objects that are nothing but our services for example virtual machines app Services storage account and SQL so this is how arm templates or the whole Azure resource manager work so in a just type I call it is is like whatever resource you are trying to create in the Azure portrait or through Powershell or through CLI or from any third party application so it does have a template and in the back end which is then extracted by the RM that is a Azure resource manager API then it processes the particular template for the deployment based on your subscriptions now we know that how templates usually work it uses the JavaScript object notation syntax that also includes the advanced capabilities so in this particular syntax we have the different components so the first component is the parameters so what are actually parameters so here the parameters allow you to pass the different values to the ERM template for use during the deployment some common examples like you can include the names of the resources or which Azure region to host them so what do they do is it enables your templates to be more Dynamic and used across the different environments so parameters require a name and a type so here you can include strings array objects integers Boolean or any secure string like password optionally parameters also contain the discretion of how to use these particular parameter so you can include default values to your parameters so that you do not need to provide one at runtime and you can configure it with a set number of values so this configuration is helpful when you want to like limit what skus or location a person can deploy the resources the next component is functions so in ERM template functions allow you to create complicated Expressions that you don't want to repeat throughout the template so here what functions actually do is like there are lot of functions in other programming languages like when we code you add on different kinds of functions like more than one function you can add so you have a list of functions into it but here you can tell them when you need to run them and you can pass informations to them and expect the return value from there for example say you need to create a unique name of a resource so instead of copying and pasting the same code to generate the unique name you can create a function that makes the unique name for example if a function called as unique name with a parameter named prefix returns a unique name using the resource Group ID so here note this name space value so this value can be anything you want so here the functions require different name space value to avoid a naming conflict with the regular template function next is variables so variables are not much different in arm templates that you find it another programming languages too here the variables contains the values that are used repeatedly throughout the template like functions you can use variables to create complicated expression like functions you can also use variables to create complicated expression so that you don't have to repeat them into the same template and like parameters variables have the same data type such as strings objects and integers so you can Define variables using colon as assigned operator for example instead of passing the Azure region as a parameter you can Define it as a variable so you can just type down the location and whichever required location you need you can specify there next are resources so what are resources basically so the resources defines the Azure resources to deploy within the particular template so the resources can be anything as small as a network security group all the weight like it can be watch on machine storage account or functions or any other services which are within the Azure environment so when you specify a resource so you have to give a couple of explanation in each part like you need to specify the name of the resource and then you also need to give the type of the resource you need to deploy whether you need to like refer to the high level family of the resource to deploy for example like microsoft.compute microsoft.storage or dot Network so this will help you out to map out these resources to deploy more precisely next you need to also specify the API version that is like to check the configuration of that particular resource so you also specify the dependencies so here dependencies determine the order of azure how Azure should deploy these particular resources so for example if an uh arm template is deploying a virtual machine and then a virtual Network so the virtual machine will exist first before creating a virtual Network and then you all then you start giving up the properties so here the properties contain the configuration information for the deployment and last is the output so the output section will Define the values and information returned from the deployment it is helpful for the data that Azure dynamically generates through the public IP address so these are the key components that are used in ERM templates and which will help you to work with these Azure resource manager now that we have an overview on the arm arm templates and all the key concepts related to arm templates now let's have a quick demo and let's see how do they actually work so for that let's quickly move on to our Azure portal so we need to First log into our Azure so let's quickly sign in so this is our Azure bottle now as we discussed that there are different ways like you can use the bottle itself to create your services as you can see here there are different Services you can opt for like create a resource Group or service bus virtual machines application gateways Etc and you also have your CLI that is your Cloud shell so you can even run your commands and you can create your resource Group so let's see how are these templates created while we create a resource Group or any resource through Azure bottle so let's quickly create a resource Group first so as you can see we don't have any Resource Group yet so let's quickly create a resource Group so we'll give any random name to it so as it is AR Ram let's give it a name as arm demo one two three and you can choose any of your region as it's up to you like API of your choice based on your requirements so I'll take West us because that region is more free so let's review and create your validation is passed let's create so when you see over here your resource Group has been created now and now you can also see your arm demo has been automatically created I will just go inside so now that we have created a resource Group now if I just go into their properties like we just click onto this Resource Group and we'll just go to their export template so here you can see the template has been created so this is a snippet code where it shows whatever whatever object you have created so under that what are the parameters and what are the variables you need to add this happens in the back end once you create any resource or any Resource Group so one of our folks have a doubt tejaswini has a doubt that how does these template create had discussed before that whenever you create anything in the Azure no matter if it is a resource Group or if you are using any services so whatever resource you're creating either it is through Azure portal or partial or clis so in the back hand it automatically creates a Json file in the back end it automatically creates a Json file which then becomes a template it is formed in the backend process so that it goes to the arm that is our arm API Azure resource manager API then it can process the rest of the resources accordingly all right so it is formed automatically so whenever you create any resources or any Services you are applying to or you are utilizing in these resources and you create any resource so it will create a template and that template will then go to the rar and from there the whole process of the deployment starts I hope you got an answer to your question so as we have seen that we created a resource Group and as we go under their overview you just come into these export template and here you find out your template that is the Json file and you can even download it or you can send it to deploy as well so now what we'll do we'll create a resource within this Resource Group so you can take any resources so I'll just take up the storage account let's create a storage account at first and our Resource Group is arm demo one two three and now let's name our storage account as well and we'll change our location all right and our performance would be standard because as our subscription is free and it is only for the demo purpose so we'll be keeping everything as it is and we'll just review and create all right so here as you see they are initializing the template deployment to the resource Group so as I said whatever resource you're creating it automatically creates a template of that particular resource within that particular Resource Group so now once it gets deployed we'll see so here you see the overview the input and outputs and your template so we'll just get into this and you see this is the RM template of our storage account this is how your arm templates are created [Music] what an Azure active directory exactly is let us start by taking a look at the definition first now if I talk about the definition this is what we have it is Microsoft's multi-tenant cloud-based directory and identity management service that combines core directory Services application access management and identity protection into a single solution now there's so many terms in it and so many things to understand let me simplify this definition so that you understand it in a much better way to do that I would be giving you an example think of it in this way suppose I am a cloud vendor or not a cloud vendor I am a service vendor and I reside on a cloud that means I have a particular application that runs on cloud now this service is used by quite a few customers plus there are quite a few organizations that I have to interact with now in this case all these medias or customers and organizations they have to communicate with me so how do they do it well what they would do is they would go ahead and create an account or maybe have some user IDs through which they can actually communicate with me now this is okay if the number is countable or manageable Suppose there was a situation where we had an number of people and the number of people was constantly increasing suppose I have somewhere around thousand two thousand people now these many logins and these many credentials managing all these can be a huge problem let me give you an example how does this happen exactly or what problems do you face normally suppose I have 10 organizations now to log into all these or give access to all these 10 organizations can be a used problem why all these organizations might have different kind of accesses based on that I have to set in different security protocols as well if certain organizations have an easier protocol what if they get an access to other organizations as well out of the data that other organizations have to correspond to in that case it can be a huge problem apart from that I might have n number of customers and keeping track of so many customers can again be a problem when you talk about creating credentials so all this is a huge problem so what happened was Microsoft Azure it went ahead and created something called as Azure active directory it is nothing but something that acts as a middleware it takes care of all the sign-ins and all these things how now the users will have a single sign-on process that means they would sign in only once plus they can have access to the applications that are there which I provide them now this intermediary that is your active directory what it does is it federates all the responsibilities of taking care of access and all those things that is the way I set all the rules it just incorporates those rules and accordingly it gives access to all the users thus simplifying all the complexities that I would face otherwise so this is what an active directory exactly is it basically goes ahead and simplifies all the signing in and user authentication processes or identification processes rather now as we move further we will be discussing quite a few other terms and you'd be having a clearer picture as to what I'm saying exactly but meanwhile you just bear with me and let's move further and try to understand what are the other points that we need to talk about okay yes I did miss out on this point what Microsoft Azure does is it also gives you something called as a better platform where developers can develop the applications with lot more ease again as we move further we'll be understanding this point as well [Music] so what is the exact between Windows ad and Azure ad let's try to understand that as well now when you talk about Windows 80 that is active directory these are the layers it has to take care of you have something called as your domain Services you have your lightweight directories you have your Federation Services certificate services and Rights Management Services as well now these are so many things to take care of when you talk about your active directory with Microsoft Azure it combines all these layers into two firstly you have your windows Azure active directory now it is something that takes care of all the services that surround or revolve around identity problems that is when you talk about identification management this is the part that takes care of it that is your waad and then we have the other part where you have to actually go ahead and communicate with other organizations I gave you an example of 10 different organizations needing 10 different things that is federating all these organizations your windows Azure access control services it takes care of all those things so both these so-called active directories they more or less serve similar purposes but the approach in which they do is is completely different your active directory has more layered approach where every service is given a different layer or different way of handling it but when you talk about your Microsoft Azure active directory it simplifies things your first layer takes care of most of the things and the remaining things are something that is taken care by your windows as your Access Control service also when you talk about active directory it uses something called as ldap for various other Communications but when you talk about your Azure active directory here you use something called as your rest apis again the approach is completely different [Music] so what are the audiences that are catered by this directory well first and foremost we have it admins now when I talk about it admins what Microsoft Azure active directory does is it provides in single sign-on for various applications now there are quite a few SAS applications that is software as a service applications and various on-premise applications to which you have a single sign on you do not have to log in every now and then now I've worked on quite a few applications and places where you have to log in every now and then you have a lot of trouble because you have to remember so many passwords and so many logins and these different credentials can be a problem now this is something your Microsoft Azure takes care of single sign-on is very convenient apart from that it ensures strong identification and there are quite a few processes that ensure this now I won't be discussing those points in detail but yes when you talk about identification Microsoft Project ensures that it happens in a very good manner plus it automates quite a few processes again easing up this process apart from that it also caters quite a few developers as well now I'm talking about quite a few organizations and since sign on becomes easier here your application developers can focus on building applications and since they have access to so many organizations and so many resources application development definitely becomes easier online customers now people who have been working for quite a while they might know that we have things like Office 365 or you have your CRM Services as well now you had an access to all these things by using your Windows Active Directory but your Microsoft Azure active directory also gives you access to all these services that means if you're using or have account on any one of these things you can have access to all these services or have access to all the active directory services that Azure has to provide to you so what this does is it caters the needs of various online customers as well [Music] so let us try to understand the next point that is azure active directory editions for that what I'm going to do is I'm going to go ahead and switch into the Microsoft's web page basically or I'm going to switch into my browser and move to their website and talk about all these points so instead what I would do is I would first discuss the last point and before we do go ahead and take a look at the demo then we would just go ahead and talk about these points as well so that we can directly switch into the demo part so let's move further and try to understand the next point and then come back to this point [Music] so what are tenants basically now when you talk about a tenant it is nothing but an organization I just mentioned that we have tens of organizations that a particular application might cater now all these organizations are treated as tenants all these tenants can have access to a particular active directory or more than one active directory as we move into the demo part I would be talking about how to create multiple directories as well yes we can have more than one active directory we'll discuss this as we move into the demo part but before that you just understand these points as in what I'm trying to say exactly yes what happens here exactly is when you talk about a tenant first and foremost it is nothing but an organization and it is a dedicated instance of your Azure active directory Service Plus these are isolated instances that means as I've mentioned we have 10 organizations or five organizations you'll be having isolated instances for each of these organizations ensuring that they stay aloof and their services and their Protocols are maintained differently this is where Azure active directory steps in it takes care of all these things it ensures that nothing is ambiguous or nothing is intermixed everything stays separate plus each and every platform or organization gets serviced equally well as we move further we'll be creating users and then I would give you the differences as in what a tenant is how do you go ahead and create all the domains and all those things so again into the demo part you'd be understanding these topics with little more clarity or more understanding rather [Music] this is the demo part but before that as I've already mentioned let me quickly switch to the website of Microsoft Azure and I would be discussing the additions that you can choose from and then we can directly jump into the demo part okay guys so this is a Microsoft dock basically which talks about choosing an addition so these are the options you have we would be finishing this quickly and then we would be switching into the demo part so let's understand this now what Microsoft Azure does is it gives you various options first and foremost you have three options to pick from and out of these three options the first one is your basic option that is your active directory basic then you have your premium which is P1 and then you have one more premium which is P2 all these things provide you with different options that you have first and foremost your main job is to have your access that is your identity management your security and your single sign-on and all those things now these are some of the services that come with your basic account as well and also with your free account what Microsoft Azure does is for people who are completely new to the session and Microsoft Azure let me tell you that you have a free sign in to Microsoft account that means you can go ahead and create your account there and Avail these services for a search duration which are available to you for free so yes you do not have to pay anything there you have a free account in that you'll be having access to some of these services but if you need Advanced Services then you have to pay for it and for those things you have three options that is your basic premium P1 and premium P2 let's try to understand these one by one as in what are these and what do they have to offer to you now if you scroll down and if you take a look at this thing you have your Azure active directory basic now this basic is nothing but it's designed for people who are task workers or who are focused on a particular application of on cloud now it takes care of everything your single sign-ons your slas and it ensures that the security is 99.9 percent and it provides with all these features which you can see here that is your self-service password resets and all those things you also have access to quite a few things like your proxies and all those things I won't be getting into the details of what proxies are and all those things but yeah for people who are admins and who have worked on these topics are in these domains they would understand what do these things mean so yep you have access to all these things which fall under your basic option apart from that you have something called as your premium P1 now this is for people who want to scale up so when you try to scale up you'd be dealing with quite a few things and terms like your IAM and all those things would come into picture now IM is nothing but your identity and access management basically which is a very important point when you talk about active directories so yes it provides you with these things as well or these facilities as well like identity protection your security in the cloud and all those things everything is taken care of for this particular model now when you talk about premium P2 if I scroll down this is what you have it is designed for more advanced protection that means you'd be getting all the services that were provided in basic and P1 apart from that you'd be having some additional Services which ensure more security that means it focuses more on privilege identity management now again this is something that you can read and understand it is very easy but to give you a basic difference your first thing provides you with your basic services that is your basic access gives you basic active directory service access then you have your premium version which is focused for scaling up and when you talk about P2 it focuses more on Advanced security so these are the three different editions that you can choose from now if you are somebody who is belonging to a particular organization and wants to go ahead and use these Services he can actually go ahead and read all these things and then go ahead and take a decision accordingly now what I'm going to do is I'm quickly going to go ahead and switch into the demo part for that I need to go ahead and open my Microsoft Azure account so let's do that well my internet is kind of slow today so it might take a little longer while than normal I can click here on Portal and there you go it would ask me to sign in I would be using a dummy account today for this so-called demo I wanted to give you an access to our view to quite a few directories and all those things so that is why I did go ahead and create an account and or certain active directory is created or basically certain accounts created now this is how the Azure portal looks like for people who are completely new again you have your dashboard here apart from that you can actually go ahead and create quite a few things that is your virtual machines your data factories and all those things for people who want to know all those things they can actually go ahead and refer the other videos that are there in this series and you know quite a few other things as well but as far as this session goes we are here to talk about active directory so let's head into it and try to understand how do you go ahead and create active directories and all those things now how do you navigate to a particular active directory now if you scroll down here you would be seeing an active directory here for people who have an account on Microsoft Azure and have access to all these things they would have an active directory by default so you just have to come here and click on it and a particular dashboard would open up for you people as well now this is how it looks like you have all these things an overview getting started you have users groups which you can manage and monitor you have devices you can connect to your various app registrations as I've told you you can manage multiple applications as well now in that case what happens is as I've mentioned you might be dealing with multiple applications so a particular user what applications should you have access to what domains does he have access to what are the devices that are configured all these things can be controlled or managed from here basically so this is what your so-called active directory does now let's move further and try to create some users now how do we do that well I can click on this icon here and it would give you the list of users that are already there as you can see there are quite a few users here this was a demo account so we did go ahead and create so-called users so that you can have a look at them this is something that I created yesterday that is Chris Pratt now how do you go ahead and create an account I would be talking about all these things to give you a start you have to click on this thing new users and this window opens up now again I'll have to go back and show you something else how do we deal with all these things well first and foremost you need to given a pseudo name or a name of the person for which you want to create a user apart from that you need a particular domain name or yes a domain name for a particular domain service now how do we get that now these domain Services have to be registered with your so-called Azure active directory so I have these accounts here right so I can use one of these accounts to just go ahead and create a particular user suppose I want that user to be assigned or maybe associated with this ID so I can select this domain service extension basically again click on new user so what name should I give to this particular user now I am a huge cricket fan and recently I watched England's match so let's pick name of one of the players that belongs to England team so Sam Billings and again this is where I would be creating the user say I say sample as a pseudo name and I given this domain details that is attribcart.tk now this is something that is configured already that is why I can use it and I can have an account if I used an ID that was not registered with this Azure account I wouldn't have been able to create this user because it would have given me a particular error as we move further we would be taking a look at that as well but for now let's go ahead and create a legitimate user that is this one now it verifies whether the name is proper or not name is something you can use in any which way you want to but your username has to be legit and valid so I have these details which I've entered configuration not required properties can be default and if I have to assign him a role I can click here you can see the name is verified here as well let's make him a global admin maybe and again you'd be given a password Here If you say show it would show you the password and I would suggest that you note it down because you would be required to go ahead and log in and in that case you might be needing this password so a suggestion that you noted down I say okay here and I create the user now it might take a while because at times there are certain things that take a while but in this case it has happened pretty quickly so as you can see we've gone ahead and created a user his name was sampling if I'm not wrong so yes there you have this account which is sample and if you click on it and open it you can have access to that account where you can enter in other details what applications that are there under this user and what applications do you want to assign which devices do you want to configure and all those things if you scroll down you have some other options as well sign-ins and audit logs now I won't get into the details of these things but you can assign all these things to this particular user as well so yeah this is what the user looks like and you can actually go ahead and log into this Azure account through this user profile as well we can do that what I'm going to do is I'm going to go ahead and create or open an incognito window where I'm going to go ahead and log in as this user let's just say Cognito and now if I try to log in I would have to enter in the details I've actually gone ahead and tried logging in but I've forgotten the credential details so let me just quickly switch to this window copy this email ID and again switch here so this is the email ID which we have sample next and my password was I hope it is right yes so when you log in for the first time it would ask you to enter the current password and then you can enter the new password let's say and then you re-enter the password there you go when you sign in so what happens is you enter into this portal as a fresh user see I'm a completely new user and it says do you want to start a Tor but I don't want to do that so I would just say maybe later there you go you have your fresh dashboard there is nothing pinned here and everything is completely new so yeah you've entered in as a completely new user and this is the active directory where I'm assigned to that is my previous active directory as you can see this is what we have here to offer so yes as a user I have certain Privileges and I can have access to this so-called portal so this is something I wanted you all to see the the quite a few other things which we are going to go ahead and take a look at it but for now let me just log out and close this tab I'm back here the other things I want you to understand and those things are if I come here what you can see is we have certain users here right there you go now if you take a look at certain email IDs you can see these email IDs now these are quite huge email IDs right what happens is when you do go ahead and register your domain service you register that domain service with Microsoft Azure account and when you do go ahead and create users you would not want to have such used names that is say for example Vishal at Microsoft something something something something that can be long right and that is complicated to handle or manage so instead what you can do is you can provide them with pseudo identities or pseudo IDs as well well so that the process becomes easier or simpler to handle let's try to do that and see how can we do that can we just go ahead and assign a particular domain name or a domain service when we just go ahead and create a new user or all those things so in order to add a particular domain what you have to do is you have to go ahead and again where is my active directory here it is and I just kind of scroll down you can see custom domain names where you can actually go ahead and add domain names but there are certain catches to it let's try to understand those now it would ask me to enter a custom domain name and I say demo domain maybe yeah and let me give it some extents and now again let me tell you that this is a demo practice and it won't take in this particular domain name I'll tell you why but first let's just try to add this domain yeah the domain name is added but as you can see to use demo domain dot at with Azure ad create a new text record with your domain name register using the info below so if I say text I need to copy this part and I have to actually go ahead and add this to my particular domain name now I won't be going ahead and doing that because that is something that we are not discussing here because for that we would be needing some other domain name which I do not have right now with me so if you do go ahead and try to add a particular domain name you need to have that particular domain that is suppose I'm using a particular website or I have a particular organization which has a particular website or a domain name that is XYZ at xyz.com or something like that so I need to make sure that I go ahead and register or have access to that particular domain and then I need to go ahead and attach this particular text to it or authorize this text with it so that I can actually go ahead and confirm with Microsoft Azure that yes I have an access to that particular domain and only then can I go ahead and use this particular domain with my Microsoft Azure now if I click here on verify it would give me an error that I'm very sure of see could not find the DNS record for this domain TNS changes may take up to 72 hours to propagate that means I have 72 hours to go ahead and add this particular text message to that domain and so that I can verify that yes this domain is legal but in this case it isn't I just took something for the demo purpose or for the reference sake and this is the domain that I might have or which I can actually go ahead and use so yeah this is how you actually go ahead and add a particular domain and you can actually go ahead and create a user as well what you have to do is when you do go ahead and register this particular domain you can just go ahead and follow the processes like creating a user which we did in the previous case because that was the register domain when I used that so-called eureka.tk it was registered with my so called Azure account and I could actually go ahead and register that particular user but in this case I cannot but yes if you do go ahead and create a particular domain or you want to go ahead and create a particular domain or register a particular domain make sure that it is valid and it is under use and you can actually go ahead and register that particular domain by using this particular process so yeah this is how you actually go ahead and do all these things now when you have this particular domain which is not registered and if you do go ahead and create a particular user on it what happens is your Microsoft Azure would let you create that user but the credentials are the access that user has is as a guest user because you know Microsoft isn't sure that this domain which you just created is actually registered or something that you can actually go ahead and use so that is one point which you need to consider now let me just quickly go back to my active directory and see if there are anything or any points that I need to discuss with you or something that we have messed up on so what I'm going to do is I'm going to talk about something else called as creating an active directory or can we create multiple active directories that is a question if you ask me I would say yes definitely you can create multiple directories now as you see here if I go to a particular directory I would be having an option called as switch directories now if I click here I have certain options from which I can pick a default directory now in my case I have quite a few directories which I can actually go ahead and choose from but I want to give you all a demo as in how do you go ahead and create one because these are something that we have created for the practice purpose or certain usage purpose so let's go ahead and create a fresh one so how do we do that can we just go ahead and create one well yes definitely we can create one if you just scroll down you have certain options here we had an option of creating a new directory let me just go ahead and see where that option is see yes this option create a directory so let's start by giving it certain names say edureka one two three four not one two three four let's call it say Eddy record Eureka maybe is it there yep and what should be the domain name one two one edureka one two one state well since I'm from India let's stick to India and I say create now it might take a couple of minutes when you just go ahead and create this so-called directory so yep meanwhile you bear with me and there you go you have your directory here you can just click on this thing to manage our directories as you can see it's a completely new directory which is fresh and new to use that means as you can see it's Eureka and if you click on any one of these things you won't be having anything else now in my previous active directory I had so many users now if you come here you'd see there's just one user main admin I do not have anything else or no other user whatsoever that means this is a fresh directory like as you can see if you just go back to the direct option you'd be having an option of resetting your so-called directory and you can do that as well but I do not want to do that for now I can just click on this and I go back see here's the option you can click on it and you can switch the user so yeah you can use multiple directories and you can have multiple users for these directories as well now I can just go ahead and create users for this directory as well but I won't be doing that now since I do not need this directory I'm just going to go ahead and delete it so I click on this icon I do not have permissions probably so I click on it and I ensure that permission is granted I say yes and I save changes it might take half a minute to update these properties or if the Internet is slow it might take longer as well there you see the changes have been updated now if I refresh this thing the access is given to me I can just I don't want to delete the dashboard I just want to go ahead and delete my so-called active directory so I do not have an access to my domain Services because of which I'm not able to delete this account but yeah you normally have an option where you can actually go ahead and delete this directory so yep you can go ahead and do that as well so yeah this was the demo about active directory as in how do you go ahead and create a user how do you create a particular domain space our domain service basically how do you go ahead and create multiple active directories [Music] what is azure load balancer Azure load balancer can be defined as a cloud-based system that allows a set of machines to perform as one single machine to serve the request of a user it allows you to distribute traffic to your backend virtual machines and also provide High availability for your application at whole the Azure load balancer is a fully managed service itself so to be more precise an Azure load balancer is a four layer load balancer that provides High availability by Distributing incoming traffic among the healthy VNS that is virtual machines so now that we have come to know what is azure load balancer so let's get to know why do we use load balancers so with Azure load balancer you can scale your application and create highly available Services it supports both inbound and outbound scenarios and scales up to millions of flows for all TCP and UDP application so there are many key scenarios of load balancer that help you accomplish your tasks so we'll quickly have a look on them load balancer load balance internal and external traffic to Azure virtual machine it also increases availability by Distributing resources within and across the zones if we talk about the configuration then Azure load balancer configure outbound connectivity for Azure virtual machines and load balances service on multiple ports multiple IP addresses or both let's look on to few more scenarios so Azure load balancer has the capability to move internal and external load balancer resources across the Azure origin it also balances TCP and UDP flow on all ports simultaneously using the ha codes load balancer uses Health probes to monitor load balanced resources and enable support for load balancing for IPv6 versions so these were some of the key scenarios which makes Azure load balancer useful and demanding now let's look on to some of the Azure load balancer types so there are three types of load balancer so first is azure load balancer as we discussed before Azure load balancer is a layer 4 load balancer which means it operates at layer 4 of the operating systems in the connection model it's a single point of contact for clients and it distributes inbound flows that arrives at the load balances front end to back-end pool instances these flows are according to the configured load balancing rules and health props the back-end pool instances can be Azure virtual machines or the instances in the virtual machine scale set next is application Gateway an Azure application Gateway is a web traffic load balancer that enables you to manage traffic to your web application it is a traditional load balancer that operates at the transport layer and root traffic based on source to destination IP address and Port Apple education Gateway can make routing decisions based on the additional attributes of an HTTP request so for example if you have an image file that is in the incoming URL then you can route the traffic to a specific set of servers which are configured for images if you have a video file in the incoming URL then that traffic can be routed to the another pool that is optimized for the videos this type of routing is known as application layer load balancing and third type of load balancer is traffic manager Azure traffic manager is a dns-based traffic load balancer it allows you to distribute traffic to your public facing application across the global Azure regions it also provides your public endpoints with high availability and quick responsiveness traffic manager uses DNS to direct the client request to the appropriate service endpoint based on the traffic routing method it also provides Health monitoring for every endpoints so these endpoints can be any internet facing Services hosted inside or outside the Azure so what does traffic manager do it provides a range of traffic routing methods and endpoints monitoring option to suit different application needs and automatic failure models it is very much resilient to failures including the failure of the entire Azure region now that they have talked about each type of load balancer let's quickly compare each one of them so as you can see there are different parameters based on which all three types of load balances are compared so in terms of service Azure load balancer is the network load balancer talking about application Gateway it is mostly related to web application so it is mostly known for web traffic load balancer and traffic manager is a DNS based load balancer if we talk about Network protocols then Azure load balancer is a layer code which includes TCP or UDP application Gateway is the layer 7 load balancer which includes HTTP and https and traffic manager is also layered 7 which includes DNS if we talk about their types so Azure load balancer has internal and public application Gateway has standard and Waf but there are no types in traffic manager going ahead with routing then Azure node balancer is Hash based Source IP Affinity application Gateway is path based and traffic manager looks into performance weighted priority Geographic and multi-value subnet the recommended traffic in Azure load balancer is for non-https in application Gateway it is for https and in traffic manager it is for non-steps talking about the end points then in Azure load balancer we have Nic IP address in application Gateway IP address fkdun virtual machine vmss app services and in traffic manager we have cloud service app service Lords public IP addresses if we talk about the Global Services so in Azure load balancer and traffic manager provides Global Services where application Gateway is only bounded to Regional at last if we talk about the redundancy then Azure load balancer is Zone redundant and zonal application Gateway is also Zone redundant whereas traffic manager is resilient to the regional failures so these were the important parameters which compare all the three types of load balancers so now let's quickly move on to our Hands-On part for this we need to First log into our Azure portal so this is our sign-in page so let's quickly sign in they ask for stay signed in so it is up to you if you want to stay assigned in for a long time you can so just click yes so yeah here we enter to our main dashboard so this is how the Azure dashboard looks like so here you can see many of the services provided by Azure and there are resources So currently I have no such resources so first and the foremost thing we need to do is to create a resource so let's quickly create a resource we'll come and check over here and as you can see there are different options so they ask you whether what kind of results do you want to create so there are popular Azure services so based on that you can create resources so here you can see different servers like Windows Server open tools and Windows 10 Pro so there are many options over here for us we'll be going to virtual machine so let's just create move on to our virtual machine so here only you can see virtual machine is given so let's create one of the virtual machines now that we have to deploy a virtual machine so we need to uh like first fill on to the basic details so as you can see the first and foremost thing they ask for the resource Group so what do you mean by a resource Group it is basically a group of resources that you are going to deploy into the server so let's quickly create a new Resource Group as we haven't created any yet so let's name it as are sure I've been demo and click ok so this is a resource Group name and now they ask for the virtual machine name too so let's name that as well azure vm1 that sounds good and they also ask for the region so default they have put the central India so let's keep it as it is now they ask for the OS type of os so there are different options you can choose like one two Windows 10 Oracle so there are many operating system options so I'll just go with the Ubuntu Server itself and the size next comes the size so there are different sizes depending upon the prices as well so it depends upon which type of size do you want to so I will keep this as a default as well now they ask for the authentication type so there are two options that you need an public key or you need to set a password so let's quickly set a password to it so let's give an username let's name it as a Eureka demo that sounds good and now let's give a password all right so the value must be within 12 to 72 characters so let's do it accordingly all right so this sounds good now they ask for the inbound ports so as they have showed here SSH 22 so that means that they will be using the 22 code for the as a firewall so let's keep it as it is now let's move on to next so here they ask of you about the OS disk type so there are different type of like you have different types of ssds like premium SSD standard SSD HDD so depending upon your requirement you can choose any one of them so I'll be choosing the standard SSD and rest all let's keep it as it is and let's move on to next so here you can see they ask for a virtual Network so it has like automatically set up a virtual Network for me same as they have also created a default or subnet as well so we won't be making any changes over here so let's quickly move ahead now here they ask for any type of monitoring we need to provide so here we are not interested in having any monitoring so let's just disable the monitoring and move on to the next options so let's skip all these two as well and just review and create so here basically they you need to review everything whatever like whatever information you have given in the options so once you're done with that once you review it just click on create so here you can see it is getting ready for the deployment all right so your virtual machine has been created now it is just here you can see the status like it has it is showing deployment is in progress so let's just wait for it so you you can get all the status all right so here you can see your deployment is complete so let's quickly check whether it has been created successfully or no let's go to our virtual machines so yeah here you can see your Azure vm1 that is a virtual machine one has been created successfully so here you get get your subscription resource from under which resource Group it has been created location and operating systems and the sizes so now let's also check on to our Resource Group as well so let's quickly move on to our Resource Group and let's find out so here you can see Azure demo which we have created currently so just click onto it and here you can see different types of resources which have been created so this is the virtual machine which we launched and deployed so let's quickly create one more virtual machine Let's quickly move on to Virtual machines and let's create Azure virtual machines now here we'll take the US Resource Group so let's click on it and let's name it as uh azure vm1 us as it is from a different server and the region would be West us and let's keep all the other details as it is now let's change the authentication type let's give it a password and using demo all right and then we'll move on to next here we'll be selecting the standard SSD and rest others would be same as it is so here again I said you before the virtual Network and the subnets are created automatically so let's quickly move on to next here we'll be disabling the monitoring and just skip the rest and review and create all right so let's just create it so as you can see the deployment is in progress so it takes a couple of minutes so here you can see the status of your virtual machines all right so here you can see your deployment is complete so let's quickly move on to our Resource Group and let's check whether it is it shows there itself or no so just go to a short demo us so now that we have created one virtual machine in us as well let's create again one in Azure demo so let's quickly create one in Azure demo so virtual machine okay so here we'll be selecting Azure demo and let's name it as Azure vm2 and list others change the authentication type and just move on ahead next and change the disk type to standard SSD and next next we'll just disable the monitoring and review and create all right so here you can see the deployment is in progress let's go to our Indian server so let's go to Resource Group and this is our Azure demo so let's come over here so as we come over here so let's try to configure the first server so with the first server is azure virtual machine so as you can see over here this is the Azure virtual machines to overview so here you can find out the IP address of this virtual machine so we'll just click on copy and now we need to open our terminal so in Windows we'll be using an app so as I'm working on windows so there is an application named as putty so we'll just open that guys who are using Ubuntu can directly open their terminal as well so here I'll just open put T let's open put it all right so here what do you need to do is just paste your IP address and your Port is 22 so let's keep it as it is and let's just open it right they ask for some acceptance so let's just accept and let's login so we had just hit your record demo let's give the password all right so as you can see we are logged in now so now that you are connected to your first server first Indian server let's just get connected with our another server as well which we had created so let's quickly move on to our Azure portal and we'll go back to Azure demo and select the second one we'll do the same thing we'll copy the IP address and we'll just go to putty again it's open and we'll again type the IP address and open it accept and now they again ask for the authentication so let's just give the password so yeah you are logged into both the servers as you can see this is our Servo 2 and this is a server one so now that we have logged into both the servers now we need to configure it so in this we need to like install and all software that is Apache 2. before that let's quickly update our servers that is by sudo apt so now that both the servers are updated now so let us just install the software that is Apache 2 by sudo apt get install Apache 2. all right so it has been installed in this it is uh the server too let's just do the same thing in the server one as well sudo apt get installed apart shift two so now that we have installed the Apache 2 in both the servers let's just try to check one of them and see whether it's working or no so we'll go back to our job portal so we'll just open a new tab and put the IP address over there so as you can see the site can't be reached this is because we haven't given the access to Port 80. so basically what happens whenever we like enter the IP address the browser interacts with the port 80 but in Azure we had given the access to Port 22. so what do we need to do is go back to our Azure portal and in this we'll go to networking and here we will go to add inbound ports so here we need to give the access to Port 80. so here is the destination Port let's just put it 80 and allow it and just add it so it's creating in the server so it has been created successfully now let's just come here and reload this it may take some time because uh as I have just given the access to Port 80 so it may take some time to load as well all right isn't hack okay so as you can see over here you can see your Apache 2 default page now what do we need to do is replace the file so here as you can see replace this file located at where dot www.html so we'll just copy this path and we'll go to our virtual machine one and just CD auth so we have come into that path now what do we need to do we need to remove the index.html and we'll create a new index.html so pseudo RM index dot HTML now Nano index dot HTML so here Nano is just like a notepad in our windows so in Ubuntu it is known as Nano so what we'll do over here is just write a few code HTML code so let's quickly type some code HTML let's give it a heading welcome to Indian server one just as an example and let's close the body and the HTML tag all right now we'll just save it and we'll just go back and just exit all right so let's quickly move on to our website so let's just reload it so as you can see over here it is written welcome to Indian server same I had done with the server 2 as well so now we have to create the application Gateway so let's quickly go to our application Gateway and as you can see we don't have any so let's create one new so as usual we need to like give our Resource Group so we'll be selecting the resource Group of India so let's name this this course group Central India will give it a region and let's name this as a g server i n now we'll go ahead so now we need to add the give the virtual Network as well so we'll use Azure devops within that we'll be selecting azure so here you could see this error subnet must only have application Gateway so let's just create a new subnet so we'll click on manage submit configuration so here we have already been given with the default IP range so let's create a new subnet where we get some of the range so here we can change the address range but I guess there is some error so what do we need to do is go to this address space and let's see what is the limit so here you can see it is 10.0.0.0 and your range is up to 10.0.2 55.255 so let's go back to the same subnet and let's add a subnet and let's name it AG server i n and let's save it so here you can see you have your new subnet has been created let's go back to our application Gateway and let's see if we get that option or no all right so here we have got the option we'll just click on it and we'll move on to next so now here we come to the front end so here we need to provide the IP address so here we'll create a new IP address so let's name it as he server and yen and let's next and we'll go to backend so here we don't find any backend backend pool so we need to add a new backend so let's add up a backend pool as well so let's name it as AC pool i n and let's pick up the virtual machines which we had created so here you can see you have both the virtual machines so let's add both of them and click on ADD and now we go ahead with the configuration so here we have the front ends we have the blackened pools now we need to add the routing rules so let's add up the routing rule so here we need to name the rule so let's name it as first rule i n and let's set the priority so here the priority range is 1 to 20 000 so let's I'll keep it one and then they ask for the listener name so let's name this as well then we need to select the front-end IP so that is public and rest other will be as default now it should go to the backend Target here we will choose the backend Target that is AG pool in and then we need to create the package settings so we'll go here so let's call it as http India and then rest others would be same now we'll just click on ADD and yeah your route has been created so let's move on to next and click on review all right we'll just click on weird so as you can see my application Gateway for India has been deployed it's still like it is in progress so it may take a while to deploy so as you can see that your application Gateway has been deployed successfully now let's go to this application Gateway let's quickly move on to application gateways and let's see whether it has been created or no so just click on to this and we'll just copy this so what do we need to do we need to copy the IP address and we will just paste it over here and we'll just try to reload it so as you can see welcome to Indian server 2 so if we just refresh it quite a couple of times so it also shows you welcome to Indian server one so this is how application Gateway runs and balances the load between the true servers now we'll quickly move on to our next one that is traffic manager so for this we need to open the traffic manager profile so once we come over here so let's create a new traffic manager so here we need to specify the traffic manager's name so let's give it a good name to it like Global traffic manager okay that sounds good and now we'll just choose the resource Group so we'll be using the Indian server and let that be Central India as well and we'll just hit create so as you can see the global traffic manager has been successfully created now we just look on to the configuration part so let's quickly check on to it it looks pretty good now we need to add the end points so let's quickly add the end points for this we need to click on ADD and let's name it as we are using the Indian server so let's name it as a g India and we'll choose the resource type as public IP address and then we'll click to AG server India so here it shows that no DNS name is configured so let's quickly resolve this error as well for this we need to go to the public IP addresses so let's quickly go to the public IP addresses and here you can see this AG server in we'll just click on to this and now what do you need to do is go to configuration and we need to set the DNS name so let's name it as HG India its suffix would be the central India cloudapp.azure.com so let's quickly save it so it has been saved successfully let's just close this and redo it again so we'll add then we'll write he in the ah we'll select public IP address and we'll select AG server so just click on to this and we can also give the geographical location for which region you want to see the traffic manager so let's quickly give a region so I'll choose Asia and India so once that is done we'll just quickly add so as you can see AG India it has been enabled so we'll go to Global traffic manager we'll go to the overview and we'll just copy this link and we'll try to open it so as you can see it has been running successfully now let's refresh it once again so yeah so both the servers are running in this traffic manager successfully so this is how we use traffic manager for the load balancing between the two servers [Music] what is an Azure firewall according to the definition Azure firewall is a cloud native and intelligent Network firewall security service that provides the best of breed threat protection for your Cloud workloads running in Azure that means Azure firewall is a managed cloud-based network security service that protects your Azure virtual network resources it is fully stateful firewall as a service with built-in High availability and unrestricted Cloud scalability so you can centrally create enforce and log application and network connectivity policies across the subscription and virtual Networks this service is fully integrated with Azure monitor for logging and Analytics now you must be thinking why do we need Azure firewall now you must be thinking do we actually need a firewall in a cloud so the answer is yes you do need a firewall when you're using cloud computing Cloud security does offer you better Securities but it is not sufficient for modern day-to-day business needs if you are constantly accessing the internet for any kind of work firewall Cloud security is much needed and as we discussed that Azure firewall is a cloud native security solution for Azure environment it provides traffic inspection filtering and monitoring in order to protect from DDOS attacks basic traffic monitoring Access Control list or any intrusions so in short it is highly recommended to use Azure firewall and if you upgrade to Azure firewall premium which we'll discuss in further this will provide you additional features to your organization with greater Cloud security needs now we got to know what is azure and why do we need it let us see what are the key characteristics in Azure firewall so the key characteristics of azure firewall is that it is fully managed cloud-based firewall service which provides platform as a service as well as firewall as a service it is built in high availability that means no additional load balancers are required and there is nothing you need to configure whereas it is highly scalable because here Azure firewall can scale up as much as you need to accommodate changing Network traffic flows so you don't need to budget your traffic Peaks it supports fqdn tags that is nothing but fully qualified domain name to make it easy for you to allow well-known Azure service network traffic through your firewall for example say you want to allow Windows update Network traffic through your firewall you can create an application Rule and include the Windows update tag next is it has inbound and outbound traffic filtering rules so in inbound it has dnat support that means inbound Network traffic to your firewall public address is translated and filtered to private IP addresses on your virtual networks if we talk about outbound through Snap support all outbound virtual Network IP addresses are translated to Azure firewall public IP here you can identify and allow originating from your virtual Network to remote internet destinations and last but not the least the Azure firewall is fully integrated with Azure monitor for logging and analytics which means all events are integrated with Azure monitor allowing you to Archive logs to the storage account stream events to your event Hub or send them to log analytics so these were the key characteristics of azure firewall now let us see what are the different type of firewall rules so there are three different rules in Azure firewall that is natural Network rule and application rule so we'll understand each one of them accordingly so let us start with the NAT rule a Nat rule also known as Network address translation firewall operates an online router to protect a private Network it works by only allowing internet traffic to pass through if a device on the private Network requested it so a night rule also known as Network address translation is the process of mapping an Internet Protocol address to another by changing the header of the IP packet while in transit via router this helps to improve security and decrease the number of IP addresses and organization needs so what does it do basically is that a Nat firewall operates on a router to protect the these private networks it works by only allowing internet traffic to pass through a device on the private Network requested it by doing this it protects the identity of a network and doesn't allow internal IP addresses to the internet a Network rule dictates which network feature can connect or associate in utility Network these rules are imposed at the class level for specific asset groups and asset types so it is basically responsible to allow or deny inbound outbound or east west traffic based on the network layer or the transport layer so you can use a Network rule when you want to filter traffic based on IP addresses or any ports or any protocols lastly we have application rule application rules allow or deny inbound and outbound or any East-West traffic based on the application layer so you can use application rule when you want to filter traffic based on fully qualified domains and HTTP or https protocols so these were the three Azure firewall rules which helps you specify which traffic is allowed or denied in your network now let us understand how does Azure firewall actually work Azure firewall offers enough feature Pro to provide optimized control over in and out Network traffic it eliminates the need of load balancer configuration because of its high availability moreover it allows restrictions on outbound traffic by specifying the fqdn service you can create your own defined rules using Azure firewall to filter network network based on service IP destination IP port or any protocol these rules further show the status as allow or deny status it also enables threat intelligence feature that can identify malicious IP addresses or irrelevant traffic now that we have understood what is firewall and how does it actually work now let us have a quick demo on Azure firewall so before we start let us know what are we gonna do in this particular demo so first we'll be creating a resource Group then we'll create the virtual Network and once our virtual network is created We'll add some virtual machine to it once our setup is done then we'll create a firewall and add some root tables and lastly we'll use some rules on the firewall and let's see how does actually the firewall works so let us quickly move on to our Azure portal so as I said we need to First create a resource Group so let's quickly create a resource Group so in this Resource Group we'll be creating a new Resource Group so let's go to create and now they ask for our basic uh details so as we know our subscription is there as we are creating a new Resource Group let's give it a name to it so let's name it fire wall demo and now review and create all right as you can see our Resource Group has been created successfully now in this Resource Group we're gonna create some virtual networks and we'll try to add some subnets as well so let's quickly move back to our home page and then we'll select the virtual Network all right we'll go to Virtual Networks so now we'll create a new virtual Network so let's go create now we'll select our Resource Group that is firewall demo and let's give it a name firewall v-net and we will change the regions it is up to you you can choose any of the regions as per your requirement now we'll go to our IP addresses so here we'll change the IP address range so let's quickly change the IP address range and 60. all right now we have changed the IP address range now let's quickly add some subnets so let's add a new subnet so we'll give it a name server one sub okay and here also will provide a specific range all right so we are subnet name and IP address range is done rest others will leave it as default and let's quickly add so R1 subnet is created now let's quickly create one more and let's name it as server tools subnet and now we'll provide the subnet address range all right so now that we have changed the IP address range and we have created the subnets let's see our security so here we'll keep everything as default we don't have to make any changes over here same as with the tags and now let's quickly review and create all right so our validation is passed now let's create all right as you can see the virtual Network deployment is in progress so let's quit wait for it it will take couple of minutes and then it will get created all right so as you can see our virtual network has been created so we'll just go to Resource and let's have an overview on it if we look on to the topology so as you can see we had created a firewall under that we created two subnets that is server one and server 2. now let's quickly create a virtual machine so let's search virtual machine and now let's quickly create one so we'll select our Resource Group over here and you can name your virtual machine based on your requirement so let's I will just give it a simple name as virtual have told you demo I hope that looks great and we'll select best use and as I said you can choose as per your requirement the way how muchever you want and now we'll change the image so here we can use our server so I'll be using Windows Server 2022 and here are subscriptions as you can see there are in different number of subscriptions that are standard So based on your size and based on your requirement you will search your like you can choose your size so I'll just select the best one out of it and now we'll have to create a register name so let's name it as iterica one two three and let's set a password to it all right so it should be it must be between 12 to 123 character long all right so so let's do it accordingly all right now I guess this looks great now here they ask about the public inbound ports so we won't be allowing it to anyone because like it all whenever it is set in default whenever you're creating a virtual machine as you can see here their inbound Port is RDP so as we are creating our own rules and creating our own firewall we won't be accessing we won't be allowing them so let's just make it none and next here I guess everything looks great we don't have to make any changes in networking we will here will be as you can see our virtual network is given and here there's chosen server one so we'll keep it as default and here public IP address they've created one so we'll just keep it none and rest other looks perfect now let's let's just go next and here also we won't require any changes so we'll just quickly skip these parts and let's review and create all right it shows the validation page required information is missing or failed let's go to basic and let's see what is okay all right so let's okay now let's quickly everyone create so here it as it shows the deployment is in progress so once it gets deployed we'll change the IP address of the virtual machine so as in default it is set as Dynamic we will just change it to static so all right as you can see our deployment is complete now let's go to resource so as you can see there is no public IP address and we do have our private IP address so now what do we need to do is go to our networking and here we'll select our network interface that is vmf2 demo and here we'll go to IP configuration and just select your ip config and let's switch it to static so it like well we'll be using the firewall it won't the IP address won't be changing dynamically it will stay static means it will remain constant there doesn't have to we won't have to see on the capture new IP addresses every day and let's save it now all right it has been saved successfully now our next step is to create a firewall so let's quickly create a firewall so we'll choose our existing Resource Group that is firewall demo and let's name it as fire wall demo there's a one I hope that looks fine and I'll keep the region same as it is so there's no change in the region now let's change the firewall tier and let's make it standard and we'll choose our virtual Network that is firewall we net all right okay so it needs a must have a subnet named as Azure firewall subnet so let's quickly move back to our home and let's go to our virtual Network that is firewall between it and here we go to our subnets and let's go to this and let's just all right so we need to just delete one subnet so I'll just delete this we will delete the second one that is the server 2 and let's create a new subnet named as Azure firewall subnet so let's add a subnet are you sure file for subnet and let's keep it okay the range as two and just save it I hope that's great all right now let's quickly move back to home and go to our firewalls let's check whether it is done correctly or no so we'll follow the same Resource Group and give a name for folder one demo let's use standard we'll use the same existing and let's see all right fine now the editor is gone now here they ask for the public IP address so we'll be creating a new public IP address so let's go add new firewall public IP okay so let's just do okay I will go next and we'll just review and create by the time the fireball is being created so here we can see it has not been connected to so as you can see it hasn't been connected to anyone if you try to even like open it it won't be able to open because there's a connection between the firewall and the virtual machine so what you will do is once our firewall gets created we'll add a rule so that our virtual machine is connected so let's say whether our Fireball has been created okay so here are watching firewall has been created so let us go over here now here we'll go to the rules so we'll be adding uh the first that is the natural so let's quickly add one natural so let's give it a name that rule one and what is the rule so what let's give a name to our role that is RDP rule as we need to allow give the access to that and our protocol would be TCP our source type is IP address itself and also should be any IP address so let's give it a star so here are destination address would be our firewall public IP address so as we go check in our overview all right so we need to First copy the public IP address so let's quickly go to our Fireball and so here you can see the firewall public IP so let's quickly click here and let's see what is our IP so we'll just copy this IP address now we'll go back to our rules and let's add a new rule so connect to one you can give any uh name to your role it's not nothing so specific about it and let's give it a priority so I'll give it up to a thousand and now our rules so as we need to give allow the RDP so let's name it as RDP and we'll be selecting the TCP protocol over here our source type would be IP address itself and our source let's keep it a default for now and our destination address as I said are firewalls public IP address so we'll just paste it over and our destination Port is three three eight nine and our translated address is one nine two point one six eight two don't four and then lastly it comes the translated Port that is same double three eight nine now everything is set now let's add it all right so once it gets updated then we might get an access to our virtual machine and we will be able to open the virtual machines so as you can see our natural has been created so let's quickly move on to our remote desktop and let's paste the IP address let's try to connect so as you can see they are asking the username and password so let's give the username all right so as you can see we have entered into the virtual machine so this is our server which is ready and now we will go to our local server let's see our Internet Explorer and we'll just keep this off and now here we'll open our Internet Explorer and here we'll try to open any of the web page so let's try to open google.com so as you can see our google.com has been opened successfully if you want you can try any other websites so let's search any other thing like BBC dot com let's see whether it opens or no so as you can see our PVC home page it has opened so through Nat rolling we were able to access the virtual machine and even browse some of the websites now let's do one more thing that is adding the root table so we'll just quickly minimize this and search for fruit tables and now we'll create a root table over here now let's quickly give the resource grow and the region is West us let's name it as root table f w and just code next and just review and create so as you can see our root table is in progress so once the deployment is done we'll be configuring it all right so our deployment is complete so we shall go to the resource and now here we'll go to the roots and we'll add some Roots let's name it as my root root table and let's so we won't be adding any sort of prefix would be IP address so here will block the address prefix destination and and our next hope type would be the virtual Appliance and our next hope address is the firewall address that is 19268 1 and we'll just quickly add so as you can see our root has been created now we'll go to subnets and we'll just associate and select the virtual Network and in this we'll select the server one and click on OK now let's go back to our virtual machine and go to our Internet Explorer here we'll retype the google.com so let's hit enter so as you can see the permission has been denied to open the google.com so now let's even see for the bbcs home page so let's start um pvc.com and Save whether it opens or no so here also you can see it has been denied we can't actually see the whole home page and even as well as the Google so what do we need to do is go back to our firewall now we'll go to our firewalls now we'll go to the rules and now we had taken we had added the natural now we need to add the application rule so we'll just add so we'll give a name to our application role so tab rule one and priority would be 1000 action would be allowed now our Target fq DNS is allow Google Source would be star and protocol ports would be HTTP comma https and our Target fqdns would be star Dot pvc.com and star dot Google Dot cool I will just add this now let us wait for it to get update all right as you can see our rule has been updated successfully now let's go back to our virtual machine just try to refresh it and see whether it is working or no so as you can see your google.com has been opened successfully now let us try for BBC as well c.com and let's see whether it opens or no so here our BBC homepage has also been opened successfully so this is how firewall works so here we created the resource Group then we created the virtual networks and under that we also created the subnets then we also had seen how we create the virtual machines and once our setup was ready we created a firewall and we saw by adding root tables and some of the rules in firewall and and we saw how actually firewall works by adding the root tables and the rules [Music] why do we actually need storage all right so for this let's take an example first let's take a use case and see and understand what is the need of storage in today's era all right so for example we have an image processing application all right so for this image processing application we have given the user interface as a website so around say a million people can access my website and put in their request to process their image all right so we don't want the processing of the image to happen on the server which is hosting my my website I wanted to be happening on some other server all right I want the processing to happen on a backend server so for that we have some back-end servers now my request for the processing will come in from website servers so I need a place wherein I'll be process I'll be storing all the jobs right which can be accessed by the backend server as well so for that I need an entity where I wherein I can dump in all the jobs which are there which are to be done by the backend servers now obviously all the jobs uh cannot be done simultaneously by the backend servers right so say like I said there are like million people who are accessing your website at once and they put in a million requests right and your backend servers cannot process all the requests at once so they will do it one by one right and they can do that with the help of this entity they will they will pick up a job do that job and then go again back to the entity pick up another job go back to the entity and so on all right so now when you have all the jobs in this entity these jobs have to be distributed equally to the backend servers right once that is done once uh your backend servers process all the images which are there or process an image which has which had some operations to be done on it then the image has to be stored somewhere right because you have to store the end result somewhere so you will store all the properties like the name the location of the image everything on the database But Here Comes the change you cannot store an image on a database I mean you can actually actually do that but when you look at the data that an image contained it is all randomized it is there is no structure in the data that an image has or for that matter any video file has or any any kind of file has right so that is the reason we need an entity to actually store this kind of data so that uh the processing that is required to query uh this kind of data which would have been there in the case when you would have stored your image in the database right so in in that case process a lot of processing is required if you try to query them so we wanted the processing part to become less the processing burden to become less and hence we hence we wanted an entity we can do which can actually store any kind of file Let It Be images Let It Be video files Etc came in storage so let's discuss the first case wherein we had to store the jobs right so when storage came in now we can actually put in say 10 000 jobs per second inside uh the storage uh the storage service without any uh overburden on any of the servers breathe back-end servers or be it website servers right so the processing time has drastically reduced and the jobs are now listed uh in the queue so a queue is actually a service which is offered by storage right so now what back-end service will do is it will take up the job from there and execute it and once they have executed that job will be deleted and the next job will be queued next uh when we talk about this section wherein we had to store images so uh now any kind of file can be stored on the storage service which is offered by Cloud uh it's it's not only limited to the cloud if you think about it in your computer in your own local computer on your mobile you store uh some pictures or you store some video files right and you store it inside a file system it is not a database it is a file system and it can contain all the objects that you want to store you don't you do not store your objects inside a database all right so this is why storage is needed let's go ahead and understand what is the difference between storage and a database so a storage is basically needed uh whenever you have objects like I said so you have when you have music files when you whenever you have video files whenever you have images right so in all these kind of cases you use a storage uh kind of service but when you have uh see something related to the metadata of a file for example when you store the file uh in in the storage you need to have the location of that particular file you need to have the properties of that particular file so all these things all the properties all the uh any location column that you want to add all these are structured right they are not randomized and hence they can be added inside a database be it SQL or nosql it can be added right so this is the main difference uh this is uh how you'll differentiate between using a storage and a database moving along guys so now we have understood what a database is what a storage Services when do we use the storage service when do we use the database service let's move on to the topic of the day which is azure storage so what is azure storage so Azure storage is a service from Azure so you use it whenever uh you want to store something on the cloud and since we are using the cloud provider as azure we'll be using the storage service from Azure and that storage service is called the Azure storage service now when you begin to use the Azure storage service first of all you should have a storage account which you can create in the Azure management portal so let me show you how you can do that uh so let me quickly jump on to my uh browser so that I can show you my Azure dashboard so guys this is how my dashboard actually looks like right so as you can see on my dashboard I have all these Services listed so what I'm interested in today is the storage account so I'll click on storage accounts so I have some deployed already over here I'll click on add to add a new storage account and then I'll have this page now uh you will enter the name of the storage account here so that name has to be uh unique so let me enter a name say live demo two three two double three six all right so that seems to be available then comes the option of account kind all right so how do you want that account to be what would it be a blob storage or would it be a general purpose so we don't want to restrict our account to only blob storage we want a general purpose account right so we'll choose that and then comes the replication part how do you want your data to be replicated now there are quite a few uh what should I say there are quite a few uh good options that we have here one is called the locally redundant storage so when we have the locally redundant storage what this basically means is so understand it like this that there are regions and inside a region there are zones so for example uh we have the US and inside us we have cities like Chicago New York right now say New York and Chicago are two zones and the region is U.S all right so when I say locally redundant storage what that basically mean is that inside a zone that is inside in say New York you have a data center for for Azure in in that data center you are so your storage account has been created so when you select the options or option of locally redundant storage what basically it does is Italy it'll replicate your data inside the data warehouse it will replicate the data inside uh the data center in which you have actually deployed your storage account so that if one server crashes you you have your storage account in other server and that same particular uh premise right but when we choose Zone redundant storage what that basically means is so our Zone was Chicago or New York right so in Chicago if you have a data center in New York if you have a data center so whatever is there in the New York data center will be replicated to the Chicago data center as well so that if one zone goes down so if this if the data center in New York goes down Chicago would be up and hence your storage account can still be used all right so that is what the meaning of uh Zone redundant storages when we talk about Geo redundant storage what we basically mean is you have different reasons so for example we have the US region and we have the India region so inside so inside India region we have different zones right so if it's select the Geo in option we will have the read write access to both these regions that is in the U.S region and the India region as well so whatever you have deployed in the U.S region in the Chicago Zone uh will be replicated across India region as well whatever zones you ever deployed in India regions also all right so this is what the meaning uh what is the meaning of jio in and storage and then we have the read access Geo redundant storage which is a very interesting option wherein you only get the read access of the redundancy for example your main server is in New York right so it's a Zone inside the U.S region so if you select read access Geo redundant storage if in India your your data will be replicated to India as well but when a failover happens that is when your New York server is down and when you are accessing when you are redirected to the India region in that case you can only read the data you cannot write on it all right so it is useful for those applications which only fetch data and there is nothing to write so in that case your internet and storage would actually be pretty costly for you but if you select the read access during and storage it'll it is a little less cheaper than the jio Redundant option since you are not writing on it you don't want your data to be replicated whenever you write uh a thing right and that uh so this is the reason that read access Geo redundant storage is kind of cheap but since ours is the demo today so I'll select a locally redundant storage and then the subscription you can choose pay as you go or free tier in your option if you have created a new Azure account and then comes the resource Group so Resource Group is nothing but a group of resources so it is basically created to manage your resources uh more efficiently in the case uh when you have a large use case wherein you're deploying a host of different Services if you include them inside one group it becomes easier to manage because if you want to delete all the resources you just have to delete the group and all the resources and the dependencies will be deleted automatically right so this is how the resource Group is helpful so we'll create a new Resource Group today Let It Be Live hyphen demo one right and we'll pin it to dashboard and we'll click on create so when we pin it to dashboard it basically has a shortcut on the desktop for Azure so let me so while this is being created let me show you how that dashboard actually looks like so that dashboard basically gives you all the shortcuts that you want so this is the dashboard guys so you get all the shortcuts that you want to your services right so my uh live so my my storage account is now being created it is called live demo 2336 so I can quickly access it from here rather than to go to my store particular storage account over here and then accessing it all right so while this is being created guys this was about as your storage this is how you create a storage account let me come back to my slide so this is what an Azure storage is let's move on to discuss the components of azure storage so right till now we have discussed how you can create a storage account but inside a storage account you have a host of different Services which you can make use according to your use case all right so the first service is called The Blob service so what a blob service so a blob service is nothing but a file system service wherein you can upload any kind of file right so the for those of you who know about AWS so AWS has a service called S3 right so it is exactly like S3 right there is nothing change into it just the name of the service is called blob now inside blob you can upload any kind of file and then that file can be accessed by uh depending on the permissions that you are going to can be accessed by any anyone on the planet for example you have created a website and uh it's showing some images so those images rather being on the websites uh server it could be there on blob and can be accessed directly through the link of any object that you uh actually upload in the blog all right having said that guys let me quickly show you how you can create a uh blob storage uh so let me go back to my dashboard so as you can see my storage account has now been created so if I go into my storage account I'll get the screen wherein I have to choose a service all right so let me click on blob so because that is what I want to create I click on blobs I'll open open it in a uh okay I'll open it in the same Tab and then it'll show me a screen where it will show me that there has nothing been added right so there are no containers yet so what are containers containers are nothing but folders that you have inside the blog all right so you cannot store anything in the root directory that is you cannot store upload any file over here you have to have a folder inside which you will be uploading your files all right so the folders are nothing but containers don't get confused with the nomenclature right so inside this I will create a container called live hyphen demo all right so it's done the access type is blob so what that basically means is if it's private it cannot be accessed by anyone if it's blocked inside this particular container all the files can be accessed and if if it's container this basically means that if it inside this container you have created one more folder that also can be accessed but when we select blob only the files can be accessed so we'll select blob and we'll click on OK and it will hardly take a second to create a container in the blob service all right so we have a container now which is the live demo and if you go inside this container there is nothing in it as of now right so now I have actually created a website which can interact with The Blob service so let me show you how that website actually looks like but before that let me show you the queue service let us cover the queue service first and then we'll come back and discuss the blog right so let me come back to my slide so blobs like I said you have created a blob you can upload any kind of file inside a blob using a website or you can read the files as well from a website all right let's move on and discuss queues so what are queues queues are basically uh it's exactly like a data structure uh wherein you whatever information goes first is the first to come out as well all right so you use uh queues to basically uh list jobs so in our use case we discussed that we have an image processing application which has millions of users accessing it and since millions of jobs cannot be executed at once they are listed inside the queue so that the server at its own pace can fetch the jobs and executed all right so this is what a queue is all about as simple as that now the way you can create queues in the Azure dashboard is something like this that you have your storage account right so let me go to my storage account so my storage account is now being opening all right so inside my storage account with The Blob Service as well I had queues option right so I'll go to that queues option and create a new queue so uh my my my storage account was live demo 2336 right and now I'll choose cues I'll go inside queues and as you can see as of now there are no queues that have been created so I'll create a new queue and let me call it hello123 all right so I'll create the queue now so my queue has now been created now if I go to my website which is there for queue this is how my website will look like if I want to upload some data into queue so this is a sample website guys that I have created now this website has to have the has to know how to interact with your queue service for that you have a thing called uh uh your connection string so every storage account that you create will have a unique connection string that you have to include in your code now if I click on access Keys here you'll go to your storage account and then you'll have this pane in that you have to go to access Keys once you go to access Keys you will have a connection string and a key so you have to have the connection string in your code included so that your code can interact with the particular service all right so I'll go to my queues code so my Q code is this and if you can see this is the connection string that I have to specify I'll copy paste this connection string over here and this end point has to be removed because it is not required in the connection string once you do that you will save it and my bucket name sorry my container or sorry my queue name is uh hello one two three so it has been already specified here right now I'll go back I will go to my uh website which is this right so I've already specified to it for it to send messages to my queue right so if I send a message saying uh hello world right and I click on send a message it'll actually send that message to my queue which can be seen here so as if you if I go to my queue you can see that a message has been added which says hello world all right so this is what I just entered now if I uh go on and process this queue which is I want to receive message now so I'll have to change uh this in the PHP as well so I'll go to my process PHP for my queue I'll change my connection string so that it can interact with the queue and fetch the messages from over there right I'll change it here I'll change the end point as well and now my code will be able to interact with my queue that I've just created all right so if I go to process.php it'll basically fetch the message from that queue and display it here all right so as you can see the message received is hello world if I enter any other message say uh Eddie Eureka is the best right I entered this message I send it to the queue foreign and now when I process it I'll get the same message back and if you will see over here guys I'll just refresh it now as you can see the message has been processed and it has been deleted from the queue so as and when the message is processed it is automatically deleted from the queue all right so this is how my cues work let me quickly show you how how the blobs work right so let me go to my blobs website so this is where I'll be uploading a file onto blobs and again I have to change the connection string again right so I'll go to my blob code and change the connection string which is here all right I'll change the connection string to my current storage account and my container name I've already specified it to be live demo right I created a live demo container now if I'll save this code and if now I choose a file and say I upload the desert fire all right so what it does is it basically renames the uh the the name of that file automatically according to system time so that there's no Clash when two or three files are updated with the same name all right so I'll upload this file now so as of now as you can see if we go into our blob there are no files right if I go inside my blob which is here there's no file in it right but the moment I upload the file from this particular website my file will be listed here so as you can see my file has been successfully added over here and now if I go to the particular link of this file I'll be able to download the file right so my down my file is now being downloaded and once it is downloaded and if I try to open it will be the same file that I just uploaded right so if I open it now you can see this is the file that we uploaded in the blog and it is now accessible on this particular link by everyone in the world all right so this is how cool blob is now um I'll be using blob and queues uh together let me show you how so what I basically do now is this this image or this image that I've just uploaded has also been added to the cube that I created so let me show you the queue so if you see the queue it'll list the uh message that or the file name that we have just uploaded we'll go into queues this is my queue and as you can see this image has been added to the queue now I'll process this image and the way I'll do it is like this that the image will be fetched from the queue and then the link of that image will be gone to and that image will go into the background of the website all right so for that first I have to change the code obviously so I'll go to my blog process website right and I will change the connection string foreign to the one that we are using right now uh one second so this is my connection string I'll change it to what we are using right now right and then also I have to change the link that will be accessing the file right so this is the link that it is accessing so the link has changed now because the storage account has changed all right so let me select the link for it so I'll go inside blobs and I'll go inside the container so this link is going to so from from this point onwards till this point that is till the container name the link will be same right so let me copy the link and paste it in my code range and I'll show you what I mean so if I paste the link here foreign demo will be same and just the file name has to be fetched right so this is the this file name I'm fetching from the queue and what I'll do now is it will fetch the file name from the queue and will change the background of the particular website right so if I process the page now that is process.php what will happen is it will fetch the file from the queue and it will change the background all right so I have updated the the desert file right so it is showing the desert background now now I'll show you how exactly cues are basically used so say I upload say three four files right I upload this flower image and mind you the fire the the message that we just received would have been deleted from the queue all right and I'll just show you uh whether that is working or not so I'll upload this flower file uh it might take some time because of the files are kind of large all right so uh the file will be uploaded okay it gave me an error let me upload it again so I'll choose a file so sometimes when the size uh when the uh the size of the file is more it throws you an error right you can't do anything about it so let me upload the file again so once it is uploaded guys I I'll be able to see it here right so I have like three files right now so it says well done blog date complete and if I refresh it I'll be able to see four three files now cool so if I go to process.php now it will not list me the previous file that I uploaded but the recent file that I just uploaded so let me go to process and now it will show me the image that I've just uploaded all right it'll fetch it from the queue the file name and then it'll show me in the background all right the net is a little slow bear with me all right so as you can see the file has been there I'm getting some weird error over here just because the net is not working but forget this error guys you can see you you get the image in the background right so similar is the case now let me go to my storage accounts so we have discussed queues and blobs now right so let me go to my slides so we've discussed what blogs are we have discussed what cues are let me go to my file system which is the best thing that I've figured out in Azure so with file system it is exactly like blobs you can upload any kind of file but with file system you can actually mount it as a drive on your computer all right you can use it as it as if it was an extended Drive in your own computer right uh but and also you get an Authentication Protocol with it which is called the SMB 3.0 protocol which is used by servers when Whenever there is a file transaction so that authentication you'll also get when you're using the file system Andrew and how you can mount it let me show you how so uh first of all this mounting process is not available in Windows 7. it is only available in Windows 10 and above so what I've done is I've deployed a virtual machine in Azure let me show you the virtual machine right so this is my virtual machine so I've deployed Windows 2012 server on this now what we'll be doing is we'll be mapping the file that we'll be creating in our storage account in this particular system all right so I'll connect to it and now it'll ask me the username and the password so I'll specify the password and I'll click on OK yes and then I'll be connected to my system so I'm in so this is my system guys so if I go to my computer right now as you can see there is no drive that is listed right so we'll be creating a network drive over here on which we can upload any kind of file now the way to do that is to first go to your Azure dashboard and create a file over there uh create a file system directory over there right so we'll go to our storage accounts so it is in live demo 2336 and then we'll be going to files right so as of now as you can see there is no uh directory which has been listed here so I'll create a file a share service and let me name it something else my drive right and the quota is basically how many GBS of Drive do you want right it can go to terabytes but since I'm doing a demo let me create a 100 GB drive right now right so I create 100 GB drive and I'll click on OK all right so my drive has been successfully created right so I'll go to this drive I'll click on connect and then I will get this command to connect to it all right I'll copy this command and I'll paste it in the notepad so I'll use this in my so that I've just deployed now how will I use this now if you look at this command guys this is the address of your server all right so this is the address of your storage account and this is the file share that you have just created which is called my drive right the username to access this would be Azure slash live demo 2336 all right and the password for this would be the key which has been specified here now I have to specify everything in uh that I just mentioned in this server that I have created all right so the way you'll map the network drive is like this you go to PC you will right click it and you will click on map network drive all right now it will ask us the drive name that we want to allocate it so for example we want to allocate the K drive to it all right and in this particular thing you will be putting in your address of the server right so let me copy the address of the server which is this all right I'll copy the address and I'll save it over here all right so it has saved it and now I'll click on finish so if everything goes well guys It'll ask you for the username and the password now the username would be this that is azure slash live demo 2336 right so I'll go to my server paste the username here and then ask me the password so like I said the password would be this your key so I'll copy the password I'll paste it here that is in this particular field and I'll click on remember my credentials and click on OK so now if everything is authenticated it'll go inside my drive so as you can see I am inside my drive right now and if I go to this PC that is my computer I can see that a drive has been added over here now this drive in this drive as you can see the total size is 100 GB and the space free is 100 GB so this is the quota that we assigned in file share while creating it right now if I want to copy any file over here I can easily do that I'll just copy this particular file and paste it here right so it will easily copy it and say I want to I want a shortcut to be copied here I can shortcut copy the shortcut as well because I don't have any files on this particular server that I can show that I can copy here so as you can see the files have been copied here and these actually have been uploaded to the Azure account as well right so let me show you on the dashboard if these files are visible so I'll go to the Azure dashboard and this is my drive if I refresh it right now refresh so it will list me all the files that I've just uploaded all right so it is taking some time and so basically when you create a network drive in your own computer it is as if you're using your own driver on your local computer if you're an internet network is good right so as you can see we have added these two files in our directory in the server and you can view it here as well and you can download it from here you can click on it you can download it and anyone can download it if they have the link all right so we are done with file system let us come back to a slide and discuss our last component of today which is tables now tables is again an amazing service from Azure so it is just like nosql but it is basically you can say a child of nosql you cannot uh do uh complex queries on it now the way or the advantage of using tables is that say you have a data in which the structure is changing dynamically what I mean by that is say you are enter you have a form you have created a website in which it accepts three things it accepts your name it accepts your mobile number and it accepts your location all right so once it does that um it'll upload it in the database right normal as usual but what if tomorrow I I a use case comes in wherein I want to add one more field which maybe is asking for your credit card number right so if it is asking that if you were using traditional systems you would have to go to your database and add one more column and then go on to chase your user interface and your PHP or whatever scripting language you're using but with tables the thing is that you don't have to change anything in the back end as in you don't have to change anything in your database it'll it'll automatically adjust according to your data and create one more field which is uh basically uh which which is basically you're trying to ingest for example in our case we are trying to ingest the credit card number as well so it will do that automatically so let me show you uh how you can use the table service so we'll come back to our dashboard and go to storage accounts right and again in your storage account you'll have all the services listed so we will select tables we'll go inside tables right and once we are inside tables we'll have to create a new table so let's name this table as say um new table all right and we'll click on OK so my table has now been created all right now the way you can upload your data to this table is the same you just copy the connection string you call the API and you can upload your data but to view the data in the table you will need your visual studio all right so let for this is one table that I created earlier so for adding it I'll show you how to add it but first let me add some values inside this particular table that is my new table right so there is nothing inside it because I just created it so uh what I'll be doing is I will be going I'll be showing you the website through which I'll be uploading the data into the table so it is this so this is my website using which I'll be updating the table right now the way I'll be doing it is I'll be first changing the connection string so the connection string can be found here right I'll have to change this uh this is my connection code for table so I'll have to change the default connection string so let us quickly change it this is my connection string I'll come back to my code paste it here right and remove the end point cool so my connection string has been given now I have to change the table name as well so my table name is new table right and all right so nothing else has to be changed cool so our code is done now I come back to my slide and refresh it all right so now uh there are two things that you have to understand that are there in a table one is called the partition key and one is called the row key all right what is partition key what is row key let me explain you so uh whatever files that you're trying to store in your table are stored in different nodes when I say nodes there are basically different servers all right now each server will have a partition key which can identify it so say there is server one server two server 3 and server 4 all right so I want to store my data in server one all right so my particular type of data that I want to store in server one so for that the partition could be partition key would be one all right now inside my server every row has to be identified by a unique identifier and therein comes in the row key so this row Keys should be unique to each and every record that you will be putting inside a partition key table that is inside that particular server if you change the server again the row key can be same as that of the previous server but when you're you are creating records inside uh one particular partition key the rows key value has to be different all right having said that guys um so we have uh this website and we have created our new table over here all right now what I'll be doing is I will be mapping this new table in my visual studio now the way to do that is open your Visual Studio go to server Explorer and uh once you have uh you you'll be listed your as your service over here all right now I have to map uh my particular storage account on this Visual Studio as well so the way to do that is in the storage section click right click it and click on attach external storage all right so once that is done It'll ask me the account name and the account key now the way to attach it is like this that you will go to the table service uh sorry your storage account you will copy the name paste in the account name and then your connection string sorry your key right so in this case you'll be copying the key and you have to paste the key here remember the account key so that you don't have to enter it again and again and click on OK so this will add your storage account over here right so this is my storage account live demo 2336 right so my storage account has now been added and inside it I have created a table call the new table all right so if I click on the new table as you can see there is nothing that has been added as of now so what I'll do is uh from my website now I'll add a new value so I've already specified the partition key as default it is called task Seattle right all I'll be doing is I'll be entering the name over here entering the row key over here which will be unique all right so as as of now there are no records so I'll enter one as my uh as my arrow key and now I'll enter the columns that I want so this row this particular row is the number is the columns and this is the values that that I want for that particular column so for that I was for example I want the name column to be there right inside the name column I want the name hemant all right I'll specify that and say I want my mobile number to be there I'll specify mobile no right and I'll specify the mobile number here that is one two three four five six seven eight nine say and now I'll click on upload data so this will upload the data to the table that I've just created so if I go here go back to my visual studio and I refresh it I'll be able to see that a new record has been entered in which the row key is one right the name is hemant and the mobile number is this now see I want to add one more field which is asking for my credit card number all right so as I said the row key would be different now I'll enter the name so whatever column you have already specified it will go inside the same column so I have already specified name right so if I enter the value hey month here it will not create a new column and enter the value payment over there it will just uh create then column which is new to the data all right so the mobile number is again say five six seven eight and now it will ask me for my credit card number right so the credit card number could be this all right so now if I upload the data and I'll show you how it looks now so as you can see now the credit card number of field has been added so the for the first record there is nothing there in the credit card number field but in the second record I have added that I want this data all right also guys you can specify in any order there is no specific order that you have to specify right so I can enter the name first right and then I can enter the credit card number say this is the credit card number that I want to add right and say I'm entering the mobile here right if I upload the reader now so this is required sorry three upload data the data will be updated and if I refresh it now you can see that okay so m capital and M Small you have to be careful of the uh of the indentation of the uh sorry your syntax as well so whatever column names you're specifying it is case sensitive so like you can see I made a mistake here by specifying a small C and it created one more column all right but as you can see the name is in the same column now all right so what in in whatever uh sorry in whatever the sequence you can add your record and that record will be added automatically in its respective column all right so this is how tables can be used [Music] what is azure import export Azure import export provides a way for organizations to export data from Azure storage to an on-premises location the service offers a secured reliable and cost effective method to export large amounts of data you can also use Azure import export to import data to Azure storage from an on-premises location however I recommend that you use an Azure data box device when you are in a region where the Azure data box family is supported importing data by using one of the products in the Azure data box family is easier than using Azure import export when you are moving large amounts of data between locations speed and reliability are fundamental requirements even the fastest networks have bandwidth limitations if you need to transfer tens of terabytes of data between remote sites the operation could take several days additionally if the transfer fails at some point you don't want to have to start the whole process again again from the beginning right so the Azure report export service is designed to address these kind of issues only now let's understand the components of azure import export so the first one is the Azure import export service import export is an Azure service that's used to migrate large quantities of data between an on-premises location and an Azure storage account by using the service you send and receive physical disks that contain your data between your on-premises location and in Azure Data Center you ship data that's stored on your own disk drives these disk drives can be serial atom means SATA our hard drives hdds or solid state drives means ssds Azure importer export service is ideally suited to situations where you must upload or download large amounts of data but your network backbone doesn't have sufficient capacity or reliability to support large-scale transfers you typically use the service to migrate large amounts of data from on-premise to Azure as a one-time task also you can use it for backing up your data on on premises in Azure storage or you can also use it for recover large amounts of data that you previously stored in Azure store storage and you can use it for Distributing data from Azure storage to customer sites now the second component is w a import export tool if you are importing data into Azure storage your data must be returned to disk in a specific format use the wa import export Drive preparation tool to do this this tool checks a drive and prepares a journal file that's then used by an import job when data is being imported into azure the wa import export tool is a command line tool that performs the following tasks such as prepare disk drives to be shipped to the Azure data center wa import export formats the drive and checks it for errors before data is copied to the disk it also encrypts the data on the drive it quickly scans the data and determines how many physical drives are required to hold the data being transferred it creates the journal files that are used for Import and Export operations a general file contains information about the drive serial number encryption key and storage account each drive you repair with the Azure import or export tool has a single Journal file so there are two versions of this tool version 1 supports import or export of data to or from Azure blob storage version 2 supports import of data into Azure files so you can download the appropriate version of the blue import export from the Microsoft download Center remember the wa import export tool is compatible only with 64-bit Windows operating systems so the third component is just right so you can shift solid state drives or hard disk drive to the Azure data center when creating an import job you ship disk drives containing your data when creating an export job you ship empty drives to the Azure Data Center so what else you can do is you can request for a data disk from Azure portal so what happens is it will then send you a data posters directly which will be encrypted and there will be a encrypting key also that will also be sent to you on your address and from where you can like store the data and send back to the Azure data center so now that you have understood what Azure import export is and also its components now let's understand how does Azure import export work so Azure import export service is used to securely import large amounts of data to Azure blob storage and Azure files by shipping disk drives to an Azure data center this service can also be used to transfer data from Azure blob storage to disk drive and shipped to your on-premises sites data from one or more disk drives can be imported either to Azure blog storage or Azure files so Supply your own disk drives and transfer data with the Azure import export service you can also use data boss test drive supplied by Microsoft so first let's understand Azure data box disk if you want to transfer data using disk drive supplied by Microsoft you can use the Azure data box disk to import data into Azure Microsoft ships up to five encrypted solid state disk drives means ssgs with a 40 TB total capacity per order means 40 terabyte total capacity per order through your data center through a regional career you can quickly configure disk drives copy data to this rise over a USB 3.0 connection and shift the disk drive back to azure so you can see here this flowchart how it works so first you have to order a Azure data box disk from Azure portal and you will then receive it you have to fill the data in it and then you have to return to the Azure data center from there the process goes on the Azure data center the process will move forward and all the data from the disk will be provided on cloud storage so that's how it get copied and you can see it from your Azure portal you can access all your data from Azure portal through the cloud now let's have a look at Azure storage so the Azure storage platform is Microsoft's cloud storage solution for modern data storage scenarios core storage services offer a massively scalable Object Store for data objects a file system services for the cloud a messaging store for Reliable messaging and a nosql store the Azure storage platform includes the following data services so the first one is the Azure Q service which is used to store Android Drive messages Q messages can be up to 64 KB in size and a queue can contain millions of messages tools are generally used to store lists of messages to be processed or synchronously then there is azure table story which is a service that stores non-relational structured data also known as structured nosql data in the cloud providing a ski store with a scheme left design I mean schema list design so then there is a Blog storage in file storage which we use in for the purpose of Import and Export so Azure block storage is a Microsoft's objects storage solution for the cloud blob storage is optimized for storing massive amounts of unstructured data unstructured data is data that doesn't add to a particular data model or definition such as text or binary data then we have Azure files means file storage so Azure files enables you to set up highly available Network file shares that can be accessed by using the standard server message block means SMB protocol that means the multiple virtual machines can share the same files with both read and write access you can also read the files using the rest interface or the storage client libraries one thing that distinguishes Azure files from files on a corporate file share is that you can access the files from anywhere in the world using a URL that points to the file and includes a shared access signature that is SAS token so you can generate SAS tokens they allow specific access to a private asset for a specific amount of time now let's have a architectural understanding of azure import export to use Azure import export you create a job that specifies the data that you want to import or export you then prepare the data by yourself or prefer to request a disk from Azure data center to use to transfer the data for an import job you write your data to these tasks and ship them to an Azure data center Microsoft uploads the data for you for an export job you prepare a set of blank tests and ship them to an Azure data center or you can request a disk from the Azure Data Center means the disk is the same which I have explained before the Azure data box disk so Microsoft then copies the data to the disk and shift them back to you in this scenario you have decided that using the Azure import export service will meet your requirements let's understand how to import data to Azure so before you put data to Azure storage you must have the following items like an active Azure subscription minimum of one Azure storage account a system running a supported version of Windows and a bit logger enabled on the Windows system then the correct version of the wa import export tool install version 1 to import data to Azure blob storage or version 2 to import to Azure files okay remember these things these two distinctions so download this tool from the Microsoft download Center and also I will show it to you in a like further when we move on to the further process I will show you in a demo also how you can download a Blu-ray input Expo tool also like an active shipping career account also you should have like FedEx or DHL for shipping price to an Azure data center so you can see this diagram which summarizes the steps involved to import data so you have to complete the following steps like determine data to be imported number of drives you need also the destination blob location for your data in Azure storage then second step is to request Azure data box disk from Azure data center use a wa import export tool to copy data to test drive encrypt the test rise with the BitLocker or you can prepare this by yourself like that can be repaired by you can connect each disk to your Windows system and create a single NTFS volume on each disk each disk must use a set of connector enable BitLocker on the interface volume and then copy the data to the encrypted disk by using a tool like robocopy open a command prompt window go to the folder where you install the WM Port export tool run the following command to retrieve the BitLocker key for the drive these are the commands you can see so the first command is manage BD protectors get Drive later so drive it has nothing like you have seen your local disk like local disk c local disk D local disk F so these CDF these are Thrive letters so this is the first command you have to give then repair the disk by running the wa input export command this command can take a significant amount of time to complete depending on the amount of data so you can see the second command here which is quite a major one so you have to follow all the things in it like it is showing like this is the default and the blue input export.exe to play input this is default what will changes in the bracket thing like you can see Journal file name that will be generated when you use the W import export tool and also there will be a session number that will be generated ID session number so that also you have to put and then there will be the same Drive letter there like which drive you want to view it on cloud so that drive you are going to upload right so for that the drive letter you have to give just like I have explained you like this try later for local C then D then if all these things cdfr try data then you have to give the bit local key that will also be generated through wa import export tool and then again the source directory privator you have to give and Destination director container name you have to give that will also be there so then you have to select the blob type whether it is a Blog blob or page blog or okay then the default code that is escape write and enable content md5 that is the default thing okay that you don't have to change it so that's how it has been done so next step is to create the job so create an import job by using the Azure portal of the rest API provide the following information like name for the job or create a storage account in which to import the data shipping address of the Azure data center for the storage account also like the return address where Microsoft should send the drive back to also a list of drives that contain the data for the job also BitLocker key used to encrypt the data for each drive then use a supported career like DHL or FedEx to ship the drives to the Azure data center the carrier provides the a tracking number you add this tracking number to the import job then you have to check the job status from Azure data center like there will be certain tasks will be going on in Azure Data Center and they will be updating you with the job status so first is They will receive the disk at the Azure data center so Microsoft updates the import job to indicate that the disk have arrived you can drag the job status from your import job page in the Azure portal then there will be transfer data so Microsoft copies the data from the disk into the specified storage account this can take some time depending on the volume of the data and number of tests there is no SLA for this process but it should be complete in 7 to 10 days after receipt of the disk Microsoft updates the status of the job to indicate the data is being transferred when the transfer is complete Microsoft changes the status of the job to indicate that the data is now available in Azure storage also if you would have chosen to send your own disk then after the data has been uploaded to Azure storage Microsoft key packages your disk this job status is updated then Microsoft sends the disk back to you using your selected carrier this job status is changed to indicate that the job is now completed now your task is left that is you have to receive your encrypted test from the Azure data center then last thing is you date time Azure storage so verify that the data has been copied to your storage account now you need to understand how to create an export job so you can use the import export service to export data from Azure blob storage only remember this okay and you can't export data that's stored in Azure files so you must have the following items to support the export process like an active Azure subscription and an Azure storage account holding your data in Azure blob storage okay second thing is system running a supported version of Windows okay then a BitLocker enabled on the Windows system and a wa import export version one downloaded and installed from the Microsoft download Center and active account with a shipping career like FedEx or DHL for shipping price to an Azure data center also a set of disks that you can send to an Azure data center on which to copy the data from Azure story also the second option as I have told you other than sending address that is to request a data boxes from Azure data center if it is available in your region okay now you can see the following diagram which summarizes the export process to export data you complete the polling steps so the first one is determine the data to be exported number of drives you need and Source blocks or container paths for your data in Blob storage second thing is request an Azure data box disk from the Azure data center or you can send a test by yourself to the Azure data center in your nearby region and provide the return address and career account number for shipping the price back to you the third one is create job so create an export job by using the Azure portal or the rest API provide the following information like name for the job storage account that holds the blobs to export and blobs in the account it contains the data to export also the shipping carrier and return shipping information for the disk then send your disk to the Azure data center identified by the export job the data center is assigned based on the location of your storage account you can check the number of disks required for the export job by using the preview export argument of the wa import export command to provide the details of your export job as parameters in this command then you have to check the job status means whatever the process is going on in Azure data center task so that will be updated in your job status the first thing is receive list so when the data center receives the disk Microsoft updates the status of the job to indicate that the tests have arrived this process is almost like same as imported also now the second thing is the transfer data so Microsoft copies the data from Azure blob storage to your disk so the status of the job is changed to show the data is in the process of being transferred when the transfer is complete the job status is updated again it can take several days to transfer the data to your disk depending on the size of the export job then package this so Microsoft repairs the disk for shipping the price are encrypted through BitLocker fourth is a ship so Microsoft sends the disk back to the return address is specified in the export job and updates the status of the job again then on your end you can like receive an unlock disk when you receive the disk you can mount them and use them locally the data is encrypted you can find the BitLocker keys for the East drive with the export job details in the Azure portal this is how the import data with Azure import export and export data with Azure import export looks like this is the architectural understanding of it I hope you have understood it let's see the region availability for it so the Azure import export service supports copying data to and from all Azure storage accounts you can ship this price to one of the listed locations these are all the supported shipping locations if your storage account is in Azure location that is not specified here an alternate shipping location is provided when you create the job now that you have understood how the Azure import export service work let's understand when to use Azure import export so the Azure Import and Export service is one of the several options for transferring data in and out of azure storage here now we will explore in Greater detail the question of when you should use the Azure import or export service and when other tools might be more suitable so the first one in this is offline transfer of massive data the Azure import export service is an offline solution it's designed to handle more data than would be feasible to transmit over a network connection using the import export service you take responsibility for preparing and shipping the necessary Hardware Microsoft provides an alternate solution in the form of the Azure data box family the data box family uses Microsoft supplied devices to transfer data from your on-premises location into Azure storage a data box device is an proprietary tamper-proof Network Appliance you connect to the device to your own internal Network to move data to the device you ship the device back to Microsoft which then uploads data from the device into Azure storage the Azure data box is the recommended solution for handling very large import or export jobs when the organization is located in a region where data box is supported it's an easier process than using the import or export service in the second scenario is online transfer of massive data so the import or export service doesn't provide an online option if you need an online method to transfer massive amounts of data you can use Azure stack age or Azure data box Gateway Azure stack Edge is a physical Network appliance that you install on premises the device connects to your storage account in the cloud data box Gateway is a virtual Network Appliance both of these products support data transfer from an on-premises location to Azure in the third scenario is online transfer of smaller data volumes if you are looking to import or export more moderate volumes of data to and from Azure blob storage consider using other tools like is a copy or Azure storage Explorer AZ copy system means Azure copy so Azure copy is a simple but powerful command line tool that lets you copy files to an Azure storage accounts with Az copy you can like upload download and copy files to Azure blob storage or you can also upload download and copy files to Azure files or you can copy files between storage accounts and you can also like copy files between storage accounts from different regions you can use AZ copy to transfer data online across a network so access is at copy you must provide the appropriate Azure credentials to access the storage account or use an sas means shared access signature EZ copy is the ideal tool for copying small to moderate amounts of data as Place possible and with the least cost and effort for large data sets Network bandwidth May limit to the speed at which you can upload or download data to and from Azure storage you can create scripts that use AZ copy then there is azure storage Explorer which is a free tool that you can use to connect to Azure storage and view data through a graphical user interface you can upload and download data held in blobs files and tables you can also examine queues and manipulate queued messages storage Explorer is an interactive tool it isn't suitable for moving more than the smallest amounts of data and it can't be easily scripted a version of azure storage Explorer is provided in the Azure portal you can also use is azure Powershell and the Azure CLI to upload the online interface to upload and download data these interfaces are programmatic enabling you to produce complex scripted solutions that can incorporate Transformations mergers and filtering as the data is transferred in the next we have a security considerations for Azure input export the data on the drive is encrypted using AES 256-bit BitLocker drive encryption this encryption protects your data while it is in transit for import jobs drives are encrypted in two ways first specify the option when using the data set file while running the wa import export tool during live preparation then second enable BitLocker encryption manually on the drive specify the encryption key in the drive set when running the W import export tool command line during try preparation the BitLocker encryption key can be further protected by using an external key protector also known as the Microsoft managed key or a customer managed key for export jobs after your data is copied to the drives the service encrypts the drive using BitLocker before shipping it back to you the encryption key is provided to you via the Azure portal the drive needs to be unlocked using the wa import export tool using the key then the next is pricing segments so they there are four pricing segments first one is a drive handling fee so there's a drive handling fee for each Drive processed as part of your import or export job so you can see the pricing options for Drive handling fee as it is shown you have 80 dollar flat free per storage device hundred so this is how it looks like and return shipping fee of devices also that we will talk about the next one so the next one is shipping cost when your ship tries to Azure you pay the shipping cost to the shipping carrier when Microsoft Returns the drives to you the shipping cost is charged to the carrier account which you provided at the time of the job creation and the third one is transaction cost so standard storage transactions charges apply during import as well as export of data so standard egress charges are also applicable along with storage transaction charges when data is exported from Azure storage so you can see here I have chosen the currency US dollar so these are the pricing options for it for data transfer then for inter region there are different charges and for InterContinental data transfer also there are different charges then we have for internet Tigress routed via Microsoft Optimum Global network and internet access routed via routing preferences Transit ISP Network they are different different prices but different different plans of certain specific amount of data per month okay so you can see the different prices here then we have the prices for data box discussed as I've explained you it is very important like we prefer to request it from Azure data centers so Azure cost there is a certain price for it so which we have to pay when we request it from Azure data center so these are the prices you can see like for East U.S region because there are certain different prices for different regions so East us and dollar currency can we see like service fee is 250 and extra DP is 15 so different different kinds of prices are there I hope you have understood this so this is just for data box then there's different prices for data box disk also data was Heavy also in database Gateway also now that you have a theoretical and Architectural understanding of azure import export service let's now see a practical implementation of it by trying our hands on Azure portal so you can simply go to Azure portal this is how the Azure portal looks like so first step is let's request for a data box as I have told you that you have to request a data box if you are not going to send your own disk you can request a data box from Microsoft and it will send you to the to your address whichever address you provide okay so this is how the Microsoft Azure portal looks like if you don't have your account on Microsoft Azure create one it's a very good platform to have your account on it's a 12 month free trial we'll get through it and also like there will be a 14 500 INR credit if you are in India supports an INR there might be different plans in us or different other countries okay so do make your account okay first let's understand how to request Azure data box disk because as I have explained you if you are not going to send your own disk something so you can actually request an Azure data box from the nearby region of yours okay so let's go to Azure data box let's type data box yeah so we can create from here then import to a transfer type import to Azure choose import to Azure then subscription free trial Resource Group like I have a degree demo if you don't have one create one resource Pro then select region I can choose area destination Azure region invested here remember I have told you the availability region so they are these are the availability regions of data box also okay so you can just apply from here but as I have a like a free trial account after this step you cannot like actually move forward if you have a free trial account you need to purchase and starting rights for it that's why it is showing like 40 TV 100 TV 1000 TV one TV it's this is like the sending your own disk that's a different thing but here you can see different plans are there I have a free trial that's why it's a limited access so I cannot order it from here so that's how you can just request for Azure data box let's start with input export service so you can just go to and explore jobs they create the job from here it will show if there is an already a job so there's no job yet so you can create from here what you have to do it from here is this whole scrub free trial and choose the resource Group like Finance certificate demo name you can get import temp right so import to Azure choose import or export whichever you want to do right now we are going with import so yes we can import one so next job details so remember I have explained this this of the blue import export tool that will generate the general file if you select the disk so you can go it from here yeah so here service devices shop oh yeah so here it is you can download it from here it will get downloaded and you can extract and like run the application in your system and download it so let's assume we have downloaded the journal file okay now let's select the destination Azure region so what we can do is yeah our region is uh West India right best India we can choose storage account we can choose like I don't have any storage account right now so what I can do is I can create a storage account go to hotel again storage accounts we can choose from here so create a storage account yeah so you can choose the resource Group from here and then you can give a name to a storage account so let's give import export store and then region we can select I'll choose India listed here right so performance everything is okay not a problem next Advance just review and create after this that's it create so it is getting deployed so yeah it's been created you can go to the record demo Source will be showing here what's the name of it and code export or you can replace it if it's not showing in refresh it once it will show them go to export storage accounts here again refresh yeah now it is the storage account import export store right open it from here this is how everything information is given but that's it you just have to create that there's no other use of it so you can just give the name here now import export store we can refresh at once it's not showing right now you can again choose a network a demo on import demo we can view important to Azure extra details then as I have explained you for this upload Journal files and everything Okay and like download this from the blue import export tool and then create Journal file by selecting the disk you want to get it uploaded into you want to import into the Azure Cloud right so allow it to the region so we can choose the region West India storage account code export store uh it is there right so actually okay then we can go to shipping and you can give the address also and also the uh your address also like if you're going to provide going to send your disk so if you select that from Azure data box now if you select the select the option of sending your disk then they will provide you an address where you can send to the Microsoft that interest will be given and you have to give your address also so that they can return back the disk and also if you have chosen the data box option then you just have to give like your address they will they will send you the address by themselves and then you have to like return it back to them online okay that's it then text and if you create and just create it right now we haven't created a general file so it will not move forward so these are the steps you can see like I have not read it here thank you too first step is to determine the data then number of drives required and blob location that is the first thing we will do and the second is to download the under blue input export tool and create a journal file then you have to create import export job it is as it is shown here then you have to shift drives to the provided address okay whatever address is provided you have to ship them if you are sending your own disk otherwise it's a different step so then the first one is to drive process that Azure data center that will be processed there and it will be like whatever process is going on it will be updated in the same input export job you are creating here like the import job you're creating here then drives our ship back to you okay if you have sent your own right now what's different in exposure the only thing different to the export job is you don't have to use the W and code export tool to encrypt our the BitLocker right by encrypting your data and everything what about the export tool is used so you can just go to Basics again instead of this we can choose support from Azure we can give it foreign if you want to choose a specific something you can choose by yourself okay then next shipping same everything shipping and everything is given and you can just give me all this information and this all the steps are same and you can just go to text and review create and just complete that's it I hope you have understood this it's not that difficult it's actually quite an easy process now let's understand some of the use cases for Azure import export service so consider using Azure import or export service when uploading or downloading data over the network is too slow or getting additional Network bandwidth is a cost prohibitive use the service in the following scenarios like first use cases data migration to the cloud so move large amounts of data to Azure clay and cost effectively that's very simple in that and second thing it provides is cloud backup so take backups of your on-premises data to store in the Azure storage and the last one is data recovery so you can recover large amounts of data stored in storage and have it delivered to your on-premises location [Music] what is azure app service the Azure app service is an HTTP based service for hosting web application rest apis and mobile backends it is a platform as a service where you can develop in your favorite language be it.net.net code Java Ruby node.js PHP or python with Azure app service you pay for the Azure compute resources you use the compute resources you use are determined by the app service plan that you can run on your apps so in simple words if we say Azure app service is a web-based service that hosts web application rest apis and mobile backends now let's see why do we use Azure app services Azure app service is a fully managed platform as a service offering for developers so here are some key features of azure app service multiple language and Frameworks so here the app service has a first class support for asp.net asp.net core Java Ruby node chairs PHP or python you can also run Powershell and other scripts or executables as background services looking forward manage production environment here the app service automatically patches and maintains the OS and language Frameworks for you you can spend time writing great apps and let Azure worry about the platform next is containerization and Docker you can dockerize your app and host custom windows or Linux container in the app Services you can also Run multi-container apps with Docker compost and migrate your Docker skills directly to the app service next is devops optimization so here you can set up continuous integration and deployment with Azure devops GitHub bitbucket Docker Hub or Azure container registry you can also promote updates through test and staging environments and manage your apps in the app service by using Azure Powershell or cross platforms like command line interface next is global scale with high availability here you can scale up or scale out manually or automatically you can also host your apps anywhere in Microsoft's global data center infrastructure and the app service CLA promises High availability moving ahead serverless code so here you can run code snippet or script on demand without having explicitly Provisions or managed infrastructure and pay only for the compute time your code actually uses next is application templates so here you can choose from an extensive list of application templates in the Azure Marketplace such as WordPress Joomla and Rupal last but not the least security and compliances so here app service is ISO SOC and PCI compliant you can authenticate user with Azure active directory Google Facebook Twitter or Microsoft account you can also create an IP address restrictions and manage the service identities at last connections to SAS platform and on-premises data so here you can choose from more than 50 connectors for Enterprises systems such as SAS services and internet services and access to on-premises data using hybrid connections and Azure virtual Networks so these were the key features which makes Azure app service more demanding let's quickly have a look on some of the app service application types so there are four app service application types that is web app API app logic app and mobile app so let's quickly look on to each one of them so first is web app Azure web apps are used for hosting website and web applications it lets us quickly build deploy and scale Enterprise grade webs mobile and API apps running on any platform it helps us to meet rigorous performance scalability security and compliance requirement while using a fully managed platform to perform an infrastructure maintenance next is API app it is used for hosting restful apis the API app features make it easy to develop post and consume apis in the cloud or on-premises the advantage of hosting apis in Azure API app is that we will get Enterprises gate security and simple Access Control automatic SDK generation and seamless integration with logic apps next is logic app logic apps are used for business process automation system integration and sharing data across clouds and last mobile app mobile apps are used for hosting mobile app backends where we can deploy our mobile backend services on Azure using Azure mobile app by implementing our mobile backend service on Azure our mobile backend will be able to communicate with different Azure Services we can able to take the advantage of various features that are provided by Azure mobile apps so these were the four app service application types so let's quickly move on to our next topic that is app service plan so to create an app service you need an app service plan without an app service plan you cannot create an app service the app service plan consists of many components so it means the app service is a baseline but the actual compute resources here the app service plan can have multiple app services and in these each app Services we have different deployment slots so what is deployment slots deployment slot allow your function apps to run different instances called slots slots are different environments exposed via publicly available endpoints one app instance is always mapped to the production slot and you can swap instances assigned to the slot on demand when you create an app service plan in a certain region a set of compute resource is created for that plan in that region so whatever apps you put into this app service plan runs on these compute resources as defined by your app service plan so each app service plan defines the operating system maybe it is Windows or Linux next is region so it can be any region whichever region of your choice you choose whether it can be West Us East U.S Central India Etc and the size of the VM instances so you can choose a small medium large and for sure pricing tires so there are different type of pricing tires such as free shared basic standard premium premium version 2 Premium version 3 isolated isolated version 2 Etc so we'll look on to these pricing tires in further so let's quickly have a small demo on Azure app services so for this we need to move on to our Visual Studio code so let's quickly go on to our Visual Studio code and the first thing we need to do is to create a folder so I have already created a folder let's open that that is azure app services so I'll select this folder and as you can see here the folder is here so within this folder we'll be creating a dotnet program so let's open our terminal yeah let's create our.net program Dot net new BC so here you can see your web application is being created and you can see all your folders are available over here so as you can see your program has been built successfully now let's run this program so we'll just hit the command dot net run all right so here you can see it is building so let's wait for a few seconds yeah so here you find your Local Host where your program has been created and it has done successfully so just click on this link this is the localhost where our program is done all right so here you can see you're in your Local Host your program has been run successfully now what do you need to do you need to deploy it into the app services so how do you do that let's go back to our Visual Studio code now for this you need to have an extension that is azure extension so how do you do that you just go to this extensions Tab and here you can just select Azure account and Azure app service you need to install these two extensions to Avail all the services so as I have already installed these two accounts and Azure app service so it's already feeded inside so let's quickly move on to our azure and here as you can see in the resources our Azure labs this is our account where we can utilize all the services so as you can see here there are many services now what do we need to do just click on the create resource and we'll select create app service web app so we'll just click on this and we'll just give a unique name to our web app so let's give it a new name Azure web app demo three all right so let's hit enter so here they will be asking what type of language it is so it is not net six that has been recently used and they ask about the subscriptions that is our pricing tired so let's take it free and here our web application gets created so as you can see your web app has been created successfully and here you can see our Azure web app demo 13 web application so here now what do we need to do is to deploy so let's just right click and deploy to web app so we'll just click over here and we'll just browse so as you can see here you can see our folder itself so I don't have to like go into the folders and find my Azure app service demo and I'll just quickly click on this and they ask permission for configuration so just add configuration and then they ask whether are you sure you need to deploy the Azure web app demo 13 or no so just hit deploy and as you can see it's feeding all the programs are being feeded in your Azure application so here you can see all your packages and all your programs are being stored into the Azure portal Azure app service so let's wait till it gets completed all right so our deployments is successful so now we'll just browse the websites just to check whether it has come into our Azure or no okay so this is the link we'll just follow this link and let's see yeah so here you can see it is azure web app demo 13 dot website.109 so it has been deployed successfully so this is how you deploy into the Azure app Services now let's quickly move on to our Azure portal and let's check whether we can see it over there as well or no so this is our Azure portal so let's go to all services so here you can see it we have the web application Azure web app demo 13. so let's just go into it the type of overview all right so as you can see over here all your details are given over here you can see all your metrics based on whatever files are there and let's quickly move down and see our scale ups and scale down that is our app service plan so as we had chosen the free option so here you can see this is our shared compute so you can upgrade your tires based on your requirements so as you can see here different pricing tiles are there so here you have the recommended pricing times based on that you get your features so the higher the tire the more upgraded features you can Avail in your Azure app services so you can change it anytime whenever you need it so for currently like we have it for free so we have no other features internet so let's also check to our Azure service plan over here you can get app service plans so here you can see our app service plan and just view so you get all your details over here so here you can see your pricing tire is free and your operating system is Windows and your one application is running so there's one application created in your Azure app services all right so if you wish to see here scale UPS same thing so here also you can see anywhere you can just upgrade your tires based on whatever requirements you need to have this is how you work with Azure app services and it's quite pretty easy so you can do it or through Azure portal as well or you can do it with our Visual Studio code like how did we do it so this is how you work with Azure app Services you can just feed in any of your desired programs or your web applications and you can just directly put into your app services so it's quite pretty easy like through Visual Studio code you just need to hit some commands and you can directly like deploy your applications into the Azure portal so this was all about the demo and lastly we come to our pricing tiers the pricing tier of an app service plan determines what app service features you get and how much you pay for the plan so the Privacy tires are available to your app service plan depending on the operating system selected at the creation time so there are few categories of pricing tires that is shared compute dedicated compute and isolated in shared compute we have free and shared two base tires that runs an app on the same Azure VM as other app services including apps of other customers these Styles allocate CPU quotas to each app and that runs on the shared resources and these resources cannot be scaled out next is dedicated compute in dedicated compute we have basic standard premium premium version 2 and premium version 3 tires that runs app on dedicated Azure VMS only apps in the same app service plan share the same compute resources the higher the tire the more VM instances are available to you for scale outs and lastly we have isolated in this we have two types of pricing tires that is isolated and isolated version 2. so these two tires run on dedicated Azure virtual Networks it provides Network isolation on top of the compute isolation to your apps and also provides some Maximum scale out capabilities [Music] now let's have an introduction of a database first let's understand why do we need a database so the various reasons a database is important are first of all it manages large amounts of data a database stores and manages a large amount of data on a daily basis this would not only be possible using any other tool such as a spreadsheet as this would simply not work second is it's accuracy so a database is pretty accurate as it has all sorts of built-in constraints checks Etc this means that the information available in database is guaranteed to be correct in most cases it's easy to update data in the database so in a database it is easy to update data using like various data manipulation languages available one of these languages SQL for this security of data so databases have various methods to ensure security of data there are user logins required before accessing a database and various access specifiers these allow only authorized users to access the database so fifth is data Integrity this is ensured in databases by using various constraints for data data Integrity in databases makes sure that data is accurate and consistent in a database the last is easy to research data it is very easy to access and research data in a database this is done using dedic query language which allow searching of any data in the database and performing computations on it now that you have understood the need of a database let's briefly understand what actually it is so a database is an organized collection of structure information or data typically stored electronically in a computer system a database is usually controlled by database management system together the data in the database management system along with applications that are associated with them are referred to as a database system often shortened to just a database so data within the most common types of database in operation today is typically modeled in rows and columns in a series of tables to make processing and data querying efficient the data can then be easily accessed managed modified updated control and organize most databases are structured query language for writing and querying data databases are used to support internal operations of organizations and to underpin online interactions with customers and suppliers databases are used to hold administrative information and more specialized data such as engineering data or economic models example includes computerized Library System flight reservation system computerized past inventory system and many content Management systems that store websites as collection of web pages in a database now that you have an understanding of Microsoft Azure as well as of a database let's now take a look at different types of databases in azure first is relational database a relational database is a type of database that stores and provides access to data points that are related to one another relational databases are based on the relational model and intuitive straightforward way of representing data in tables in a relational database each row in the table is a record with a unique ID called the key The Columns of the table hold attributes of the data and each record usually has a value for each attribute making it easy to establish the relationships among data points in a relational database all data is stored and accessed by relations so relations that store data are called base relations and in implementations are called tables other relations do not store data but are computed by replying relational operations to those relations these relations are sometimes called derived relations in implementations these are called views or queries derived relations are convenient in that they act as a single relation even though they may grab information from several relations each relation or table has a primary key this being a consequence of a relation being a set a primary key uniquely specifies a tuple within a table while lateral attributes are sometimes good primaries so this is all about relational database then second we have is non-relational database also known as nosql databases so nosql database or non-relational database provides a mechanism for storage and retrieval of data that is modeled in means other than the tabular relations used in relational databases so non-relation databases are increasingly used in big data and real-time web applications and non-national databases are also like sometimes called not only SQL to emphasize that they may support SQL like query languages or sit alongside SQL databases so the prominent round relational database is provided by Azure is Cosmos database that I will explain you further in this video and third is in memory database so an in-memory database also like in the short form we say it as IMDb also like a main memory database system or mmdb or memory resident database these are all the names of it so and in memory database is a database management system that primarily relies on Main memory for computer data storage it is contrasted with database Management systems that employ a disk storage mechanism in memory databases are like password and disk optimized databases because this is slower than memory access the internal optimization algorithms are simpler and execute fewer CPU instructions accessing data in memory eliminates seek time when querying the data which provides faster and more predictable performance than disk a potential technical hurdle with in-memory data storage is the volatility of ram specifically in the event of a power loss intentional or otherwise data stored in volatile Ram is lost with the introduction of non-volatile Random Access Memory technology in memory databases will be able to run at full speed and maintain data in the event of power failure let's Now understand the architecture of database Services provided by Azure so you can see the it looks like a complex architecture but I will explain you in quite easily so the basic fundamental building block that is available in Azure is the SQL database so Microsoft offers this SQL server and SQL database on Azure in many ways we can deploy a single database or we can deploy multiple databases as part of a shared elastic pool you can see the elastic pools and single database okay Microsoft introduced a managed instance that is targeted towards on-premises customers so if we have some SQL databases within our on-premises data center and we want to migrate the database into Azure without any complex configuration or ambiguity then we can use a managed instance because this is mainly targeted towards on-premises customers who want to lift and share their on-premises database into Azure with the least import and optimized cost we can also take advantage of Licensing we have within our on-premises data center Microsoft will be responsible for maintenance patching and related services but in case if we want to go for the infrastructure as a service for the SQL Server then we can deploy SQL server on the Azure virtual machine if that data has a dependency on the underlying platform and we want to log into the SQL server in that case we can use SQL server on a virtual machine we can deploy a SQL Data Warehouse on the cloud Azure offers many other database services for different types of databases such as MySQL meridb and also post the SQL once we deploy the database into Azure we need to migrate the data into it or replicate the data into it okay then we have is the Azure database services for data migration so services that are available in Azure which we can use to migrate the data from our on-premises SQL Server into Azure so in that the first one is the Azure data migration service so it is used to migrate the data from our existing SQL server and database within the on-premises data center into the Azure then we have Azure SQL data synchronization if we want to replicate the data from our on-premises database into Azure then we can use Azure SQL data sync then we have a SQL stress database so it is used to migrate old data into Azure SQL stress database is a bit different from other database offerings it works as a hybrid database because it divides the data into different types like hot and cold so hot data will be kept in the on-premises data center and poor data in the assure then we have is data Factory so Azure data Factory is used for ETL means transformation extraction and loading so using the data Factory we can even extract the data from our on-premises data center we can do some conversion and load into the Azure SQL database data Factory is an ETL tool that is offered on the cloud which we can use to connect to different databases and like extract the data or transform it and load into a destination then there's azure's security Center so all the databases that exist in Azure need to be secured and also we need to accept connections from known Origins for this purpose all these database Services comes with firewall rules where we can configure from which particular IP address we want to allow connection we can define those firewall rules to limit the number of connections and also reduce the service attack area so now let's talk about the cosmos DB so Cosmos DB is nothing but a SQL it is true that is available in Azure and it is designed to be globally scalable and also very highly available with extremely low latency Microsoft guarantees latency in terms of reading and rights with Cosmos DB for example if you have any applications such as iot or gaming where we get a lot of data from different users spread across globally then we will go for cosmosity because Cosmos DB is designed to be globally scalable and highly available to which users will like experience load latency finally there are two things and one is we need to secure all the services for that purpose we can integrate all these services with Azure active directory and manage the users from Azure Vector directory also to monitor all these Services we can use the security Center so there is an individual monitoring tool too but Azure security Center will keep on monitoring all these services and provide recommendations if something is wrong I hope the architecture of azure database Services is now clear to you now let's uh move forward to briefly understand the database Services provided by issue so the first is the Azure SQL database so SQL database is the flagship product for Microsoft in the auditory it is a general purpose relational database that supports structures like relation data Json spatial and XML the Azure platform fully manages every Azure SQL database and guarantees no data loss and a high percentage of data availability Azure automatically handles patching backups replication failure detection underlying potential Hardware software or network failure deploying bug fixes failovers and like database upgrades and other maintenance tasks so there are three ways we can Implement our SQL database so first is managed instance this is primarily targeted towards on-premises customers in case if we really have a SQL Server instance in our on-premises data center and you want to migrate that into Azure with minimum changes to our application and the maximum compatibility then we will go for managed instance second is single database so we can deploy a single database on Azure its own set of resources managed via electrological server okay then we have this elastic pool we can deploy a pool of databases with a shared set of resources managed by a logical server we can like deploy the SQL database as an infrastructure as a service that means we want to use the SQL server on Azure virtual machine but in the case we are responsible for managing the SQL server on that but in that case we are responsible for managing the SQL server on that particular Azure virtual machine so then we have the purchasing model so there are two ways we can purchase the SQL server on Azure so first is V core purchasing model virtual purchasing model so the vehicle purchasing model in enables us to independently scale compute and storage resources match on premises performance and optimize price it also allows us to choose a generation Hardware it also allows us to use Azure hybrid benefit for SQL Server to gain cost savings best for the customers who value flexibility control and transparency so second is the ddu model it is based on a bundled measure or compute storage and input resources so sizes of the compute are expressed in terms of database transaction units means reduce for single databases and elastic database transition units for elastic pools this model is best for customers who want simple pre-configured resource options in the second database Services Azure Cosmos database so Azure Cosmos database is a nosql data store it is different from the traditional relational database where we have a table and the table will have a fixed number of columns and each row in the table should adhere to the scheme of the table in the nosql database you don't Define any schema at all for the table and each item or row within the table can have different values or different schema itself so now let's understand the possible database structure first one in the structure is a database so we can create one or more Azure Cosmos database under our account our database is analogous to a namespace and it is the unit of management for a set of azure Cosmos containers so the second is Cosmos account so the Azure Cosmos account is the basic unit of global distribution and high availability for globally Distributing our data and throughput across multiple Azure regions we can add or remove Azure regions from our Azure Cosmos at any time I mean Azure Cosmos account at anything so the third is a container so an Azure Cosmos container is the unit of scalability for both provision throughput and storage of items a container is horizontally partitioned and then replicated across multiple regions then let's understand the types of consistency under Cosmos DB so Azure Cosmos database approaches the data consistency as a spectrum of choices instead of two extremes so strong compatibility and eventual consistency are at the ends but these are many consistency choices alongside along the Spectrum so the consistency labels are region agnostic the consistency level of our Azure Cosmos account is guaranteed for all read operations regardless of the region from which the reads and writes are served the number of areas associated with the Azure Cosmos account or whether our account is configured with a single or multiple right regions then there is a request unit so we pay for the throughput we provision in the storage we consume on an hourly basis with Azure Cosmos TV remember this DB means database so then there are request units and Cosmos DB means Cosmos database so we pay for the throughput we provision and the storage we consume on an hourly basis with Azure Cosmos GB the cost of all the database operations is normalized by Azure Cosmos DB and is expressed in terms of request units the price to read a 1kb item is a one request unit all other database operations are similarly assigned with a cost in terms of research units the number of research units consumed will depend on the type of operations item size data consistency query buttons etc for the management and planning of capacity Azure Cosmos database ensures that the number of resource units for a given database operations over a given data set is deterministic in the third database Services Azure data Factory so Azure data Factory is a data integration service based on the cloud that allows us to create data driven workflows in the cloud for orchestrating and automating data movement and data transformation data Factory is a perfect ETL tool on cloud data Factory is designed to deliver extraction transformation and loading process within the cloud the ETL process generally involves four steps so the first one is correcting collect we can use the copy activity in a data pipeline to move data from both on premises and Cloud secure data stores so second is a transform so once the data is present in a centralized data store in the cloud process or transform the collected data by using compute services such as HD inside Hadoop spark data leak analytics and machine learning third is publish so after the raw data is refined into a business ready consumable form it loads the data into an Azure data warehouse Azure SQL database and Azure Cosmos database Etc so fourth is Monitor so Azure data Factory has a built-in support for pipeline monitoring via Azure monitor API Powershell log analytics and health panels on the Azure portal so then there are components of data Factory so data Factory is composed of six key elements all these components work together to provide the data form on which you can form a data driven workflow with the structure to move and transform the data so first one is pipeline a data Factory can have one or more pipelines it is a logical grouping of activities that perform a unit of work the activities in a pipeline perform the task altogether for example a pipeline can contain a group of activities that ingest data from a Azure blob and then runs a hive query and an HD Insight cluster to partition the data so second is activity it represents a processing step in a pipeline for example we might use a copy activity to copy data from one data store to another data store then we have his data sets so it represents data structure within the data stores which point to or reference the data or we want to use you know activities as input output then there are linked services so it is like connection strings which Define the connection information needed for data Factory to connect to external resources a linked service can be a data store and compute resources link service can be a link to readers to or a compute resource also then we have a triggers so it represents the unit of processing that determines when a pipeline execution needs to be disabled we can also schedule these activities to be performed at some point in time and we can use the trigger to disable an activity then the last one is control flow so it is an orchestration of pipeline activities that include training activities in a sequence branching defining parameters for the pipeline label and passing arguments while invoking the pipeline on demand or from vertical we can use a control flow to sequence certain activities and also Define what parameters need to be passed for each of these activities I hope you have now understood the major Services of azure databases so now let's have a look at some of the use cases for Azure database Services first let's see the use cases for SQL database so the first one is developer or test environment an important use case for replicating or migrating data to SQL hosted on Azure is for developer test environments before deploying to the production environment it is pertinent that the data is tested against developer and test environments so Azure SQL database can act as a target for such environments the live production environment can be replicated to the developer or test environment using a database copy so the second is business continuity one of the most important use cases for SQL on Azure is using it as a PR Target to maintain business continuity Azure SQL databases can provide an SLA of up to 99.99 by maintaining several copies of the data this provides business continuity as it allows you to register a geo-rated Indian copies of the data or use active Geo redundant copies as failover points in use of outages at data centers or in regions besides SQL databases you can also use availability groups to make fulfill business continuity demands not only can you use availability groups in Azure SQL virtual machines but also use Azure SQL virtual machine instances as a target for higher availability and disaster recovery and the third one is scaling out read-only workloads apart from providing PC or Dr capabilities active Geo replication can also be used to offload read-only workloads such as reporting jobs to secondary copies you can also extend on-premises SQL Server instance using readable always on replicas and the fourth one is backup and g-stroke so Azure SQL database are backed up automatically on a regular basis and there are no storage costs for up to 200 of the maximum provision database storage you can restore backups to any point in time going back to a pretended period which is determined by the Azure SQL Service Tire in use on-premises SQL Server databases and transaction logs can also be backed up directly to Azure using the backup to URL feature and stored in Azure storage so Azure SQL databases can also be stored on local storage by exporting them to backpack files means backup and these profiles so first one is Advanced analytics so another important reason for hosting SQL in Azure is to make use of azure's advanced genetics platforms such as Azure storage blob and Azure data Lake store a common scenario with Advanced analytics is when users reference data from various data sources use Azure data Lake store as the staging area or perform transformation activities using High verse spark and finally load the data into Azure data warehouse for bi and Reporting bi means business intelligence now let's see the use cases for Cosmos database first of all they are used in iot and telematics so iot use cases commonly share some patterns in how they ingest process and store data first these systems need to ingest bursts of data from device sensors of various flow bills next these systems process and analyze streaming data to derive real-time insights the data is then archived to full storage for batch analytics Microsoft Azure offers Rich services that can be applied for iot use cases including Azure Cosmos database Azure event hubs Azure stream analytics Azure notification Hub Azure machine learning Azure HD insight and power bi bursts of data can be ingested by Azure event hubs as it offers High throughput data ingestion with low latency data ingested that needs to be processed for real-time Insight can be funneled to Azure stream analytics for real-time analytics data can be loaded into Azure Cosmos database for an ad hoc query once the data is loaded into Azure Cosmos database the data is ready to be queried in addition new data and changes to existing data can be read on changed field so change speed is a persistent append Only log that stores changes to Cosmos containers in sequential order then all data or just changes to data in Azure Cosmos database can be used as reference data as part of a real-time analytics in addition data can further be refined and processed by connecting Azure Cosmos database data to HD insight for pig Hive or mapreduce jobs defined data is then for a sample of iot solution using Azure Cosmos database event hubs and storm see the HD inside storm examples depository on GitHub okay then we have the retailer marketing so Azure Cosmo database is used extensively in Microsoft's own e-commerce platforms that runs the Windows store and Xbox Live it is also used in the retail industry for storing catalog data and for event sourcing in order to process Pipelines so catalog data storage scenarios involve storage and querying a set of attributes for entities such as people places and products some examples of catalog data are user accounts product catalogs iot devices Registries and build of material systems attributes for this data may vary and can change over time to fit application requirements consider an example of a product catalog of an automative part supplier every part may have its own attributes in addition to the common attributes that all parts share furthermore attributes for a specific part can change the following year when a new model is released Azure Cosmos database supports flexible schemas and hierarchical data and thus it is well suited for storing product catalog data Azure Cosmos database is often used for event sourcing to power event driven architectures using Exchange feed functionality the change feed provides Downstream micro Services the ability to reliability and incrementally read inserts and updates made to an Azure Cosmos database this functionality can be leveraged to provide persistent event store as a message broker for state-changing events and drive order processing workflow between any microservices in addition data stored in Azure Cosmos database can be integrated with HD insight for big data analytics wire Apache spark jobs so the third one is gaming the database tire is a crucial component of gaming applications modern gaming have performed graphical processing on mobile or console clients but rely on the cloud to deliver customized and personalized content like in-game stats social media integration and high school leaderboards games often require single millisecond latencies for reads and rights to provide an engaging in-game experience a game database needs to be passed and be able to handle massive spikes in request rates during new game launches and feature updates so Azure customer database is used by games like The Walking Dead No Man's Land by next games and hello five guardians so Azure Cosmos database provides the number of benefits to game developers like Azure Cosmos DB allows performance to be scaled up or down elastically this allows games to handle updating profiles and stats from dozens or millions of simultaneous Gamers by making a single API call then Azure Cosmos DB supports a millisecond reads and writes to help avoid any lags during the gameplay the Azure Cosmos database automatic indexing allows for filtering against multiple different properties in real time for example locating players by the internal player IDs or their game center Facebook Google IDs or querying based on player membership in a build this is possible without building complex indexing or sharing infrastructure social features including in-game that messages player Guild membership challenges completed high score leaderboards and social graphs are easier to implement with a flexible schema so Azure Cosmos database is a managed platform as a service require minimal setup and management work to allow for Rapid iteration and reduce time to Market the last one is web and mobile applications so Azure Cosmos database is commonly used within mobile and web applications and is well suited for modeling social interactions iterating with third-party services and for building Rich personal experiences the cosmos database sdks can be used to build Rich IOS and Android applications using the popular zamarin framework so underwear applications first we have the social applications and then we have personalization so in Social applications are common use for Azure Cosmos database is stored in query user generated content means ugc so for web mobile and social media applications some examples of a user generated content are chat sessions tweets blogs posts rating and comments often the ugc in social media applications is a brand of free form text properties text and relationships that are not bounded by rigid structure contains such as stats comments and posts can be stored in Cosmos DB without requiring Transformations or complex object to relational mapping layers data these can be added or modified easily to match requirements as developers it iterate over the applications code thus promoting rapid development applications that integrate with third-party social network must respond to changing schemas from these networks as data is automatically indexed by default in Cosmos database data is ready to be queried at any time hence these applications have the flexibility to retrieve projections as their respective needs so the second thing in webmobile application is personalization so nowadays modern applications comes with complex views and experiences these are typically Dynamic catering to user preferences or modes and branding needs hence applications need to be able to retrieve personalization settings effectively to render UI elements and experiences quickly Json a format supported by the cosmos DB is an effective format to represent UI layout data as it is not only a lightweight but also can be easily interpreted by JavaScript Cosmos GB offers turnable consistency labels that allows fast reads with low latency rights hence storing UI layout data including personalized settings as Json documents in Cosmos DB is an effective means to get this data Across The Wire so these are the use cases for Cosmos DB now that you have a theoretical understanding of azure database Services let's now see a simple deployment of database service on Microsoft azure the simply type Microsoft Azure on Google what you can do is you can create a free account on Microsoft Azure you can start 12 months of free services and around 40 000 rupees of three credits also for using the services we can just directly open the console from here I'll just sign in so for deploying a simple database service so what we are going to deploy today we can like deploy a cosmos database like I have explained to you what is cosmos database or nosql database it is so we can just go to console to the portal item you can go to the portal you can create a resource from here I can search for Cosmos configure Cosmos yes you can see here like three credits I have a free trial account so it is showing that I have 14 500 triggers so this is the like credit amount you get for in a free trial okay so you can create a Azure Cosmos DB from here which one you want to create like you can create Pro sql1 so Resource Group you can give a new one or we have an existing we have a reposmos one this was Cosmos so if you want to choose the existing one or if you want to choose the new one okay so you can choose the existing one from here or if you want to create a new one and create new One Source One Cosmos I hope that works out yeah then give any unique name for this like you will give a demo underscore Cosmos one two three okay it cannot contain uh underscore remember these things okay let us cannot contain this so one two three four I will okay it's not available so I will get five also yeah it's available now and like choose the location whichever location you are located in you can use here nearby location so mine is Asia Pacific Central India so I've chosen this so pre-trial account is already there I applied for it then you can just preview and create before getting deployed it will show you the review for it so remember that on the basis of the location we have selected the creation time will differ okay so you can review it all the information what you have inserted so now you can create so deployment is in progress it will take a few minutes you can see like how the resource has been created deployment is in progress will soon be created you can check details for it from here go to portal again like it is already pinned here or you can search from here okay for Azure data Cosmos TV so just click here so it is showing that it is getting created this is the one I have created before only this one is creating register in progress let's refresh one okay you can see the processing going on here yeah so your deployment is complete showing you can go to Resource from here also you can just replace it from here so yeah this is how it's been created you can open the resource from here and you can go to activity log or data Explorer you can create a database anything or you can see the consistency of it electric for consistency and everything so that's how Cosmos database has been deployed so I hope you have understood this deployment [Music] y1 needs an Azure SQL database extensively so Azure SQL database has the privileged platform for the transition to the cloud there are at least five key reasons driving widespread adoption of the Azure Cloud Model so the first one is cost savings well if you see the most efficient advantage of transitioning to Azure SQL database is that you no longer need to invest heavily in high priced on-site Hardware instead the costume for Azure database can be treated as ongoing operating expense this helps in budgeting as well as cash flow also you don't need to be concerned about replacement appreciation and other Financial worries surrounding Capital assets and expenditures the second key reason is boosted scalability and performance so Azure SQL database is a highly scalable and flexible model by Design far from on-site Hardware you can rapidly spin up additional instances to hover traffic spikes seasonal flows Etc Microsoft specially designed SQL Azure for cloud applications giving it a Performance Edge over other database as a service solutions in many factors also Azure now fully supports both postgresql and MySQL hence if there are pre-existing applications running on MySQL or postgresql and one wants to seamlessly preserve functionality in the cloud then Azure can fully support the transition third is high security so the Azure SQL database service hosted by a network firewall and other safeguards is widely considered extreme really secure for development environments as well as many production environments fourth reason is time is on your side so traditionally to host a SQL Server you require a lot of resources for example you need to buy and set up a physical server with required hardware capacity once the server is set up you need to ensure the required softwares is installed further you also need to set up networking handle failovers capacity planning Etc so with Azure SQL you just need to deploy the service and everything is managed by Azure hence you save not only monetary sources but also the time invested in setting up and reading a solution for Azure SQL in a couple of minutes the fifth and the last reason is the discontinuity and disaster recovery for those who would prefer to take incremental steps towards Cloud acquisition before moving the entire database to the cloud Azure has two outstanding tools ASR and Azure backup that suppose the Azure Cloud to fully protect your on-site execution from downtime and data loss now that you know why we need Azure SQL service let's briefly understand what it actually is so Azure SQL is a family of managed secured and intelligent products that use the SQL Server database engine in the Azure Cloud so these products are Azure SQL database which support modern Cloud applications on an intelligent managed database service that includes serverless compute second is azure SQL managed instance you can modernize your existing SQL Server applications at scale with an intelligent fully managed instance as a service with almost 100 feature parity with the SQL Server database engine best for most migrations to the cloud third is SQL server on Azure virtual machine you can lift and ship to SQL server workloads with ease and maintain 100 SQL Server compatibility on operating system level access using SQL server on Azure virtual machines Azure SQL is built upon the familiar SQL Server engine so you can migrate applications with ease and continue to use the tools languages and resources you are familiar with your skills and experience transfers to the cloud so you can do even more with what you already have now that you have understood why we need Azure SQL and what actually it is let's have a brief understanding of its architecture so there are four layers in Azure SQL architecture so we'll briefly understand each one of them first is a client layer so to be able to access SQL database the client layer X is an interface for applications it includes SQL Server tools open database connectivity ado.net and hypertext preprocessor tabular data stream transfers data between applications in SQL databases and also communicates with applications hence edo.net and odbc can connect to SQL without any additional demands so the next layer in architecture is the service layer which is in between the platform and the client layers that X is a doorway between the two as you can see in the diagram provisioning billing and routing connections come under this layer it validates Microsoft Azure SQL database request and authenticates a user also it establishes a connection between the client and a server and routes packets through which this connection third is platform layer so the layer has systems that host the actual Azure SQL server in the data center each SQL database is a store in one of the nodes and is replicated twice across two different physical servers Azure SQL makes sure that commentable copies of servers are kept within the Azure Cloud it also ensures that the copies are synchronized when clients manipulate the data on them the fourth and the last layer is infrastructure layer this is the first layer from the bottom of the architecture and is responsible for the administration of the operating system and the physical Hardware so these were the layers in the architecture of azure SQL so now let's understand different models and service tiers in Azure rescuer so in this the first is deployment models so Azure SQL provides them following deployment options for database so the first one is managed instance this is primarily targeted towards on-premises customers in case if you already have isql server instance in our on-premises data center and you want to migrate that into Azure with minimum changes to our application and the maximum compatibility then you will go forward for the managed instance second is single database which represents a fully managed ISO database you might use the option if you have modern Cloud applications and micro services that need a single reliable data source a single database is similar to a contained database in the SQL Server database engine the third deployment model is elastic pool which is a collection of single databases with a shared set of resources such as CPU or memory single databases can be moved into and out of an elastic pool the next kind of models are purchasing models so SQL database offers three types of purchasing models the first one is the vehicle based purchasing model which is new and it offers a totally different approach to sizing your database it is easier to translate local workloads to a vcode based model because the components are what we are used to the vcore based model lets you choose the number of vehicles the amount of memory and the amount and speed of storage the vehicle based purchasing model also allows you to use Azure hybrid benefit for SQL Server to gain cost savings the second one is the digi your base purchasing model which offers a blend of compute memory and input output resources in three service tiers to support light to heavy database workloads compute sizes within each tier provide a different mix of these resources to which you can add additional storage resources as you can see from the following diagram the DTU model offers a pre-configured and predefined amount of compute resources vcore is all about independent scalability where you can look into a specific area such as the CPU core count and memory resources something that you cannot control at the same granular level when using the DTU based model the third purchasing model is a serverless model which automatically scales compute based on workload demand and builds for the amount of compute use per second the serverless compute to you also automatically pauses database during inactive periods when only storage is built and automatically resumes databases when activity returns now let's see the service tiers for Azure SQL database so the first one is general purpose or standard model it is based on a separation of computing and storage service the architectural model depends on the high availability and reliability of azure premium storage that transparently copies database files and guarantees for zero data loss if underlying infrastructure after failure happens second is business critical or premium service to your model it is based on a cluster of database engine processes both the SQL database engine process and underlying MDF or ldf files are placed on the same node with locally attached SSD storage providing low latency to a workload High availability is implemented using Technologies similar to SQL Server always on availability group third is hyperscale service to your model it is the newest service tier in the vehicle based purchasing model this tier is a highly scalable storage and computes performance tier that leverages the Azure architecture to scale out the storage Azure SQL database beyond the limits available for the general purpose and business critical service tiers so now that you have understanding of azure SQL let's understand one of its major productive zero SQL database so Azure SQL database is a fully managed platform as a service database engine that handles most of the database management functions such as upgrading patching backups and monitoring without user involvement Azure SQL database is always running on the latest stable version of the SQL Server database engine and patched OS with 99.99 availability platform will service capabilities that are built into the Azure SQL database enable you to focus on the domain specific database Administration and optimization activities that are critical for your business with Azure SQL database you can create highly available and high performance data storage layer for applications and Solutions in Azure SQL database can be the right choice of modern Cloud applications because it enables you to process both relational data and non-relational structures such as graphs Json spatial and XML now let's understand the key features of azure SQL database so first of all Azure SQL database has extensive monitoring and error alerting capabilities so Azure SQL database provides Advanced monitoring and troubleshooting features that help you get deeper insights into workload characteristics these features and tools include the built-in monitoring capabilities provided by the latest version of the SQL database engine they enable you to find real-time performance insights and the second one is a platform and service monitoring capability is provided by the Azure that enable you to Monitor and troubleshoot a large number of database instance second a feature is availability capabilities so Azure SQL database enables your business to continue operating during disruptions in a traditional SQL Server environment you generally have at least two machines locally set up these machines have exact synchronously maintained copies of the data to protect against a failure of a single machine or component this environment provides High availability but it doesn't protect against a natural disaster destroying your data center third is built-in intelligence so with Azure SQL database you get built-in intelligence that helps you dramatically reduce the cost of running and managing databases and that maximizes both performance and security of your application running millions of customer workloads Around the Clock SQL database collects and processes a massive amount of telemetry data while also fully respecting customer privacy various algorithms continuously evaluate the Telemetry data so that the service can learn and adapt with your application the fourth feature is Advanced securing compliance so SQL database provides a range of built-in security and compliance features to help your application meet various security and compliance requirements which are Advanced trade protection and like auditing for compliance and security data encryption as well as data Discovery and classification and also Azure active IIT integration and multi-factor authentication the fifth feature is easy to use tools so SQL database makes building and the maintaining applications easier and more productive SQL database allows you to focus on what you do best building great apps you can manage and develop an SQL database by using tools and skills you already have SQL database supports building applications with python Java node.js PHP Ruby and dotnet on Mac OS Linux and windows as well so SQL database supports the same connection libraries as SQL Server now let's understand the server in Azure SQL database so in Azure SQL database a server is a logical construct that acts as a central administrative point for a collection of databases at the server level you can administer logins firewall rules auditing rules thread detection policies and auto failover groups a server can be in a different region than its resource Pro a server must exist before you can create a database in Azure SQL database or a data warehouse database in Azure synapse analytics all databases managed by a single server are created within the same region as the server remember that Azure SQL database server is distinct from the SQL Server instance or we can say which is like the SQL Server virtual machine which we will discuss later in this video now the next thing we have to understand is single database in Azure SQL so the single database resource type creates a database in Azure SQL database with its own set of resources and is managed via a server which we have talked about recently okay so with a single database each database is isolated and portable each has its own service tier with the ddu-based purchasing model or vcore based purchasing model and a guaranteed compute size so some of its features are Dynamic scalability so you can like build your first app on a small single database at low cost in the serverless compute tier or a small compute size in the provisioned compute tier you change the compute service to your manually or programmatically at any time to meet the needs of your solution so second is monitoring and alerting you use the built-in performance monitoring and alerting tools combined with the performance ratings using these tools you can quickly access the impact of scaling up or down based on your current or project performance needs additionally SQL database can emit metrics and resource logs for easier monitoring the third is security SQL database provides a range of built-in security and compliance features to help your application meet various security and compliance requirements so Azure SQL database has been certified against a number of compliance standards as well now the next is azure SQL database elastic pool so elastic pool helps you manage and like scale multiple SQL database so Azure SQL database elastic pools are simple cost effective solution for managing and scaling multiple databases that have varying and unpredictable usage demands the databases in an elastic pool are on a single server and share a set number of resources at a set price elastic pools and Azure SQL database enables software as a service developers to optimize the price performance for a group of databases within a prescribed budget while delivering performance elasticity for each database so software service developers build applications on top of large scale data tiers consisting of multiple databases a common application pattern is to provision a single database for each customer but different customers often have varying and unpredictable usage patterns and it's difficult to predict the resources requirement of each individual database user traditionally you had two options over provision resources based on Peak usage and overpay or under provision to save cost at the expense of performance and customer satisfaction during Peaks so elastic pool solve this problem by ensuring that databases get the performance resources they need when they need it they provide a simple resource allocation mechanism within a predictable budget so now we need to understand when we should go for an elastic pool so pools are like well suited for a large number of databases with specific utilization patterns for a given database this pattern is characterized by low average utilization with relatively infrequent utilization spikes conversely multiple databases with persistent medium high utilization should not be placed in the same elastic pool the more databases you can add to a pool the greater your savings become depending on your application utilization pattern it's possible to see savings with a few S2 S3 databases so we will be assessing database utilization patterns so in this figure you can see this is an example of a database that spends much time idle but also periodically spikes with activity this cases of utilization pattern that is suited for a pool so this chart illustrates DTU usage over a one hour time period from 12 to 1 where each data point has one minute granularity at 12 10 you can see DB1 means database 1 Peaks up to 90 dtus but its overall range usage is less than 5 dtus and S3 compute size is required to run this workload in a single database but this leaves most of the resources unused During the period of low activity so a pool allows these unused dtus to be shared across multiple databases and so reduces the dtus needed and overall cost so building on the previous example Suppose there are additional databases with similar utilization patterns as DB1 in these two figures the utilization of four databases 20 databases are layered on to illustrate the non-overlapping nature of their utilization over time using the dtu-based purchasing model the aggregate Duty utilization across all 20 databases is illustrated by the black line in the preceding figure this allows the aggregate DTU utilization never exceeds 100 dtus and indicates that the 20 databases can share 100 edtus over this time period this results in a 20 times reduction in dtus and at 13 times rise reduction compared to placing each of the databases in S3 compute sizes for single databases so this is all the concept of elastic pool so overall now we have a conceptual understanding of azure SQL database so let's practically Drive our hands on it so you can directly go to Microsoft Azure reported this is the portal so first let's create the database SQL database go to create you can create the resource Group first uh like you can see there's no Resource Group so you can create a resource Group here it is so you can give the database name like you can give a SQL database and you can like select a server so for that you can create a server like I have explained you now SQL Server how we have to we need an SQL server for SQL database so that's what we are creating here you can give a server name and give here and then enter server admin login so you can give a admin login as well so I drew admin then password okay so password and then we have to confirm password the rate just a second okay so I will then choose my location so that will be Asia Pacific Central India now if you want to go for elastic pool you can choose that or if you want to go for a single database only you can choose it no so you can create an elastic pool from here if you want to go for an elastic pool so you can like create new so yeah being created now you can configure the elastic pool so here you can see like so you can see here the different purchasing models as well as the service to use in it so you can see like we copies purchasing multitude based purchasing model and it has like general purpose and business critical service tiers and directly to have the base extended and premium one so you can go for general purpose or we can go with basic connects with so that will be enough if you need to use is enough 0.88 GB is enough so yeah you can apply for it in this manner we can do or we can this is how we can configure elastic pool and if you want to go with the single database you can just show it now so right now I have chosen so let's go with elastic pool and yeah that's it so rest everything is okay you can just go to review and create and you can like create it so it's validating right now it takes a little time so you can check in the resource Group as well and go for the source groups here remember we have created a resource Group remove this part cut and get validated first now that you have a theoretical understanding of particular SQL database let's try our hands on Azure portal for deploying the Azure SQL database and let's see how it works so you can directly go to Azure portal then you can go to SQL database and create the SQL database create a resource Group you create a new one the resource is created now I will get the database name now we have to create a server it will create a new server okay so server name I have given then I have to give a admin login so I will just give H2 database at Azure DB admin your password then you have to select the region so my region is Asia Pacific Central India now server will be created so now if you want to use a elastic pool you can use it you go over a single database you can go for no and you can like configure it from here like how many cores you want what's the storage you want so everything is given here so you can like select the model whichever you want to choose so I can choose like the basic one I can go with this one only if I will just apply it from here or if I want to go with elastic pool so I can create and last pool here TV so it's been created now again configure it from here in a similar manner like the code base purchasing model is given youtube-based purchase money is given service to yours are given so for this one also I can go for basic that will be enough so you can see the price here it's costing you just apply and it's been done everything is okay then I can just review and create you can review the price again and I can now just create just create so the deployment is in progress it takes one or two minutes max so I can go to Resource Group from here and source which I have created employment resistance progress so here you can see the resource button we are given the name it as Azure resource SQL right Azure SQL resource it is here right now and nothing has been created in this resource because the deployment is listed in progress let's wait for it while yeah so completed let's check it yeah so the server is being created so server is being created and we can just open the server and here first we have to change the firewall settings so we can just go to show firewall settings so here it is given why we have to end at the client IP and it is given like 0.158 so we can give it from 0 to 255 complete we will give and we will just save it then you can go to the overview and if you go back to the resource it is uh you can see like if you refresh until now the deployment might have been completed and you can see the database as well like it is showing here your deployment is complete so yeah come back to here so you can see here like yeah so database has been created elastic pool is also being created so you can just go to database and wherever we have already said like this so you can see on the left side it's pretty cool features given here so in computer and storage if you come you can again reconfigure your elastic pool settings whatever storage you want to select in service to us you want to select you can select okay then there are also connection strings which is a pretty cool feature as you can see here it is a adu.net is given here and if you are working on Java then jdbc is also their odbc PHP code everything is here then there are replicas as well similarly a lot of things are there you can use them as per your need for bi powerapps everything is given so next thing we have to do is we have to connect it to the management SQL Server so if you don't have a management server you can just download it from here just go to management secure server SQL Server management studio is there so you can just click here and download it but I already have downloaded with myself all I have to do is go to the studio so yeah so here it is let's open it and we can copy the server name from here copy to clipboard so we have to add the client MP in firewall right so we just add the client IP and it is showing 158 so we will just give it from 0 to 255 so we will just save it from here so yeah then we can just go to the resource again and we can see like database has been created as DB you can see so you can directly go to database you can see here as well like you can see like it's showing like your deployment is complete so database has been deployed and now you can see like there are pretty cool features given on the website of it you can go to compute and storage so here you can like again change your service to yourself and purchasing models and everything if you want to change so if you can go to connection strings as well so it's pretty cool which you can see like your net is already here and if you are working on JavaScript PC is also here and then odbc php4 everything is there okay so similarly like pretty cool features given as per your need you can use those features for security purpose of our agency visualization purpose Hobby and everything is given here so you can use all of them so we will connect it with management SQL Server studio so if you don't have it you can just go to Google management SQL Server you can get SQL Server management Studio come here click here and just a second it's loading yeah it's loaded so you can just come here and download it from here now what we have to do is because I already have downloaded with myself so I will just open it just a second so here server name is required so we can just copy this over here we have it here so our name you can see all the information is given here last two connection strings and everything is given here so right now and request overview I have copied it I will just paste it here here SQL Server authentication will only remain here okay login is azure remember we have given login password you have to remember these ID and password which I have created for SQL Server if you remember right so don't forget them so login is a real SQL organ is azure so we will just connect it from here we have it here now you can go to new query and you can write the query here itself table where can variable correct Wi-Fi check first done and you can now insert the values into the table so it will be like persons age we can get done then values that I can give any name the name is given and then I will give the age so yeah first name last name and H right so it's been done and it has been centered so after this you can like select it so select here select from persons done okay so yeah so you can now execute the query 98 eighth line this mistake right in persons okay execute now yeah so you can see it has been executed you can say the table has been created and with first name last name and age now you can even see that in tables you can see just a second yeah the dvo persons has been created okay the table is created and similarly you can do it for Azure data Studio as well you can download it on audio data Studio I already have it so you can start a new connection from here you have to give the server name so and go back to here you can copy the server name from here paste it here you can select the SQL login then username and password you have to give so the same issue DB admin and the password which we have created for the SQL Server remember same thing okay database I have to select it's loading just a second yeah so my database name was SQL DB right yeah so I will just SQL DB1 server if there is any no so I will disconnect so we can come to persons which we have already created has been connected from within so we can just go to persons you can see you'll be able to see the query for It Go for edit data so yeah you can see the table here it's been given okay and 26 this is what our code was right so it's directly connected here I hope you have understood this so this was all about Azure SQL database so yeah now let's move forward so next we are going to talk about the second major product of azure SQL family there is azure SQL managed instance first let's understand what is the Azure SQL managed instance so Azure SQ manage instance is the intelligent scalable Cloud database service that combines the broadest SQL Server database engine compatibility with all the benefits of a fully managed and Evergreen platform as a service then SQL managed instance has like near 100 compatibility with the latest SQL Server so database engine providing a native virtual Network implementation and addresses common security concerns as well and a business model variable for existing SQL Server customers SQL manage instance allowing existing SQL Server customers to lift and shift their on-premises applications to the cloud with minimal application and database changes at the same time SQL manage instance preserves all platformer service capabilities automatic patching and version updates automated backups High availability Etc so all this rustic reduce measurement overhead and TCU as well now let's understand why the Azure SQL menu instance is required so Azure SQL management instance for that we need to understand the difference between Azure SQL database which is platform as a service and Azure SQL management Trends so if you know about the like there's one more service and Azure SQL family that is a SQL server on Azure virtual machine so when we create a virtual machine instance and we install the SQL server on it so it has like a lot of features there's a feature difference between the managed instance and usql distributed database when we talk about which is on-premise server so what is the feature difference feature response can be I will give you one of the example like there is a SQL Server agent on when you install SQL server on virtual machine so there's this SQL Server agent which helps in executing your jobs but the same feature is not them Azure SQL database so there are other features except SQL Server agent that are like link server and SQL Server auditing SQL data synchronization DB image which are supported by SQL menu instance but not by the Azure SQL database so so that's where SQL management instance is required so if someone is migrating data from Azure SQL Server like from the virtual machine to platform as a service as a SQL database remember this like keep this in mind that Azure SQL database is a platform as a service and whereas a Microsoft SQL Server which we are talking about the on-premesis server that is infrastructure and service so if you are moving from infrastructure as a service to platform as a service so first of all what we need is we want all the features of SQL Server whatever was there on SQL Server we want that on Azure SQL database if a user wants to move all the data and second thing is like from migrate if you are like migrating to Azure then there should be like less maintenance overhead so with less maintenance overhead what it means is if someone is moving from SQL Server to Azure Square database they don't want to again create a virtual machine and install SQL Server they want the less maintenance less maintenance overhead which means like they don't want to go for extra steps it should be minimized there should be less amount of steps to get it executed so that's where the Azure SQL manage instance comes into playoff it has some cool features that if it has 100 compatibility with the latest SQL Server Enterprise Edition so whenever you're using it you will get the availability of the latest SQL Server whatever the latest addition is other than that it has a backup and high availability as well so with virtual Network we can say like v-net implementation what it means IT addresses like common security concerns and grants Network administrations and full control over access and configuration using like firewalls and security groups so which on the security basis what it does is it keeps it isolated whatever you're making through marriage instance it will create a virtual Network and that the SQL will be installed and the apps are in your private Network can access your marriage instance only so by security point of view picture it is very secure because it is isolated and Azure SQL database which is a public service can be like accessed in multiple manners whereas through manage instance it is isolated so that's your Azure SQL management a great role that is a migration especially so the next thing is key features of azure SQL managed instance so if you see like many distance for SQL database provide like several features that makes a deployment option stand out from single or elastic pool deployment these features include like link servers that enable you to operate a distributed database means it enables you to read data from remote sources and execute commands for those sources from your database instance for example you can use this feature to execute tsql statements that include tables from an outside SQL server or database this which just enables you to configure many API data sources as link servers including Azure Cosmos DB Microsoft Access and Excel Additionally you can use the link servers to implement sharing without direct loading or the use of custom application code second thing is service Brokers that provide native support of for messaging and queen so this is a feature that provides like native support for asynchronous messaging and queen so you can use it to build distribute applications and enable communication between separate databases so when you use service broker you can easily distribute workloads across databases for example separating free courses process intensive tasks on front end versus vacancies this feature manages the communication path for you enabling you to focus on development and testing without securvising data consistency third is database mail so this is a kind of feature that enables you to send email messages from your Azure SQL managed instance so with this you can send query results also you can like notify users of completed processes and attach files from any other in network resources so now let's see the similarities between the Azure SQL database and Azure SQL manage instance so if you look in the management so like both SQL database and SQL menu instance preserves all platform and service capabilities that drastically reduce management overhead and total cost of ownership that is TCO and if you look into the backup so both supports like automatic backup full backups are like taken every seven days different chill 12 hours and like log backups every five to ten minutes so backup retention is like seven days default in maximum 25 years and third is availability so Azure SQL database is like 99.99 to 99.995 percent whatever it is guaranteed for every database whereas in same thing in Azure SQL managed instance and here also like 99.99 percent availability is guaranteed for every database and can't be like managed by user then there is a host accessibility so there's like a node direct control over underlying computes so both are like fully managed SQL Server database engine based on the latest stable Enterprise Edition of SQL Server both are like deployed on standardized hardware and software that is owned hosted and like maintained by Microsoft lastly there's one more similarity that is license so both have like written license model with pay as you go now let's look at the key differences like between the show SQL database and Natural Spirit management if we talk about recovery model so Azure SQ database has a recovery model from automated backups only whereas in manage instance it has like from automated backups and from Full backups placed on Azure block storage then second is a active jio replication so in this SQL database is like supported in all service tiers other than hyperscale and then there is SQL manage instance it is not supported alternative solution is like Auto failover Group so then the next one is auto scale so SQL database is like only supported in serverless model whereas uh in many instance it's not supported you need to choose the reserve compute and storage now the next one which is automatic tuning so which is actually supported in SQL database but not in SQL manage instance and next is elastic jobs which are like supported in SQL database but not supported in SQL managed instance but in the race of that SQL agent can be used okay so next is long-term backup retention which is supported in SQL database and it keeps automatically taken backups up to like 10 years and in manage instance if not supported yet but like manual backups can temporarily work then there is hyperscale architecture which is supported in SQL database but not in manage instance then there is SQL Server profiler which is not supported in SQL database but supported in manage instance then cross database transactions which are not supported in SQL database but supported in manage instance and lastly the database main which is not supported in SQL database but support in SQL managed instance so with this we come to the end of second major product that is manage instance this was all the theoretical concept of manage instance now let's try our hands on it by deploying it on Azure portal as well as a connecting through it let's directly go to the Azure portal so here we go to Source groups we created this one right this was our source group so here we can like create a new one so we will create from here we will create a virtual machine first virtual Network I need to say so virtual Network you will create as we have been through the explanation I hope you remember this explanation right when I was talking about why we need a managed instance that's where I talked about virtual Network as well so here create this network so exposed group I do same name I have to give so here it is accepted and everything is done if you can create it so yeah it's being created now employment is in progress it won't take much time so employment is successful it's been done now we can again go back to our resource that is a row SQL Source here you can see I will representative you can just directly go to Resource from here so virtual network has been created just a second so here it has been created group dot Mi and now the next thing is we are going to make the managed instance so let's create the managed instance and draw SQL manage instance so let's create it so Resource Group is selected manage resistance name you can get so I will just give so virtual Network I have named with the same okay so I will just give it instead of dot I have given this program I and we can configure it from here okay first I will select the region that is same to the India and I will come to the managed instance so yeah generation general purpose is given so much lower than that let's create it for 64. so during and everything is okay apply so I will just create a password so Azure manager since admin right there so then password 16 characters remember that okay so reviewing create so the requirement is in progress yeah so remember this though it takes like at least two to three hours to get deployed so we again gonna wait for it keep that in mind that it takes two to three hours so this was all about the deployment of azure SQL managed instance so let's move ahead to have a brief understanding of third major product in Azure SQL family and that is SQL server on Azure virtual machine so SQL server on Azure virtual machines enables you to use full versions of SQL server in the cloud without paying to manage any on-premises hardware SQL Server virtual machines also simplify licensing costs when you pay as you go Azure virtual machines run in many different geographic regions around the world they also offer a variety of machine sizes the virtual machine image gallery allows you to create a SQL Server virtual machine with the right version edition and operating system this makes virtual machines a good option for many different SQL Server workloads so now the question arises when to use SQL server on Azure virtual machine so as we have read earlier like SQL Server Azure virtual machine is useful when you want to migrate your existing databases to Azure Cloud without doing much work secondly another reason to select SQL server on Azure virtual machine is when you want to use your existing SQL Server licenses and want to keep control of your database and virtual machine and third Factor can be the cost so cost can be the major reason if you have high available systems with high traffic so SQL server on Azure virtual machine can save your significant compared to other options Azure reservations can save up to 70 of your cost on SQL Server so now let's look at the key features of using SQL server on Azure virtual machine the first is automated updates so SQL Server measure virtual machines can use automated patching to schedule a maintenance window for installing important windows and like SQL Server updates automatically second thing is automated backups so SQL server on a draw virtual machines can take advantage of automated backup which regularly creates backups of your database to blob storage you can also manually use this technique Azure also offers an Enterprise class backup solution for SQL Server running in Azure virtual machines a fully managed backup solution it supports always on availability groups long term retention point in time recovery and Center management and also monitoring so third feature is high availability so if you require High availability consider configuring SQL Server availability probes this involves multiple instances of SQL server on Azure virtual machines in a virtual Network you can configure your higher availability solution manually or you can use templates in the Azure portal for automatic configuration the fourth key feature is performance so Azure virtual machines offer different machine sizes to meet various workload demands SQL Server virtual machines also provides automated storage configuration which is optimized for your performance requirements so now that you have a theoretical understanding of SQL server or Rover virtual machine Let's see how it works practically so you can directly go to Azure portal and then you can go to Resource Group and you can like create a resource Group say the new Resource Group is azure server and you can select your region as well so mine is a central India after that you can just review and create so the resource is created then you can go to your resource and you can go to create just search for SQL Server 2019. on Windows on Windows on Windows 7009 okay so we are going to select this one only but in this there are sub options where you can select for the free one I am going to select free SQL Server license SQ 2018 developer on Windows server 2019. then there are different plans and like usage information to support everything is there let it be a default only and just create it now you have to create the virtual machine so you can give the name for it so I will give it as SQL vm1 and the reason I have to select so I will select my region it is Central India then let it be a default everything psql no infrastructure required image is okay I have to select the size so there are different sizes you can select so like you can select the sizes from here and sizes are given but I'm already going with a standard one so that will be enough for me so I will just select this one and I will give the username so that will be Azure admin server and then give your password and do don't forget these username and password we will require this while connecting to the ssms so the password is password submerged public inbound ports so so select this and RDP is selected already so we will just go to the next one that is this and in this we are going to select the standard SSD which is better than HDD as it's faster then you go to networking so our virtual network is this only so everything will remain default so all the options remain default in this and then management also the options will remain default so as in advance then you can just go to SQL server settings directly and here so now SQL server settings you can like see the SQL connectivity and port and everything then you can come to authentication and you can enable this so that will remain the login and ID password which is given now because we are going to use this for connecting it with ssms so Azure keyword integration everything is okay this is okay so you can just go to review and create finally and you can just create it from here so this deployment we're gonna take like 20-25 minutes so we ain't gonna wait here for that you can download the deployment details from here like this you can download and you can go and see the deployment details so what I did is I already created a virtual machine before only this for the demo purpose I will show you so I had a different resource for that so when it will get created the vm1 I created so this was SQL vm1 and these nine reports are being created so you can just go to SQL vm1 so this virtual machine is actually stopped right now so I will just start it but when you create the virtual machine it will automatically be in the on position okay because this one I have stopped so I have started it it's getting started so virtual machine you can see like it's started now you can see the public IP address and everything like here itself so you just refresh so now just go back to the resource and you can see all the resources here so here we have IP address and everything and you can see the size as well which is a standard ds1v2 one virtual CPUs and 3.5 G memory so now we will connect through this IP to the database engine later on we will use this IP so right now we can just go to and connect through RDP and it says okay this is okay you can just download the RDP file as well the RDP file is downloaded you can just open it from here and connect and put the username and password which you have assigned for it so I will assign Azure server admin the my password as well yes so now I will just connect yes so as you see like the server is uh launched so we can just go and launch the ssms in so Microsoft SQL Server management it takes a little time when you launch It For the First Time The ssms so now you can give the server name here and authentication you can give Windows authentication or you can give SQL Server authentication for connecting it to the SQL Server so right now I'm giving Windows authentication I will just connect so now you see it got connected this is the version and these are the databases the system databases and you can also see for the security which we have chosen which for logins like you can see for SQL Server Like This SQL vm1 and for login thing you can use see it as Azure server admin so we will just check the version so take a new query let me look at it as select enter it give it version and you can just execute it but it's showing the original Microsoft SQL Server 2017. so after this you can come back to your source so now you can go to the resource Group so here you can see total nine records have been created when you create the virtual machine you can see like the virtual Network version machine sql's virtual machine then public IP address network security group network interface and three disks so you can go to the virtual Network which has been connected through subnet and you can see like there is a default subnet and available IPS are 250. you can go back again to the resource now we can go to network security group that is SQL VMware and SG in this you can see you can connect through RDP through port 3389 and for connecting through SQL you can use the port 1433 with your local system now let's come back you can see you have the source group Azure SQL resource which I have already made in the Azure server resource which should be updated so let's see now we have left this one okay for deployment so this deployment is completed as you can see so let's come back as I've already explained you all the process and suppose if you don't want to continuously use your server you can even stop your server so what you can do is you can go to your resource and you can go to your virtual machine so that just in the case you don't want your build to be high and you want to put a control your building so you can just stop the virtual machine whenever you are not using it so you can just click on stop as you can see stop virtual machine is stopping processing is going on it will also stop this one as well so we can just go to the source code so it's better because otherwise it will keep on billing and it may like cost a lot so I will just stop this one as well and stop it so let's check the last one so this one we can see like this one is stopped already successfully stop virtual machine so you can see the virtual machine is dropped that's the process so I hope you have understood the process of SQL server on Azure virtual machine now let's look at some of the use cases for Azure virtual machine so the first one is developer or test environment it's like an important use case for replicating or migrating data to SQL hosted on Azure which is for developer and test environments so before deploying to the production environment it is a pertinent that the data is tested against developer test environments Azure SQL databases can like act as a target for just such environments the live production environment can be replicated to the developer or test environment using a database copy second is a business continuity so one of the most important use cases for SQL on Azure is using it as a Dr Target to maintain business continuity Azure SQL databases can provide an SLA of up to 99.89 percent by maintaining several copies of the data this provides a business continuity as it allows you to restore jio rate indent copies of the data or use active geo-rated net copies as failover points in case of outages at data centers or in regions besides Azure SQL database you can also use availability groups to fulfill business continuity demands not only you can use availability groups in Azure SQL virtual machines but also use Azure SQL virtual machine instances as a target for high availability and Disaster Recovery third is scaling out read-only workloads so apart from providing BC or TR capabilities active Geo replication can also be used to offload read-only workloads such as reporting jobs to secondary copies you can also like extend on premises SQL Server instances using editable always on replicas both is backup and restore so Azure SQL database is backed up automatically on a regular basis and there are no storage costs for to 2200 of the maximum provision database storage you can like restore backups to any point in time going back to the retention period which is determined by the Azure SQL service tier in use Optimus SQL Server databases and transaction logs can also be backed up directly to Azure using the backup to URL feature and stored in Azure storage so Azure SQL databases can also be stored on local storage by exporting them to backpack files fifth is Advanced analytics so another important reason for hosting SQL in Azure is to make use of azure's advanced analytics platform such as Azure storage blog and Azure data link store Common scenario with Advanced analytics is when users reference data from various data sources so you can use Azure data next door as the staking area perform and then you can like also perform transformation activities using Fireball spark and finally load the data into Azure data warehouse for Bim reporting VI means business intelligence and Reporting [Music] so why do we need Azure data Factory well first and foremost we need to understand the fact that the amount of data that is being generated these days is huge and this data comes from different sources now when we move this particular data to Cloud there are quite a few things that needs to be taken care of now this data it can come in any form because we are talking about different sources and these different sources would transfer or channelize this data in different ways and it can be in different formats so when you do decide to bring this data on cloud or at a particular storage place you need to make sure that this data is well and truly managed now what do I mean by this well you need to transform this data you need to delete unnecessary part or get rid of all the things that is not needed now that is an after part or pre-processing part but as far as moving this data is concerned you need to make sure that you pick this data from different sources and bring it at one common place then store it and if required you should be able to transform it into something more meaningful now this is is something that can be done by using traditional data warehouse as well but then there are certain disadvantages what happens is at times you're forced to go ahead and have custom applications that deal with all these processes individually and this can be time consuming and integrating all these sources can be a huge pain now how do we solve this problem well if there was a way to automate this process or create proper workflows this burden would have been taken care of now this is where data Factory steps in what it does is it kind of helps you automate this complete process instead of saying automate I would say it helps you orchestrate this process into more manageable or organizable manner now that is something that is needed and that is why we need something called as data Factory which lets you automate all these processes what is azure data Factory so if I just go ahead and talk about its definition I would say it is nothing but a cloud-based integration service which lets you do quite a few operations like create data driven workflows basically and to go ahead and orchestrate all those sources that are there but what exactly does it do to name few common operations what it does is it helps you create pipelines which are nothing but you can think of it as a logical pipeline which supports various processes that happen it lets you ingest data from different sources and then process it when we are talking about processing it it also helps you go ahead and do analytics which is very important these days because we know that the amount of data that is being generated it can be helpful and it can help you take various good business decisions and this is where analytics comes into picture now what data Factory does is it lets you transform this data and make it ready for something like data Lake to use it now data lake is something that lets you use various analytical tools or methods like you have something called as Azure HD Insight maybe a hadoops bar Azure data like analytics and all these things now as we move further I would be talking about these terms to some extent but till then you'll have to wait for now just understand one thing that these are nothing but platforms or tools which basically let you go ahead and do various analytical operations so that is what data Factory lets you do it lets you get in all the data arrange it in a particular manner or order and then Supply it further for various processing or various other things that can be done with the data so if you talk about particular steps what does it do exactly well first and foremost what it does is it helps you collect and connect data now when I say connect I'm talking about connecting to various sources from which the data can come now there are quite a few desperate sources from which the data can come you can connect to all these Resources by using your data Factory and then once you have this data you can collect it and store it at a central place say for example your data warehouse then comes the process of transforming and enriching it now when I say transforming it I mean running various patterns on it creating schemas and all those things you can then actually go ahead and publish this data now we all know that Microsoft Azure supports various other tools as well we have one more popular data visualization tool called as power bi which is very good when you talk about data integration and various data visualization capabilities what you can do is you can connect power bi to your Microsoft Azure and publish this data that is create various dashboards and all those things now that can be very insightful when you talk about it from business intelligence perspective so yes you can go ahead and do that as well and then you can actually go ahead and monitor your data that means you can take a look at all the data that you have and you can actually go ahead and analyze it in real time as well so these are the processes which we are talking about that is connect and collected data transform and enrich it publish it and monitor it now that is what a data Factory lets you do various Concepts that surround this particular term we have quite a few terms to discuss like we have pipelines data sets activities and linked Services let us try to understand these one by one first we would be talking about pipeline now pipeline is nothing but think of it as a proper pipeline that is it is something which acts as a carrier in which we have various processes taking place now this individual process is nothing but an activity if you take a look at this side you'd understand that activities represent processing step in a pipeline that means your pipeline can have one or multiple activities now when you talk about this activity it can be anything it can be a single process like querying a particular data set or it can be something like moving data from one source to the other then you have something called as data sets now data sets are nothing but sources of data say for example my data is stored at an N location and that n location is nothing but my data set so in simple words it is nothing but a data structure that holds my data linked Services now this is nothing but a law say for example I need to move my data from a particular database to a blob storage there need to be something that connects these two things that is a law or some information that lets a database understand that it has to move the data to a particular data source so that is what a linked service is it is nothing but an information that tells a data Factory that is you need to connect to these particular sources so these are some of the concepts that form a central part when you talk about a data Factory difference between data Lake and data warehouse now why did I bring this topic up well data warehouse is something that is a traditional approach towards storing data and it is still used widely but then why do we need data Lake and why am I comparing these two terms well quite a few people confuse these two terms as in what is the difference between them exactly well they're quite similar to each other but there are slight differences which are important and I felt that we should have gone ahead and discussed those that is why I put this slide here so let's try to understand that as well well this is a clear differentiation between these two things that is first of all you need to understand that your data lake is nothing but something that is complementary to Data Warehouse that means if you have your data at a data Lake that can be stored at data warehouse as well but there are certain rules that need to be followed now when you talk about your data warehouse what happens is you can use again your data Lake to bring in data but the main difference is when you talk about a data Lake the data is detailed data or raw data that means it can be in any particular form you just need to take the data and dump it into your data lake that is not the case with your data warehouse here the data is filtered summarized refined now you might wonder that okay if everything is happening here at your data warehouse isn't it better in some situations yes definitely it is better but as I've already mentioned so many times in this video the amount of data that is generated these days is huge and it can come from any particular Source this might not be the best of options to deal with why well your data lake has something called as schema on read and your data warehouse has a schema on right kind of an approach what this does is if I talk about it from a data warehouse perspective where you have scheme on right that means when you're writing your data to your data warehouse it is written in a structured form or in a particular schema so when I'm reading this data I have to read it in that schema only but when you talk about a data Lake here you just dump your data it is not structured so when you are using this data you are free to go ahead and Define schema in a number of ways as it suits your needs so this is where the benefit lies in that is the data does not follow a particular schema you can just go ahead and pick a schema for it so that is a plus Point again one more Point your data warehouse basically it works in SQL that is it uses SQL to acquire or question your data but when you talk about your data Lake no matter the data comes from different sources it can be acquired by using one single language that is your usql and again that actually helps you reduce various barriers because you are talking about different data sources but still something that lets you access the data is one single programming language so these were some of the concepts I felt that you all should know that is what a data Factory is and what a data lake is again the reason I talked about data Lake was it is an important part when you talk about data warehousing and data integration because all these topics would form a base when you talk about analytics and processing data and that is why I wanted you all to know all these terminologies now since we started our session talking about data Factory let us move back to data Factory and try to visualize all the concepts that we talked about because I talked about something called as pipelines activities and all those things so let me quickly switch to my Azure and we can actually go ahead and see how it works okay so there you go what I have done is I've gone ahead and I've logged into my so-called Microsoft Azure portal for people who are completely new to this term that is Microsoft Azure what you should do is you should go ahead and take a look at other videos if possible I've created quite a few videos in the series and those videos talk about various Concepts as far as Microsoft Azure goes there are certain Basics as in how should you use it and what are the things that you need to do one of the things that is there is creating a free account which is very important now we are going to go ahead and use our free account or our subscribed account what Microsoft Azure does is it lets you have a free account you need to just register for it and quite a few resources would be made available to you and some credit is also given to you which you can use that is you would be charged for these resources but since you are given certain credit you can use that and that means the services are freely available to you if you already have an account then that is not a problem we are just going to go ahead and take a look at the demo part as far as this demo goes what I'm going to do is I'm going to go ahead and create a database is for that I need something called as ssms let us go ahead and try to understand what that is first but if you do not have that thing you need to go ahead and install it so how do you do that just go to Google basically and type ssms this is the first link that you get and you open it you have your SQL Server management studio now this is something that we would be needing you just come here and you click on it and it would start the setup download I guess it's getting downloaded twice I don't need that I've actually gone it and I've installed that thing on my system I'm doing this for you people what you need to do is you need to download this particular file now as you can see it is somewhere around 800 MBS that is approximately equal to 1 GB so you need to download that file and once you do go ahead and download that file you need to install it in your system the installation can be a lengthy process now when I say lengthy process it can be time consuming it is not difficult to do all you need to do is open this file and double click on it it would give you an option to install it and just say yes once you do that it takes care of all the processes and once you do go ahead and install that thing now this is something that would help you have your own server and that is on your local device basically so this is something that we would be using in the later part but what I would suggest is you come to this website and download this particular thing if you do not have it and once you download it I would suggest that you go ahead and install it as quickly as possible again let me tell you it might take some 10 to 15 minutes depending upon the processor you have and the internet speed that you have so yeah once you're done with this then what we do is we again switch back to our Microsoft Azure account now what I'm going to do is I'm going to go ahead and create a database here so how do we do that well you go to your portal basically and quickly log into your so-called account my internet is fairly slow today so yeah it might take some time so this is how your Microsoft portal looks like you have quite a few things to select and choose from you can just go ahead and click on create a resource here and it gives you quite a few options to select from or instead what you can do is you can just come here and look for your so-called databases and I select an SQL data warehouse so it would ask for a database name give it a proper name say demo DB is what I would call it and yes the name is acceptable my subscription is azure 533 I would be needing to create a resource Group for people who do not know what a research group is it is nothing but think of it as a place where you put in all your resources basically so it is a group of resources again if you need to understand more about it we do have a session on it which you can take a look at it on YouTube so I say create a new one say I call it demo r g and let me just pick a blank database I need to configure a server to it so I would say create a new one let's name it say demo server and I should give it some numbers probably yes it is acceptable and you need to create some login credentials when you do log into the server so let those credentials be say admin demo maybe there you go and what password should I set Let It Be Strong and let it have a number and a symbol basically location South India that does not matter a lot you can choose the location that is there in here you have quite a few options there you go I would stick to what it is there and I would say select yes so I have a demo server as well performance level now it depends as per your need as in how fast do you want it to work now as far as my so-called demo is concerned I do not need it to be very fast or something like that let me have it to this value now based on the location where you've created your so-called research the prices would be given to you now this is something that I created in South India so the price to me is in INR that is Indian rupees and it is 223.88 per hour all I would do is I'll just go ahead and apply I would say pin to dashboard and I would say create now the deploying might take a little longer than normal because yes as I've already mentioned that it depends on the internet speed and even if your internet is faster at times it might take a longer while for this thing to happen so what I'll do is I'll just pause the video for a minute and once the deployment is done I'll get back to the demo part again so yes guys I've gone ahead and I've done the process that is we have our so-called demo DB already it took a while I actually went back and had a cup of coffee as well it took that long my internet is really slow today so yeah let's move further and now try to go ahead and do the other thing that we are supposed to do as I've already mentioned you need to go ahead and install ssms if you do go ahead and install it this is how it would look like now since we've already created a database we can log into it you can go ahead and given your credentials here that is I would be putting in the credentials which I've mentioned that is demo server one one zero zero and then the extension that is database dot windows dot net yes and the name here is admin demo I can go to options here connect to what database we have something called as demo DB and that is three to one TCP I say connect it wants me to sign into my so-called Azure account I hate this each time it happens because I've already logged in I should have stayed signed in but I did not and that is costing me now yeah one more thing at times you would be logging in from your client user or your client PC and that times you might not be an administrator so in that case you need to change the firewall Rule and make your system accessible here and there you go have signed in so we have our information here that is this is my server I used to login and if I click on databases the schema is there we have our database that is demo db321 now what we can do is we can actually go ahead and add a table to it now I'm not gonna go ahead and create a table a completely new table instead I'll just go ahead and maybe run a simple query which lets me have two columns so what query should I put in what table should I create I'll just go ahead and pick up some query from Microsoft Azure website now this is something that lets you create a table with this name and these are the so-called variables or column names that your so-called table will have so I'm just gonna copy this piece of code if you wanted you can go to Microsoft Azure website and you can have this piece of code as well I'll share the link do not worry so when I close this and I come here and I click on this I say new query and I simply go ahead and paste this piece of code now I'll just make some changes to the name just to get rid of ambiguity say 2 maybe 3 and I would be inserting two records in it so this is the changes I make here that is insert into this so-called table and to do that I would say execute okay this should not take this long but let us now once it does happen these are the two records that would be there that is John Doe and Jane Doe there you go one row is affected is what it says so we have actually gone ahead and logged into my ssms and we have connected to our SQL server and we've created a table there and as far as the table goes it has these two records so what I'm going to do now is I'm going to again switch back into my so called Azure data Factory that is to my Azure account and create Azure data Factory basically so how do we do that first thing you need to understand is in order to have a resource Group or have all these resources there on your system you need to have a storage account now this account is nothing but more or less a compliance or a law basically what that law does is it lets you have your storage entities on Microsoft Azure so let us go ahead and create that account first it is very easy you come to this portal once you do that you just scroll down and you see storage launcher and I say add when I say add this window should appear and you need to enter in details so what details should I give in for my storage account let's call it say essay for demo now it is not accepting this name okay it needs to be in lower case so I would say essay for demo and there you go it is taken resource manager is what I stick to and I say V2 location is South India now I've talked about your data getting replicated three times that is three copies are maintained and when I talk about the replication these are the types of replications that you have locally redundant storages which I am selecting which is the most affordable one it is not very reliable but since it is a small demo I can use this performance should be standard and if you see by default The Blob storage is always hot and when you talk about a resource Group let's go ahead and use the one that is already there our Resource Group was demo RG if I'm not wrong yeah let's select this we need to dashboard and I say create and my storage account would be created this should happen quickly there you go the deployment again might take a longer while so yeah what we have done is we've gone ahead we've actually created a database we've connected to a server put a table there and now I've created a storage account since I have a storage account I can go ahead and create again containers and then I can use data Factory to basically move data between various sources as you can see it has been deployed do I want to go there yes I do now when you come here you see this so-called tab here you have something called as access keys what you can do is you can actually come here and copy some of the data that is here this is the name of your storage account which you would be needing and there are quite a few terminologies that you need to remember when you are using or going through this demo like I mentioned you have created a database you need to know its name or remember its name you need to know the server name what is your server admin credentials plus what is the name of your storage account which is this in this case the key I'm not sure whether we would be needing it or not but if we do I would still say that you copy this first key that is here control a control C and I would just paste it there you go now when you move back to this page you have this option here blobs where we're going to go ahead and create a container you can click on it as you can see there's no container here so you click on this plus sign and you have an option to create a container so what do we call this particular container here let's call it say blob again you need to use something small that is demo new blob maybe is what I'll name it and I'll let the public access level be the waiters and I would say okay and my blob is created here now I won't be adding a destination folder or something here because when you do go ahead and move your data a destination folder is created automatically so let's not do that let it be the way it is so I have a blob storage or a container with me now I need to go ahead and create a data Factory here so how do we do that come here search for data Factory directly there you go now this is not where I want it to be so I close it I click on Analytics and here I have the option of data Factory I click on it yes so I can create a new data Factory so what should I call this let's say demo DF okay it's still not there so that is surprising okay let me just try a few names and come back quickly so this name is accepted now so use an existing Resource Group which resource Group did we have it was demo RG if I'm not wrong there you go V2 is the latest version that we have and do we need to bother about this thing let's say okay these are the areas that are there in the option we select this into dashboard and I say create and I have a data Factory with me so this might take a while so let me again pause it and switch back to the time no I don't need to do that it's created okay so I have a data Factory with me now I need to come here and click on this because I need to go ahead and move my data so I come here when I click on this thing I would be given quite a few options to deal with so I need to copy my data so I click on this icon there you go I need to give a name to the task that is there let's call it say copy from DB to block and I go to next I need to select an SQL Server here so I come here and I scroll down now the interface might vary from different portal accounts this is what it looks like as far as my so-called system goes SQL Server here it is I select it and I say create a new connection there you go you'll have to scroll down right at the bottom SQL server server server okay did I miss it somewhere yeah here it is you select it say continue and you need to give a name to your new linked service now what name should you give it to you say SQL Server demo link Maybe and I also need to go ahead and create an integration environment so I come here and I click on this thing and I say create new public network I say next so what name should I give it to it let's call it say demo integration Maybe and I said finish now as you can see what we've done here is I've gone ahead and I've given in details for my so-called source that is the place from where I'm going to go ahead and use all the data so I have this so I need to go ahead and give in my server credentials again so if I'm not wrong the name was I'm telling you you need to write everything down because you would be using all these names so a best practice would be to write everything down so in my case it was something like demo server zero zero one one if I'm not wrong sorry one one zero zero the extension is not database dot windows dot net there you go the database name was demo db321 SQL authentication username was admin demo if I'm not wrong I scroll down and I enter the password that is and I say finish so yeah my source details are given in already that is this is my source data store and these are the informations are this is the place from where I would be wanting to move in my data so I go to next and then I enter the details about the destination now but before that what we do is I need to select a table if you remember we had run a small SQL query to give you the details I do not remember the name it was what was the name of the table was it emp3 was it yeah let me just refresh it to see if the information is there or not I scroll down and it is here so I select it and it would load the preview saying that okay these are the records that are there now these were the records that we entered so I say next for my destination I'll come here and I would select a blob storage now there you go create new Azure blob storage and I say continue now I need to enter in details here as well let me just call it say Azure storage [Music] demo link let me just do this thing again quickly now we need to select a storage account name as well because we've gone ahead and created a storage account here which was for the demo part and what was the storage account I forgot the name of it it was storage account for demo if I'm not wrong and I've selected that and then I say finish I click on next now this is the service that I wanted to connect to I say next now I have to enter the so-called details for that I would be actually needing the name of my blog which I think I've forgotten so let me just go ahead and get the details of that as well it was something like this that is demo blog and the file name was emp3 if I'm not wrong I say next I say next again and I say next and there you can see the data has been deployed that is I've actually connected MySQL server and I've gone ahead and I've moved the data to my so-called blob storage which you can actually also do then you can go ahead and edit the pipeline if you want as you want or you can just go ahead and monitor this data as well now it depends on your needs and what do you want to do you can go ahead and do quite a few things with it that is you can go ahead and move this data to your so-called power bi and Implement various other things on it as well when we did go ahead and create our so-called blob storage that is block container you can actually go ahead and keep that part or that so called window open because when you do go ahead and deploy this data you can see all the details that is what has just happened and how has it happened all those things so yes what we've done is we've gone ahead and we've moved our data from a so-called database to our blob storage now I actually accidentally went back and did not go through that step where we see the deployment but do not worry we can just go back to our dashboard and take a look at that thing as well so how do we do that I have this so-called storage account here now inside that storage account we have our blobs where we moved our file now this was the block where we moved the file and then this was the file that we moved so yeah our deployment has succeeded that is we've moved our data from a so-called database to our show called blob storage and as far as this demo goes that is what our aim was I hope that I did throw a sufficient light on following concepts that is our data Factory and how do you go ahead and create a data Factory and use it as a pipeline basically [Music] before knowing what is azure data breaks we should must know what data breaks actually is well according to the definition data breaks is a web-based platform for working with Apache spark that provides automated cluster management and IPython style notebooks so basically data breaks developed by the creators of Apache spark is nothing but a web-based platform which is also One-Stop product for all data requirements like storage and Analysis it was originally founded to provide an alternative to the mapreduce system and provide a just-in-time cloud-based platform for big data processing clients it can derive insights using spark xql provide active connections to visualization tools such as power bi click View and tab View and also build predictive models using spark ml databricks also can create interactive displays text and code tangibly so in short it is an alternative to a mapreduce system databricks is now integrated with Microsoft Azure Google Cloud platform and Amazon making it easy for the business to manage a colossal amount of data and Carry Out machine learning tasks as we got to know that databricks is integrated with all the three cloud-based platforms today we'll discuss on one of those that is azure data breaks so let us understand what is azure data breaks Azure data breaks lake house is a platform that offers a uniform collection of tools for building deploying sharing and supporting Enterprise grade Data Solutions to a scale it integrates with cloud storage and Security in your cloud account and manages and deploy Cloud infrastructure on your behalf Azure data break supports python Scala R Java and SQL as well as data science Frameworks and libraries including tensorflow pytorch and scikit Loan now that we have come to know what is azure data breaks let us understand why should we use Azure data breaks well our customers use Azure data breaks to process store clean share analyze model and monetize their data sets with solution from Power bi to machine learning you can use Azure databricks platform to build many different applications spanning data personas customers who fully embrace the lake house take full advantage of the unified platform to build and deploy the data engineering workflows machine learning models and analytics dashboard that power Innovations and insights across an organization the Azure data breaks workspace provides user interfaces for many core data tasks including tools which will be discussing one by one so first is optimize Spark engine it is a simple data processing on auto scaling infrastructure powered by highly optimized Apache spark for up to 50x Performance gains next is machine learning runtime it is a one-click access to pre-configured machine learning environment for augmented machine learning with state-of-the-art and popular Frameworks such as pytorch tensorflow and scikit-learn we also have ml flow that is used to track and share experiments reproduce runs and manage model collaboratively from a central repository well here you can use your preferred language including pythons kala R spark SQL and Dot net whether you use serverless or provision compute resources so that you can quickly access and explore data find and share new insights and build models collaboratively with the language and tools of your choice in Azure data breaks you have Enterprise grade security which is an effortless native security that protects your data where it lives and creates complaint private and isolated analytics workspace across thousands of users and database apart from that it is also production ready that means you can run and scale your most Mission critical data workloads with confidence on a trusted data platform with ecosystem integration for CI CD and monitoring you also have collaborative notebooks through which you can quickly access and explore data find and share new insights and build models collaboratively with the languages and tools of your choice and the most important thing it has Delta Lake that brings data reliability and scalability to your existing data lake with an open source transactional storage layer designed for full data life cycle it has a native integration with Azure services that means you can complete your end-to-end analytics and machine learning solution with deep integration with Azure services such as Azure data Factory Azure data Lake storage Azure machine learning and power bi last but not the least it has an interactive workspace that means you can enable seamless collaborations between data scientists data engineers and business analysts so these were the key features that makes Azure data breaks unique till now we have got an idea about the data breaks and Azure data breaks and why are we using Azure data breaks now let us Deep dive in by understanding how does Azure data breaks actually work as I said Azure data breaks is structured to enable secure cross-functional team collaboration while keeping a significant amount of backend Services managed by Azure data breaks so you can stay focused on your data science and data analytics and data engineering tasks it operates out of a control plane and a data plane although architectures can vary depending on the custom configuration such as when you have deployed a Azure data break workspace to your own virtual Network which is also known as v-net injection now let us consider this architecture it is a common structure and a data flow of an Azure data breaks here it consists of of control plane and data plane so let us understand what is control plane the control plane includes the backend services that Azure data breaks manage in its own Azure account notebook commands and many other workspace configurations are stored in the control plane and encrypted at the rest if we talk about data plane it is managed by your Azure account and it is where your data resides this is also where data is processed you can use Azure data break connectors so that your clusters can connect to external data sources outside your Azure account to ingest data or fault storage you can also ingest data for external streaming data sources such as even data streaming data iot data or many more well your data is stored at rest in your Azure account in the data plane and in your own data sources not the control plane so you maintain control and ownership of your data if we talk about the job results then it resides in storage in your account itself interactive notebook results are stored in the combination of control plane that is partial result for presentation in the UI and your Azure storage if you want interactive notebook results stored in only your Cloud account storage then you can ask data break representative to enable interactive notebook result in the customer account for your workspace note that some metadata about results such as chart columns names continues to be stored in control plane itself so this is the basic architecture of azure data break to know it more in simplified manner let us have a quick Hands-On on Azure data breaks so here we'll be integrating Azure data breaks with a short blob storage that is a service provided by Microsoft azure so as you can see this is a small workflow of how we'll be working on the demo so as we know Azure blob storage and Azure data breaks are both Services provided by Microsoft Azure now these are two separate services but as long as you are using them in same Resource Group you can integrate these two Services well now you must be wondering why do we need to combine these so let's say that Microsoft Azure provides a multitude of services it is often beneficial to combine multiple Services together to approach your use case so if we combine multiple Services then we don't need to engage your local hardware in anything like well for example currently I'm using a laptop maybe my laptop can be of lower configuration or it might not have enough space to process a huge amount of data in that case I would want the cloud service to handle all my use cases and my big data storage that I have so in this workflow as we said that we will integrate Azure data breaks and Azure blob storage that means we're gonna combine them so here what you do basically is you interact with the coding notebook which is nothing but the IPython or jupyter notebook that your Azure databricks will create then what you do is then you type some coding commands in your coding notebook and then these commands are sent to the data brick service and then what happens is the data break service receives the commands from the coding notebook and it sends those commands to your Azure cluster so whatever cluster you have created after creating your data breaks so whatever commands that we are writing in the coding notebook is sent through data breaks to either clusters that you have created now depending on the authentication provided to the cluster with regards to your blob storage account the authentication commands are then sent to the blob storage account saying that the data is fetched from the desired directory then it is brought back inside the cluster and that data that has received by the cluster then is process and whatever output you get out of after the process is you can see it on your coding notebook now whatever output you receive from the coding notebook you can also store that particular output that you have got from the coding notebook back to your blob storage account old space so all of this is integrated easily and it is handled in a very simpler manner well now that we have understood the architecture let us now know how to implement it with a Hands-On for this we need to quickly sign in to our Azure portal so once we sign in we land in our dashboard of the Microsoft Azure as you can see here the all the Azure services are mentioned here and once you create your resource Group so it gets highlighted also now let us quickly open our Azure data breaks and let us create our Azure databricks so for this once we come here we need to just simply create and here they ask for basic details like your resource Group then workspace name region so let us fill up one by one so here as we don't have any Resource Group so we'll be creating a new Resource Group as we are working on a demo so let's name it as Azure data breaks demo and let's create it now that we have created the resource Group now they ask for the workspace name so we'll give it a unique name you can keep it any so let's name it as Azure data bricks and so on and then you can choose your region so here you find uh different types of region where you can like work or create your Azure data breaks so it is up to you whatever reason you choose for me I'll be keeping West us as it is after that we come to the pricing tier so here there are three types standard premium and trial so as you're working on the like we are just having a practical knowledge we will be using the trial version and we'll just review and create now here you can just check on to your whatever details you had filled previously whether it is correct or no so once it is validated we can just create this data Bricks now as you can see they are initializing the deployment so it may take a while to create a data bricks all right so here you see the deployment is in progress so once your deployment is complete you can go to your resource and here you land up to your databricks page now here what we need to do is launch our workspace so it will move us to our main Azure portal so it will sign us in the azure databricks so this is the dashboard of the databricks now here you have different options like notebook and daytime Port partner connect transform data and many other things here you can set up your workspace like create a cluster import data build a data pipeline as well now one thing I should tell you that databricks works on dbfs now what do you mean by dbfs is well dbfs is nothing but databricks file system which is a distributed file system mounted on Azure database workspace and are available on the Azure database cluster so it allows you to interact with the object storage using directory and file semantics instead of cloud specific API commands and it helps you out to mount Cloud object storage location so that you can map your storage credentials to the path in the Azure databricks workspace it also simplifies the process of persisting files to object storage allowing the the virtual machines and attacked volume storage to be safely deleted on the cluster termination well these are the things which you can like implement it while you're working with the Azure data Bricks now let's get back to our data Bricks now that we have come to this page so the first thing to be done over here is to create a cluster now we go to create a cluster now here we go to create compute now here as you can see we have the cluster name so here we can edit the cluster name based on your requirement so let us keep it as a data breaks cluster and after this we have the policy so here policy is nothing but a cluster policy defines the limit on the attributes available during the cluster creation so here we have different types of policies that is unrestricted personal compute power user commute shared compute so we will keep it unrestricted for time being now there are two types of cluster mode one is multi node and single node so in multi-node we can specify the minimum number of workers and the maximum number of workers so here minimum numbers can be two or you can specify it upon this is basically a standard limit for minimum and maximum whereas you can change it accordingly based on your requirement so now as of now as we are only practicing so we will just disable this Auto scaling and we can specify our number of workers over here so we can keep only one worker as it is only for practicing whereas we can also like change the timings for termination if the particular cluster is inactive so you can specify that much of time to it apart from that we have our access mode that is nothing but there are three types of access mode that is single user shared user and no isolation shade so here we will keep it as it is and your single user access is nothing but a subscription after that we come to our performance so here we need to specify our runtime version so here in our case it is runtime 11.3 LTS color 2.12 and our work type can be standard whereas there are other versions as well but as of now we don't require much of it so we'll be going with the standard version it set and whereas we have already specified our workers and termination time is also being mentioned now here towards your right you can see the whole summary of your cluster what have you been like creating so once you review it you can just create this particular cluster now as you can see it is loading it takes a while to like create a cluster here in this section you see the status of your cluster So currently it is in a pending mode like it has been creating so we'll wait for a while so now as you can see our cluster has been created and it is in the running mode now after this what we need to do is go back to our main dashboard and we need to create a new notebook so we'll create a new notebook over here and here we have already specified the cluster so we had created now so it has automatically taken and here we need to specify our default language so you can choose any of them so here there are four types of languages which you can choose here I would be taking Scala for now and we can give a simple name to this notebook that is it can be anything of your choice so let us name it as databricks notebook and let's create so this will start very quickly it doesn't take that much of time now here it is like you need to like run your command over here so you just need to type down your command and just hit enter and it starts running so now in this we will know how to upload a file through a see Azure storage service that is our Azure blob storage so it's basically if we need to First integrate the Azure blob storage so let us know how so before this we need to go back to our Azure portal and here we need to First create our storage account so here as you can see we have our storage account and here we'll create our blob storage so let's create so now here for creating storage account you need to give your details over here so as here we had already created the resource Group previously while creating a data break so we will be selecting the same and after that we will give a storage account name so let us name it as data breaks storage account and in region we need to provide the specific region whichever you like to choose so as previous I had chosen West you will so I'll be choosing that as well and here now when we come to the performance so here we can choose any of those um any of the two options given below so here we have standard and premium So currently we will be going with the standard version and when we talk about redundancy so here we have two types of redundancy as you can see locally redundant storage here it means the low cost option with the basic protection against server rack and drive play areas recommended for non-critical scenarios whereas a geo-redundant storage is like for intermediate option with failover capabilities in secondary region recommended for backup scenarios so locally means it happens within the region not across the whole world so here we will be choosing the locally redundant storage now we have specified everything so now let us review so here before like creating you need to check all your details once you review it just create and your storage account is in initialization stage so once it gets deployed we will start working on that so now our deployment is complete so we can go to our resource so here all of the permissions are automatically managed within the same Resource Group so we don't have to worry about any permissions requirement so now that we have created our storage account now here we need to create our container so we will go to our containers and we'll so here we can give it a name to our container it can be anything name it as storage account one and let's just create so as you can see our container has been created now we will quickly go on to this container and here we don't have anything in this container so the container is empty now we need to upload some files in this particular container so we'll just quickly go to upload and here we will go to select file so here what does it do it will get connected to my Windows File you can take any files over here so as of now I will just take a normal Excel a CSV file and we'll just simply upload it well you can upload more than one files if you want to like upload it in the containers you can also have a larger files but it may cost according to the given size now we have a file in place and we also have created our notebooks so now we have to integrate The Blob storage with the data Bridge so for that we need to run a code now here the code looks a bit complex so that here so here first we need to create a token so that we can get access to the files so we'll just copy this whole code so that there's nothing to memorize this can be provided to you while you are practicing so now we'll just copy paste the whole command so here now as you can see the container name so here we need to specify the container name and the storage account that we have created so we'll quickly go back to our Azure portal and we will fill up the details so as they asked for the container name so here we need to specify our container name and the storage account which we have created so let's quickly go back to our Azure portal and here as you can see your container name is given so we'll just quickly copy that okay where did it go let's go back and we'll just copy this name and we'll paste it here now same thing to be done with our storage account so we'll go back to our storage account and here we'll see this is our storage account name so we'll just copy and we'll just paste it over here now for the SOS token we need to go back to our containers and here we will go to the shared access signature so here we will be getting our SAS tokens so SAS token can be generated for a limited amount like for a given period of time so you need to specify like when you create your SAS token so the time it has been started it would be valid from the time it has been started till its time of expiry so we'll just allow the services containers and the objects and let the time be as it is as it is being specified and now let's generate so now here is you can see you can find your SAS token now we need to just simply copy this and go back to our data breaks and we'll paste it over here so we have just added our SAS token let us just uh verify whether it has been correctly copied or no so I feel everything looks perfect rest other things remains to be same so here what happens you keep on creating new variables so here we have created a URL so by appending the container and the storage account so once we like we specify our URL and our configuration then it comes the DP utils so here we specify our source Mount point and our maps to that particular token so this is basically where the data breaks helps you like get your sources from your different like different services so here the data breaks plays the role where it provides you the data which from where you want to extract from and it shows it over here and now you just hit shift plus enter so now it is like running the command so here it what does it do is we had discussed in our architecture so it communicates first with the data bricks and then it communicates with our cluster and get all the sources and then shows its result on the cluster itself so as you can see here command has been run okay so it shows some error over here so let's quickly solve that error all right now let's quickly run it again so it may take a while so here as I said it will first connect with the data breaks and then communicate with that and then it will communicate with your cluster and get all the resources from there so here as you can see they have specified your container name storage name and your token which has been specified now what we need to do is check whether her file has been extracted from our storage account or no so here we'll specify first we'll specify our variable so let's specify with Park Dot read and now we're gonna give the format of the file so as we had uploaded the CSV file so we'll just specify it over here CSV and then we can give them some of the options that they should show so let's give it an option like we can ask them to show the header and their value should be true then we can also specify the in first framework so we'll just add on that as well and for the efficient data requirement we will just provide them with a mode that can be fail fast mode and then we upload the like mentioned the file name so here we paste this Mount slash staging that is our Mount point and we'll paste it over here and then we will specify our file name that is let's go to our containers let's check what's the name python one dot CSV let's copy the name and we'll paste it so now let's just shift enter uh so it shows some error let's see what is it okay so here we had specified a wrong command so let's just resolve it and let's run it again all right so let us see how it looks like so now just do TF dot show and let's specify some amount and just shift enter so as you can see so they have mentioned here the decimal and uh description the percentage your headers have been provided and the number of the number of rows that we required they have specified that so this is how we integrate the two services and we process data through data Bricks now let us look on to some of the popular use cases that makes us your data breaks in a huge demand well databricks isn't a catch-all solution for every business scenario so there are the best use cases for Azure data breaks first is database and Mainframe modernization data storage collection and processing are incredibly important in modern businesses well if you are looking to modernize your data lakes or looking into Mainframe modernization applications then Azure databricks has all the integration you need next is machine learning production pipeline here using the underlying power of ml flow databricks is a good choice if you need to get machine learning applications into production getting data science out of the development and into production is a common problem and Azure data breaks can help you streamline that workflow if you talk about big data processing then Azure databricks is one of the most cost effective options for big data processing in terms of performance versus cost it offers higher efficiency if your business needs the best performance for one demand data processing then data breaks will likely be your best choice next is business intelligence integration integrating business intelligence tools means you can open your data Lake to analysts and engineer more easily there is no need for creation of new pipelines when analysts need access to new data the data can be shared through SQL analytics power bi and Tableau if this is a bottleneck for your business then data bricks will help you enable your business intelligence team now these are the four popular use cases by Azure databricks if your business fits in one of these use cases then it might be the solution for you [Music] Azure data Lake storage so what is azure data Lake storage Azure data Lake storage is a repository that stores large amount of raw data in its natural format until it is needed for analytics application so why is it named as data link well James Dixon the chief technology officer of pentaho is the person who has generally been credited with the coining of the term data link according to him he described a data mod that is a subset of a data warehouse as a key to a bottle of water which is cleansed packaged and structured for easy consumption while a data lake is more likely a body of water in its natural States data flows from the streams that is the source of the system to the link users have access to late to examine take samples or dive in so a data lake is a centralized repository designed to store process and secure large amount of structured semi-structured and unstructured data it can store data in its native format and process any variety of it ignoring the size limits next is how to create a data Lake storage for this we'll have a practical demo to have a better understanding so the first step to work on any Azure Services is to force sign in so first we need to sign in make sure you do have an Azure account so that you can have the access to different Azure services so let's sign in thank you stay signed in so once you sign in you enter to this dashboard so you can see here this is a dashboard of your account and you can see different kinds of azure services like storage accounts monitors virtual machine resource Pro SQL database SQL managed instances and also you can see your subscriptions and different types of Resource Group we have created and all other stuff so let's quickly get started first we need to go to create a resource so once you come here you can see popular Azure services so like virtual machine Community Services Cosmos DB and register though so for us we need to go to storage account once you click here so you enter to create a storage account here we need to create a resource form so first of all what do you mean by response phone a resource Group is a container that holds related resources for an Azure solution it can include all these resources for the solution or only those resources that you want to manage as important so let's create a new Resource Group let's name it as demo data Lake one okay and once you create a resource Group so you come to a storage account so a storage account contains all of your Azure storage data objects including blogs file shares queues tables and disks so let's name it as marshmallow one two three so once you have named your storage account we go to the advanced section here we will directly move to data Lake storage generation 2. so what do you mean by data like storage generation 2 Data Lake storage generation 2 is designed to deal with this variety and volume of data at exhibit scale while securely handling hundreds of gigabytes of true output with this you can use data like storage generation 2 on the basis of both real-time and batch solution so here we will enable the hierarchical namespace once we enable it we will check on review plus create so it will take few minutes to validate all your details once validation is passed you just click on create so it's now getting ready to deploy so here you can see your marshmallow one two three is created and now it is getting ready for deployment here you can see deployment is in progress once it gets ready so we will be working on it here you can see your deployment is complete now what you gonna do is we'll move on to go to resources now here you can see your details your resource Group details the whole storage Account Details basically and even you can see the properties enabled with it so here we have data Lake storage file service Q service table service networking security so after this we'll quickly go to containers and we'll create a new container for our storage account so let's click on container and we give it a name to it make sure you give a name it should be in lower case because they don't accept the uppercase so let's keep it demo and create so yeah you can see your container has been created let's click on it so here you can see there are no results as we haven't added any stuff so let's upload for this you can either upload it from Azure portal or else you can also go to storage Explorer so let's see how do we do on the Azure portrait so once you come here you can just click on select a file let's take any picture let's see we'll upload a picture so here you can see aws3.png let's upload it so yeah you can see AWS PNG has been uploaded in your container same as we can do on storage Explorer too for that you need to download a storage Explorer in my case I have already downloaded the storage Explorer so let's quickly go into it so here also make sure you are signed in so that you can see all your containers and Resource Group which you have created so here you can see my storage account that is marshmallow one two three which we have created recently under this we'll go and find our content you know yeah so here we go to block containers and we can see a demo so here you can see the image file which we had uploaded through our Azure portal now we'll upload a file and a folder both let's see so let's upload a file first so you click on upload upload files and then you select a file let's take any file let's take a picture again so AWS 4 let's take this an image so now here you can see your image is transferring from your path to our demo yeah so here your image file is uploaded so as we said we can upload any kind of datas maybe structured or unstructured so let's check out by uploading a folder so here you go with the same process and you you select your folder let's take any folders let's see if we do have any cooler okay I'll just take any one of my folders and just check and upload so your folder is also being uploaded over here and once your folder is uploaded now you can see the inside resources into it so yeah there were different files text image all of them are there you can access it from here itself and you can see other operations as well if you want to download any of the file or a folder you can do it or if you want to open it let's open it any of the files see let's see if it is getting open or not okay yes so yeah you can see this file has been opened now if you want to download this file let's download sequence so we'll download it and just put it in downloads let's see if it is downloaded or not apply to apply so yeah it tells your download is completed let's see if you find it or not if you go to a file and downloads and here see now I have downloaded this is the file which I had downloaded through our storage Explorer so this is how you manage your data's close your data Lake storage so apart from this you can also give permissions to different users like whichever file if you want to give an access to a particular user then you can also give permission to them that they can either read write and access the whole file or a folder so for that you just need to click on a file or a folder and right click and you just see here manage Access Control list so once you come here so you can here you can see see there are different owners super user owner and all so here I can add a no no like for this file who can just get an access to it so you can find out any name if you find out any relatable person or a user then you can give an access to that currently I don't have anyone so I won't be able to show you that but yes this is how you add or give permission to different users you can also do it with a folder or else you can also do it with the whole container as well if you want to share your containers with different people or different users then you can easily share them so here also you just go right click and just come to manage access control and just give them the access click on ADD and find out the person whom you want to give the access to and just after that once you get give them the permission here you can see if you want to permit them for only read or only write or if you want to give read and write or all of the three so you can just give them the permissions accordingly and click on OK and then a particular user gets the access to all your files and folders so this is how we create and work with Azure data Lake storage now let us see the comparison between Azure blob storage and the data Lake storage range so here are some of the comparisons between Azure data Lake storage and Azure block storage Azure data Lake storage is a technique of planning and control of the time whereas Azure block storage is an object stored with a flat namespace Azure data lake is an optimized storage for big data analytics workload whereas Azure block storage is basically a general purpose object stored for a wide variety of storage scenarios which also include big data analytics in Azure data Lake storage the apis are over https only whereas in Block storage the rest API is over the HTTP as well as the https in Azure data Lake storage there is no limits on the account but in Azure blog storage there are specific limits for container sizes and the files in the block so this was the major points which differentiate Azure Box Storage with Azure data Lake storage at last we come to the use cases there are many use cases of data Lake storage out of which we'll discuss about four so at first business intelligence on data Lake storage so data Lake storage dramatically improves the speed for ad hoc queries dashboards and remotes you can run existing bi tools on lower cost data Lakes without compromising performance or data quality it also avoids costly delays adding new data sources and the reports at second we see cloud data Lake migration here we can optionally deploy new applications to the cloud using data Lake storage such as S3 or ADLs you can also migrate from older on-prem data Lake environments that are expensive and difficult to maintain while ensuring agility and flexibility next data science on the data Lake storage here you can accelerate data science on data Lake storage with simplified data exploration and feature engineering dramatically it improves a performance making data scientists and Engineers more efficient resulting in high quality analytical models at last data architecture modernization so here you can avoid Reliance on proprietary data warehouse infrastructure and the need to manage the cubes extract and application tables you can run operational data warehouse queries on low-cost data Lakes of loading the data warehouse at your own pace foreign [Music] Works actually and this is pretty much common for most of the algorithms that you're going to implement first and foremost you would be needing data now I've already talked about the point that data is centered to machine learning if you do not have data you cannot make any predictions and more the data the better it is for you so the first part is having data and once you have the data the next point you need to confirm or make sure is the data is appropriate for machine learning and this is where pre-processing steps in what pre-processing does is it helps you process the data that you have and prepare it for machine learning I mean your data might not always be clean there might be some missing values some repetitive values which you do not want in your data when it is getting processed right so in this case we filter out this data we clean it we fill in certain values we predict certain values and we put in those values there and once this data is up and ready for working then we pass it on further and then we provide a particular machine learning algorithm to it now this again is a trial and error kind of method where it seems simple at times because we have discussed all those machine learning algorithms right so to naked eye or to naked mind basically we might think in this Direction Where We would think that okay this is the kind of problem I'm dealing with so this is the algorithm I might use but that is not the case at times the data is misleading we are not sure what kind of algorithm I want to use what kind of data I want to pass and how much data I want to pass in that case what we do is we first allot a particular algorithm use it implement it then we test the values then we try out some other algorithms as well and then we come to a conclusion as in okay this is the best algorithm and using which I've generated a model which is best to meet my needs and while doing that there are quite a few processes that happen processes like training testing validating where you pass in certain amount of data you build the model you train the model and then again you pass or keep some data behind which you later pass to test these models as in are they working properly or not so this is an iterative process and this might take in more than one iterations to actually go ahead and jot down or settle down onto a particular point so once your algorithm is selected once your machine learning model is built you can actually go ahead and deploy this model into your environment or real time working where it would be able to predict the real-time data or the data you provide your so-called algorithm right so this is how the whole process of machine learning works now the processes which I talked about pre-processing then implementing various algorithms training testing your data now this again they look simple or when you listen to them they seem pretty easy as in training the data and all those things but once you start implementing these things it is fairly difficult ask any data scientist and that person would tell you that pre-processing is something that is very difficult to deal with and mostly 60 to 70 of the work is done in these phases only so what if we had something that could actually help us here wouldn't that be easy I mean if we could just speed up this process of pre-processing algorithm selection training and testing data instead of doing these manually can we do all these things automatically the answer is yes now this is where your Azure machine learning steps in what Azure machine learning does is it helps you carry out the whole process but as you can see we have something called as ml studio and it focuses on your pre-processing application of algorithms and deployment processes so bulk of the task where which can be repetitive or which can require you to put in more efforts to implement manually it actually helps you automate or speed up that process so that is what ml studio is basically it is a studio or a service in a very popular cloud service provider that is Microsoft Azure which lets you implement various machine learning algorithms and it carries out the bulk of task or the bulk processes which you would otherwise not want to do now this is not something that actually is used to replace data scientists you cannot do that so no offense to any data centers who is listening to this video or going through this video it is more or less complementary to data scientists you would be needing statistical knowledge and hence we talked about machine learning a little because even if you build more models using Azure or any platform you would be required to have proper statistical Acumen or knowledge about data science something that would help you understand the output of the models that you've built so yes some statistical knowledge would always help and you cannot replace that but definitely if you are working on machine learning and you need to speed up this process or make it more efficient then Microsoft Azure and ml studio in particular is a very important resource for you to have so what I'm going to do now is I'm going to switch into the demo part and we would be building a model so that we can discuss some of this stuff that I've already talked about and we also get to see how Microsoft Azure Works in real time right so let me just switch to the Azure portal or console that we have at our disposal so guys what I've done is I've gone ahead and I've switched into my Microsoft Azure portal for people who are completely new to Microsoft Azure you can actually Avail certain services that Microsoft Azure offers you for free for one month now during this period you can Avail certain credit for U.S citizens or people who have an account in U.S region they can actually Avail up to 200 of usage where you can use certain Services now these services are chargeable that is why the free credit that is made available to you so which I believe is more than enough for one month's practice so if you're somebody who's new to these platforms I would suggest that you go through or sign up for Microsoft Azure and you can Avail those Services now since I am from India we have Indian rupees as our currency and for our usage we are given somewhere around 13 300 INR or Indian rupees which is a very big amount if you talk about using a service for a month's time so it actually serves my purpose I have been using it for a long time that is this particular account I've had couple of accounts but this one is something I created some 15 days back and as you can see I still have like 12 400 something something which I can use and we would be not needing this much today somewhere maybe 10 20 Rupees to the max so yes you can go ahead and create this account once you do have this account then you'd be having access to all the services that Microsoft Azure has to offer to you you can go ahead and create all the resources you can have or utilize its compute Services storage Services database services and all the services that it has to provide to you but since we are talking about machine learning we would be sticking to those applications as well so in order to use your Azure machine learning you need to create a workspace where you can actually go ahead and put in all your data and once you create your workspace you actually need to go ahead and sign into the Microsoft Azure ml Studio which is an interface or IDE where you can actually go ahead and create all those models so in order to go ahead and create a workspace you just need to come here and type machine learning and you might be having that thing in the suggestion it was already typed I believe machine learning and you have this place where it says machine Learning Studio workspace or service workspace you can click on these the studio workspaces something I've clicked on and you need to put in some details as in what is the name of your workspace what subscription are you using Resource Group you can use the one if you already have one if not you can create one just given some name now Resource Group is something that holds in details about your resources that you're using and then what kind of storage account you are using you can create a storage account as well it's not a big deal it is nothing but think of it as a storage place where you can store in your data that's it so you do go ahead and put in these details and then you say create before that you have to put in what pricing tier are you using and you'd be entering the region where you want your workspace to decide now what cloud does is it stores your data in particular locations on the globe right so you can choose the location that is closer to you or closer to your business depending upon your needs for now I'm gonna stick to the basic one that is here because we are just creating a simple demo in fact I'm not going to go ahead and create a workspace because I already have my own workspace but you I would suggest that you put in these details and create one once this workspace is created you can actually just open it and at the bottom you would be seeing an option called as machine Learning Studio else what you can do is you can type in this URL and you would be redirected to this page where you have to sign in with your Microsoft Azure portal account and once you do that you would be redirected to the ml Studio that I'm talking about so the workspace would be created once the workspace is there you log into your Azure ml studio and once you do that this is what you'd be having at your disposal now these are some of the experiments that I might have worked out or worked on in last week or so some of them are finished some are still in drafts so you can actually go ahead and create these workflows or these kind of workflows now you have so many options here what are the projects that you create experiments which we just saw different web services that are made available to you you can actually go ahead and create web services as well you have your notebooks now you might not always go ahead and start from the scratch right you might have your code written in maybe r or some other language like python so you want to import that code you can do that as well or you can use the existing notebooks which Microsoft Azure has offered to you where you have ready to use codes or ready to use models as well and then we have data sets now again you can import your data sets that you have there was one Financial sample data set which I imported recently you can actually go ahead and use sample data sets as well now you can see there are quite a few data sets that are made available here which you can use and Implement your own algorithms or implement the algorithms that Microsoft Azure has to offer to you right so you can do that as well so it pretty much depends upon what do you want to do and what kind of processing do you want to do as you can see if you come here you have some this is somewhere you can see your Trend models as well as in the work you've done or some models that you've implemented so you can have that here in this case we are going to go ahead and Implement one of the algorithms or algorithms that are implemented by Microsoft Azure so if I come here to experiments and I go to samples you can see that we have have so many implementations so we would be taking a look at one of these and then we'll be implementing that on our own do not worry we won't be copying it right away so we have quite a few options here as you can see okay let's just go ahead and do one thing let's build our own model here so for that we need to go to experiments or rather we can come down here and say new and I say add a blank experiment let's just go ahead and try to build a recommender system or something like that so to do that let's call it say my recommender there you go and save it okay there are no modules so you cannot save it so first let's get started now in order to create a recommender for people who do not know what a recommender is it is nothing but you pass in certain data and it might suggest or the model might suggest you as in what you might like or what you might want to do say for example when you shop on Amazon or any other website that is there you normally have some suggestions right I mean you may like this you may like that same as with YouTube you go through certain videos and it gives you suggestions as in you preferably might like these videos as well so that is a recommender system so let's go ahead and create one recommender system let's create one for movies so let's just go ahead and practice or play with the data that we have for that we would be needing a data set first right now if we talk about ml Studio it is very simple you just drag and drop stuff just like creating workflows it is as simple as that now in order to use a particular data set we have these saved data sets here samples let's make a recommender systems form movie movies maybe yep so movies do I have something in movie yeah there you go movie ratings so we would be using this data set now once you put in this data set it is available you can just go ahead and take a look at it so let's just visualize this data and as you can see the information is here it has certain values like user ID movie ID ratings and timestamp timestamp is something that people do not use frequently but yeah these are things that are important to us we have factors like your ID that is your movie ID user ID and rating it's somewhere around up to 10 so I believe it has started from 0 or 1 maybe so we have ratings from 1 to 10 which we would be using so this is the data that we've just visualized but at times this data is not as simple and as managed it might have some missing values and you might be required to play with it or make some changes to it you can actually go ahead and put in some factors here as well now here you have so many options that are made available to you right so you can actually go ahead and process your data a little manipulate your data a little like you can take a look at statistical analysis and all those things but this being a clean data we do not need to do that so we're just going to stick to the recommender system part here so we have the data set now I need to select the columns that I want to use so I'm going to project certain columns out of it so for that we have a module here called as project columns okay I don't see it here I believe they've changed the name select columns okay so this is the one they've changed the name it's called as select columns in data set so we drag it and paste it here and we hold on to the circle that is there on the previous module or the tab we had and we pull it down so we can connect these two now they are connected and but since I've connected them there's an error here it says value required now I need to pass in some values as in what are the columns I need to focus on right so I would be clicking on this icon here or tab which says launch column selector and it gives me options so I'm gonna put in certain rules now what are the columns that I want first I would say that get in all columns and just exclude the ones that I don't want so what are the columns that I do not want timestamp was something that won't be very handy so I'm gonna remove that so I would be excluding that and I would be saying okay once you do that the error is gone so we have the data set we've selected The Columns now next phases I need to go ahead and pre-process the data but data is already pre-processed so I don't need to do that either so in this case I would be going ahead and splitting my data into two parts my training data and my testing data training data is something that we would be passing on to the model and we'd be training our model based on the data and test data is something that we would be holding back and then we would be using that testing data to test our models or to predict the outcome or to see whether the model is working fine or not so to do that we need to split our data so come here this process is easy just go ahead and type in the words that you need to do and it gives you modules to do that so split data I just pass in the data so I've gone ahead and I've created or I've pulled this split data module or tab into my workflow so yes the data would be split and it would be split in this fraction 0.5 that is we would be using 0.5 percent data to test that is half of the data to test and half of the data to train our models so there you go I won't be tinkering with these factors those are good enough for me so let's just move further now there's an option here where you can zoom in or zoom out your model to fit in the screen so the data is split now next job is to train your data so train your recommender system uh do we have option to yeah train Matchbox recommender so we select this and pull it here now what we do is we pass in one of these branches here first one and the split data is passed on to this training modular tab now I'm not too good with the nomenclature that is why this confusion let's call it tab so yes so I've pulled in this data into the tab so this is something that would get trained here and I need to score it as well so once I come here I select this first and it shows me the details as in okay how many trades of the data that I have that I want to use to build this recommender system let's just say 10 okay 10 is fine I don't see too many problems with it maybe let's do it 20 number of recommendations I want okay now this can also stay to 5 no problem training batches four is fine so we just move further and next what we do is we just go ahead and score the data that we have so I say score and I begin this thing so again to this I would be passing in my training data here and I would also be passing in my split data that is the testing data there you go so the data that has been trained that would come here and also the testing data would be here so I have this score where I need to given the details as in what are the predictions that I'm looking for now I basically want prediction where I would be wanting related items right if I watch this movie what kind of movie I need to watch right so this is what I would be entering here so once I enter related items it says what are the maximum number of related items to find from an item that you have let's say I want just one item there you go and so it would give me one related item to the movie so if I pass in a particular movie to this recommender it should suggest one movie that I might like watching so that is what I'm talking about here so one related movie you can have more than one as well so that is up to you so we've done this now next is I built a model or I've actually gone ahead and put in a tab for training and scoring next I need to evaluate create this data right so I say evaluate or test rather so it says evaluate your recommender you get in here and this time you take in this value that is your score and you put it here in this column we passed in one of these threads here next is we take in the split data and we enter it here there you go this is fine I would be saving it here just to be safe once you save this data okay guys so the connections that we pass in these are important in which Port are you passing in what value so this might hold and this is more or less an experimental kind of a stuff where you might actually go ahead and put in some wrong connections and you might get in some errors so you actually need to go ahead and troubleshoot some of those at times not always so we've actually gone ahead and we've almost built a model now we need one more table here or one more data set that is movie IMDb titles let's place it here okay it has gone somewhere so I place it here there you go and whenever we have a new data set we always visualize it to understand what it has okay this should not take this long but for some reason it is so I have this data set where I have movie ID and movie name so I would be using this data set for my recommender where it has some IMDb titles which is not the actual IMDb that website which we have it is a sample data set that is created so we have this data set here now that we've seen the data I'm gonna go ahead and use a command called as edit metadata and I'm gonna place it here and as usual I'm gonna go ahead and put this thing in here now if I come here as you can see there's this column here which says select certain columns which you want to use so what are the columns that I want to pass to my metadata basically or what is the metadata that I want to use so I'm going to go ahead and click on launch column selector and in this case with rules I would start with no columns and I would say include so what are the columns I need first is I need item so you have to hit the enter button you won't be given a suggestion here because for some reason it is not taking in this value let me see what is wrong okay we need to pass in first values to this local metadata before that we won't be able to deal with items because we just took a look at this data set and it does not have item value that is why we are not able to pass in that value so let's pass another values to it the values that are more relevant to this data set so we'd be coming here and we'd be selecting some other values now you can see that we have these values that are available so let's just go ahead and select those there you go and I say okay so the error is gone now now we have a metadata which is made available to us and I have a model which is up and running here now I need to put in Joins here now if you all know what joints do is they basically help you select data from one table to the other so we have two tables here or two data sets so I want to combine the data that these two tables have now I won't be getting into the details of joins and all those things but we would be using them here for the general reference sake so I have these two data sets here and I want to predict or compare the table with the table that is here so what I'm going to do is I'm going to build a model or create a joint that lets me compare the movie names from one data set with the other or give me recommendations from one of the two data sets right so for that I would be needing a join here so let's just come here and say join okay before we get into this thing that is one more important point my edit metadata data type it has to be string there you go now I need to pass in values to this tab as well so for that I would be needing a score from here so I would be taking this and I would be placing it here there you go and one from the metadata to this joint now what are the values that I want to pass in here now it should be item I believe so as I've said hit the enter button there you go and you say okay and columns from r or the other table what do you want from here I would be seeing maybe movie ID there you go I say okay what kind of join do I want I want left outer join now again I won't be getting into the details of these joints don't keep the right key columns because that is the reason I'm using this join there you go I would be needing one more join here and I would be pulling it here because the first join would just give me the movie ID but I just don't want the idea I want the movie name as well right so come here and pull in this thing and I would pass it to this data set and edit metadata there you go again if I come here it would ask me for values I would say give me related items if you have any insure I want the movie name it says save if you missed out on one thing we need to come here first and remove this left outer join and now save we are bound to have some errors guys so stay tuned now this thing runs it would run it tab by tab or module by module and then everything would be executed so this might take a couple of minutes once the module or the tab is executed it shows a green tick on it as you can see here we have text here and here this might take a longer because we have increased the number of iterations to 20 in this slide or in the tab when we were working on it so the whole processing might take a little longer than normal okay it says related item not found let's see why is that happening launch recommender selector yeah this is the one that is the related item one probably there is no related item variable in the data set that we generated now as I've told you in the first model I'll show you where first let me select this for now and let me say okay here in this core Matchbox I had passed a number of related items right it was one so by default it was given a name called as related items one so that is something which we are passing in here or something that we would be displaying so there you go so what we've done is we've actually gone ahead and put in all the stuff that we wanted to let's just see whether it runs and once it runs I would be explaining this again to you people so do not worry first let's run it now this time around it should happen quicker because most of the stuff is done we just have the error in the last tab or the last module so the other part should be done quicker as you can see and now the last one would be implemented and it is done already so guys our model is up and running let's just go ahead and check so when I click on this icon and I say visualize it should give me some values see how relevant values these are let's just verify it now this being a model it might not be that accurate but let's hope it gives some values it is not giving me the movie name and movie ID for the other section or the related item let's see why is that the case but it says if you have seen this movie you might like this movie but we do not have that movie so let us see where we have gone wrong first so there is some error here or here we have a movie idea so let's just come here and see okay so we can just match ID to ID or map ID to ID so let's remove this for now and say movie ID and then see whether we get the output so I say save not save as let's say save and I would run this so again it should run quicker than the last time there you go now let's just see what is the output that we get so I say visualize and there you go guys it's as simple as this I mean we put in or pulled in some values and we've edited certain values and we have the result here now this is an Indian movie called as talash and it says that if you like this you might like this Oblivion probably Jack Reacher I haven't seen neither of this Iron Man you might like stand by me so I don't think this recommender is that accurate but probably I'm sure that there would be some movies in it which are more relatable so and for people who are big movie fans they probably would be able to relate a lot more to the movies that I hear so again you can actually go ahead and select the number of related items that you want to select and you might actually go ahead and tinker and tailor your algorithm a little more for that you have to play with the values that are there you have in this algorithm you can just come here and switch in these details and probably the answer might be depending upon the inputs that you pass to this algorithm so this was my basic aim I wanted you to get some Hands-On on Azure machine Learning Studio and nothing more than that but as far as this model goes or this particular session goes we've actually gone ahead and taken into data sets we've actually gone ahead and built a model trained it tested it and then we've used a joint to actually go ahead and see what would be the possible movie that you might want to watch if you liked one of those movies so again as I've already mentioned it might not be that accurate you are free to go ahead and play a little more with it you can pass in your own data sets as well [Music] now coming to the other component basically your virtual machine you know so virtual machine is basically something people often get uh you know confused with that so what is virtual machine so in in this computer World in this Computing World a virtual machine is just an emulation of your computer system you know your VMS are just based on a computer architecture and basically you know they provide functionality of a physical computer so uh on your system you can create as many virtual machines as you want okay you don't you don't have to basically you know worry about uh you know all this kind of things so you just have to uh you know uh in other words you basically just need an ISO uh that you want to uh you know uh require in order to spin up on your virtual machine for example on your laptop if you have enough hard drive enough RAM you can just uh you know create a new virtual machine so you can basically have different uh softwares for this you have basically have virtualbox you basically have uh you know uh esxi you know there are various uh softwares that you can basically use to build up the virtual machines so like uh I can show you another example before I come you know jump to that point same thing the the same concept basically you know applies on the uh Cloud World wherein what you do is you just use one uh you know instance One account for example I'm going to log into Azure I'm going to use my own account and uh once I'm logged into my account I can create as many virtual machine as I want and I have to pay as I go like I just have to pay for the virtual machines I don't have to uh you know pay for entire uh infrastructure so whatever I'm going to use I'm going to you know pay for that now first one of the example of virtual machine like in my local machine I have enough space so I build up you know this virtual machine which is going to open up in a moment uh using uh you know oracle VM virtualbox manager so this is a you know Kali version machine which I have built up so if I'm going to launch it up even though my system is a windows in my Windows operating system it's going to open another operating system on this console on the VM Box console which is Oracle virtualbox console uh Carly lens operating system so likewise you know it's going right now this these uh virtual machine this is one virtual machine I can have as many as I want till the time I have enough Hardware space to you know utilize all the needs so uh once I have created the virtual machine I can just depending on my Hardware configuration I can run as many virtual machines as I want but all these virtual machines are going to be remain on my local machine it's not going to be on a cloud if I go somewhere else you know I will not be able to access it I need my a computer or these you know virtual machine files to access it in the case of cloud you can access it anywhere wherever you are okay so I hope you guys got a very fair understanding about each and everything that we have covered so far so till now we talked about you know what is basically is uh you know cloud computing what are different components of the cloud you know what is uh what is basically uh you know Azure what are the different is your features what is a virtual machine and now let's head towards the iot what exactly is iot before we jump on Azure iot so we are basically breaking around the things in uh various parts because I know that you know all the people come from a different background so uh now before we directly jump to SEO iot we really had to understand what the cloud is so we are basically taking the baby steps here you know what the cloud is what cloud computing is what Azure is what uh you know the different uh functionality of azure you have you know and then what virtual machine is and then uh you know what is iot iot means internet of things which means that you know you're going to connect various devices uh just to you know have a kind of a resiliency redundancy and you know instead of having different different devices you're just going to focus on one device and you're going to get the information from there so you're going to you know connect various uh devices various things over the Internet uh without any human prevention you know so like if we talk about one of the other example of iot uh which will give you more understanding you know as you can see in this uh you know screenshot what we have done is like in the We have basically connected variable technology smartphones Vehicles your you know home lightning your home appliances your music your personal computers analytics your flight services online shopping we have connected you know uh almost many uh you know functionality here just to you know have one solution so instead of having multiple Solutions like uh dedicated solution for online shopping are different solutions for wearable technology different solution for your vehicles different solution for home lightning it's going to be a lot of solutions which is going to be practically impossible for you to manage it plus you have to basically you know uh pay for all those kind of things separately so uh iot gives an advantage wherein you combine multiple Technologies just to make a single technology okay now I hope you guys got a very fair understanding and very thorough understanding about cloud computing is your the components uh you know iot now let's head towards the major thing which is iotns here okay so what exactly is iot in Azure now you guys have a fair uh idea about uh you know iot and zero so the very common analogy uh behind Azure is like it basically focus on the power consumption analogy like uh you can see here so it basically means that the pay as you go so whatever the service is whatever the things that you're going to utilize right now you are going to pay for that only so it's not like that uh you know they have written one of the example here so it's basically similar to the power consumption you know so whenever you're using the electricity you basically pay only for the power units that you use and uh you are not going to pay for anything else so a similar concept lies here in the case of s0 only you are just going to pay for the component itself not about other things so this is again the symbol of iot in Azure you know so where it's going to look like this so I'm going to show you practically once I'm going to move towards uh you know that so these are the different components in the iot on Azure as you can see here uh you know what he had done is you have I I have a use case I want to combine a web app of our bi tool storage I need I need a steam analytics iot Hub you know so with the help of uh you know iot which is one of the component and Azure I can combine all those kind of features and I can use it you know directly I don't have to worry about you know all other kind of things okay so I hope you got a very clear understanding about iot uh wherein you combined the different Technologies and just to build a one solution instead of having multiple Solutions if I have to conclude this as your iot term now let's talk about that there's your iot components what are the different Azure iot components that you can have you know so the very first component that you can have is basically as your iot central which basically means that uh you know you are all the different technologies that you have that you have combined basically are going to lie in uh in just single technology it's not like that my different technologies that I have combined uh you know for that I need a different interface I just need a single interface and with that single interface uh you know I can connect with all the different components like I want to build a web app I can basically build a web application I can simply you know build uh you know your uh uh you know the database application and I want this web app to uh you know uh to communicate with your database application so you can make the Dual level communication with both these applications instead of having a different interface with just one interface which is going to be Azure platform with with single login account which in which you are you can just you know access all the different components that you have okay other is as your iot solution escalators so wherein uh you know so all these basically devices require some kind of uh you know interface so these different devices these different Technologies can you know integrate with each other so I'm saying that I have an application which is basically going to you know uh serve on the internet like I'm going to uh I want to make some you know shopping website wherein people are going to you know perform the shopping on directly on my website so I have basically that shopping application you know so uh this shopping application will require the database this shuffling application will require let's say you know um now different Native segment all those kind of components so it requires some some form of interface between the different components like this uh you know shopping application has to integrate with the database which is which has to be a different server for security measures and uh I require another database which is going to show the you know hash of the you know passwords because I'm uh I want to be you know security Centric as well wherein if my database is compromised let's say in X due to XYZ reason you know so uh during that time I don't want that sensitive information of my you know customers should be hacked you know so uh let's say I made this episode in UK and this has to be asked for the gdpr guidelines which is global data protection regulations if my data database is going to be compromised with the customers I have to pay Hefty fine uh you know to the government at the same time uh in addition to paying the Hefty fine I have it's going to spoil my name and fame right so we don't want to do that so that's why I want uh all the different components to be on a different different uh you know systems at the same time if I if I want to have all these components from different different system I want all these components to be you know interactive with each other so there are because your database is going to be on a different server your uh you know uh web application is going to be in a different server your passwords which are going to be in a hash form are going to be in a different servers so you have to have any kind of interface where you know all these uh different uh components that you have basically can be you know interact with each other so that's the benefit that you get with as your iot solution excavators wherein it helps you in providing the interface the next one basically is azure iot Hub wherein it means that you have a just a central Hub or Central depository from which you can just manage all the components I'm going to show you uh you know how it looks like on Azure platform and you can just get the things you know from there only the other is basically Azure times uh say these insights so it means basically that you know each and everything is basically logged with a timestamp you can have your own logs uh so basically you don't have to you know rely on uh you know different people different team members for logging everything it's basically going to log each and everything for uh you know for your application even you can just configure your own multi tools as well let's say I want to completely depend on this your only I don't want to build my in-house platform so I want to build everything on this road itself so each and everything which is going to be you know logged all the requests which are coming in my environment all the people who are trying to log in each and every request is going to be logged So based on that uh those events I can just set up the you know certain security mechanism wherein uh you know if a request is coming from that IP couple of times it's going to you know send an alert so uh you know that's the meaning of SEO time series insights you can just be uh you know have a clear uh transparency of all different kind of events which are being uh you know happening the other is yours sphere which is basically one of the components of your sphere when you can just see all the components you can manage it out and the last one is as your Maps like the Google Maps basically you know you can just see the information you can just manage the information and you can just you know get the things from there itself here these are the different uh as your iot components now coming to one of the important part which basically helps in you know building up Azure structure is like virtual networks so for the better uh you know uh security purposes as well as for better isolation we can use Virtual networks wherein we can just uh you know Define our own network and where we can put the applications into the different different network zones basically what I want is that uh I want my uh you know one application has to be in a different network other application has to do a different network so it's not like that uh that's some of the uh sometimes people think that how I can achieve on on the Azure itself like if I'm building something in my in-house Network okay so uh what we used to do is the critical components uh for example my level of my critical application let's say in my uh let me let me make it more simple to you guys let's say in my organization I have 50 servers okay out of these 50 servers I have 15 critical servers which basically stores the information of the customers credit card and everything out of these 50 servers I just want to have only two servers only internet you know and you know that uh internet means internet internet connection of the network so there can be certain computers as well uh you know wherein the hacker can also be there who can try to ease up the communication so he can try to put the malware on the machine so if he's if hacker is able to put malware in one of the system he will be able to replicate those values in all the systems right so in order to minimize it what we do in and generally in the on-premises environment we just put it in our different security zones like these two computers I could put it uh you know on the internet via the processor I can use dmz's for that a different network basically so that if something is compromised it's going to compromise those two computers only which are available via the Internet rest body it would be fine so how you can achieve with this uh you know this thing with Azure because the entire thing is in the cloud so networks uh people think that it cannot be managed by you that's wrong Azure has a concept of virtual networks you know basically um what it means is it means that uh you know uh the different uh you know the machines the different infrastructure that you're creating you can create a different V Network for that so here in this case uh as your virtual network uh it means basically you know a v-net basically what you can do is you can just uh you know isolate your different uh components different computer uh you know at a different different networks so you don't have to basically rely on uh you know one of the component one of the network you can just basically isolate you know all these computers all these uh you know critical components into different uh you know networks so that's basically your uh virtual Network and in the Azure basically you know that is a sign of virtual Network you know the Azure platform if you're going to log into your account you're going to get a sign like this of the virtual Network okay and this is again one of the real environment uh you know that I can show you which can help you in getting a fair understanding let's see these are the three virtual machines you know in my virtual networks you know I want these these three virtual machine into a different uh you know subnet so we're in working what you can do is you can just put in a different supplement this machine is going to be different supplement this machine is going to be a different supplement this is going to be in a different subnet which basically gives you an advantage you know instead of having uh machines in the same network you can have machines in a different different uh subnets so that you can just uh you know protect yourself and then if I have to summarize this entire architecture you know the way basically how it works is you know let me show you in the next slide this is the entire uh you know architecture the way it's going to look like uh to you wherein you have basically NSG uh NSG basically stands for network security groups these uh nsgs are just like your uh you know firewalls only uh which filter the traffic which is coming and going out from your virtual machines uh you know these settings can be easily you know done by this word itself then you can have a different subnets and uh different virtual machines into those submissions and you have at the core at the top you have basically the virtual Network let me even show you uh if you want to basically build uh you know as your iot how you can so basically like I mentioned you know for starters you need to have a Azure free account so you can create a free account if you don't have any any of the free account basically so you're going to log into your Azure account and you're going to have a page like this as you can see on my screen you know this is going to be your home page so you can just click on a dashboard uh here if you're if you get lost uh you know backing the dashboard you just come here you can just create a resource of you as you can see in my screenshot here you know this is basically the you know the entire uh you know the resource you can create it here once you have basically created the resource remember everything in the Azure has to be lie between a resource uh you cannot have anything uh you know up in the air you have to put the things in basically uh in a resource so that all the uh whatever the kind of configuration you are doing you know like I want to create a different uh Network I want to create a you know Network Security Group which is just like a firewall you can just put everything in a dedicated Resource Group so here you are going to create a resource once you get the resource you are going to select uh you know here the internet of things iot we talked about and you're going to have iot Hub the same thing that we have uh you know that we have learned a moment ago so you can have a quick start tutorial wherein you can just uh watch it uh the tutorial and uh you know you can just get more information about it all the information that you have already gathered with me right now and then once you've created your you know you're basically uh iot Hub is basically ready to use it's going to look something like this you know so here uh you know this is basically the subscription wherein you have to provide a subscription name the resource Group name Rhodesian region basically is uh basically you can use default one so what it means basically is that uh if you are going to create a virtual machine in different region you are for example I'm in uh you know let's say South Africa I'm creating a virtual machine in a region which is uh let's say India you know so it's going to create a virtual machine in that region and I'm going to exit over there so sometimes it can have uh you know some kind of lag but it does not uh create that much lag so you are not going to you know create uh that much difficulty so uh that's the reason is so if you're even using the uh the default uh the region that is also fine it's not going to uh you know impact anything it's going to create the virtual machine in that environment and again in the case of s0 everything is basically into the clustering environment where the things are in the cluster itself directly and here we go here your uh you know iot Hub basically gets created [Music] what is azure service bus Azure service bus is a fully managed Enterprise message broker which has three important entities that is message queues and publish And subscribe topics and namespaces before dwelling into these entities let us learn what is azure service bus and then why do we use it so as I said Azure service bus is a multi-taining Cloud messaging service that sends information between applications and services it provides platform as a service communication platform built to allow more robust multi-tenant Software System to be built in the cloud so when you need a cloud-based solution to broker messages between different applications to reduce your coupling in your system you can use Azure service bus now let us understand why do we use Azure service bus as we know Azure service bus is a messaging service on a cloud used to connect any applications devices as Services running in the cloud to any other applications or Services as a result it act as a messaging backbone for applications available in the cloud or across any devices now let us understand some of the key scenarios that will help us to know why do we use Azure service bus so first key scenario is it can decouple applications that allow each component to perform its task independently that means producer or consumer do not have to be online or readily available at the same time the load is leveled in such a way that the traffic spikes don't over tax any services this also improves reliability and scalability of applications and the services next is receivers and subscribers can receive copy of messages depending on the filter rules set on the subscription it means you can Define rules on the subscription a subscription rule has a filter to define a condition for the message to be copied into a subscription and an optional action that can modify the message metadata so this can be useful if you don't want a subscription to receive all the messages to send to a topic or if you want to markup messages with extra metadata when they pass through the subscription next is two or more operations can be grouped in scope for execution so there is a feature in Azure service bus that is transaction that groups two or more operations together for execution in a scope it makes sure that these operations either succeed or fail jointly and not partially that is why the scope is often referred as atomic which means the entire transaction always succeed or fails as one unit of work and is nowhere left in a half complete state with multiple concurrent consumer you can process multiple messages concurrently to optimize throughput to improve scalability and availability and also balance the workload this means you can allow for multiple competing consumers to read from Queue at the same time each safely obtaining exclusive ownership to the specific messages next is auto forwarding can be used to scale out individual topic so Auto forwarding is a feature in Azure service bus that enables you to chain a queue or a subscription to another queue or a topic that is a part of the same namespace when Auto forwarding is enabled service bus automatically removes that are placed in the source that can be first queue or a subscription and then puts them into the destination that is into the second queue or a topic you can also schedule a time to process the messages that means you can submit messages to a queue or a topic for delayed processing for example to schedule a job to become available for processing by a system at a certain time so this capability realizes a reliable distributed time-based scheduler schedule messages do not materialize in the queue until the defined in queue time before that time schedule scheduled messages can be canceled which deletes the messages you can also schedule messaging using any of our clients in two ways either use the regular send API but set schedule in queue time property onto message before signing so schedule messages do not materialize in the queue until the defined in queue time before that time scheduled messages can be canceled which deletes the messages now that we have come to know why do we use Azure service bus let us know some of its important entities so first is namespace so what do you mean by namespace a service bus namespace is your own capacity slice of a large cluster made up of dozens of all active virtual machines so basically it is a container for all messaging components so here multiple queues and topics can be in a single namespace or namespaces so as you can see in this diagram this is a container which has Q and A topics and subscription so it is a container for all messaging components which can include multiple queues and topics so you can have multiple topics and cues in a single namespace or namespaces which can later serve as an application container so you can think of a Service Plus namespace as a server with its own capacity of large cluster of all active VMS this makes the service bus as an available and reliable service to scale without us needing to manage the service next is topics and subscriptions so topics and subscription are the message oriented middleware that is responsible for holding and delivering messages to the subscriber so basically it provides one-to-many form of communication in a published And subscribe pattern it is useful for scaling a large number of recipients each published message is made available to subscription registered to that particular topic so when a publisher sends a message to a topic one or more subscriber receive a copy of the message depending on the filter rules set on these subscriptions so as I said in previous key scenarios that subscription can use the filters so here they use the additional filters to restrict the message that they want to receive so the consumers don't receive messages directly from the topic instead consumers receive messages from the subscript options of the topic so when a subscription is created you can supply a filter expression that operates on the properties of the messages the properties can be both the system properties for example label and the custom application properties like store name so as you look on to this diagram we do have subscriptions along with the topics to process the messages so the sender sends the messages to a topic in the same way that they send the messages to the queue but it varies in the slide factors where the topics can have multiple independent subscriptions so as you can see there are multiple independent subscriptions which is colored in red green and yellow so subscriptions are durable by default but can be configured to expire and then can be automatically deleted as well so here we can Define rules on a subscription and then a subscription rule has a filter to define a condition for the message to be copied into the subscription and an optional action that can modify the messages to the metadata this means you can by applying filter we can decide which messages should be assigned to which receiver so as you can see here we have applied filters so for an example the messages highlight it in red are to be sent to the receiver which is highlighted in red whereas the green one goes to the green and the yellow one goes to the receiver which is discolored in yellow so this is a simultaneous process here as it is a one-to-many relation so we can send these messages to the N number of receivers which are subscribed to the subscription now if we talk about cues as we know queues offer first in and first out message delivery to one or more competing consumers that is receivers typically receive and process messages in the order in which they are added into the queue and only one message consumer receives and process each messages so as you can see in this diagram this is a queue and the sender when it sends the messages it goes into the queue in a synchronous way like in sequence one by one first in first out so here the system can have multiple senders so as you can see a sender has sent the messages and it has been organized in the queue now only one person can consume the messages at a time so here if you see the person 1 consumes the first messages and then the second person can consume the second messages as it is synchronous process and lastly the third consumer can consume the message so this is how we send and receive messages through the queue so a key benefit of using queues is to achieve temporal decoupling of application components in other words we can say that means the producer that is a sender and a consumer which is a receiver don't have to send and receive messages at the same time that is because messages are stored durably in the queue furthermore The Producers don't have to wait for the reply from the consumer to continue to process or send messages now that we have come to know about these entities let us compare the two most important entities that is queues and topics so we'll be comparing them based on some of the important constraints so first constraint is consumer options here in queue multiple server receivers can be added to the cube but each message will be sent to one of them as I said it is one-to-one relationship so one single consumer can receive a message at a time whereas in topics the messages can be received by numerous recipients as it offers one-to-many relation and each message copy can be delivered to any number of subscribers linked with that particular topic next is message filtering in queues because the messages are only received by one person queues do not require any filters but in topics a collection of attributes can be attached to each message broadcast across a particular topic when a custom subscription filter is applied then these properties are utilized next is consider tumor scalability in queues if you need to scale a queue you are still limited to having the one consumer whereas in topics does not need to be recreated when new subscription is formed only new messages submitted to the particular topic are also received by the new subscriber so once a new subscription is created this can be more scalable than queues as more than one consumer can receive messages and once a new subscription is created all new messages that are sent to that particular queue will be received by the new subscriptions Also let's look on two few more constraints next is message Auto forwarding so in queues messages can be automatically rooted to a queue or a topic however a topic subscription cannot be a destination whereas in topics messages can be sent automatically from a topic to a queue or any other topic additionally messages from topic subscription can be sent to a queue or a topic message removal in queues the first receiver who finishes reading the message also removes it from the queue preventing further readers from processing it whereas in topics the message is removed only after every receiver has processed the particular message so when the message is read by all the subscribers the message is then removed from the topic too last is use cases so we choose service bus queue when there is a need to pass a message in the one to one system whereas we choose topics when there is a need to send messages to multiple system so these are the key differences between queues and topics now let's understand the concepts with couple of Hands-On so that we understand how do we send and receive messages through Azure service bus so our first time zone is sending and receiving messages by queue so here I'll be using the Python program to send messages to and receive messages from Azure service bus skills for this we must have Azure subscriptions I hope most of us might be having the Azure subscriptions if not you can sign up for free and get the subscription through job portal so that you can get access to many services so let's quickly move on to our Azure portal so here we will be first signing in so let's quickly sign in after signing in we'll directly drop down to our dashboard so here as you can see there are many different services so let's quickly create our service bus so once you log into your Azure service our first step is to create our service bus so here as you can see we have the service bus option so just directly click over here now as you can see here we have no name spaces here so as we had discussed that what are namespaces so here they are nothing but containers so we need to create a container over here so let's quickly create a namespace so before using any Services we must have a resource Group so if you already have a resource Group then you can use it or you can create a new Resource Group so let's quickly create a new Resource Group so let's name it as service bus demo one okay so this is a resource Group now what do we need to do is to give our namespace name so let's name it as well let's name it service bus demo one okay it shows it already exists and a few more details okay now as you can see it has been checklisted and now you can choose your locations as per your requirement so I'll just choose West us and now it comes to our pricing tier so here as you can see there are available pricing tiers so as you can see here it is premium standard basic so in basic we might not get all the requirements or all the features which we might require while using our services so we'll go ahead with the standards as premium requires some of the amounts in like we need to pay some amount so that we can get an access to all those features so here we'll be choosing standard once you create your name space and choose your location and your pricing tier will just move on ahead so here will not be will be keeping everything in default and let's review and create so as you can see here it is valid editing all the details and once it is validated then we can create the service first so our validation is succeeded now we'll just quickly create it as you can see here it has been initialized it will take a minute or two like it won't take that much long but yeah it will get started to deploy so yeah here our service bus has been created and now it is showing deployment is in progress so this might take some time so as I said here I'll be using the Python Programming so every one of us know it is quite easy to use so whereas you can use any other programming languages or other packages like nuget packaged or to like work on with the Azure service bus so here first will be seeing how do we send messages into the cube and then we'll see how do we receive messages from the queue now what do we need to do is go to our resource once you land here you can see just see the overview of your service bus so you can see in under which resource Group it is being located and what is the status location and your subscriptions and your pricing tier everything now our next step is to create a queue so as you can see we have no queues over here so you can come to these entities and you can just select queue and from here we can create a new queue so let's just create a new queue so here also we need to give a name to the queue so let's name it as service bus demo one queue sounds good and here as you can see your maximum queue size is 1GB so it has been given as a default whereas you have you can change your maximum queue size based on your requirements and you also have the maximum delivery count so here the maximum delivery count is your number of like how many times you can sign the messages or you can process the messages so here as you can see your you have a value range from one to two thousands so here the default value will keep it as 10 and rest is your message time being live so it has been kept for 14 days so let's keep it as it is we'll keep everything as default and let's quickly create so here you can see your service bus demo queue has been created so if we'll just click into it you can just see the overview of the queue so here you can get to know how many messages are have been sent or either it has been scheduled or not so now that we have created a service bus we have also created the queue now our next step is to create a program so that we can send messages into the queue so first let us create a folder so that we can store all our programming files into it so as you can see in I have created a folder as Azure service bus so here what do I do I'll store all my Python programming code files so that it is easy for me to locate so now what do we do is open a notepad all right now here we'll be writing a code to send the messages into the queue so as I said we'll be using the Python language to work on with the Azure Services we'll be electing the code in Python so for that make sure you have the higher version of python installed in your system so that you can even install the packages whatever when we need to import it so let's quickly open our Command and let's see what is our python version so as you can see here we'll just check out python version so as you can see this is the latest version of a python so make sure you have the latest version of python so that it is easy for you to work on with the codes and as well as your process goes very smoothly as we have our python installed in our system now it is time to install the packages for Azure service bus here we'll install the Azure service bus client library for python so that we can create a connections which will help us to send the messages and receive the messages so let's quickly install our package first pip install are you sure service bus all right so I've written a wrong code so we'll just correct it and we'll hit enter so this is the command for installing the packages so here it shows that my packages are already existing so previously I had already installed my packages so it is available as it you can see here the version of the package is already available now what do we need to do is start so now what do we need to do is write a Python program so that we can send the message so first thing is to create an import statement so this means you are importing the Azure service bus package so from a sure dot service bus import so here we'll be importing the client as well as the service bus message now that we have imported our server Azure service bus so now here we'll be mentioning our connection strings and our queue name so Str is equal to here we'll be mentioning our connection string so what is the connection string over here so as we go to our Azure portal and we'll go to are service bus so here once you create a queue we need to authenticate that particular sender so that they can send the messages into the queue so here we will be mentioning the namespace connection string so that the sender whatever message you want to send to the queue it can go through these connection string so we'll click on to this root manage shared access so as you can see here you can either manage send and listen you have all the three process to it so we'll just copy the primary connection string you can use the secondary connection string only if your primary connection string doesn't work so we'll quickly have we'll copy this primary connection string and we'll paste it in our code all right so once that is done now we need to mention our queue name so our Q name was service bus demo one Q all right it has come into caps lock we need to change the case Service First demo one Cube all right so this was a queue name which we had created now the next step is we need to add the method so we now we need to create a message so that we can send a message into the queue so here we'll be adding a method to send a single message so here the sender is an object that act as a client for the queue that you have created so let's quickly create a method so Dev send Le message so here we'll be mentioning the object that is our sender all right I hope this is clear now so now as you can see over here we imported our like favorite we have imported our package into the code and then we create a connection string and we mentioned the correction string over here and the name of the queue which we have created now what do we need to do is to create a method so here we will be sending a single message now we need to create a service bus message so how do we create that is message is equal to service bus message so as it is a single message so we'll be mentioning here as single message now this message has to be sent to our queue for that we'll use our object that is sender Dot send message I will mention the message that we have created and there will print it as sent a single message so this was first like this was a method to send a single message like this we can also send a list of messages or a batch of messages where we can give the value of how many times the message has to go so let's quickly add those methods to so Dev send um list of messages they'll mention our object that is sender and then we'll have the same code over here so we'll just copy paste and we'll change the values here we as we are sending the messages in list so we'll write it as message in list and we'll mention the range till how many range it has to go so for range five okay I need to add this square bracket all right now we'll change the statement that is sent a list of five messages all right so now that we have created the list of messages now let's create a batch of messages for that now let's first Define our method that is send batch of messages and will mention our Center and now we need to create the batch of message for that we'll write batch messages [Music] is equal to sender Dot create message batch so this is the method that we are creating over here now we need to provide a range that how many messages can be sent so as I had mentioned before in our previous in the list of messages here will be mentioning in a range of let's take 10 and then now we need to add messages into the batch for that we'll batch message dot add message now we'll mention our service bus message now in this we'll write messages inside a service bus message now we'll write a following code except value error break now that we have created our batch message we have added the messages into the badge now we need to send this batch message into the queue for that we will write sender dot send message patch [Music] message and then we'll write a statement that is sent a batch of 10 messages now that we have added the methods now we need to create a service bus client and then a q sender object to send these messages so let's quickly create that so first we'll be creating a service bus client using the connection string so for that we'll type service bus client is equal to service bus client from connection string and now we'll mention the connection string which we had mentioned above so connection Str is equal to and and logging enable is equal to true so we are giving them the axis now with this service client we're gonna get the Q sender object to send the message to the queue so let's write that with service bus client sender is equal to serve this bus client dot get Q sender and now you will mention the Q name which we had created so Q name is equal to Q name after this now we need to send the three types of messages that is single list and the batch so for that we need to write with sender send single so here we'll be mentioning the methods which we had created so here let's quickly create all of them we'll just copy this and we'll paste it and we'll change the method name and lastly we'll just mention it turn signing message just all right so this was a code for sending the messages into the queue now let's quickly save it so we'll save it in our Azure service bus folder itself so let's name it as send Q message or we can just send write it as Q send message and Dot py as it is a Python program and we'll save it all right so this is our whole code that is to send the messages into our Cube so let's quickly move to our Command Prompt so let's see in our folder whether it has been there created or no so here as you can see it is a Python program has been created now what do we need to do is go back to our Command Prompt and now let's run our program so as you can see the program has been run successfully your output has come send the single message the list of five messages sent a batch of 10 messages so now let's quickly move to our Azure portal and check whether the messages has been sent or no so we'll go to our service bus and we will go to our cues all right so yeah as you can see previously we had no active messages that was zero messages now we have 16 active messages now as this means your messages has been sent into the queue now if you want to see these messages like what messages are being sent over here so we'll go to our service bus Explorer so here we can do all the three processes that is C or Peak the messages receive the messages and send the messages so currently we are in Peak mode so we'll go Peak from start and now as you can see here there are almost 16 messages over here now you can select any one message and see it is a single message now let's move on to our different topic so this one is message in list let's try our different okay so here it is shown whatever we had written into our code it has been shown over here so this was our message message inside a service bus message batch so we can see here it is shown and you can also count the number of the range what we had given over here so for a batch we had given up to range 10 so let's count one two three four five six seven eight nine ten let's see okay so we have 10 messages like that and we have five messages of list so this is how you can see your messages from the queue now we got to know how to send messages into the queue now it's time to receive the messages from the queue so for that we'll go back to our notepad now we'll just open a new file and here we'll write a code for receiving a message so on our previous program we had created a code in which we had added methods for sending the messages into the queue now what do we need to do is to receive all these messages from the same queue for that we'll just copy the upper code that is our from the Azure package the connection string and the queues and we'll just paste it over here and then we'll also mention our service bus client that is which we had used to like where we had created a Service Plus client so that we can send these messages now after this what do we do is we write a code for receiving these messages so let's quickly write the code we'll mention our Q name and we'll Define a maximum wait time for it so let's define maximum weight time is equal to 5. and now with receiver will receive the message so for message in receiver print received and we'll mention the message now receiver receiver Dot complete message and we'll mention the object so here we mention this command is to like complete the message so that the message is removed from the queue so once the receiver receives the messages so there won't be messages available inside the queue so this is the whole code let's save it and mention it as Q receive message dot p y and let's move on to our Command Prompt and let's clear everything and let's see and now let's run the program so as you can see all your messages are being received from DQ all right so here you can get the notification received message in list received single message so all these messages are received by the receiver now let's quickly go back to our Azure portal and quickly check whether the messages are still there or no so we'll go back to our queue and let's check all right now as you can see here we don't have any messages now this is because the receiver has received all the messages from the Q so this is how we send and receive messages through queue now let's quickly move on to our next part of the Hands-On in which we'll see how to push and pull messages by topics and subscriptions so in this also we'll be using the Python program to send messages to a service bus topic and then receive messages from a subscription to that particular topic so it is quite simple scenario of sending a batch of messages or a single message or a list of message to the topics and then we'll receive those messages from the subscription that we have assigned under that particular topic so for that first let us create a new topic so let's quickly go to our portal so we'll follow the same step we'll go to the Azure service bus so anytime when like while creating a queue or a topic first thing is to create a namespace and then we can add on the number multiple queues and topics under it so let's quickly create a new namespace now if we'll follow the same step we'll give a resource Group name so service bus demo and now we'll give a name to the our namespace so service bus demo all right so let's change this name okay so as I said you can choose any locations as per your requirement and then we'll come to our pricing cures so we'll select standard and then we'll quickly create okay so our validation is succeeded now let's create it all right so our deployment is complete so we'll quickly go to our resource and now we'll go to our topics so previously we had selected queues now we'll select topic and now we need to add a new topic so let's quickly create one so how did like in previous Hands-On how we had created the queue so in the same way we'll be creating our topic so let's give a name to it service bus topic and as you know maximum topic size will be 1GB and the messaging time to live is till 14 days so let's keep it as default let's quickly create okay now we need to create our subscriptions as well so we'll go under this topic and we'll create our subscriptions so we'll quickly go to our subscription section and we'll create a new subscription now we'll click on subscription and here we'll mention the name service bus sub and the maximum delivery count so as I said the default maximum delivery count is 10 whereas you can change your delivery counts if you have you have a value of ranging from one to two thousand so you can take up to that so rest other things would be kept default and let's quickly create all right so our topics are created and our subscription has also been created now let's quickly open our Notepad all right so here we'll be writing our python code so that we can send messages into the topic so as usual first we'll write the import package code so our import package code is from Azure service bus import service bus client comma service bus message all right okay now we need to add our following constants that is our connection string topic name and our subscription name so let's quickly write that connection string is c equal to so first let's write the code then we'll add on all our values to it topic name all right so first we'll copy our connection string that how do we go we'll go to our Azure service bus demo and we'll come here to Shared access policy and from here we'll copy our primary connection string now let's paste it after that our topic name saw topic name was service bus topic and our subscription name was service bus stop all right now that you have given our constants now let's quickly add our methods so like before how we had sent a single message then a list of message and then a batch of message so let's quickly add the methods now so def send single message now we'll create a service bus message so message is equal to service bus message and here you can specify your message so let's write hello and then we'll send it now to send this message to the topic so we'll specify sender dot send message that is our message what we are that is hello so we'll be sending that messages to our topics now we'll print we'll just check our indentations all right now that you have added a single message method now let's add a list of messages so death send list of messages sender now we need to create a list of messages so how do we create is message it is equal to square bracket service bus message hello how are you doing so you can type down any message of your choice just for an example and then we'll specify a range so for in range so here we'll specify let's say 6 and then we need to send this message to our topic so sender dot send searches now we'll create the same for a batch of messages so let's quickly create that so what did we do now we have created a batch message and now in this we'll add messages into the batch so now we'll just complete this method so here we'll add a break so that the service batch object reaches a maximum size and then you can create a new service bus patch object to send more data take all right now we send a message to our topic the sender Dot send messages or batch messages print now we'll specify sent a batch of 10 messages all right now that we have added the methods now it's time to create a service bus client and then a topic sender object to send these messages so let's go down and first thing we'll create a service bus client service bus client is equal to service bus client dot ROM connection string so here we'll specify the constraint that we had specified up as you can see here so we just need to call it over here so connection string is equal to connection name now after this we'll get a topic sender object to send the messages to the object so as you had specified here the sender object so we'll create one over here so sender is equal to service bus space bar grind dot get topic sender and now we'll specify the topic name all right now next thing we need to send all the messages that is single list of messages and batch first so for that let's sender now we're going to call all the methods over here send single message sender send a list of messages and then our batch message all right lastly we'll just print done sending messages now we're gonna save this so service will go to our previous folder that was Azure service bus so service bus top picks now click on dot py now we'll save it so this is the whole code where we used to send the messages into the topic now let's quickly move on to our Command Prompt now let's run our python code so as you can see our messages have been sent successfully now let's quickly move on to our job portal and let's go to topics let's check whether the messages have gone or no all right now we'll just enter our subscriptions and here as you can see we have 32 messages all right now if we just go to our service bus Explorer we are able to see the messages what are what has been sent into the queue so let's go to Peak from start now you can see here single message messages in list so these were the previous messages now you can see over here hello hello how are you doing this shows that the messages have been successfully sent into the topic now we'll see how do we receive these messages from the topic now let's go to our notepad now let's quickly open our new notepad so we'll go by controlling you and now let's write the code for receiving the messages so we will just copy the previous few codes so we'll just open this so here we need to import our packages first and the connection string so we'll just copy and paste it and now what do we need to do is to take this service first client and we'll paste this and now we need to write the code for receiving messages so it's a very simple code here we need to get the subscription receiver object for team subscription so as we discussed that the receivers can will receive their messages from through the subscription so we need to create a receiver object and then we'll complete the messages so that the message is removed from there so it means like once the receiver receives the messages the message the topics and subscription will get emptied so there won't be any messages left so let's quickly write a code so with service bus client now here we'll mention our topic name so let's quickly mention is equal to and here also we'll also describe our subscription name as the receiver will receive from the subscriptions so let's quickly mention that foreign the maximum wait time let's give it a so it means that this code continuously receive the new messages until it doesn't receive any new messages for five seconds all right so now let's complete the messages so with receiver or message in receiver and now we'll specify print received and now we'll add the message here message to your dot complete message so this is the whole code for receiving the message so let's quickly save this all right so let's mention it service bus subscription Dot p y so this is the whole code that will write to receive the messages now let's quickly run our code all right so here you can see that we have received the messages so all our messages have been received by the subscriber okay now let's quickly move on to our job portal and now let's refresh this so here you can see now no messages are left in the subscription because all the subscribers have received the messages so this is how we send and receive messages through topics and subscriptions so in queues the sender used to send the messages into the queue and from there the receivers used to receive the messages one by one whereas in topics the sender sends the messages into the topic but unless and until he subscribe we are not able to access any of the messages so once we create a subscription and we have number of subscribers then we can receive all the messages from the topics foreign let us see what are the top companies using this cloud service providers first talking about AWS its services are used by top companies like Netflix Coca-Cola McDonald's Unilever ESPN and Adobe next coming to Azure it is used by many Fortune 500 companies and some of the companies which use Microsoft Azure clouds are Samsung HP BMW FedEx and pixel animated Studio moving on to gcp the top companies who use their services are PayPal Twitter 20th Century Fox PNG and King Digital entertainment now that we have seen what are the top companies which use this cloud service providers let us compare them based on the compute services compute services are one of the core Services when it comes to cloud computing compute Services helps to create instances of virtual machines in minute and also scale up instances instantly if needed so in today's session we're going to compare this three cloud service providers based on the compute Service and Storage service the primary compute service for AWS is Amazon ec2 the primary compute service for Azure is azure virtual machine and for gcp is Google compute engine now all the three services are equally powerful but unique in their own way each one has its own advantages and disadvantages like Amazon ec2 has 99.5 percent of the annual uptime and can be tailored with variety of options according to users requirements on the other hand assure virtual machines provides enhanced security and hybrid Cloud capabilities but when you compare the cost Azure instances tend to get costlier as a compute size increases next talking about Google compute engine the comparatively cheaper they come with a persistent disk storage and provide consistent performance next talking about the storage Services AWS offers a variety of storage option like S3 for object storage EBS for Block storage EFS for file storage and a few other storage services next talking about Asha cloud storage this also includes object file disk queue and table storage they also have specialized services for data Rich applications and many data backup Services now talking about gcp cloud storage they have a fewer storage as compared to the other two but they are more targeted for object storage DCP offers cloud storage it offers persistent disk for Block storage to be used with virtual machine and file storage for storing the files for backing up your data AWS provides a service called as AWS ratio and Azure provides a service called as Azure backup but Google does not yet have any backup services next let us compare this three cloud service providers according to the pricing now all the three cloud service providers offer a pay-as-you-go structure which means you only pay for the services you use now pricing would vary for different services like for computer services one cloud service provider could be cheaper but it could be costly for database services and so on so just to give you a general overview of the pricing among the three cloud service providers gcp offers a slightly cheaper project model and has a flexible cost control which allows you to try the different services and features AWS charges you on hourly basis whereas Azure charges you on a per minute basis and gcp provides per second billing for its resources when it comes to a short-term subscription plan Google cloud and Azure gives you a lot more flexibility but in certain Services Azure tends to be costlier when the architecture starts scaling up [Music] today nearly 80 percent of all Business Solutions and services have moved to the cloud most of the projects are happening by utilizing Cloud platforms which include software as a service infrastructure as a service and platform as a service with projects being hosted on the cloud architecture it makes perfect sense to utilize devops solutions to continuously deliver value for Enterprises building upon that we have the Microsoft easy 400 exam or the Microsoft certified devops Engineer Expert but what is the az400 exam as described by Microsoft az400 certification training is an intermediate level course from Microsoft for professionals who want to gain knowledge of Designing and implementing devops processes and practices for an Enterprise this five-day az400 training course is ideal for professionals who combine people processes and Technologies to deliver value you to the organization some of the skills that are outlined by this curriculum include developing an instrumentation strategy which carries a weightage of about five to ten percent developing an SRE strategy which again carries a weightage of five to ten percent then you have development of a security and compliance plan which carries the weightage of 10 to 15 percent as well as managing Source control which has the same weightage apart from that you also learn how to facilitate communication and collaboration between your teammates stakeholders and vendors and Define and Implement continuous integration or CI which has the majority of the weightage which is about 20 to 25 percent and finally you learn how to define and Implement a continuous delivery and release management strategy which again carries a weightage of 10 to 15 percent now this is the entire skill set but before you get ready to take on the world with your newly certified skill set a candidate for this exam must be familiar with both Azure Administration and development and must be an expert in at least one of these areas there is only one prerequisite for taking this exam and that is as a candidate you must have cleared either the az-104 or the az203 exam furthermore there are other prerequisites that you should keep in mind before appearing for the az400 exam now even though these are not mandatory prerequisites these are the ones which can improve your chances of qualifying the exam such as Proficiency in agile practices the ability for Designing and implementing devops practices for Version Control configuration management build compliance release and testing all of this by leveraging Azure Technologies now that you know how you can qualify to give this exam let's go ahead and look at who is this exam most beneficial for or who can take up az400 first of all this course is designed for professionals who are aspiring to clear the Microsoft certified devops Engineer Expert exam right so individuals looking to establish their credibility and value in the market as experience devops practitioners agile practitioners and cloud computing professionals apart from that this course is also meant for system administrators software developers project managers and Technical leads and practically anybody that wishes or aspires to lead a devops team so now that you understand who can take up this exam the main question arises why should you take this exam what are the benefits of taking the az400 examination now the devops market is expected to grow from 3.42 billion dollars in 2018 and 19 to 10.3 billion dollars by 2023 at a compound annual growth rate of new nearly 25 percent which means this industry is going to grow more than threefold in the next few years alone out of which Azure holds 19 of the total Global Cloud infrastructure Market which automatically qualifies you for a majority of the mncs in the industry today not just that this also has proven to give you a huge boost in your salary as a mid-level Azure devops engineer earns on an average of 145 000 US dollars per year whereas a senior level Azure devops engineer can make up 285 000 per year apart from that the Azure infrastructure is very ideal for devops which is why a lot of companies are quickly moving to devops which makes you a front runner in the race for all of these companies now talking of companies here are a few aggressively hiring for Azure certified devops professionals including but not limited to Amazon earns young Dell Accenture Microsoft Google and VMware now that you know what all is there in store for you I'm sure you're curious to know what does the curriculum outline look like so here it is first of all you shall be introduced to Azure devops and understand the important aspects of azure as well as devops along with this you also learn the role of a devops professional in an organization then you learn about implementing continuous integration followed by building containers with Azure devops where you learn to create and deploy a multi-container application for your devops pipeline followed by that you learn how to design a dependency management strategy as well as managing artifact versioning next up you learn how to set up a release management workflow where you learn how to configure a CI CD pipeline using yaml and manage your secrets using the Azure Vault followed by which you implement deployment models and services where you configure infrastructure as a service and platform as a service on Azure followed by which you learn how to implement and optimize a continuous feedback mechanism next is the meaty part where you learn a bunch of different Azure tools for infrastructure configuration as well as a few third-party tools and finally you learn how to implement compliance and security where you learn how to manage and check code quality with sonar Cloud on Azure devops you also learn how to integrate Azure key Vault with Azure devops to access secrets in the Azure Pipeline and that was all about the course outline of the az400 curriculum this certification exam not just enhances your skills as well as demonstrates it to your employers thus making you gain that edge in the industry the time is right to upskill and devops and Cloud are two of the biggest domains in the industry to take advantage of the career opportunities that come your way [Music] what is Microsoft Azure certification Azure certification is a level of Microsoft cloud expertise that an individual obtains after passing the certification exam Azure certification validates an individual Cloud expertise and skills now talking about some of the benefits of azure certification the first one is you stay ahead of the crowd an Azure certification will validate your Cloud skills in a selected domain you will earn The credibility and your present or future employer will know for sure that you've worked on Azure and you have the skills as you've mentioned in your resume the next benefit is higher salary package Business Wire as estimated that assure course certification have raised their wages by 20 to 40 percent payscale.com also announced that based on the position and job description those credited by Microsoft Azure could get a salary of 128 000 annually the next benefit is it offers a flexible career a Microsoft Azure certification enables you to pursue a wide range of career options you can become a cloud architect a developer or a system admin and many more in addition to this the certification enables you to work with various Industries at different location the fourth benefit is it acts as a proof of commitment an Azure certification proves that you're able to commit to get a certification you have to sign up for a course study and then pass the exam this simply implies that you can commit your time and resources towards achieving a goal and that you're dedicated to improving career objectives in the long term so these were some of the benefits of azure certification now let us move on to main topic for today and understand what exactly is azure fundamental certification Azure fundamental certification or AZ 900 is intended for candidates who are just beginning to work with cloud-based solution and services or who are new to Microsoft Azure this certification is a basic certification it is intended for individuals who just want an entry-level job in ashore and do not want to get in-depth into Microsoft Azure and according to Glassdoor the average salary for a Microsoft issue of fundamental certified candidate is about one hundred ten thousand dollars per annum you the certification can also be the stepping stone for other certification in Microsoft Azure with this certification you can prove knowledge of cloud Concepts Azure Services assure workloads security and privacy in Azure as well as Azure pricing and support to write a certification exam you should be familiar with the general technology Concepts including concepts of networking storage compute application support and application development so this was just a brief introduction to Azure fundamental certification now let us talk about the skills measured while writing a certification exam firstly you should be able to describe the cloud Concepts and also know about core Azure Services next you should be able to describe core Solutions and management tools on Azure then you should be able to describe the General Security and the network security features in Microsoft Azure Moving On You should also be able to describe the identity governance privacy and compliance features in Azure and then finally you should be able to describe the Azure cost management and service legal agreements so in the certification examination your skills on all these topics are measured so in order to pass the examination you need to have knowledge about all these topics now let us move on to next topic and talk about the certification exam guide so first talking about the exam format and question types Microsoft continuously introduces Innovative testing Technologies and question types your exam might contain questions like multiple choices repeated answers choices short answers mock review drag and drops and so on next the examination duration is 45 minutes but the seat duration is 65 minutes in total next the cost for this examination in the United States is 99 US Dollars and in India it is around 3600 rupees now talking about the languages in which you can write a certification exam you can write this in either English or Chinese Korean Japanese Spanish German and six other languages this is the only certification exam in Azure which you can write in so many languages now let us take a look at the topics from which the question should be asked in the certification examination the first topic is from cloud Concepts where 20 to 25 percent of the questions are asked in the examination next you need to describe core Azure services and from this topic somewhere around 15 to 20 percent of the questions are asked next you need to describe co-solution and management tools on Azure and from this section 10 to 15 percent of the questions are asked moving on you should describe general security and the network security features from which 10 to 15 percent of the questions are asked next you should be able to describe the identity governance privacy and the compliance features from which 20 to 25 percent of the questions are asked and finally you should be able to describe Azure cost management and service level agreements from which 10 to 15 percent of the questions are asked all this is mentioned in the certification exam guide to go to the certification exam guide just go to your Microsoft certification page that you can find the Azure fundamental certification and when you click on that they will have an option to download the exam guide so this was about about the Azure fundamental certification exam guide now let us move on to a final topic for today and see how you can prepare for the certification the first thing you should do for preparing for the exam is to see what is asked during the exam so to understand that study the certification exam guide thoroughly now every Microsoft certification exam page has the skill measured in its exam guide the list is usually very accurate and helps you focus and study the right content the page also have some training and courses to prepare you for the certification also understand the different question types which will help you prepare for the certification the next step is to understand some of the basic concepts like what is cloud computing and the various services and deployment models in cloud computing learn the architecture and a few important service domains such as compute storage databases networking and security you need to basically learn all the topics mentioned in the exam guide after you learn all these topics the best way to pass the Microsoft Azure exam is through real hands-on experience with the technology by micro this off gives you some free Hands-On learning modules you can practice that and there is also Azure free trial account the Azure free trial account will provide you with 12 months of free Azure services not all the services but some of the prominent services will be free for you to use you can Hands-On practice what you have learned in the previous steps using the free trial account The Next Step which will help you prepare for the certification is by reading the Microsoft documents read the Microsoft Azure documentation so you will understand the topics better which will come in the examination as I mentioned before read the skills measured on the exam Page look up to the specified Microsoft document page and read them thoroughly and try out in the tutorials also read some books from Microsoft press now these are some other tips which will help you prepare for the Azure fundamental certification examination foreign why should you actually go ahead and become a data engineer so alright guys so you must have heard about data scientists or a data analyst right now the core of their job is to basically work on data now data engineer are people who basically Avail this data which has to be worked on to data scientists or a data analyst right so if you think about any company which basically has to have a data scientist or a data analyst will need a data engineer in the team right now talking about that or if you talk about the data engineer profile now according to a tech job report by 2028 the number of jobs involving data Will Rise by 12 according to the Bureau of Labor Statistics and more than five lakh for the 6 200 new roles related to Big Data will result from this now you must have heard that being a data scientist is probably the next go-to of the 21st century but understand this guys without data engineering there is no data science that is why the number speaks for themselves hence the data the engineering job is basically one of the fastest growing jobs now even in 2023 there is going to be a huge jump in the number of jobs opening the reason being a lot of things a lot of businesses have basically gone online in the past years because of the pandemic now there are several reasons why organization may choose to hire an Azure data engineer or why an individual may choose to pursue a career as an Azure data engineer so the first reason being it's high demand so what I meant by that is as an organization increasingly rely on data to drive business decision there is a high demand for professionals who are skilled in designing and implementing data management solution now according to Microsoft the demand for an Azure data engineer has increased by over 100 in the past years so the second reason being is strong earning potential now an Azure data engineer are in high demands right so which often translate into what into a strong earning potential now according to the Glassdoor the median salary of an Azure data engineer in the United States is anywhere around 120 to 133 000 per annum and the third reason being career growth opportunities now as you all know the role of an Azure data engineer provide opportunities for career growth as individual can advance the positions such as data architect or a data scientist right now the fourth reason is zor is a leading Cloud platform now what I mean by that is azure is a leading Cloud platform with a growing number of organization using it to store and manage data now as an Azure data engineer you will have the opportunity to work with the latest Technologies and gain valuable experience that is in high demand in the job market now the fifth and the last reason being its flexibility now as an Azure data engineer you can work in a variety of Industry giving them the opportunity to choose a career path that aligns with their interests and goals this is where you come and work in a variety of roles including full-time or as a freelancing position now as you can say overall becoming an Azure data engineer can provide individual with a rewarding career path that offers a strong earning potential career growth opportunity and the chance to work with the latest technology in a flexible and dynamic environment now that you have understand that why data engineering can be the next big thing in the coming job world now let's go ahead and understand who is a data engineer exactly and what does he do alright so guys a data engineer are basically not only responsible for giving the data to the data scientists or to the data analysts now they basically have a generalist role where they have to do most of the jobs that probably a data scientist or a data analyst also has to do so the first job that they have to basically understand is they will get the requirement from the data scientist that this is the kind of data they need and now they have to figure on the internet how they can find that data now once they found the data they will have to create a process right now on how the data can be brought to their own platform and make it basically ready for the data scientist to use right so you can say that they have a job of managing and organizing the data that basically will be look upon by the data scientist and basically help him to find out the trends and other things which can help a company to improve its current business and opportunities okay now secondly talking or pinpointing the exact role that our data it is basically divided into three types of data Engineers right now there are people who generally call as a data engineer now when we say that there are people who basically not only does the data engineer work but also have their roles and responsibility coincide with the data scientists and a data analyst also which he also does right for example people who are a data engineer they will also have to do the job of a data scientists now they will also do the job of a data analysts and they are usually people who are at a very high or senior profiles and they basically have to wear multiple hats right so say for example let's say once again let's say you introduce a new data team in a company now that data team is still in a very nascent stage right so probably the person who has to set up the team initially will have to do all the work now which is required to basically show the stakeholder that this team can actually work right and once it becomes successful then he basically hires more people who can do more specific tasks which are basically tasks that has been doing previously okay so that's what a generalist data engineer does then you have people who are basically pipeline Centric data engineer now that is people who basically have the job of setting up how to bring with the data which is basically online or you know scraping of data from the web then converting or transforming it into a useful manner or in a useful State and then giving it to a data scientist so that he can consume the data so this pipeline which is Created from the web to the point where data scientists can use it is basically called a pipeline for data so guy who basically looks after this pipeline or data engineer who basically look only after this pipeline is called the pipeline Centric data engineer then the Third Kind of data engineer is database Centric data engineer now in some companies they basically say we have all the data in a database now you check this data and then you figure out what can be used by a data scientist to make their job better or through which these guys can actually go ahead and able to do their job so in those cases you need to have the skills or you know expertise in that database which the company is using and hence for him the job becomes a database entry out of which you will have to create a pipeline right so these are the three kinds of a data India which are out there and if you basically take a look out of it and Define the jobs or responsibility of a data engineer he is basically responsible for creating architectures which are related to data pipelines he is the guy who will be aligning the architecture with the business requirement he will be the guy who will be responsible for data acquisition he will be the guy who will be developing processes to which she will be creating the data pipeline now he will also be making use of a lot of programming languages and tools to accomplish this and he will also be the guy who will basically be responsible to further enhance the data reliability efficiency and quality so Guys these are the roles of a data engineer and after this I think you guys now have an idea about what a data engineer will do in his day-to-day life now let's move on and talk about what are the salaries and followed by we shall also see the career opportunities which are there in becoming a data engineer so let's go ahead and also understand the average salary of a data engineer so in the U.S the average salary of a data engineer is 133 000 but remember guys this is basically an average salary which basically means that there are people who are above this salary or can even be a person who is below this salary okay but later in this session I'm also going to talk about some of the job description which can tell you the kind of salary that you can earn once you start applying for a data engineer profile in India the salary is around 6.5 lakhs per annum or a 7 lakh per annum and the same goes for an Indian jobs as well that the salary can be higher than this average package or it can be lower than the package as well now that you have seen the basic or an average scale of a Azure data engineer let's go ahead and see the job description of an Azure data engineer alright guys so thinking again about what a data engineer does and a zo data engineer is responsible for Designing implementing and maintaining data management and data processing system on the Microsoft Azure Cloud platform now they work with large and complex data sets and are responsible for ensuring that data is stored process and secured efficiently and effectively now when you basically Define a job role like that you have many jobs description which are floating in the market right now how can you identify which job description you have to apply to now let's go ahead and understand standard so talking about the job description which basically exists in the market guys you will see a job description which is an entry-level job description and then on the next slide I will show a drop description which is a mid or senior level job description all right so heading back to the entry level so if you see the first job description which is in Azure data engineer the salary is anywhere around 6 lakh per annum to 14 lakhs per annum and these are the skills that you require now as I mentioned before these are the expected skills required for an Azure data engineer now what are the skills which are expected now you are expected to know no SQL or a cosmos database skill now they are expected to know data Lake data factory data warehouse and strong experience in building pipelines in Azure data or in Azure data Lake then you should be able to analyze and understand complex data you should be able to understand business requirement and actively provides input firm data perspective now at the same time if you look at these skill sets these skill sets are all the skills at which you will basically know after you study for or clear the Azure data engineer certification so once you're done with the certification once you are done with the skill sets which is just there in the certification you can easily go ahead and apply for a job that lies in the salary range which is for an Azure data engineer now talking about a mid senior level data engineer profile the list is quite long long as you can see now you can see that over here apart from all these skill sets a lot of other things are also mentioned here as well for example you shall have some four or five plus years of experience in implementing or designing solution using Azure Big Data Technologies then you actually have an experience with an Hands-On in azorita Factory Azure devops Azure data Lake storage Etc now you should have a knowledge of Big Data pipeline then design and build modern data pipelines and maintain the data warehouse schematics layouts architecture and relational or non-relational database for data access and advanced analytics now you should also know Java jQuery SQL or scalar or any preferred programming language right so over here what we recommend to our Learners is you should go ahead and Learn Python because although they have mentioned only these programming language but companies are very much flexible so if the target profile has any programming experience but a strong one in any of the programming languages right next thing that they expect you to know is about one skill using one or most common language for example like python batch Etc so this python will basically serve as a dual purpose that is all a scripting language and a programming language as well the next thing that they expect you to know is the ETL process using big data Technologies such as spark Kafka Hadoop and others now the next thing that they expect you to know is the ETL process using big data Technologies such as spark Kafka hadoops and others and then you need to understand the data visualization experience using python python here is a plus guys you need to understand this and then you actually need to learn Tableau or a power bi any one of the tools or technology is a plus point so you either have to have a skill on Tableau or a power bi so this again is an important skill to have and this again coincide with what a data analyst does right because he is also responsible for data visualization to some extent then you have a solid understanding and experience implementing cloud data platform in Microsoft Azure devops so if you're a guy who wants to start off you can start off with the fresher profile and after having experience in the fresher profile and learn all the skill sets like big data Azure these are the skill sets that if you gain you can actually apply to the senior or mid-level in particular okay so Guys these are the few job description which are related to the data engineer profile now that we are clear with why who and what are the career opportunities and salary of an Azure data engineer which I'll see the path towards becoming an Azure data engineer so first of all you will have to talk about the different Azure storage which are out there now you'll have to learn about these like the blob storage table storage file storage and the queue storage now you will have to learn about the relational database options as well which are there in the market such as the SQL database SQL DB Warehouse analysis Services now you will also have to learn about nosql you will have to learn about big data services in Azure like data Lake analytics data Lake storage now you also have to learn about the data factories Azure functions stream analytics iut hubs even hubs Etc now apart from that you will also need to learn reduce cache and Azure search so these are the services that are basically required for you to understand in order to clear the certification and go ahead and become an Azure data engineer now these are also coincide with the job description that we had a look earlier right now all the services which are mentioned there is basically a part of what you have learned in order to crack the certification now apart from this we also recommend going through the open source Services of Hadoop such as spark Hive and the Hadoop itself right now let us go step by step to reach our goal in becoming an Azure data engineer so in order to become an Azure data engineer you will also need to have a strong foundation in data engineering and cloud computing here are some steps you can take to develop the skills and knowledge needed for a career as an Azure data engineer so the first one is learn the basics of data engineering now in order to become an Azure data engineer you should first develop a strong foundation in data engineering Concepts such as data modeling data pipelines data processing and data storage now you can learn these Concepts through online courses or books or by working on practical projects now the second is learning or programming language now as an Azure data engineer you will need to provision in at least one programming languages as I've already mentioned earlier now python is a very popular choice for data engineering but you can also consider learning languages such as Java c-sharp or scalar now the next step is you need to learn an SQL now as a data engineer you will be working with larger amount of data you've already known that since the name itself as an Azure data engineer so you will need to be a proficient in SQL to extract and transform data now the next thing you need to learn is azure data storage options now as you know Azure offers a range of data storage options such as Azure block storage Azure data Lake storage Azure Cosmos DB Azure SQL database and Azure synapse analytics firmly SQL database warehouse now you should familiarize yourself with the features and capabilities of these storage options now the next thing you need to learn is azure data processing Technologies now Azure provides several Technologies for processing data such as Azure stream analytics Azure databricks Azure data Factory and Azure HD insights you should learn how to use these Technologies to build data pipelines for ingestion transforming and processing data so the next thing you need to learn is azure data management and security so you as an Azure data engineer should learn about the Azure tools for managing and securing data such as Azure data catalog Azure data Lake security and Azure private link so what is the next steps the next step is get certified as you know getting certified is very crucial and it's very important in today's generation because all the organization as I mentioned earlier if I just have to go back in my slide I will show you the job description where they have actually mentioned that you need to actually pass the certification all right this is the certification you required that is the dp200 and a dp201 but as for now you don't need dp200 or dp201 you only have to give one exam which I'll be further talking about it that is a dp203 right guys so moving ahead again so you need to earn an Azure data engineer associate certification by passing the dp203 exam right now what is the next thing you need to do is you need to gain practical experience now the best way to learn Azure data engineering is by working on practical projects now we all know this right now you can find data engineering projects on online platforms such as Kegel or you can work on projects in your own organization or you can enroll with edureka and they provide you a tons and tons of live projects which are developed by the instructor who have already worked as a data engineer in their organization now the next thing you need to learn is continue learning now as a data engineer you will need to keep your skills and knowledge up to date as Technologies and best practices evolve right now making sure to stay current by learning about new Azure data engineering features and participating in professionals development activities alright guys so now let us understand the things that you need to know about this certification question that I've just talked about that is your dp203 so guys if you're applying for an Azure data engineer associate there used to be two exam that you have to give which I've just mentioned now on the screen in the previous slide right one is implementing an Azure data solution and the next exam is design and Azure data solution now after clearing both these exam it will give you the Azure data engineering associate certification and these two exam basically have the code like I've mentioned dp200 and dp201 right but now this exam have been retired on 23rd February 2021. now you guys just have to give one exam to clear the data engineer certification and this exam is dp203 now what they have done is they have Club both these exam and they have now included syllabus in just one examine they're asking questions from it right so earlier what you have to do is you have to pay for two exams then prepare for two exams and give them and then only you could get the certification but now just by passing one exam you can clear out the data engineer certification right now as part of the new exams now the skill sets have been updated as you can see on the screen so basically these are the distribution of topics and they'll be covered in this exam now first of all you will be asked most of the questions on design and Implement data storage now this is going to have 40 to 45 percent of weightage okay now I'm going to tell you what are the topics that comes under the design and Implement data storage so make sure you write it down okay guys so the first is design a data storage structure then second is design a partitioning strategy and the next question is design the serving layer then you have the Implement physical data storage structures then Implement logical data structure and lastly implement the serving layer now after this you will have design and develop data processing now this is of 25 to 30 weightage now there are four points coming under this design and develop data processing now the first is you need to learn about the ingest and transform data second is design and develop a batch processing solution the third is design and develop a stream processing solution and lastly we have the managed batches and pipelines all right so coming to the third is design and Implement data security which is of 10 to 15 percent so there are two main topics here that is design security for data policies and standards and the second is Implement data security Now talking about the last is you have the Monitor and optimized data storage and data processing which is again of 10 to 15 now even here so we have two main topics that you need to be covered that is the monitor data storage and data processing and the second is optimize and troubleshoot data storage and data processing now if you are with me till at this point you shall have understood what are all the things that you have to learn in order to clear the exams and become an Azure data engineer right since we have already discussed what are the path and what are the skills to learn to prepare ourselves so like I said earlier you now just have to give this db203 exam to get the Microsoft certification associate data engineering certification okay so just one exam and now you will get the certification for it [Music] so now uh when you talk about cloud computing a lot of different people have different definitions of cloud computing so now cloud computing is something which is not very new technology I would say I mean the terminology is new but the technology is not the first product I would say which was talking about Cloud was my Hotmail which was designed and developed by sabir Bhatia so I'm sure a lot of people are listening to this webinar would be aware of that particular name so what he did was he he designed a mail server so in that particular main server I used to go ahead and access my mails right so that mail server was in the cloud and then I was connecting to that mail server remotely to access my mails so that makes it a cloud product right so now when you talk about cloud computing these days the definition has changed but as far as the working is concerned it remains pretty much the same so now when you talk about this we talk about that we use and pay right so paper uses is the kind of what we talk about so we basically use and then pay only for that amount of I would not say data but I would say Services which we use so there's an explanation down there in the slide it talks about it is the use of servers on the internet to store manage and process data the difference is instead of using your own servers you're using somebody else's server to do the task and pay them for the amount of time for what you use them right so this is what the idea is so paper uses the model which we focus on when we talk about cloud computing these days right so this is what cloud is this is what a basic definition of cloud is if you search the internet you will have more than I guess 100 100 plus definitions on cloud now as far as definitions on cloud are concerned it's it's up to you which one you would like to adapt and which one you would like to go with but then it works with a single model which is paper use so when you use you pay if you don't you you don't pay and end of day you're using somebody else's resources no you're not using your own resources and you're only paying for their resources what you use okay now talking about different type of services which we talk about as far as my cloud is concerned so there are three major things what we talk about which is Ias PS and SAS and the other things like somebody has talked about the AAS database as a service then you talk about SES so that is for my mails but then majorly we categorize these into three types one is IES Bas and SAS so IIs talks about infrastructure as a service and you get the hardware in that from the cloud provider as a service so say for example you want to go ahead and deploy a machine deploy a server so basically when you talk about that particular scenario you are basically putting a server in IAS which is my infrastructure as a service so you get complete control on a virtual machine which is there in the cloud hosted by Microsoft Google or AWS and then you can just go ahead and put in your resources onto that particular machine so this is how it works right simple enough then you talk about paas platform as a service so in this you don't get access to your machine or in other words you can say you don't get access to the underlying layer but you get a complete access to the resources or these Services right so it sets more like my platform on which you publish your applications or you put your application there are few examples so I talk about my web apps my mobile apps which are in Azure then you talk about the last type of service offered in Cloud so this is my SAS so when you talk about SAS you get software as a service in Azure so you don't need to worry about infrastructure you don't need to worry about platform so you basically are getting the end product which may be running on the infrastructure which may be running on the platform so for example when you launch a VM on Azure if you talk about OS you're not buying the OS and if you talk about Pas you're not buying the service so you're just basically paying for the software which is running on either IES or either Pas so this is what we get as far as my types of cloud is concerned now let's just jump on to another question now what are the different Cloud deployment models so explanation talks about three models which we have one is my public Cloud one is my private cloud and one is my hybrid Cloud so let's just take this with examples so I am a general user I am a home user I want to go ahead and put a website so I decide that my website will be hosted on my cloud right so I want to go ahead and deploy a VM on the cloud and I won't put my website over there so I go to Microsoft website as your website I sign up there and then I maybe you can say provision a machine once I provision a machine I put IIs on top of it once I put is on top of it I will just go ahead and move my resources when I say move my resources move my application on the cloud so what I'm trying to say is a general basic end user who is sitting at home can log into the cloud get a machine and then use a machine right so this is what my public cloud is so it sits available for the public right then you talk about the private Cloud now as far as private cloud is concerned and it's owned by the Enterprise okay and only Enterprise people will have access to it so organizations like Accenture or maybe Wipro or maybe cat Gemini cognizant so all these people own a private Cloud right so the idea is that the resources are owned by them and hence their people are the ones who will go ahead and access that but then at times what happens is they give access to their private Cloud for their customers also now you might have a question that why do I build a private Cloud because the end of day I'm spending a lot of money on infrastructure I'm spending a lot of money on air conditioning electricity and I'm spending a lot of money on licensing also so why can't I just go ahead and work on public Cloud why do I need to go ahead and Implement a private Cloud now basically when you talk about a private Cloud private cloud is built for automation so that is one of the most important things and the second important part one we talk about is that when you talk about private Cloud it gives me more control over the resources whereas when you talk about public Cloud I do not have control over the resources because everything is owned by the vendor right now talking about hybrid Cloud so uh I'll give you an example for this also I go to Amazon and Amazon is my public cloud provider and I tell them that I need a private Cloud so would it be possible for you to host a private Cloud for me so Amazon says yes and they dedicate a list of resources for me only for me only for my organization so now when you talk about Amazon Amazon is hosting a public Cloud but then they also are capable enough of hosting a private Cloud for my organization so that makes it a hybrid Cloud so this is uh what hybrid Cloud means so these are three different ways of how I categorize these Cloud models now fear the questions now the question says I have some private servers on my premises also I have distributed some of my workloads on the public Cloud what is the architecture called so when you talk about this kind of scenario so we talk about my virtual private Network so these are my options basically virtual private Network then we talk about private Cloud then we talk about my virtual private cloud and then you have hybrid Cloud okay again so just read the question carefully and then probably I'll just jump on to next slide and we will talk about the answer and the explanation so again I would say I have some private servers on my premises also I have distributed some of my workloads on the public Cloud what is this this architecture model okay so now it says hybrid cloud so when you talk about this this type of architecture would be on a hybrid Cloud why because we use both public cloud and the on-premise server which is on the private Cloud now let's just jump on to the next question so we have a few General Azure questions now these are Azure specific the questions which we did talk about previously were just general Cloud questions right so if you are giving an interview for uh maybe Azure maybe AWS or maybe Google so those four questions are something which you I would say go with any of these scenarios any of the interview which you would do now these are specific General is your question so when I say specific generalize your questions so we talk about only Azure English Okay so the first question talks about what is Microsoft Azure and why is it used okay so as I initially said that Azure is my public cloud from Microsoft so let's just see how it works so we have a definition on the next slide okay so as discussed above the companies which provide cloud services are called Cloud providers and of course I did explain you this in the previous slide I mean the first slide which we talked about so I explained you about Google so that is a cloud provider I explained to you about AWS that is a cloud provider and then I talked about uh Azure which is a cloud provider so now I zero is from Microsoft and the idea behind this is it sets more like my public Cloud so you can go to Azure you can log on to Azure create a machine and then start working on that so this is one of the cloud providers so Microsoft is one of the cloud providers is one of the major Cloud providers these days right so this is what my a0 is now talking about with services in SEO is used to manage resources in SEO so we have few options I have my application Insight I have my arm which is my Azure resource manager then you have my Azure portal then you have my log analytics okay so as far as this is concerned so we talk about my arm so this is a servers which is basically uh on the newer portal right so now when you talk about these Services which is on the newer portal so it is used to manage the infrastructure which involves a number of azure Services it's it's basically my fabric so I'm not sure how many people are aware of words and as your fabric but then the idea behind this is that when you talk about my Azure fabric as your fabric now runs on something called arm previously in version one of azure it was using the service management model so now this is a new thing right so when you talk about managing the resources in azure so we basically talk about my resource manager model and then how our services run on it how they work and how they interact with each other there are a lot of things which you need to work on and which you need to learn as far as my arms but for the time being you just need to remember that we talk about my Azure resource manager so this is something which runs everything and takes care of a lot of different things as far as my Azure is concerned so when arm came a lot of different came a lot of different things started coming in so tag was one of them resource groups was another one so if you are familiar with Resource Group great if you're familiar with tags great if you're not just go to Google and search for it and you'll find your answers okay then we have few questions which of the following web applications can be deployed with Azure okay so we can of course put asp.net applications in Azure now you might have a question where exactly I can use these applications or where exactly I can deploy the applications so to answer that you can deploy it either in IAS or you can deploy it either in paas right so you can of course deploy asp.net you can deploy PHP you can deploy WCF so the answer is all of the above but then it's very important for you to understand that if there is an application which is on Java so that may have few problems when you deploy it on Azure right because of course Azure is majorly designed for Microsoft Services okay now when you talk about these Microsoft Services Java is not a Microsoft product so it's it's more like it's not that friendly when you compare it with other services which we are running in the cloud okay specific to Microsoft cloud of course I would say Okay jumping on to the next one then few other questions so these are my fill in the blanks a dash role is a virtual machine instance running Microsoft IIs web services that can accept and respond to http or HTTP request okay so we have four options in here one is my web one is my server one is my worker and one is my client okay now when you talk about this the answer to that is web role is something which is basically hosted on a virtual machine and is running on Microsoft is web server and that is capable of you know accepting HTTP and https request so basically on the web roll what you do is you can upload your website and your website runs on top of it and then of course you know that my website works on HTTP and https so hence my web role is the one which is capable enough of running that particular maybe application or software or web services okay the other question what we talk about is what is use of roles in Microsoft SEO so when you talk about what is the use of roles we talk about three rules one is my web role one is my worker role and one is my VM role okay so now when you talk about my web role web role basically is for hosting and deploying my websites okay the worker role is the one which basically helps the web role to execute the background processes the example I can give you is my web jobs I'm not sure how many people are aware of web jobs but if you only can again search on the internet so my web rule is residing on worker role worker role is the one which supports my web role and the worker role is the one which actually resides on my VM Road okay so this is the one which talks to the underlying layer it's it's more like a framework it acts as a framework so that means on the operating system I have my worker role on top of it I have my web role so my web role is the one which will host my of services I would not say services but the applications rather okay so jumping on to the next one now the question says is it possible to create virtual machines using arm in a virtual Network that was created using classic deployment and of course I don't think so it's possible so let's just see this is not supported as I said because arm is something different and previously the classic poker was using a different model of a deployment so it used my classic Model of deployment which was basically for service management and if you see if you're aware of the two portals you know you'll see there's a huge difference in how classic purple was and how the new portal is so there's a huge difference in terms of that also so it's not possible jumping on to the next question what are virtual machine skill sets so scaling up scaling down so if you're not aware if I have more number of requests coming in my virtual machines have the capacity and capability to scale up and scale Down based upon schedule and performance okay so in other words what I can say is if my CPU is spiking to 100 or not 100 I would cap it to nine so if my CPU is spiking to 90 and my requests are still coming in for accessing that portal the web portal maybe you can say which is running on this particular machine so my skill set is intelligent enough to scale one more instance and then load balance to traffic between the two instances which are now deployed running the same set of application okay so this is what my skill set does it automatically Auto scales I would say uh rather the workloads right based upon the performance and the schedule now how this can schedule thing works so say for example if starting from morning nine to evening nine you have more number of visitors on the website so you can scale more number of instances for your website and after 9 pm at night you can minimize the number of instances right so this is how my skill set works okay jumping on to next question our data is supported within skill set so now first of all when you talk about this yes of course my data discard supported in scales and so when you talk about this a skill set can be defined as attached data disk configuration that applies to all VMS that we set so there are few types of data disk examples which I have given over there so one is of course my Azure files which is my SMB share disk and then you have my OS drives then you have my tab drives then you have my SEO data services like the blobs are there and tables are there and queues are there and then you have my external data services such as remote databases so all of this are supported all of the disks are supported which you see over there talking about availability set so do skill sets work with is your availability set now first of all you need to understand what is an availability set availability set is basically grouping of servers so that I can load balance the information among those servers so that is done for high availability and there is something called fall domain and update domain as far as high availability is concerned so fall domain says that if I have two VMS in same availability set both the VMS will share the common power source and common Network okay as far as update domain is concerned so update domain talks about that if I have say for example 10 VMS and I would keep the value of obtained domain to 2 so I will have sets of 2 to each right so I'll have a group of two to each right and at a given point in time if Microsoft needs to perform a planned maintenance only one update domain will go down on a given point so this is what my update domain means so now does it support my availability set the scale set supports my availability set yes the skill set supports my availability set and you can just go ahead and fetch some information about the availability set if you're not very much well words with availability sets as far as my SEO is concerned okay and there is a small description also over there so I'll just read that for you yes a skill set is an implicit availability set with five fault domains and five update domains okay skill sets of more than 100 BM span multiple placement groups which are equivalent to multiple availability sets of so this is what it means and then I would just skip the other part because I have given enough explanation for this and I would definitely recommend that you read availability set okay now the question is what is the break fix issue break fix issue is a technical problem which is like your environment is running and then all of a sudden a problem comes right that is a break and then fixing that particular problem is my fix right so technical problems are called break fix issues it is the industry term which refers to work involved in supporting a technology when it fails in the normal course of function which requires intervention by support organization to restore the service okay so this is what break fix is now it's not specific to Azure actually it's just a common term what we use as far as my services are concerned so even if I talk about on-prem on-premise networks so we still talk about brick fixations right so I'm not sure how many people have got a chance to called Microsoft and open a support case in case you have issues with any of the things in your environment so those issues are basically my breakfast issues jumping on to next question why is is your active directory used okay now when you talk about Azure active directory active directory is used for identity and access management first of all and it's a pass-based service as your active directory is a pass-based service now it is used to Grant access to employees to specific products and services in your network so there are a lot of examples salesforce.com Twitter right I can give you few more examples Office 365 in tune which are two different products from Microsoft so they work with Azure active directory right now Azure ad has some inbuilt support for applications in the gallery which can be added directly okay so there is a gallery and then you can just go ahead and add support for Azure active directory to any application which is available in the gallery okay talking about few other questions regarding SEO ready what happens when you exhaust the maximum failed attempts for authenticating yourself via is your ad okay so we have a more so sophisticated strategy to account lockout which is based on IP of the request and the password enter okay so now when you talk about on-prem we talk about account lockouts so if you enter incorrect passwords more than three times so your account is logged out and it will be unlocked after maybe 15 minutes or half an hour or one hour based upon how your administrator has configured it now the same thing happens in Azure ad but then it happens based upon the IP address okay so it will go ahead and track the IP address of the request and the password enter and the duration of the lockout also increases based upon the likelihood that is in the attack okay so this is how it works when we talk about maximum failed attempts for authenticating yourself okay jumping on to the next question where can I find a list of applications that are pre-integrated with Azure ad and their capabilities so when you talk about this my Azure ad has around 2600 pre-integrated applications and all pre-integrated applications support single sign-on single sign-on lets your organization credentials to access your apps some of the applications also support automated provisioning and deprovisioning so there's a gallery you can just go to the gallery and then you can get a list of applications which are available for using my SEO Ed but of course that is not the complete list of applications which are available which you can use as far as this is concerned there are huge number of applications which are supported by Azure ad okay of course single sign-on is one of the components of azure ad so probably in case you're not aware of single sign-on you can just go ahead and do some research on that now talking about the next question how can I use applications with Azure ad that I am using on-prem of course you can do that now my Azure 80 gives you an easy and secure way to connect to the web app that you choose you can access these applications in a way you can access your SAS based applications in Azure ad so you don't need to have a VPN installed okay you can just go ahead and install a component and then you are ready to go you can use the applications which are there on-prem a few examples I can give you is you can use your SharePoint you can use your web apps for which the servers are installed on your premises and you would like to use as your ad for authentication so you can do that for sure now the question important question very important question what is the Azure fabric service fabric so when you talk about that my Azure service fabric is a distributed system platform that makes it easy to package deploy and manage scalable and reliable Microsoft okay so this is the guy which actually controls your network your storage and your compute okay which are basically the most critical components of my cloud so if my fabric will not be running in the back end so nobody will be assigning IP addresses nobody will be assigning the resources and nobody will be assigning the storage okay so this is the guy which actually runs everything I would say in the cloud infrastructure right okay let's go further service fabric also addresses the significant challenges in developing and managing Cloud applications developers and administrators can avoid complex infrastructure problems and focus on implementing machine critical demanding workloads that are scalable reliable and manageable now my service fabric represents the Next Generation middleware platform for building and managing these Enterprise class tier one cloud-based application okay now what is v-net v-net is basically nothing but my virtual Network so v-net is a representation of your own network in the cloud it logically isolates your instances launched in the cloud from rest of the resources which you have in the cloud so this is my whole network based on which my configuration of machines depend I mean the subnets the address space depends on right talking about few other things what are the differences between subscription administrator and directory administrator so when you talk about the subscription administrator the subscription administrator is the one which is responsible for the complete subscription Now by default only one subscription administrator role is assigned in Azure a subscription administrator can either be a Microsoft account or either can be your work account and that guy has control over the complete SEO portal and the services which run on that okay if you want you can just go ahead and sign up more accounts which are basically subscription administrator and these people are my co-admins now as far as the Azure ad is concerned so the account which we set up over there for identity and access management so we talk about my Azure administrator the directory administrator so let's just read this Azure ad has a different set of admin roles to manage the directory and identity related features these admins will have access to various features in Azure portrait or the Azure classic portal and the admin roles determine what they can do like create or edit users assign administrative roles to others and so on so this is what my Azure administrator does are there any scale limits for customers using manage disk manage disk eliminates the limit associated with storage accounts now first of all you need to understand what is a managed disk right when you create a machine now these days you will have an option over there that you want to use manage disk or unmanaged disk right so when you select manage this Microsoft manages I would not say Microsoft but then the Azure fabric manages all your disk right so a disk which is assigned to VM will be managed by Azure okay so this is what my manage desk is now using my manage disk I basically eliminate the limit associated with my storage account however the number of managed disk per subscription is limited to 2000 by default okay so you can have 2 000 manage disk in as your subscription now talking about what's the difference between service bus queues and storage queues now the Azure storage queue is simple and the developer experience is quite good it uses the local Azure storage emulator and debugging is made quite easy the tool for Azure storage Cube allows you to easily peek at the top 32 messages and if the messages are in XML or Json you'll be able to visualize their content directly with the visual studio furthermore these queues can be purged of their content which is especially useful during the development process the QA efforts basically and as far as my service bus is concerned so the Azure service bus queues are evolved and surrounded by many useful mechanisms that make it Enterprise worthy they are built into service bus and are able to forward messages to other queues and topics and they have built 10 header message queue and messages that are time to live let us control hence messages don't automatically disappear after seven days so if you talk about this probably you'll have to go ahead and read about what is the service bus queue and my storage queues so I talk about my storage queues which are a part of my storage itself if you're not aware of Storage storage has four types of storage one is my blog one is my files one is my queues and one is my tables right so you need to have more information of what my storage queues is okay now talking about what is is your redis cache so redis is an open source in memory data structure store used as a database and then you talk about cache and a message broker so when you talk about my Azure redis cache that is based on the popular open source redis cache it gives you access to secure dedicated redis cache managed by Microsoft and accessible from any application within SEO it supports data structures such as strings hashes list sets solid sets with a big range of queries bitmaps hyper logs and indexes with radius queries okay another question we talk about is why does not Microsoft Reddit cache have an msdn class Library reference like some of the other Azure services now of course when you talk about this this is an open source one very important thing now my Microsoft redis cache is based on the popular open source writers cache and can be accessible by a wide range of redist clients and that is the reason we cannot just go ahead and put it into mstn okay so simple enough just remember that because each client is different there is not one centralized class reference on msdn so we cannot just restrict it maybe you can say right so this is the reason we do not have any msdn class library for my that is cache what is my redis databases so redis database are just a logical separation of data within some relish instances the cache memory is shared between all the databases and actual memory consumption depends on the key value stored in the database for example a C6 cache has 53 GB of memory and you can choose to put all 53 GB into one database or you can split it into a multiple set of databases then you have another question why was my client disconnected from the cache okay now when you talk about this the following are some common reasons for cache disconnect now my client side causes the client application was redeployed the client application performed a scaling operation in the case of cloud services or web apps this may be due to Auto scaling the networking layer of the client side changed then you have transcend errors occurred in the client or in the network node because of the client and the server the bandwidth threshold limits were reached CPU bound operations took more long to complete and then you have my server side issues also a few of the server side issues also on the stand cash offering the Azure redis cache service initiates a failover from the primary node to the secondary node now Azure was patching the instances where the cash was deployed and this can be the redis server updates or general VM maintenance then you have what is the Azure search so simple enough Azure search is a cloud search as a service solution that delegates server and infrastructure management to Microsoft leaving you with a ready-to-use service that can populate with your data and then used to add search to your web or the mobile application so it works with rest API or dotnet sdks of course a few other questions my web app still uses an old Docker container image and I have updated the image on Docker Hub which is basically the new portal what we talked about so we are basically talking about container services do you support continuous integration deployment of custom containers now first of all for private registries you can refresh the container by stopping and starting the web app or you can change or add a dummy application setting to a force and refresh of your container so this is what you can do not talking about few things what are the expected values for startup file section and I configure the runtime stack now for JS nodes you specify pm2 configuration file or your script file for dotnet code specify your compiled Al name for Ruby you can specify the Ruby script that you want to initialize with your app okay how your Azure Marketplace subscription is priced so we have these models one is monthly fee so pay as you go kind of thing usage based then you have free software so Microsoft gives you a free subscription so free subscription is valid for a period of one month and you get approximately 200 in that so you can use that then you have the free so this is for the customers who are not charged Marketplace fee for use of the offerings the Microsoft offerings and you have by well bring your own license okay so these are the subscription models basically how these are priced you can say so jumping on to the next one what is the difference between price software price total price and the cost structure for virtual machine offerings in virtual machine so let's just see price refers to the cost of azure virtual machine to run the software software price refers to the publisher software running on the virtual machine total price refers to the combined cost of these two simple enough I don't think so you need to get more details of this because it's it's very simple price of VM software price total price total price is equal to price of VM plus software price now the other question talks about what are stateful and stateless micro services for service fabric so service fabric enables you to build applications that consist of microservices so stateless microservices such as protocol Gateway and web proxy do not maintain Mutual State outside a request and its response from the service now my Azure cloud service worker role is an example of a stateless service and my stateful microservices such as user accounts databases devices shopping carts and queues maintain a mutual authoritative State beyond the request and its response so this is how it works now what is an application partition okay now when you talk about the application partition the application partition are a part of active directory system and having said so they are directory partitions which are replicated to domain controllers so usually domain controllers that are included in process of directory partition holder replica of the direct repetition now maybe you can read some more information my domain controller has four types of partition one is configuration one is schema one is my domain partition and then you have my application partition so all the information related to the application goes in the application partition so it actually again has nothing to do with the cloud but then it's always good to know because if you're designing and developing an application on the cloud you can just go ahead and put that information in the application partition on the active directory server but then it's very important for you to understand that we should have an active directory server up and running fine in the cloud [Music] so the first question is what is devops of course this is one of the first Microsoft Azure devops interview question that you will be asked so the answer to this is devops is a set of cultural ideas practices and technologies that improves an organization's capacity to provide applications and services at High Velocity by changing and enhancing products at a quicker rate than traditional software development and infrastructure management methods in simple terms devops essentially speeds up the process of delivery of application and software Services coming with our second question why use devops so this is a common Azure devops interview question so how do we answer it devops is significant because it is a software development an operation methodology that allows for faster product creation and simpler maintenance of existing installations when problem arises the devops team is able to don't listen to the user and provide the necessary features and repairs it also facilitates the delivery of smaller features to clients in a quick and efficient manner and allows seamless software delivery so this was the answer to why use devops let's quickly move on to our next question how does devops work so devops is about breaking down barriers between previously compartmentalized departments such as development and operations these development and operations team collaborate across the whole software application life cycle from development and testing to deployment and operations under the devops paradigms going ahead what is azure devops so as I said before Azure devops is a Microsoft application as a service platform that offers an end-to-end devops tool chain for building and distribution software it also interfaces with the majority of the Market's main products and is an excellent choice for coordinating a devops tool chain this process includes testing automation continuous integration as well as continuous delivery people with both development and the operation skill work together and Implement various tools for CI CD and monitoring for quick response to customers requirement and fixing issues and bugs moving ahead what are the benefits of devops so there are five benefits of devops which will understand one by one so first is theme devops practices let you move at a velocity you need to innovate faster adapt to changing marketing better and become more efficient at driving business results second benefit is rapid delivery so when you increase the pace of the releases you can improve your product faster and build competitive Advantage third is reliability in this devops practices like continuous integration and continuous delivery can ensure the quality of application updates and infrastructure changes so you can reliably deliver at more rapid Pace while maintaining an Optimum experience for end users at number 4 improved collaboration under devops model developers and operation teams collaborate closely share responsibilities and combine their workflows this reduces inefficiency and saves time as well finally number five we have security so you can adopt a devops model without sacrificing security by using automated Integrated Security testing tools so these were the five benefits of devops which makes it more demanding let's move on to number six so the number six question is name a few devops tools so the answer to this is some of the popular devops tools are get selenium Jenkins puppet Chef ansible neggars Docker so you can answer any of the four devops tools which are easy to remember moving ahead with the next question what are the popular devops tools for continuous integration and continuous deployment so Azure pipeline support Mac OS windows and Linux so some of the popular devops tools for continuous integration and continuous deployment are Jenkins and themes these two are the devops tools which provide you Perth continuous integration and continuous deployment quickly moving ahead with the next question question number eight that is what are Azure boards so Azure board is a user interface that allows you to keep a track of tasks effects and features in your software project it enables you to monitor all your ideas to each development stage in order to keep your team on track with all the essential code modification related to your work procedure as your boards include features like boards Sprints work items dashboards backlogs queries Etc going ahead what is azure reports this is a basic Azure devops interview question but it can be difficult to answer so the answer to this is azure reports is a collection of Version Control tools for managing your code using Version Control as soon as visible whether your software project is huge or little is a smart idea Version Control Systems are a piece of software that allow you to keep a track of changes to your code over time Azure reports offers centralized version control system as well as the distributed Version Control System that is cute at number 10 what are containers containers are software packages that include all the components required to execute in any environment containers virtualize the operating system in this fashion allowing them to operate anywhere from private data center to a public cloud or in a developer's own laptop containers also provide the means to package software code its configuration dependencies and packages into a single unit or object multiple containers can run on the same machine or share OS with containers for running fast reliable and consistent deployment anywhere moving ahead what containers does Azure devops support Sergey devops support the following containers such as Docker Azure kuberneteservice asp.net with containers Azure service fabric application with Docker support so these are the containers which are supported by Azure demo moving ahead what are Azure pipelines Azure pipeline is a cloud service that allows us to automatically create and test our code projects the Azure pipeline includes several features such as continuous integration and continuous delivery that allow us to test test and build our code on regular and consistent basis and ship to any destination it automatically develops and tests the code projects it is a service on the Azure Cloud that offers well with most project types and languages going ahead at number 13 what is the use of selenium in devops selenium is widely used browser automation technology and testing teams heavily rely on it in devops pipeline it is an open source solution that saves money for testing teams and functional testers that are in charge of UI testing it specializes in the form of regression and functional testing at number 14 what are Azure test plans this is another one of the popular interview questions asked in Azure devops so the answer to this is azure test plans is a service provided by Azure devops that offers a browser-based test management solution for exploratory scheduled manual and user acceptance testing in addition Azure test plans includes a browser extension for exploratory testing and getting feedbacks for the stakeholders additionally it also focuses on automative testing it combines the contribution from developers managers testers product owners and user experience Advocates and enhance the quality of the project moving ahead what are the some important features of mean cache so mean cachet offers a wide variety of features some of them are like it is an open source it's a UDP or TCP server client program it works as a standalone service the cash rate nodes are very ignorant so they don't have other nodes involved they are also under the license of Barclay software distribution and it reduces the database loan so these were some of the important features of mean caching quickly moving ahead at number 16 word is a dog pile effect and how can you prevent it so dog pile effect refers to an effect that occurs when a cache expires and websites are attacked by several requests sent by the client at the same time it indicates the expiry of cashier followed by the website being simultaneously hit by numerous suspects in simple terms it indicates the expiry of cache followed by the website being simultaneously hit by numerous requests this impact can be avoided by employing semaphore log when a log expires in this system the post process gains the lag and begins creating the fresh Valley at number 17 what is continuous testing what is the use of test Automation in devops as we know devops is all about people culture and Automation in this continuous testing is the process of running automated tests as a part of the software delivery pipeline to get feedback on the business risks associated with a software release candidate as soon as possible devops use test automation to use the test cases to find the flaws and save time unlike manual testing automation test cases accelerate texting allowing you to get products to Market faster moving ahead number 18 what is a forking workflow so before we get to know what a spoken word flow let us know what is forking so is a gift loan process that is performed on server copy of the Project's reposit the forking method supports open source projects and isolates changes from the main repository until they are ready to be integrated what is forking workflow poking workflow is a git clone process that is performed on a server copy of a projects repository this method supports an open source projects and isolates changes from the main repository until they are ready to be integrated moving ahead what are some of the useful plugins in Jenkins so it has a very simple answer so there are many useful checking plugins out of which would be describing few so among them are Amazon ec2 Maven 2 project HTML publisher join copy artifact green balls and many more so these are few useful Jenkins plugins and it will be very easy for you to remember it quickly moving ahead with number 20 can we move or copy Jenkins from one server to another so the answer to this is yes it is possible Jenkins can be removed from one server to another just by cloning the job directly from the previous server to the new or the current one and later you can rename it this enables moving and installation by popping the corresponding job tag so these were the top 20 Azure devops interview questions so let's look on to some scenario based Azure devops interview questions so the first one is we have 50 comments in develop Branch out of which only five commits need to be pushed in the release branch which will eventually deployed in the production okay so the question is we have around 50 comments in the developed branch which is mostly used for development activities so in that we have multiple comments and the release branch is used to be release your changes into your proper environments like your Stage production so here out of 50 only five commits need to be posted into the release branch so the best way to achieve it is through get cherry picking get cherry picking is the act of picking a commit from a branch applying it to another kid cherry pick can be useful for undoing changes for example say a commit is accidentally made to the wrong Branch so what you can do you can switch to the current branch and cherry pick the commit to where it should belong most simply it would be like a box of Cherry in which you have to pick five best cherries out of it so you start picking the best cherries and if you want to run this cherry picking you need to use the get console so using the get line command you can initiate the cherry picking so let's move on to our second scenario so in the second scenario a developer has created an asp.net core application and now he wishes to automate the entire build and release process through Azure devops guide the developer with the entire end-to-end process and deploy the application on Azure web app using IAC methodology okay so guys as you can see that we have multiple questions and in this entire scenario which we are talking about the first question is that there is a developer which is very much new to devops and yes reach out of the reaches of devops engineer and now he asked how to automate the entire build and release process and in the another scenario we need to deploy this on the Azure web app using the IAC methodology so the first step is to be taken by the developer to build the application locally after that we create a Azure reports and push the source code in the mode report tree along with the IAC configuration file and lastly we create a build pipeline plastic or yaml based in which we can build the solution using Ms build or vs build run the code analysis through sonar Cube Vera code Etc run code coverage copy arm or terraform configuration files also publish the artifact either on filler share or Azure pipeline drop location so this was the answer to the scenario moving ahead with the next scenario here the developer named Ram has few commets changes in his local branch and now he has to empty the current work directory to accommodate emergency change so how can Ram handle such scenario without losing his uncommit changes so to achieve this we use kit stash what is git stash git stash is a command that takes your uncommitted changes that is both staged and upstaged and save them away from and save them away for later use and then reverts them from your working copy so there are commands like git stash which stashes The Uncommon changes and next is get stashless so you can list all the stashed changes then there are other commands like get stats pop to reapply the most recent stat changes