Transcript for:
Quantum Machine Learning (QML) Station Series: Introduction

welcome everyone for the qml station series we are starting today and today we have with here uh with us katalina eloros from zanadu Catalina holds a master's degree in electronics from Los Andes universities and an engineering diploma from IMT Atlantic in France with a research focus on auto autonomous systems she's currently the quantum Community manager at sanadu where she's helping build the community around penan and in the past she also worked as an IBM Quantum ambassador and C has been the reason championing the entire relationship and the partnership between Sanu and for the past three years so we are very happy and very like really honored to have you back Catalina for this PML session so yeah over to you C now thank you very much and to everybody attending uh today the goal is that you get all the tools the learning you know uh the building blocks that will help you uh do project s to research uh you know learn about this very interesting topic which is quantum machine learning and we're going to um challenge some of the assumptions some of the knowledge that we think we have so before we begin I'm going to ask everybody uh well first let me introduce I know I know that F done introduce myself but I just want to focus on a couple of things um I didn't study uh Quantum machine learning right uh but I'm an engineer uh but I could still make it into the field I'm not doing research I'm working in Community Helping uh you know everybody who's who's joining the field and you know managing a lot of the activities that we hold uh that we host uh from Penny Lane and zanu and this is an invitation for you as well if you come from a different field and maybe you have this assumption that conine learning is going to be too hard it's not it's probably going to be uh easier than you think and maybe harder in unexpected ways right uh so this is an invitation to just uh uh enjoy a learn and challenge yourself um it's uh you're very much welcome into the field and we need people with different and new ideas coming from different backgrounds so today I'm going to uh first give an intro of uh what is Anna do what is beny Lane uh I'm going to talk about some qml Basics so no or terms ter techology that you may not have heard before uh that is going to be you know uh the building block for what you're going to need to know later we're going to do a live demo so uh I'm going to show you some code show you how um how it looks how it works and how it doesn't work um and at the end we're gonna I'm going to share some resources and what I call the homework it's uh I'm I'm not going to check the homework but it's going to be more like um a suggested homework uh so let's get started uh zanadu is a startup it's a Canadian startup founded uh in 2016 uh we have almost uh over 200 people now already uh who have a very strong Mission uh we want to build quantum computers that are useful and available to people everywhere so you're part of the mission uh making uh you know our software and Hardware available to you and our uh knowledge and expertise and a available to you that is part of our mission so we're very happy to be here uh our Hardware is focused on photonics uh which means we use light to make quantum computers and we have a full stack we have the hardware uh the software and the applications and algorithms research so the hardware is going to work differently from what you may have uh you know been used to and we're not going to focus on how the hardware works today if you are interested in this then go to zanadu doai that's a company website and we have a lot of uh videos and explanations but the main thing to know is that it's a uh we're in the process of building a new generation of quantum computers we in the process of building our first fault tolerant modules um and our architecture is universal meaning what you program in pen lane or what you have programmed in other software no you could uh you should be able to run it in those new computers they're not online they're not ready we are building them right building a quono computer is hard it takes a lot of time uh um in the meantime uh and not in the meantime but like also very important thing that we need to do is develop software that is going to allow us to develop meaningful applications right so why do we even do Quantum Computing right ah ask yourself the question if you want to uh post an answer in the chat why why do you even do Quant Computing the goal is to solve some very hard problems not all problems only specific problems um the classical computer cannot solve um and not even all problems that classical computer cannot solve just a subset of those uh one of those there's a one we hope that Quantum machine learning will be one of those and we think it's going to be key Target area uh but there's just a lot of work to do some other areas are quantum chemistry that one is more obvious uh of why in chemical systems you can use quantum Computing uh and in the end we integrate all of these layers together so the the hardware the software and the algorithms especially today we're going to focus on the latter two the software and the algorithms they need to work together um and so what we do for example we have some of the best researchers in the world in Quantum machine learning and quantum chemistry Maria SCH for example maybe you can um can give me a hands up or a reaction if you have heard of Maria SCH I don't know if I can see it but maybe you can uh you can let me know over I I can see the hands perfect yes so yeah give me a reaction if you have heard of her good so she works at zanadu and uh there's a whole team working on Quantum machine learning um we work not not on a specific let's say not on a specific problem a specific application but like the foundation of that and all of the research quickly turns into something that goes into the software so you're going to see that we have tutorials demos and even like the software itself evolves as the research evolves we wanted to always be at the the frontier so that you can do uh impactful you know projects and research with it so why do I say research so much uh it's because today um this is still a a field of research uh this is not a solved problem everything that happens in Quantum machine learning is in the area of research uh there's not um one product that will work for everything everything we're doing still uh we're learning on the way and now going zooming in a little bit here what is Penny Lane I mentioned Penny Lane let let me ask the question again who knows uh Penny Lane maybe lift your hand if you've heard of Penny Lane give me a reaction or a hand up yes perfect I love it all right great so did you know that Penny Lan had all of these things Penny Lan is the software Penny Lan is the SDK that we used to program uh peny Lane has high performance simulators so useful if you have for example um an HPC computer well many many other things uh we have a compiler so catalyst is the this kind of Brother software that goes with Penny Lane it integrates with it it works very well together and uh we have plugins plugins help you code in Penny Lane and run your code in other places and Penny lane. is a website our hub for learning for content for engaging with other people Etc so all of this is part of the paining ecosystem and people love it uh I guess you too uh there's probably a lot of you who have been part of or are part of the penink community asking questions answering questions doing projects uh you know maybe studying in a university where peny Lane is Tau uh or used for teaching uh it's a tool and it's a community as well so now getting into the topic let me uh now uh I'm going to ask you this question so you can go to that M sorry because the on top is not letting me see all right so I'm going to launch um I am going to no not that one I don't know if you're seeing anything here what words come to your mind when you think of quantum machine learning so here you can uh scan that uh QR code or go to m and uh put in that number I want to hear what words come to your mind when you think of quantum machine learning so as we get more answers I will see them here many people are typing in the Discord chat as well we will share some of them VA says variational algorithms Maya says optimization more are coming but currently people are trying to log into m i see VQ I see people typ here I think this link should also work I sh it in the chat yes we get some answers AI contur networks optimization futuristic nice hype o some of it yes uh we're going to try to break it up today A Little Bit O Perfection I had never seen that one um crazy answer set yes faster speed up those come up often original algorithms yes oh great answers amazing okay so a lot of optimization and that is true optimization is actually the big base the building block that you're going to use for Quantum machine learning um and version algorithms that those are all very important as well so what is it exactly so let's go to to the next stage and try to say what what is quantum machine learning it works in the interface between classical uh data and algorithms and Quantum data and algorithms so for example if you use classical machine learning to improve a quantum computer technically that's Quantum machine learning most of us probably don't think of quantum machine learning in this way but that is that goes within the definition of it right uh what if you use quantum data and use classical algorithms to treat it that works as Quantum machine learning as well what we think often when we think of quantum machine learning we're referring to uh using classical or Quantum data and running it with quantum algorithms so like that's what most people refer to when they say Quantum machine learning but all of these are quantum machine learning So today we're going to focus on like the uh most popular one but you can actually use and we have people at Z for example who use machine learning uh to help improve uh stuff in our quantum computer uh so if this is if you're into machine learning um there are many ways in which you can combine the knowledge of quantum and classical here we have some very important Concepts so first actually um give me a hands up if you are comfortable with Quantum circuits give me a hand up if you're comfortable with Quantum circuits okay perfect a lot and now H give me hands down or a reaction down or a sad face if if you are not comfortable with Quantum circuits or vam maybe you can tell me if if you're not comfortable with Quantum circuits that's totally okay just let me know I guess everybody has already working hard the past weeks to be introduced to them nice perfect awesome so here in the left hand side what we're representing is we have a Quantum circuit uh these you know yellow yellow pink and blue blocks uh represent so first we're going to have an embedding so we're going to put data into the quantum computer it's going to create a a specific state in you know our Quantum circuit uh then we're going to have uh some layers so these pink and blue uh well they can represent a different set of gates a combination of gates and they can take parameters so these Alpha and beta what they're going to be is they're going to be some values that we're going to put into the quantum computer and we're going to modify so we're going to do some tuning not manually but automatically so instead of having like a manual tuning for the alpha and beta uh Quantum machine learning and you know optimization allows us uh to to do that tuning where is the difference and this is a common a common question where does optimization stop and it becomes Quantum machine learning data data is a key in Quantum machine learning uh it's a you you you need data probably a lot of data usually depends um in order to you know for it to be considered quation learning instead of just an optimization problem so the picture that we have in the middle represents an optimization landscape this is a cost function so again an important concept that we're going to hear a lot in Quantum machine learning our cost function is going to tell us how well we are doing so let's say in the circle that we have on the left uh we want to measure H we want the measurement to be zero right we want to be as small as possible so we want to tune these parameters Alpha and beta so that the output is zero um oh sorry um now our cost function is going to you know tell us or give us a map of what is the output depending on the value of Alpha and beta except we don't know this right usually for a specific circuit a specific problem uh we don't know exactly how this is so an optimization uh routine will allow us to start at any point and then move along downhill so downhill we're going to find something called a gradient this gradient is the direction of Maximum change and we're going to go downhill against the gradient right um so if we're rolling down a mountain we want to roll as fast as possible we're going to go step by step and so what happens is let's say I start at a point in the pink region right this means there's a specific value of Alpha and beta so I'm going to have some initial parameters you very very important concept when you're talking about uh variational algorithms you're going to start with some initial parameters then you're going to take a step so let's say it's one one little square in a direction which I think this one is the one that is you know has the steepest descent so I'm going to take one step in this direction and I'm going to stand here now so after that one step I will have an this means I have to run my circuit again the layout of the circuit will be the same but the specific value of those parameters is going to be different and step by step I'm going to get to a lower cost uh until I get to a place where I'm I'm at the bottom and there's no way to move move down anymore uh yeah so basically that that would mean a success now there's making these cost functions is a whole science and there are cases when that cost function does not look so nice it actually looks pretty flat and if it's pretty flat it's hard to know where downhill is or for example you have kind of um what we call local minimum meaning you have places in the function that you know kind of are at the bottom of their little region but they're not the overall bottom usually there are techniques to overcome this but choosing your cost function is very important and it will depend on what is your your purpose right so in this case um our purpose was to um get the circuit to have a value of zero for example or a specific value sometimes we're going to do often what we're going to do is we want the circuit to have a specific value depending on the input for example if my input is uh numbers from 1 to 10 and I want the output to be the sign of the input then it will have to train in a specific way for that um and so it depends on what you want to do uh then you have to build a different uh cost function that will take you there now what you see on the right is uh a different representation of this this side that you have the beginning means uh you have an initial State you have to prepare it so this is something very important usually don't have these states for free this is going to be a very hard thing to do embedding data into the quantum computer um is going to be a a roadblock a very big bottleneck so very important to not to assume that we can always uh prepare initial State easily then we have Earth Gates and this part you probably already know from your previous sessions and at the end uh you have you know a new version of the state at you know right before measurement so you can use techniques uh for example gradient descent there are other both gradient based and non-gradient based techniques uh to find the new uh version of those parameters that you're going to then use for the next iteration of your secret all right so this is very important um because differentiable programming uh you know so far it has has been revolutionary uh it is how we are thinking of quantum machine learning and it has its pros and cons uh but it's important to to understand it so that you can either use it or challenge it all right so I guess we don't need to go over cubits circuits and Gates we can go straight to embeddings let me St here for a second and uh give me a hands up if we're doing well or give me a reaction to show me how you're doing hey everyone on Discord how are you doing going well ready to go into embeddings right away people say I I love the Penny Lane visualizations and people ask are they done by hand or by software and uh Kush asks is there a way to find global Minima for the cost function yes good questions uh so we have a designer and he does the all of the drawings by hand I mean in a in an iPad or equivalent uh but yes uh we have a originally it was actually the person who is now our head of product he started doing the design by hand and everybody loved them and it became iconic Penny Lane and now we have a designer who who makes all of those designs so good question and uh can you find global Minima it depends it's actually the goal that's what everybody wants when you have your uh when you have uh your Quantum machine learning problem the goal that you want is to find that Global minimum right the overall minimum that is what you want but depending on how your set your problem uh what Optimizer you use and how your data is how your cost function looks sometimes it's going to be easier and sometimes it's going to be harder so it will happen a lot and as you're trying it on exploring it uh you will for sure fall into local Minima and you'll have to be creative and find new ways and explore ways of either you have to change your cost function maybe you have to use something called local cost functions so you know that their answer is around a certain um uh area or something so you're kind of restricting your problem to a specific um part of your cost function this can allow you to not fall into local Minima somewhere else um there are many techniques there are sarcastic techniques so this is like a whole a big thing that you can explore and sometimes it's very hard so there's no there's no way to know is what I'm doing going to fall into a local Minima usually I mean okay I haven't read all of the papers maybe there's a way but as far as I know usually you know you try it and then you have to just check check that you actually fell in the right place wonderful and everyone is ready and eager to continue so let's move the other questions to the end of the talk and go to embeddings good uh so this uh I did I didn't make this um uh you know picture I I got it from a colleague but I just found it so fun um it does show a little bit how this happens you have your Con on computer and your classical data you know you just want to put it in and it's not some not always easy as I mentioned it is a bottle link and you'll have to find creative ways of getting your classical data in so usually um root Force doesn't always work uh you'll have to try different things of how to uh put your data into your quantum computer um without it being a roadblock so this is one of the reasons why a lot of people are saying maybe we have to think about Quantum machine learning in a different way because embedding that data uh is hard so maybe instead of thinking of quantum machine learning in like big data right I think some of you mentioned speedups and a lot of data Etc um that's going to be hard most of the times it's going to be small and you're going to need to reduce your data a lot so maybe we need to be smarter about how we're doing embeddings um they're called either embeddings or encodings you're going to hear both terminologies both are fine um and they're going to try to represent your data in the quantum computer something very important that you want to know is you want your data to be identifiable or different uh being able to be different for the quantum computer so for example if you have the numbers 1 to 10 that is your data and you want it to put into the con a computer and you say all of the data I'm going to put it as a one then your computer doesn't know that they are different data if you say all of this data is going to be um fall between 0.1 and 0.2 it's going to be hard for the quantum computer to actually you know uh understand the difference between all of this all of these different data points so usually you're going to be onean tick to use as much of the uh feature space available depending on the embedding that you want for example and here we're going to start to talk about some embeddings uh some common embeddings that you're going to see are amplitude angle and uh basis embedding there are many others uh these are just some examples um so amplitude embedding is going to be well actually maybe let me ask you first a question no I'm going to say I'm going to tell you we don't have as much time for questions so ampli embedding is going to input your features um let's say all together into the cubits so if you have uh five cubits and yeah you have a number of of features you can have up to two to the end features uh which are your data points uh and you can encode them as an amplitude Vector right so you remember when you learned about Quantum Computing you have have um yeah you have your probability amplitudes and you can enclude them in different ways so you can have a lot of features and put it into less cuits you want to try to make sure that you know um the data is uh not all like clumped together so that you can actually get different amplitudes there so you're going to have your data for example being uh normalized in terms of angle embedding if your data are angles uh you want you want your angles to be between uh zero and Pi or zero and 2 pi or minus pi and Pi but within that range right because if all of your data is let's say between again 0o and 0.1 then all of the angle embeddings are going to be basically the same so the angle embeddings are doing a rotation you start with a value that value let's say your data is 0.1 and your angle embedding is going to do in this case in the picture we have an X rotation so we're going to do a tiny rotation and that is going to basically prepare that initial state that is then you know you're going to put your onsets after so first you have um you know your data you have to start with your data then you do h a way to prepare to put that data in into the quantum computer so in this case of angle bedding it's going to prepare it as a rotation you can use XY Z rotation um and you can and you should pre-process your data to make sure that they're not all kind of the same rotation so you want to actually use all of that uh feature space that you have uh in the quantum computer um and finally and this for this one it's important uh you can I mean I'm going to I'm going to put the the simplest way to think about it is you can only use n features which has to be smaller than the number of cubits meaning if you have five cubits or in this case we have three cubits you cannot have one layer with more rotations than the number of cubits right so uh if you had five five data points saying number from 1 to five what you would need to do is you need to do several sequential embeddings so you would have to first embed data points one two three and then you would have to embed data points let's say uh four and five uh but this would be a different embedding right it would not be just all rotations in a single line you would have to have like several lines and this changes uh this is um yeah this like a a hack that you could do but usually what you would do with angle embedding is if you have let's say three data points you encode them each in one cubit and for basis embedding um this one like is different you really just need like the same number of binary features as the number of cubits meaning that if your numbers are binary say uh you want to encode uh one one in this case then um you can use this embedding to say I'm going to apply a PO X rotation to make this a one then I'm not going to do anything and apply a zero and I'm going to apply another poly X and this is going to become a one as well so again you're limited in the amount of data that you can put into the qu computer with this embedding but it is very practical if you have binary date for example well sometimes you know uh you can have you know the number say number three and then you can turn it into binary you can turn it into binary and then encode it with basis embedding so this is something that that you can do uh with quantum computer um all right so just as a wrap up of what embeddings are embeddings are uh the way in which you put your data into your quantum computer uh most of the times we're going to work with classical data but if you have Quantum data you will also need a way to embed that data into the quantum computer there are different ways some work with the amplitude you know creating a an amplitude Vector some of them work with creating specific angle rotations some work with like binary poly X or not and there are many others um so there's like a Time Evolution one the others are a bit more complicated I would usually recommend I'm going eding for starting It's usually the the one that's easier to understand um but if you want to try them all then for sure go ahead now layers layers are different and here it's important to pay a lot of attention layers are sometimes going to be different In classical and in Quantum so in this case uh you're going to have some blocks let say A and B and on the right hand side uh if you're maybe you've heard in previous sessions block number B you could say these are kind of like two or three layers right but for the context let's say it depends on the context that you're listen hearing it right because very often for example in Penny Lane we refer to layers in the same way as classical layers where you have a combination of things that go into like a block a and a combination of things that go into a Block B and so a is one layer and b is another layer and you can interchange them so um it's very important when you're uh you know reading learning uh doing code Etc to understand what terminology of layers is being used whether it's the one where you say there's a block with stuff in it or whether you're meaning like a specific you know um let's say a specific time step in you know your Quantum program so uh here for example uh how many layers do we have it depends on who you ask uh I don't know if you've learned before the the other different interpretations of how you consider layers but for practical purposes uh I think it's uh it's smarter to go with a classical machine learning uh concept of layers which is just a is a block of stuff in a specific you know setup and B is a block of stuff in another setup each of them is a layer and then I can apply them as needed and I can apply more than one um yeah Etc so here we have an example of parameterized layers first we have an embedding here we have angle embedding we have y rotations and they're embedding our data which are angles p over four 5 over three um 5 over 7 we're embedding this data and then we're adding a series of of layers we have some parameterized layers taking uh some angles that is zero and Theta 1 they are being added as Z rotations then we have some cots so um here you can consider this like an entangling layer and then you can have another uh parameter is there uh making rotations Y and X on other values of theta uh so here we can probably input into the program just a a whole thing called Theta which contains all of the little values of the rotations and at the end we will have a measurement so what do represent here is an expectation value it's a very common way of uh obtaining measurements as that this is one is um one that is very confusing at the beginning uh but what it means is it's uh like a an educated guess on the layout of your circuit so you're going to not going to worry about the specific parameters so here for example we have some specific parameters when you talk about your anet it's or anets it's all about just the the layout of the circuit saying I'm going to apply some uh Z rotations here some C Nots some Y and X rotations it's all about the layout and not specifically about the parameters that go inside and there are different an depending on what you want to do so for example circuit number one has you know some layers Alpha and bet A and B which depend on Alpha and beta but Circle Number Two has also also depends on time so it has some uh exponentials some evolutions and they also depend on time so it's going to be very important to uh you know choose the right answers for your problem and time dependent things are know hard you actually have to do a lot of approximations and there's a lot of research on how to do uh better approximations for um those humbl tonians I hope you have heard of hamiltonians but anyway you have like these um matrices that you want to uh you know exponentially or evolv with time and how to put them into the quantum circuit um in a way that is efficient that is a a research problem so um yeah and that's it's not a solved it's not a solved problem uh there's no one that works better for all and you'll have to test them depending on your on your situation so my recommendation here would be look for a similar problem to the one that you're tackling and see how they have uh how they have performed their the layout that anets how they have designed it and what approximations they have done uh because very often you'll have something that you want to you know represent um a specific physical problem but you cannot represent it exactly so that is something that uh I would recommend you take into account when you're looking at your assets here are some fir ones so um I don't know if you can tell but they are different the one on the left is a tree tensor Network so it has uh kind of like this tree uh structure where you have you know uh kind of like a pyramid you're going to think of it kind of like a pyramid and on circuit number two uh we have something called matrix product States and so here you're you don't have like that tree structure or that pyramid structure you have kind of like layers that overlap at certain points so these are also two different options for anet and um these are useful for if you're working with tensor networks for example these are some that are going to come and it's good to be familiar with the concepts so treat tensor Network matrix product State uh they are ways of creating that layout of your circuit and they're going to encode uh you know you're going to encode your uh parameters in them and it's going to have a different effect so let's say you meet a circuit your for your project that you're going to go uh do at the end uh it doesn't work try to change the anets maybe it's not good enough to represent you know the the solution that you want in your circuit um tesy other options as well now I guess uh we know about measurements uh but uh they are important it's important to to remember that uh when you measure everything becomes classical uh so just uh be aware of it be careful of um whether using an expectation value samples probabilities you have all of these options and they're also part of the circuit so this kind of completes this different part of the circuits your encoding or embedding is where you will put your data into your circuit and in some software you might have the impression that you just initialize and it starts in a specific state but in reality it doesn't start in that you have to put it in a specific state it always starts in zero and then you put it in a specific state with your encoding or embedding then you have your anets which is your layout of your circuit where you're going to have your parameters and you're going to be able to vary them and at the end you're going to have your different kinds of measurements so I am going going to show you a little bit of code quickly and uh we're not going to do the live coding because we are not going to have enough time um but oh I think there are some questions that I hadn't seen uh I'll answer them later I'll answer them later because I think we're going to run out of time um yes most questions are being asked on Discord and for down will as to you at the end perfect okay all right so so uh what I want to show you first is uh if you want to use penine for example you can use it in in Google collab um every I think yeah anyway um I'm going to run this just in case I want to run something so you import pendine as ql so pendine is not only used for Quantum machine learning but uh it did start like that so for historical reasons we keep the kind of the signature here um you then create a device so the device is going to be like the back end where you're going to run your circuit here uh we're going to use a simulator uh this is called uh this one is called default. Cubit and you're going to see a lot that we say wires why wires well if you look at a circuit the cubits kind of look like wires you know those horizontal lines they kind of look like wires so we're going to call them wires but they are cubits you can say it's the same as Qs um now you can create circuits in a very python native way so you create a function in this case I'm going to call it circuit and you add the different gates in the middle of that function and at the end you return a measurement um here we have atq ml. q Noe of Dev so basically this line can signature pen Lane it turns your Quantum circuit into something that you can run on a specific device so your Keynotes are going to be very important H you can then draw can then draw your your circuits you can even choose different styles here's the pennying style for those who who asked about the pening style this one is automatic so this is not this is our designer drawing it after every circuit this is automatic um and you can also say run your circuit and show you have 50/50 chance of getting Z 0 or 1 one um so you you can then practice changing all of these things so this was just important information to be able to read a little bit what's going to happen now so the this example what we're going to show is I'm going to show you how to create a circuit that will learn a sign function right so this is a Quantum machine learning problem you have uh we're going to have 10 data points from zero to 2 pi and we're going to teach the circuit um we're going to make it learn that it has to Output the sign function so we first well Step Zero get your data prepare the data this is very important uh in this case it's just 10 data points between 0 and 2 pi and I'm also creating 10 test data points that are a little bit shift I create a device so the device is where I'm going to run my circuits and now here we go to the interesting thing so I know everything else was very quick but I just want to focus on on this specific point our Quantum circuit our Quantum circuit takes some parameters must take some arguments my data point and my parameters so notice how I have actually 10 data points but I'm going to run my circuit only once for each because I I want to just here get at the output at the measurement I just want to get the sign of the specific input right I am going to encode the input As an X rotation so with this gate which is the first gate here I'm going to modify this state of my Quantum circuit and then my circuit is going to learn the parameters of the next two gates the Y and X rotations here at the end it's going to learn the right parameters so that the output is always the sign function of whatever state you know uh whatever input I had so this is uh pretty cool I just need one gate to do my encoding or embedding so notice how this is an angle embedding and then I have an anets which is composed of a y rotation and an X rotation so I don't I don't have to care exactly about the parameters that go in just about the layout that's going to be my assets and at the end I'm going to measure an expectation value so a number that is going to tell me uh you know give me the answer or a prediction for that sign function that's what I want notice that so far I haven't told I haven't said anything about the sign function the circuit doesn't know so what is the part that knows I have a loss function and a cost function and they're the ones that are going to help me you know make the circuit make the right prediction so what is the difference between a cost function and a loss function they in some cases they can you know be used for the same but there's like a technical difference and it's that um the you know you have 10 data points because we have 10 data points then we actually need a uh loss function to calculate the specific loss for each one meaning um it's going to compare each data point with the expected output and the real output so for my number one data point which was Zero I'm going to you know compare the prediction of the circuit so for a specific set of parameters I'm going to compare the prediction of the circuit with what we call the label so you're going to learn that in the next section as well you going to compare and see okay if I did well then this is probably zero so we're going to have a low loss and we're going to add them up add that loss uh up and at the end um our cost function will depend just on our parameters and our data and our labels um while our loss function here depends on our predictions and our targets so our loss function kind of makes that comparison and our cost function is kind of the overall function that we're going to try to optimize and we can have more classical Pro processing if we want as well we can use different optimizers in this case I use a gr descent Optimizer which is just a a very normal Optimizer and I use an optimization routine I'm not going to go into the details because are not going to remember maybe just remember my Optimizer has some optimizing functions that are going to be able to help like step on cost and at the end when I graph my results what I'm going to see is that the parameters evolved you can you can see that the cost is getting lower and lower and the param are changing after every iteration until it can always predict even for data it has never seen it can predict what are the right you know um uh yeah it can predict the right output for the right input so I know we're kind of out of time uh so I'm going to kind of skip a little bit of like talking about Penny Lane let me just give the homework the homework is to practice my homework is to practice explore pen. you have a lot of practice opportunities and learning uh tutorials coding challenges and a codebook I absolutely recommend that come in the code book um the demonstrations are a little bit harder but they also go deeper into you know what's going on in research in this topic and the challenges are great practice so uh the codebook is good to get started it does it's not specific to Quantum machine learning though and uh in terms of uh the community you can always ask questions and I'm going to say the second homework aside from practice is ask questions get involved uh learn like you're doing with renum like in the Discord Etc uh definitely um you know join the different communities because they're going to help you out you don't have to do this on your own so uh if you're still in that M uh let me know how you like today's session I hope you learned a lot um I'm going to yep let me know how you liked today and if you liked it uh yeah definitely stay tuned for next time and next time I'll also show you the gate emojis uh gate emojis are uh kind of something you can add to slack as well so if you're interested next time I'll show this to you thank you wonderful yeah thanks a lot kle we have already started receiving a lot of comments on uh the Scot Channel it was wonderful yoses from Brazil uh it was quite interesting very nice idea about predicting sign functions it was very good and we have a lot more comments too but I can maybe just quickly squeeze in a couple of questions before we run off time uh one thing is about the collab file which you just demonstrated can the participants get a link for that collab file uh we it's not in a like public repo uh but I can sh share so I can share with you a blog post that has the sign function example so that blog post has you know it's a good way to remember what we talked about and also get the example at the end perfect so as in when we receed that blog post from sadu we'll share it with all of our participants on this call uh another question it's regarding anets and encoding so while you mentioning encoding there were three different ways right uh one of the participants ask is it possible to use all three of them or maybe a combination of these encodings in the same Quantum circuit good question yes and there are more than three there are many more these are just three that are kind of uh representative of what you usually do so the goal of the encoding is that your initial state of your circuit is something that you can then you know distinguish the different uh the each state kind of represent a specific data so you can combine them but you have to be careful so that you're kind of not diluting the data so what you do not want is for your data to kind of like um get faded out and then not distinguishable because then your Quantum circuit is not going to be able to to act on each specific data point but you can test it you can test it and see maybe for your specific problem it will work great and then there's one question on the anset so anset is an educated guest right so is it just a trial and error kind of uh way to come up with the best ANS possible or is there a more structured way where the participants can actually work their way through and come up with the best ANS for a particular problem or a particular Quantum circuit yeah good question so there's people who have done research for specific problems uh so for the example that we show today we just use one cubit and two rotations this is not enough to represent problems that um um I'm going to say our hard and what does hard mean it's actually complicated to say but for example a sign function so um circuits are can be represented like in forer series so kind of a combination of signs sign functions uh if you just have one sign function then that's easy you just need kind of one coefficient so that one is easy but if you want to represent a straight line uh with like the same kind of layout it's going to be very hard you're going to need a lot of a lot of gates to do that so you'll have to become H creative and see how others have used different techniques to represent your specific problem because even a straight line and I'll show this to you next time a straight line can be hard with a similar onsets like we were using today precisely because it's not suited for that and because it's so domain specific it's uh it's hard to say so for example part of the research that that Maria's team is doing is what is the structure of the problems and how how can we use quantum computers to represent that structure right so it's not a solved problem and it's depending on the problem you have to choose one you can use some generic ones uh like a lot of inment it can help but you'll have to be careful because it doesn't work for all problems right another question and I'll uh maybe this the last question for today a work from India asked that what is the physics behind Quantum gradient descent in qml like is it similar to how aning takes place where the hamiltonian is just left to reach out to the lowest possible state is something similar to that um so analing is is different analing is definitely different and it it's not about the optimization it's about kind of like the the whole thing if you the whole picture so if you look at the whole picture you have these circuits um that can do uh you know they're Universal you can represent any like it's an information question so in theory of information uh you can kind of distinguish the two in a KN link you it's it's like um having a hot coffee and it kind of gets colder here it's more like you are acting together in that coffee you're kind of moving it around until you get something specific so they're kind of they're they're similar and the same that in the sense that they use quantum but they're a different approach as a whole I it's not just restricted to the optim to the optimizer great okay yeah with that we can probably wrap today's session uh thanks a lot C once again and we'll see you soon with the next lecture on qml and with zandu once again so thanks a lot k for today thank you I'm glad you liked it