all right so keep in mind there is zero prompting going on within the actual agent it just says you are a helpful assistant now I'm going to ask it to get Airbnb listings in Miami it's hitting airbnb's MPC server in order to understand okay what resources do I have what actions can I take what are the tool names and now it's hitting its execute tool in order to fill in the tool name and fill in the different parameters that it needs to pull back those listings that we're looking for so it should be finishing up any second now and we will be able to see these listings that we got there we go so we have here are some Airbnb listings in Miami 126 Sabo it gets us price information location all this kind of stuff if we click in here we should see an actual picture of Airbnb listing this is actually kind of just a boat but either way um we got five total here's a beautiful Suite in Miami which once again Airbnb Miami now to look at what it actually did as we know it hit the server first to list out the tools so as you can see it has Airbnb search which we use when we want to search for listings and here are the different filters that we can use the different parameters and then we have the other tool which is listing details in order to get information about a specific listing and in that cas these are the different parameters that we would send over so because the agent was able to understand this is what I have access to and this is when I use each one I'm going to fill in my tool which is over here the tool I used is Airbnb search and I know that one of the parameters I need to fill out is the location which I will put Miami and then we get all of our actual results back cool we're going to do one more quick example which is asking it to scrape Chipotle so sent that off it's going to hit fir crawls mCP server this time in order to understand what actions do I have over here what can I do and now it's hitting the actual tool to take action for us and we should get back you know the the scrape results from chipotle.com so it's finishing up now and we should see the results right here so let's make this a little bigger I've scraped the main content from Chipotle homepage here's a summary we have new Chipotle honey chicken chipotle rewards menu highlights guacamole all this kind of stuff and if we were to click into the fir craw actions tool we can see that fir crawl has more than just those two actions that Airbnb had it has scrape it has map it has crawl it has all of these it has nine total even deep research and then the execute tool is able to say okay in this case I need to scrape I need to use this URL I am going to format it and mark down and then we get our answer back here so what's going on in this tool is it's basically nine tools wrapped into one and all of these other ones are multiple tools wrapped into one as well so if we wanted to build out this entire agent it would have to look something like this if we weren't accessing mCP servers and once again in that demo there was Zero prompting going on within the agent and there was no prompting going on within the descriptions of the parameters to be sent over either so this is really just all I did was plug in the servers and give it a chat model if we really got more refined with our prompting these things are going to get really powerful so we're going to hop into an excal draw I'm going to break down what is actually going on what mCP is as simple as I can and if you guys are wanting a step-by-step build of you know how we actually you know spin up a self-hosted environment install the community node right that we're looking at right here make sure everything's configured correctly and then you know actually connect to the different mCP servers that we want then definitely drop a like and let me know in the comments if you want to see that cuz I would love to make a video covering this top topic so let's get into the breakdown okay so model context protocol I swear the past week it's been the only thing I've seen in YouTube comments YouTube videos Twitter LinkedIn it's just all over and I don't know about you guys but when I first started reading about this kind of stuff I was kind of intimidated by it I didn't completely understand what was going on it was very techy and you know kind of abstract I also felt like I was getting different information based on every source so we're going to break it down as simple as possible how it makes AI agents more intelligent okay so we're going to start with the basics here let's just pretend we're going back to Chad gbt which is you know a large language model what we have is an input on the left we're able to ask it a question you know help me write this email tell me a joke whatever it is we feed in an input the llm thinks about it and provides some sort of answer to us as an output and that's really all that happens the next Evolution was when we started to give L's tools and that's when we got AI agents because now we could ask it to do something like write an email but rather than just writing the email and giving it back to us it could call a tool to actually write that email and then it would tell us there you go the job's done and so this really started to expand the capabilities of these llms because they could actually take action on our behalf rather than just sort of assisting us and getting us 70% of the way there and so before we start talking about mCP servers and how that enhances our agents abilities we need to talk about how these tools work and sort of the limitations of them okay so sticking with that email example let's pretend that this is an email agent that's helping us take action and email what it's going to do is each tool has a very specific function so this first tool over here you can see this one is going to label emails the second tool in the middle is going to get emails and then this third one on the right is going to send emails so if you watch my ultimate assistant video If you haven't I'll tag it right up here what happened was we had a main agent and then it was calling on a separate agent that was an email agent and as you can see here was all of its different tools and each one had one very specific function that it could do and it was basically just up to the email agent right here to decide which one to use based on the incoming query and so the reason that these tools aren't super flexible is because within each of these configurations we basically have to hardcode in what is the operation I'm doing here and what's the resource and then we can feed in some Dynamic things like different message IDs or label IDs over here you know the operation is git the resource is message so that won't change and then over here the operation is that we're sending a message and so this was really cool because agents were able to use their brains whatever large language model we had plugged into them to understand which tool do I need to use and it still works pretty well but when it comes to being able to scale this up and you want to interact with multiple different things not just Gmail and Google Calendar you also want to interact with a CRM and different databases that's where it starts to get a little confusing so now we start to interact with something called an mCP server and it's basically just going to be a layer between your agent and between the tools that you want to hit which would be right here and so when the agent sends a request to the specific mCP server in this case let's pretend its notion it's going to get more information back than hey what tools do I have and what's the functionality here it's also going to get information about like what are the resources there what are the schemas there what are the prompts there and it uses all of that to understand how to actually take the action that we asked back here in the whole input that triggered the workflow when it comes to different Services talking to each other so in this case nadn and notion there's been you know a standard in the way that we send data across and we get data back which has been the rest apis and these standards are really important because we have to understand how can we actually format our data and send it over and know that it's going to be received in the way that we intend it to be and so that's exactly what was going on back up here where every time that we wanted to interact with a specific tool we were hitting a specific endpoint so the endpoint for labeling emails was different for the endpoint for sending emails and besides just those endpoints or functions being different there was also different things that we had to configure within each tool call so over here you can see what we had to do was in order to send an email we have to give it who it's going to what the subject is the email type and the message which is different from the information we need to send to this tool which is what's the message ID you want to label and what is the label name or ID to give to that message by going through the mCP server it's basically going to be a universal translator that takes the information from the llm and it enriches that with all of the information that we need in order to hit the right tool with the right schema fill in the right parameters access the right resources all that kind of stuff the reason I put notion here for an example of an mCP server is because within your notion you'll have multiple different databases and within those databases you're going to have tons of different columns and then all of those you know are going to have different pages so being able to have the mCP server translate back to your agent here are all of the databases you have here is the schema or the different fields or columns that are in each of your databases um and also here are the actions you can take Now using that information what do you want to do real quick coping back to the example of the ultimate assistant what we have up here is the main agent and then it had four child workflows child agents that it could hit that had specializations in certain areas so the Gmail agent which we talked about right down here the Google Calendar agent the contact agent which was air table and then the content creator agent so all that this agent had to do was understand okay based on what's coming in based on the request from the human which of these different tools do I actually access and we can honestly kind of think of these as mCP servers because once the query gets passed off to the Gmail agent the Gmail agent down here is the one that understands here are the tools I have here are the different like you know parameters I need to fill out I'm going to take care of it and then we're going to respond back to the main agent this system made things a little more Dynamic and flexible because then we didn't have to have the ultimate assistant hooked up to like 40 different tools you know all the combinations of all of these and it made its job a little more easier by just delegating the work to different mCP servers and obviously these aren't mCP servers but it's kind of the same concept ccept the difference here is that let's say all of a sudden Gmail adds more functionality we would have to come in here and add more Tools in this case but what's going on with the mCP servers is whatever mCP server that you're accessing it's on them to continuously keep that server updated so that people can always access it and do what they need to do by this point it should be starting to click but maybe it's not 100% clear so we're going to look at an actual example of like what this really looks like in action but before we do just want to cover one thing which which is you know the agent sending over a request to a server the server translates it in order to get all this information and get the tool calls all that kind of stuff um and what's going on here is called mCP protocol so we have the client which is just the interface that we're using in this case it's NN it could be your CLA or your you know coding window whatever it is and then we're sending over something to the mCP server and that's called mCP protocol also one thing to keep in mind here that I'm not going to dive into but if you were to create your own mCP server and it had access to all of your own resources your schemas your tools all that kind of stuff you just got to be careful there there's some security concerns because if anyone was getting into that server they could basically ask for anything back so that's something that was brought up in my paid Community we were having a great discussion about mCP and stuff like that so just keep it in mind so let's look more at an example in nadn once again so coming down here let's pretend that we have this beautiful air table agent that we built out in nadn as you can see it has these um seven different tools which is get wck record update record get bases create record search record delete record and get bases schema the reason we needed all of these different tools is because as you know they each have different operations inside of them and then they each have different parameters to be filled out so the agent takes care of all of that but this could be a lot more lean of a system if we were able to access air T's mCP server as you see what we're doing right here because this is able to list all the tools that we have available in air table so here you can see I asked the air table agent what actions do I have it then listed these 13 different actions that we have which are actually more than the seven we had built out here and we can see we have list records search records and then 11 more and this is actually just the agent telling us the human what we have access to but what the actual agent would look at in order to use the tool is a list of the tools where it would be here's the name here's the description of when you use this tool and then here's the schema of what you need to send over to this tool because when we're listing records we have to send over different information like the base ID the table ID Max records how we want to filter which is different than if we want to list tables because we need a base ID and a detail label so all of this information coming back from the mCP server tells the agent how it needs to fill out all of these parameters that we were talking about earlier where it's like send email you have different things than you need to fill out for labeling emails so once the agent gets this information back from the mCP server it's going to say okay well I know that I need to use the search records tool because the user asked me to search for records with the name Bob in it so I have this schema that I need to use and I'm going to use my air table execute tool in order to do so and basically what it's going to do is going to choose which tool it needs based on the information it was fed previously so in this case the air table execute tool would search records and it would do it by filling in this schema of information that we need to pass over to air table so now I hope you can see how basically what's going on in this tool is all 13 tools wrapped up in a one and then what's going on here is just feeding all the information we need in order to make the correct decision so this is the workflow looking at for the demo we're not going to dive into this one CU it's just a lot to look at I just wanted to put a ton of mCP servers in one agent and see that even if we had no system prompt if it could understand which one to use and then still understand how to call it tools so I just thought that was a cool experiment obviously what's next is I'm going to try to build some sort of huge you know personal type assistant with a ton of mCP servers but for now let's just kind of break it down as simple as possible by looking at an individual mCP agent and so I I don't know why it's called it an mCP agent in this case it's just kind of like a fir crawl agent with access to fir craw's mCP server so yeah okay so taking a look at fir crawl agent we're going to ask what tools do you have it's hitting the fir crawl actions right now in order to pull back all of the resources and as you can see it's going to come back and say hey we have these you know nine actions you can take I don't know if it's nine but it's going to be something like that it was nine so as you can see we have access to scrape map crawl batch scrape all other stuff and what's really cool is that if we click into here we can see that we have a description for when to use each tool and what you actually need to send over so if we want to scrape it's a different schema to fill out than if we want to do something like extract so let's try actually asking it to do something so let's say um extract the rewards program name from um chipotle.com so we'll see what it does here obviously it's going to do the same thing listing its actions and then it should be using the firecore extract method so we'll see what comes back out of that tool okay went green hopefully we actually got a response it's hitting it again so we'll see what happened we'll dive into the logs after this okay so on the third run it finally says the rewards program is called Chipotle rewards so let's take a look at run one it used fir crawl extract and it basically filled in the prompt extract the name of the rewards program it put it in as a string we got a request failed with status code 400 so not sure what happened there run two it did a fir cross scrape we also got a status code 400 and then run three what it did was a firec scrape once again and it was able to Des scrape the entire thing and then it used its brain to figure out what the rewards program was called taking a quick look at the fir call documentation we can see that a 400 error code means that the parameters weren't filled out correctly so what happened here was basically just didn't fill these out exactly correctly the schema of like the prompt and everything to send over and so really these kind of issues just come down to a matter of you know making the tool parameters more robust and also more prompting within the actual fir crawl agent itself but it's pretty cool that it was able to understand okay this didn't work let me just try some other things okay so real quick just wanted to say if you want to hop in NN and test out these mCP nodes you're going to have to self-host your environment because you need to use the community nodes and you can only access those if you are self-hosted but I will leave a link in the description for the documentation for this mCP node as well as this link which has basically all the servers that you can connect to um so obviously in this one we were doing like Airbnb and we had Brave search we had fir crawl there's a ton that you can use and just so you know one issue that I was running into in setup is some of these aren't like publicly available or completely published yet so like perplexity I was trying to connect to and it wasn't letting me and I was thinking I did something wrong but some of these just aren't like published yet and once again if you guys want a tutorial of you know spinning up a self-hosted environment installing the community node connecting to a bunch of different ones here then let me know in the comments and give this video a like I'd love to make a video about that all right but that is going to be it for this one I hope that you enjoy this one I hope you learned something new if you did please give it a like it definitely helps me out a ton as always appreciate you you guys sticking around to the end of the video and I will see you guys in the next one thanks everyone