Transcript for:
Highlights from JavaScript Developer Days

Welcome everyone to the JavaScript developer days. In today's session, we're going to present and walk you through our OpenAI JavaScript sample with our special guest, Lars. With us here. Hey. Hi, Lars.

Welcome, Lars. Thank you. Thank you for having me.

I'm Lars Brink from Denmark. I'm a Microsoft MVP for several years in developer technologies, GitHub star for several years, recently NX champion, NX the build framework, open source, and also Angular hero of education for writing a book on Angular. And I help organize communities both virtually and in person.

That's amazing. You also have this amazing community for education. This is learning. This is learning.

Yeah. Yeah. I was thinking this is Angular, but there is another one, this is Angular, right? Yeah, that's our sub-brand. It's part of this is learning.

Because a lot of us have Angular backgrounds, at least that's how it started. Today, the this is learning part is even bigger than this is Angular. But still, we have so much Angular content that it makes sense to have a separate sub-brand for that. So this is a community I founded with Santos Yada three years ago, and it's grown to about 100 contributors. all kinds of public tech contributors like tech writers and content creators open source maintainers speakers and writers and authors and everything in between and we accept contributors from all skill levels which i think is great as well so we support each other in this community and offer free public channels to promote your your content and improve as a contributor we also have wasim see you introduce yourself too again yeah that's me I work at Microsoft as a Senior Developer Advocate Engineer with a focus on JavaScript.

Yeah, I have a background mostly in JavaScript. I've been doing JavaScript since 2004-ish, before jQuery. And yeah, Angular since 2009 with AngularJS and then Angular V2.

And this is where I get to know Lars and also met Natalia, by the way. before we get to work at Microsoft as colleagues, we're hanging out in conferences all over the world, Angular conferences. Yeah, and I'm Natalia Bendito, and I'm Principal Product Manager at Microsoft, and I lead the developer experience on Azure end-to-end for JavaScript, and across the many services and also developer tools. I collaborate very closely with Waseem from content to actually designing and implementing developer tools. and experiences.

This sample that we're going to be talking about is one of those experiences that we have designed and put together, so the Azure community and the JavaScript community can better understand how to implement this type of applications with best practices in mind and how each building block works in integration. Welcome everyone again. If you want to follow along, you can check out the repo, you can clone it, github.com slash Azure dash samples slash Azure.

dash search dash OpenAI dash JavaScript. We have the JavaScript suffix because we have these exact same application in different languages, C-sharp, Python, and Java. We can now start talking about the retrieval augment generation that is part of this architecture sample or the pattern that powers it.

Lars, have you heard about Rank before? Yeah, I have heard about it and I was so lucky to be part of AI bootcamp through the Microsoft MVP program. But to be honest, I forgot about it since I haven't been using it. So hopefully you can help explain it to me again. Some call it patterns, some call it technique.

But when we are working with generative AI, what's happening is the model, what it does is it gives the system the ability to predict what is the next word that fits in a sentence. So how does it do that? Well, it has access to a very large, what they call corpus of data.

And then it ingests all that information and with all that information passed as parameters to a large language model it can then decide word by word as a token what is going to be responding how it's going to be constructing a response to a question and basically this technique is designed to work with those very very large language models that have parameters in the, I think it's in the millions right now for the latest versions of the GPT model. We're talking about massive operations, compute operations that are taking place in a very, very short amount of time. And this is why we have to continuously refine these techniques in order to make better predictions, better content. I think Copilot's been the most popular use case that developers are.

interested in these days? Yeah. I've been an early access member of GitHub Co-Pilot for years, and it's a great part of my everyday as a developer.

I wouldn't work without it. Microsoft Co-Pilot, very excited about that as well with the ability to make these custom GPTs or custom Co-Pilots and see how that can be integrated as part of business use cases. I think that's very interesting and that was primarily what I was learning in these Microsoft AI bootcamps. And then the more you learn, the more you realize that you have to learn even more to get the full picture. Right.

So you have to learn about engineering and agencies and the open protocols and all of that goodness. Let's walk through the architecture of this particular sample. Yeah.

Let me let me give you a high overview of the architecture that we designed. This sample application was basically built on top of different services. Azure services, I mean by that.

So one of them is Azure Static Web Apps. Obviously, we use that to deploy our front end application, the UI of the WAG chat application. The bigger part is the backend, which uses multiple Azure components and Azure services.

One of them is Azure Container Apps. So this is where we deploy both the indexer, which we're going to talk about later, and also the search API. And yeah, we leverage Azure Container App plus all the registry mechanism in order to deploy the business logic of the indexer and the search. We also use Azure AI Search for accessing the embeddings and the vectors. And we also use, obviously, OpenAI as our main API.

for retrieving the models and the predictions and everything. Something which is not shown here, we also use different libraries like LaunchingJS for instance which help us interact with Azure OpenAI, which itself interacts with OpenAI. This is the high level. We are going to go into more details into the code in the next slides.

But basically, this is the architecture that you get when you deploy the sample using AZD. We also have some optional services like the monitoring of logs and system logs. Now, let's actually see code, right?

So the actual sample repo is built on top of NPM workspaces as a monorepo and under each packages, obviously we have the different component we mentioned earlier. Two that I'm going to present right now are the indexer and next the search. These are basically the two core components for our backends. So the indexer is what allows us to ingest the data and index it.

And by the way, we are to serve these two components as in the REST API we're using Fastify. The actual meat is in this file, which is responsible of... processing all the files that we support. For now we support Markdown and text and PDFs. We read the whole text, the whole chunks of text, and then we try to split them into smaller chunks so then we can easily index them, upload them to Azure.

In order to split those texts, basically I'm going to highlight this whole method which called split pages. So what we do here basically is we leverage the sliding window algorithm in order to split the sections of the whole text but in a like a clever manner, I mean because we want to split the text while we keep the context because this is where we are going to basically run the search in the next phase. Basically making sure that we split the text. according to like sentences or in some cases whole words. We don't want to basically split in the middle of a word or in the middle of a sentence because we're going to lose context.

And yeah, once we construct these document section, we use them and upload them to Azure Storage. So then they could be used as indexes for the search. And this document processing mechanism is basically used in this indexer file where we grab those sections that we constructed using our algorithm and then upload them to our Azure search. in the index. Yeah, and then we expose the service as I mentioned as a using Fastify as a plugin.

We have like a bunch of plugins by the way, but the index are exposed in this manner. Most of the endpoints we expose are using the same pattern. We decorate using Fastify in the indexer in this case and expose it as an endpoint. And it's like a really fantastic way to expose API. So I'm really glad we went with Fastify.

Lars, do you have any questions on the indexer? Yeah, from what I understand, it's processing different documents to make them available for end users to search via this chat client. Yeah, we use a local ingestion mechanism using this CLI.

You give it like a folder, basically a folder called data where you put all your files. And by the way, we are using this sample application as in the context of this other Contoso real estate application that we've built. So basically all the knowledge. that we are feeding this RAG is basically based on this Contoso real estate application. So we're giving it terms of services, guidelines, how to start with this and that.

And these are being fed as markdown files that the indexer through the CLI is uploading to Azure. And then we are exposing this also as an endpoint. So the search similarly is a different package under the packages workspace. And the search API follows the same, again, the same pattern is exposed using Fastify as an endpoint and being registered through the same mechanism.

Basically, the search that we provide is using like standard search mechanism in this RAG experiences. They're called approaches. And you. We'll see them registered here as basically we support two approaches or like two techniques. I'll call them techniques, which are read, retrieval, read.

I think that's the first one. And then the other one is retrieve, then read. And we also support two, on top of those searches, we support two.

search experiences one is called we call chat and the other one is called ask so a chat basically it's a it's a stream of discussions you know like you're a thread with you and the system assistant with the user and the system assistant the ask it's like a one one shot question and then you get the response we have some explanation for each um approach in in its core you would probably wouldn't be concerned about which approach you use uh but because basically they give you they will give you an answer However, we do support both approaches in our samples. The search experience we offer is based on these two approaches. And as I mentioned in the introduction, we use this library called Longchain, the JavaScript implementation of Longchain, to basically access and retrieve all the models or communicate between our search, our API, and OpenAI, which makes it really easier to get the model and get all compute all the... Yeah, we have a few examples here how we can respond and construct the results back to the user. Yeah, Waseem mentioned Langchain and you might have heard also about Semantic Kernel from Microsoft.

Unfortunately, there's no JavaScript SDK for that yet. So Langchain is definitely your best option there. Yes.

And by the way, speaking about Langchain, I just want to give a huge shout out to Johan, who did a tremendous work contributing to Langchain.js upstream repo, adding support. for tons of Azure services. So yeah, kudos to him. This is one of the objectives when we put together the samples is to understand what gaps may exist between our services and some popular JavaScript libraries.

In this case, LimeChain is one of the most popular libraries for implementation of this type of applications in the JavaScript community. We always want to contribute back to the ecosystem. So yes, definitely great.

work from Johan making sure that we can work with LanChain and also the Azure. OpenAI SDK for JavaScript in this case. I also want to highlight, and if you happen to use Java as a backend, another colleague of us, Julien Dubois has added support for long chain for Java for J, I don't know how they call it. Yeah, so if you're using Azure and using the popular library for Java, there's also support for that.

We have created at Microsoft this specification called the HTTP protocol for AI. chat applications. What we try to do with this is to create a unified API shape. So we can say, this is how these requests and responses and any error associated or any success to hitting these APIs should look like. That allows us to swap frontends and backends in a seamless way.

I know if I am building a frontend application that is going to connect to a service implementing the rack pattern in this case, that when I am creating my front-end service or my client-side service, and I need to write a request, I have to format it in a specific way, pass some specific headers, and make it a specific type of request. In this case, it's always a post request because it takes some payload to the backend service to then. deliver the response and that makes it possible to work with different backends and different front-end applications is a contract right between front-end or back-end or between different services this is between front-end and backend in this case so this is adding extra context for the for the front end to allow it to display more uh more information let's see the application let's see the application working i have it locally and i have it also already deployed Let's go with a local.

We were mentioning before that this was built for the use case of our rentals portal. As you can see, there is a set of default questions here in the application, how to search and book rentals, or what is their refund policy? Let's go with the first one.

We could, of course, write our own, but let's use one of the default ones. Yeah, and there is this response being put together. with citations, which are these little tokens here that will take you directly to the source of the response. The document where the system got this response from.

If I click here, it's going to be loading this preview. This is the document where I got my response. In this case, as you can see, there are PDF documents and there are also Markdown documents. that are ingested by the same service like was mentioned earlier and we can also make some follow-up questions okay now you ask about uh how to search and book rentals maybe you also want to understand what are the guest verification requirements just click and go because we are again uh like like wasim explained we have the ask and we have the chat um modes for the application in this case is a chat, so it keeps the context of the whole conversation. This is how the application works in essence, on the front-end side.

But we were also mentioning the settings that we can pass. We saw the overrides when we were seeing the chat protocol specification, that could be passed as a template here. This is not for the end-user.

This is because we want to showcase how developers could potentially pass overrides from a. an application settings dashboard that is restricted obviously to the business and not to the end-user. But if we wanted to a little bit tweak how the response arrives to us, we can then pass these overrides.

We can also adjust the retrieval mode directly from the settings. Then we can also exclude certain categories. We can adjust the relevance with the semantic ranker, with the query contextual summaries instead of whole document so the system will first generate a summary and then use this summary to elaborate the response the suggestion of follow-up questions that we saw can be enabled or disabled and also we can um stream or not the response i would like was in to explain streaming because he is the one who implemented this very very nice effect of typing yeah sure uh well first first of all i just want to mention something so for people who are watching the show. If you are you if you want to use the sample app and deploy to production.

Well, first of all, please run some security audits before you deploy to production. And secondly, you need to disable the app settings and not allow you to play with those. So yeah, disable them and hard code the values if you can. Like, yeah, that's my warning.

Yeah, at the end of the day, we are just getting the response from OpenAI, Azure OpenAI and then OpenAI. And we get those as chunks, like streams coming. from the back end.

So we are passing along through Fastify to our REST API those streams to the front end. And yeah, basically on the front end side and that the logic you can find inside of the Web Component implementation, we are taking those chunks coming up from the back end and then parsing them live on the fly because the UI you're seeing here is basically being rendered on the fly. as we get data from the back end. So all the citations, all the CSS that are applied, CSS classes and the elements are, well, they're built with LIT again, we're going to mention that afterwards. But yeah, these are being parsed and sometimes it's fail, as you can see, depending on the response from the server.

So basically, yeah, go in trying to find some placeholders and some special characters in the response in order for us to construct a meaningful UI for the end user. And also this is open source. That's the opportunity for me to mention we welcome all contributions if you want to improve this parsing algorithm.

And believe me, parsing streams are really, really hard because you need to make a lot of decisions because like you're parsing data as it comes in. So yeah, there are some techniques. So yeah, if you want to help or contribute, please feel free.

But basically, yeah, that's the simple approach to it. And yeah, if you disable the streaming, as Natalia mentioned, then that's easy for us because we are getting the whole piece of text and then we can easily parse it and render it correctly without any errors. But it takes longer, right?

So I just disable streaming and it comes in a full chunk, but it takes longer. It's not as much fun. Yeah. Why we decided to write this application with Web Components?

Because we wanted to create as many samples as possible, and also make sure that any customer using a modern front-end application or whatever, if they're using a framework, if they're using plain JavaScript, they could just integrate these chat application to their own existing app and make sure it worked without having to install. a very large dependency. Applications that we have created so far with React, with Next.js, with Angular, have not offered a lot of difficulty in integration. I think that it was a good decision in the end. If people want to build other application with Vue or whatever framework, please feel free to do it.

We're happy to. get your PRs. Absolutely. We want to have as many frameworks and samples as possible.

So if you're using one framework that is not in our list, we're very happy to have your contribution. And we have invited Lars to share with us what are these new capabilities that this framework is offering today that didn't have in earlier or previous version. Okay. Yeah.

So as you said, Natalia, lots of lots of lots of new features and changes are coming and have already been released as part of Angular. A lot of exciting things related to especially signals, which is inspired by features from other frameworks like SolidJS that has signals. But almost every framework either has signal or something like it or is looking to implement it. Angular.dev is part of the new things being released.

It's a new documentation and learning website for Angular. There's this new logo for Angular. So if you've seen this A with the gradient background, that's the new Angular logo.

It's using the Google, what's it called, IDX or something like that, this embedded development environment, similar to StackBlitz or something like that. So it's using that inside of this documentation to really make it an interactive experience to be learning about Angular. And like we see here, there's guides on signals, the fundamentals of that. There's the API reference is also part of Angular.dev.

So you can find all the information you need, both about new and older existing features of Angular. So Signals is this reactive primitive. So it's a way of managing granular pieces of state and the Angular story.

the Angular epic of signals is a long one. There's many, many different features that are going to be released related to signals. Signals are already, most of signals in Angular, the primitive signal itself is already here.

But related things like the effect is still experimental, so some changes are still needed before that's finalized. That's the effect function related to signals reacting to changes in signals data values. Another feature that I'm looking forward to is signal-based components, which will make change detection in Angular more efficient.

Even if you're not going to migrate to signal components, there's also coming official Soneless components or Soneless Angular application, an official solution for writing Soneless applications without migrating all your components to signal-based components. There's a bunch of really important changes coming to Angular. More related to signals is signal-based forms.

We're going to need those because Angular forms is a really important library. Well, there's two APIs and we need official solutions for how to integrate Angular signals with that. That's also something that's not here yet, but it's going to be a major benefit. I think that's about the roadmap of signals.

Signals, the values, the data containers, signals, they are already here. The related effect function is something that's being worked on finalizing right now. It is, you can import it, but it's in developer preview, so it should be ready for changes to occur in any new release of Angular. Other important features that are not related to signals is this new template syntax.

Historically, Angular versions 2 through 16 have been more. HTML-like with some special symbols like the square brackets, property bindings, and the parentheses, event binding, and so on. Now, some of the most used directives of Angular like ng-if and ng-for have been turned into this new syntax. It's part of a template syntax rather than being these structural directive extensions to Angular itself. So now you might see in a template the add sign, so an add if, add sign like in an email address, right?

So add if and add for for the for loop, where before you would use ng if with the star in front and ng for with the star in front for structural directive. So, yeah, here's the sample for the for loop. Now it looks like this. Another important change here is the track, the track keyword.

Before the track by function was optional. in the ng4 directive, but now it's required. So you always have to track by some value in the items you are iterating over in this for loop because of performance reasons.

So it forces you to think about, it's similar to React where you have the key identifier. So here it's track in the new Angular template syntax, and you can track by any property on the object or the object itself. or the index value as it's iterating through the for loop, for example, whatever makes the most sense to your use case. So those are some of the more interesting features for and if the new syntax is already stable and part of Angular 17. There's another very, very important one coming.

It's the defer keyword. So it's part of the feature called deferrable views. I think it's in developer preview in version 17 that we're on.

So if we try to look for this defer keyword, add defer. there's definitely the documentation and I think we would be able to use it in Angular 17 as well, but it's in developer preview. So be a little cautious, expect changes. This is so powerful. What could be a good use case for signals in the context of an application like the RAC ChatGPT application?

So in our sample, well, should we go look at the code? To the code. Then you can follow me because right now in this branch we have just this one.

So use Live Share, follow me. Yes. LiveShare, by the way, is an extension for VS Code by Microsoft, where we can follow each other and we can modify the code in real time together. So really a nice tool and it's easy to set up. So now you should be able to see me where my cursor is at, the file I have open.

So this is the settings component that we saw, the developer settings that we saw in the UI when that was loaded. So this is the only Angular component. It's also a new feature, relatively new standalone component. meaning that all the imports are here in the component decorator options rather than in an Angular module. And some of these are still modules, mostly the Angular material ones.

In the latest release, or a soon-to-come release, I think the modern Angular material components will also be available as standalone components, so we're not importing modules with many components. That's very good. Very good news. So here we see an example of not a standalone component, but a standalone pipe, the async pipe. Previously...

The only option was to import this common module, which brings in a lot of stuff and previously also the ng4 and ngf directives that they are still here, but we could also use this new syntax for a period of time. The settings component, here's the template. Down here, we see that we are using this custom element, the chat component that we saw was implemented using lit. I see that this API of the component is has there's there's a property title and then there's attributes so this is the attribute binding syntax in angular so there's data dash attributes for the other inputs but apparently title is the property so that's something you could access in the dom element object of this chat component from what i see here in the template so here we are passing the different properties but I'm not sure any of these are actually dynamic or whether they are static. They're not assigned elsewhere, are they?

Or are they, maybe they are in, probably here. Yeah, here it is. Okay.

So the streaming setting, whether we should be streaming the output as we saw earlier or getting the full chunk, waiting for that before rendering the response to the user. This is a Boolean property. It's set to true by default. And here it's passed to the chat component custom element, but we also see it being bound here using Angular template driven forms, the ngModel directive, and the Angular material checkbox component. So this is actually an example of what could be great to have as a signal because signals are about granular values.

So what I've mentioned. offline prior to this is that a property like this settings default is not a good candidate for signal since it's an object with many different properties. So ideally we would split these into individual properties that are rather signals.

Each property would be a signal. And I even see here that this settings default has the streaming property, right? So maybe that's we should even be using this as the default property down here or something.

So this streaming property, which is just a Boolean value contained in a property, this could be a signal, but we could try modifying that to a signal. And I'm going to import here from Angular core, the signal. And here we go.

Oh, I'm just remembering a relatively new feature is you can have input properties as signals as well. We don't have any input properties in this component. But that's something that already landed, as far as I remember, in latest versions of Angular.

Is that correct from what you know? Yeah. Yeah, I think yes. Yeah.

There was a lot of celebration about it, I've seen. Input symbol signals would be a good candidate if we have written the chat component in Angular. Yeah.

Yeah. Exactly. Because right now what takes the input is the dynamic property in the lead component. So this is why we are not using outputs and inputs in Angular.

Okay. So now that this is a signal that contains the initial value of true, meaning that. We see here that the type parameter or the actual type here is a writable signal containing a Boolean value. And with signals, we always have to have an initial value.

There's no late value. We have to have that first value initially. And we already have that default value of truth.

So it makes sense to keep that. So now let's first go into where we pass it to our chat component web component to pass the value. we will call it as a function.

This is how we get the value of a signal. And Angular's template will then be aware of when this signal changes. It has to re-render. this part of the template so it'll pass this attribute to our chat component custom element so this is passing the data along now we're missing what what about updating the signal and this is where we're using the ng model template driven forms to have like easy data two-way like data binding and but but now we have to change this banana in a box syntax which is the special Angular syntax for actually two bindings, which is ngModel change and ngModel property binding. So here, I guess that we would be passing the value.

So unwrapping the Boolean value of this streaming signal, that's the property binding part. The event binding part is when the form controls of the user. or rather the checkbox, the Angular Materials checkbox, emits a new value. We will get the value here as the event, the special event value.

So we pass that to the streaming signal and we use the set method, sorry, the set method here. So this is what I think is needed to make this work. So let's-I wanted to wrap the- and I'm always not showing me.

Okay, let's just go ahead. And of course in the future there will be a more an easier way of integrating Angular forms with signals rather you're whether you're using the template driven forms or the reactive forms because signals and observables they are more they make sense to use together and even Angular has some. mapping functions to map an observable to a signal and to map a signal to an observable for interoperability. And the reactive angle forms already has observables.

So even if we were using that, we could also convert, for example, the value changes observable to a signal using the Angular RxJS interrupt package, I think. So Lars, I have a question for you. Do you recommend using signals with primitive values like numbers, booleans, or we can use them with any random text or strings or arrays or objects. Yeah, so if we were to wrap an object or data structure, like the data overrides when the settings defaults, the issue here is that... I mean, it wouldn't be an issue to wrap this in a signal.

In the longer term, signals are part of optimizing change detection, as I mentioned a bit earlier, especially when signal-based components will be part of Angular in the coming year or two or something like that, I imagine. And at that point, it's important to have these granular values so that rather than having the whole object as one signal, you have each. each primitive value, as you say, each string and each Boolean, similar to what we did with the streaming signal.

It's a Boolean value. It's not an object, but a property with a Boolean value. The reason for that is, so now, if this was a so-called signal component that will be available in the future, then Angular would know only to re-render this part of the template once once the streaming signal value changes. If streaming was part of an entire object, like if we have a settings, or it's over here instead. If we had a settings primitive with an object that had a streaming property, then this part of the template, this checkbox would be re-rendered every time any property of the settings object was changing.

So now it's not very granular anymore. Now it's not much better than just re-rendering the entire component, which has been the default way of change detection when you're looking at a single component. But with signal-based component, we should be seeing local change detection where parts of the template can be rendered based on granular signal values.

So it's another thing to think of in this new world of signals. suggested when we discussed this is I mean, there are great libraries in the Angular ecosystem, open source libraries by the community, and we're already seeing signals being integrated into them. For example, if you're using NGX store, you can get an observable selector as a signal.

But there's also this entirely new package called the NGX signals where there's the signal store where everything is signals. And that is great when you have a lot of data, like a big complex data structure. It's not that complex, but...

even if you have nested data structures, you can still get each signal value or each property of this object as its own signal without having to create all these signals manually. If I were to create every one of these properties, it would be, what, 12 signals or something like that. So I will have an easier time with something like the NGX signal store or the RxAngular state, the new functional APIs that also have the signal support.

But here we are trying to keep the dependencies to a minimum, and by that only having Angular. We just saw that it makes sense to have a signal here, which will allow us to have better change detection in the future. When signal components arrive or even if we turn this application into what's called the zoneless application where ng-zone is disabled, which can have major performance. How do we get this on Azure? Yeah, that sometimes is not trivial.

especially for front-end developers, understanding what are the resources that we need to provision in Azure, how do we go about deploying all these parts, and then putting them back together in a way that everything work. What we wanted to provide is a template, in this case, to use with Azure Developer CLI. That's our provisioning deployment tool, in this case, our deployment engine.

If you look at the repo over here, you'll see that we have all these Bicep files. Bicep is a domain-specific language used to describe all this infrastructure that we want to provision, to pass secrets, to configure. For example, here if we go to the Azure Static Web Apps where this Angular application will be deployed to, we can see that we are defining a resource. going to be using this specific API to pass a name, a location, tags, the SKU or the tier where this application is going to be deployed to, and a set of properties in this case. Yeah, we're going to be doing this with a SWA CLI.

We're going to perform this provision and deployment using this tool as part of the Azure Developer CLI. Because we already have all these definitions here, we can do this with one single command. Actually, we should be logged in to Azure, into the Azure Developer CLI.

That's typically done with ACD house login, and that will take you to complete a flow, or prompt you to login. and then you can go back and just run a CD app and follow the prompt that will start appearing in your terminal, and it should be deployed. We're not going to go through the whole deployment because our provisioning deployment, because particularly provisioning takes around 12 minutes in this case, but we're going to go to the portal where we already have a deployment, and we can see those are the resources that are going to be deployed, the search.

service in Azure Container Apps, the indexer, then there are some other additional resources that make this work like the Container Registry. It's not something that you need to worry about. It's going to be provisioned for you. The Azure OpenAI service, it requires that you have Azure OpenAI credentials, of course. Finally, the Static Web App over here.

If we go to this resource and we click on the URL. We are on the Angular application over here that functional. Azure Developer CLI, ACT app is the command and this is a template that is already part of the Azure, also on ACT gallery where you can go and find all kinds of templates including JavaScript templates to get started quickly on Azure with JavaScript. Thank you, Lars, for being with us, for teaching us about signals and about Angular.

Thank you, Joaquin. I love talking about Angular and Azure and AI, so thank you for inviting me. We will always have you on board whenever we have that combo because you are the right person to chat about these things.