In our program, there are only 3 parts. Part 1 is the part where Cong will be the guide. We will go through the most important things about LOM Notebook, such as the implementation part. The short implementation part, about 5 minutes, will help us have the same perspective, the same experience.
Then, we will go to the part where Yusuke will guide the second part of Linh or Mai Anh. Then, we will accept it much easier. Especially when I have the experience from the beginning, especially with new people, it will be very easy to answer.
That's why the board wants to push the practice part first, so we will have 30 minutes together in this part 1, then about 40-45 minutes for part 2, and a big part of time for part 3. That's the goal of our program. Now, I really want to introduce about LOM notebook so that I have a slightly different approach to help you guys to grasp and understand about LLM in a very short time. And this is also a gift that I would like to send to you after the show. That is, I have combined a lot of materials related to LLM notebook and use LLM notebook to save the materials for LLM notebook. And especially in the form of registration, I and the board have received more than 100 questions, people's questions about how to use LLM notebook.
All those questions also have detailed answers and have been put into the materials to put into this warehouse. So, I will use this to create a short introduction to LPM for everyone. And the second is this link, later with those who join the program, I will send it as a gift to those who really want to master it and maximize the experience of using LPM that people around the world have used.
Then this tool, I will send it back so that people can use it publicly, can use it to research about it. Now, let's listen to what is LPM and based on the available features. Hi everyone, welcome to our discussion today. We will explore a special AI tool from Google, it is not Gemini, but LLM.
The key point is that it is like a research center and only uses the documents that we provide. Instead of searching on the Internet, it will dig deeper into the documents of the user. It can be a PDF file, a link web, a YouTube video, Google Slides, audio files, or a copy of the version.
Today, we will look at it more carefully based on the evaluated and guided videos. to see what this tool is good for. The core idea here is called Grounded AI, which means AI with a database, right? That's right.
Sounds good, but the limitation of AI in our documents, is it always good? Sometimes we want it to be more outside. Ah, that's a pretty important point, a change.
The biggest advantage is trust. Because AI only uses the source we provide, so every answer is accompanied by very clear instructions. Ah, there are instructions. Yes, help us to review easily and avoid information that is like a fake or fake like other large models. Like Chubb GPT, for example.
That's right. But it will not add knowledge in addition to what we provide. Basically, This is a tool to understand what we have, not to explore the outside.
I understand. That is, focus on accuracy and we can control the information. And the loading of documents is also very flexible, right?
The database supports all kinds of things. Exactly. PDF, web, video, sound, and Google Drive like Docs and Slides.
Even now it can understand the image with the label in Slides. Great. After we upload documents, such as reviews on Honor XC phone, or travel information about Guatemala, we can start talking to them.
Talk to documents? Yes, request it to summarize, compare the data or ask about a specific point. For example, asking about the pin of the Honor XC, it will answer the correct review that we uploaded and clearly where to get it from. There is a feature called pinning notes. I think it's great.
When AI answers something that I find important, I will save it and then write my opinion in it. It sounds a bit like how I normally write notes, for example in Notion. It's a bit different.
This writing happens right in the field where I'm interacting with AI. And it always fits AI's answers into my notes. Ah, that means it's more proactive.
That's right. I build knowledge with AI. In addition, there is an automatic privacy map.
AI draws its own map, connects the main ideas in the documents. Quite intuitive. Helps me see the contact easier.
One. What about the sound function? I see that the overall sound is like a mini podcast. That's right.
It creates a discussion between two AI voices about the content of the document. But the interaction sound is really interesting. I heard that we can interrupt and ask directly to those two AI voices.
Is it too short? According to the evaluation, it is quite smooth. When I interrupt to ask, the AI voices will stop answering my question.
Of course, it is based on the document. Then they continue to discuss. It turns normal listening, initiative into a more active conversation.
Like I'm in a group of students. It's good for studying and studying science. Exactly.
In addition, there are other learning tools such as FAQ. Google Although AI has a source guide, it can still make mistakes. Ah, still have to check again? Always have to check the source and the quality of our logo is extremely important. That's right.
If it's in, it's out. Exactly. The quality of the source decides the quality of the result.
In summary, Notebooks LM seems to be a fairly focused and reliable AI research assistant. It does not replace the fact that we are looking for information on the Internet, but it helps us dig deeper and actually communicate with our own database. Yes, especially with special features such as interactive sound or mind map.
It has the potential to turn dry and dry materials into a truly dynamic knowledge partner. It helps us save time processing information, which can be said to be a second brain. This makes me think of a quite interesting question for today.
If we could actually talk deeply with the knowledge that we have learned, then will the nature of learning and exploring change? That's a great idea. How will it be when my personal library is no longer silent, but becomes active conversational people?
A very worth-watching story. Everyone has started to have an impression, right? Now I will go through a few more minutes about the theory before we do the demo.
In general, it is very easy for me to get hold of this tool. Oh my god, I go fast. All the tools have the second brain that has created that file for us this afternoon.
It will take about 45 minutes to study it, read it, and exploit it. I'm sure I'll be the master of it. Don't worry, you need tools, you don't want to go fast because of that. This is the summary.
This summary is always taken on the official page of Nú Bụi LLM, the page that guides and supports. It is essentially a research assistant and you can upload PDFs, websites, everything on your own and you can download the documents and it only interacts with you based on those documents. You can interact and especially very good is the ability to process its information is extremely terrible, even video and image processing.
Especially now, all the PDF files are scanned. Usually, the tools are very difficult to be separated. This tool has been separated and the previous day, I had an extremely wow experience.
I trained more than 2,000 Viettinbank officials on AI applications in the Department of Business Performance. I have a demo for them to apply two of the latest plans related to personal income taxes, business taxes, as well as tax evasion. is the government's 70th and 181st resolutions. These two resolutions are printed on the government official website and then scanned.
But when it is put into the LLM notebook, it reads all the content in a very accurate way. That is a great power to handle information. Next is the process of processing the folder, then switching to the source. Later, I will do it myself.
To summarize, it is a piece of information that is quickly turned off by the visual image. LLM notebooks, as well as other AI tools, change very quickly at home. The function changes continuously.
This is a very impressive and intuitive picture comparing between the free and the Plus version of Mr. Luong Minh Thanh in May 2025. But until now, in July, there have been some changes, but basically they are the same. Here are some notes. I imagine it looks like a notebook and with the free version, I will have an maximum of 100 notebooks. And in each notebook, there are an maximum of 50 documents.
And look at this, each document has an maximum of 500 words or 200 words. This is a guide for those who say that I put the document in and it reports errors, then I put the PDF file in and it reports errors. Because if the file is too heavy or too many words compared to the rules, it will also be error. Free at most 50 questions, at most 3 audio. And the Plus version, later Công will use the Plus version to visualize it almost 90% compared to the free version.
It only has a few additional features, later Công will use that version to visualize, you will see it has a difference. At most 500 notebooks and at most 300 documents. and it has a support device. Especially this automated document discovery, now all free versions are available.
Especially now it has a phone app, so it's extremely convenient. And the summary, or the My Map, or the multilingual translation is very OK. So if I have documents in Chinese, Korean, English, I put it in here, it will all convert to Vietnamese for me. That's a very special point.
For those who are not good at languages, I can learn the original documents from China, from countries that have been supported by the LMS. That's a general introduction. To make it clearer for everyone, what is the difference between Notebook LM and ChatGPT or Gemini? ChatGPT or Gemini is like learning the giant library of the world, a library of the world.
After that, we ask what it knows, but it answers relatively generally. And it is based on global data. That's a way of approaching Chatbot. And Notebook LM is a bit different.
It's like we're doing a detective. We look very deeply into some of the... We go into the documents...
specific information of a specific case, connecting things together based on what we have, then the approach is very different and it only knows what we provide for it. It's very completely special compared to the other tool. And related to building the second brain, it makes me sad why so many people talk about it as the second brain because it meets three important factors.
One is to store information, capture the system into 3C, so that you can understand it easily. It can save almost all information, but it will turn into text. Audio, video, everything is based on text content and saved.
It can store all the sources together. The most powerful thing about LLM is that it connects everything together. Connects all the information we put together. After that, when we ask questions, we use it, the value will be very high. And from storing that connection, it can create a lot of things for us.
That's an interesting feeling. And why is it called the second version? I will give an example. I took an example just now, I was training for the banking team. I created a document to support them to sell an iPad product.
In this document, most of the documents are about introducing and analyzing the features of the product, but it adds another document called the sales techniques for this product. Immediately, a warehouse of a lot of product-related information, but there is also another document about selling products. So, we can use it to guide the sales techniques to sell these products effectively. It means that we can combine different styles, styles of products, styles of... policy or skill level, sales skills, we convert it into a warehouse, so when we exploit it, we can use its value to create many ways to use it effectively.
That is the strength of the LLM notebook compared to all other tools. That's it. Now, let's do the demo. Are you ready? I will have about 10 minutes to model and after 10 minutes, we will all turn on the LLM notebook.
to follow the instructions. This is also a gift that I would like to give to everyone. We will build a knowledge base about learning how to learn, how to learn, because that is the king of this era.
If we can master this skill, then everything in our life, everything that changes, we will face and handle it very gently. So I want to choose the topic learning how to learn so that we can build a knowledge base about this topic, including all the most valuable and most valuable materials in the world and the... YouTube channel that the actors and experts have shared about the methods of learning and we will create a table like this so that we can talk and learn from the documents so that we can master the learning skills 20 times. Of course, this is a gift for everyone, so everyone will rest assured.
Here is the product I will create, it will look like this, it will have documents from YouTube, from PDF files, from these files, from the parts called uh and I will use the features of the LLM notebook to exploit it. That's what I'm going to do. Now, I have summarized some data sources.
I have summarized some data sources that I have done very quickly about videos from world experts, about learning how to learn, how to learn very quickly on YouTube, or detailed articles about how to study, or PDF documents. is uploaded online and even the text content. Now let's go to this source together. How to do it, I will guide you from start to finish so that everyone can observe this part. Don't follow me in 5 minutes.
All of you just focus on looking at the screen of the computer and you will go step by step. Of course, for those who have used notebook lm, I think this part of the It's very easy, very simple. But for those who have never used it, this part is very important for us to look at it together.
Let's have a look at the experience and then we'll ask the answer together and observe the actual use case of the two speakers. It will be very easy. OK, let's get started.
This part is not done yet. Later, there will be time for each person to do it. This is the notebook lm notebook interface when you log in.
When I log in and log in to my Google account, of course, I am using the Pro version. basically it will be the same as the free version, rest assured, when I go to the Lm, this is my homepage, it has a very interesting thing here, there are many Lm notebooks of the world's best in the world, they are public and I can access it when I go, I will see the secrets of the living people, the documents of the economy, these are the ones that are already available for others, they have a total source, they share it with me, I can go and they can see it, I can see it, it's like when the house is open, and the part of the record below is my record, so I'm using a few accounts, but this account is not used, it's not too much, so I haven't given it much source, now I will make a record like this. Let's say I have a topic is learning how to learn, I want to summarize all the information about learning how to learn in a notebook, then step 1 we will click on the part is to create a new record, I will do each step slowly, you will observe carefully and then we will practice When I click on create a new account, the screen will immediately appear, which is the information sources that I want to get. It's great that I can get it from many different sources. Here, from uploading PDF, file text, file markdown, file sound, MP3.
I download it and put it in here. For example, I have a summary of all the detailed content analysis of learning how to learn online science of Ms. Barbara Oakley on Coursera. I will copy it later. Then here you can copy the documents, Google documents, Google account. all these YouTube websites, of course, there will be cases where it is wrong, it is due to the installation of the website, they may not allow me to get data from this or it is violated.
Some of the rules will not be wrong, but most of the official websites will not be wrong. Here, for example, I am on the web page, it will show up and paste the URL, you can paste the whole website here and it will immediately put it into the system and then you can put the version That's all. Bye-bye.
with the free version, you only have 50, at least 50, inside documents, and if you are using the fee-paying version, the difference is that it can give you up to 300 and another very interesting thing about this source is that sometimes I don't know where to find the source, it has a part called exploring the source, it will automatically go online to find me the documents related to the password I want, for example, do this part first, then go inside, then I will input the source and you will see the example I don't know where to find the document Learning How to Learn. I go to the source and type Learning How to Learn. It will immediately find it very quickly on the internet and it will open at least 10 resources for me to choose. Otherwise, I don't use this method very often.
The reason is that sometimes there are documents that I don't find it good enough, so I can choose to remove it or not. If I know what the content is, I just need to click on this part. to see if the official song is a song that is really good or not, but if it is not good, I can leave it, that's the way to use the source discovery.
Ok, now it's inside this is the official interface of the notebook, but when I open it on the left, it is a combination of all the original documents that I put in from all the sources of YouTube documents. pdf file and mp3 The middle part is the conversation part, that is, I will talk to the document to use it. And the studio part, this is the part that I can create a thousand personal notes here.
It will give me a lot of features about creating podcasts to summarize or self-summit or even My Maps. Now, Cong will do the sample of this additional part and start the demo to deal with the inside part. After this demo, Cong will stop for 1-2 minutes to ask questions.
After that, we will all do it together. Now I have prepared the information here, I will start copying. For example, I have about 10 YouTube videos about experts talking about effective learning.
I am posting videos that are quite popular on YouTube. I click on the Add source and the web page. I choose the web page, even YouTube, I can use the web page.
Copy a page here, copy and then share. Immediately, all the YouTube resources I put in were put in here. very fast, next, I want to add this article to this article. This is a very famous article about the effect of this study, copy this link, also return the part to add, it's called web page, put it in, it will immediately create add his file part here. Next, I have a PDF file of the detailed learning how to learn of the author Barbara Oakley on Coursera.
Click on the PDF file on your computer. You haven't seen this file yet, right? It's okay.
I'll show you. Here, I just uploaded the PDF file. Learning how to learn. Next.
I have a Google Doc file. I used Deep Search of GenMini to research many websites, and combine the most effective learning methods and AI applications for people who go to work. I will add this link to this file. I will add this link to this file. I choose Google documents.
This is the file I just created. Here, the learning method. Click here. So if you have a lot of your own resources on Google Drive, or Google Slides or Google Docs, you can use this. For example, I find this article so cool, I just want to copy this text to put it in, then I will turn on the article to copy it, Ctrl C, I go to this part, I choose the add part, here it will have the version that has been copied, I click on this version and copy it here.
This is like 10 books that are very good about Learning How to Learn. Change. So we have a very good storage of data about Learning How to Learn And now we will move on to part 1, part 2, part 3 Please imagine, I'm here, this tool allows me to interact with all the documents Or I can choose one of these documents to interact For example, I don't want to interact with all the content in the whole source I just choose this video, I just need to click on this video All the exchanges here all the records I created here are only related to this source.
If I want to use all of them, I will choose the whole set. For example, I will not choose the whole set. She said 50 documents in one notebook. Each YouTube link is considered a document.
Exactly. Ok, now it will have the following features, for example, I don't choose all of them, it will have the feature that I will be able to summarize the sound, I will create that podcast file, press it, it can create a mind map to combine all of these contents, it will be a mind map, click here, then the available features of the notebook to support me, and this side will have the tutorial, the material will be automatically turned off for me I will click here and click on it. Let's see what we have.
and this part I can start chatting so I can interact with all these tools . For example, here I have a series of proms that I can refer to, for example, based on all the documents provided, please calculate the five most important golden principles of science. For example, take this one, I can take the prom myself, but this file will also be sent to you.
You can try to use it to exploit it. Here, I'm going to go back here, it will I got something like that. and return the data to me. Then it starts to return the results for me and the biggest special feature of the VOM is that when it gets any information, it will write the specific source so that I can check it.
Always check the opposite, for example, here is the active and concentrated study, for example, it says to the second letter here, then when I click on this number 2, it will immediately show the right document. This is this video. video and how to learn anything fast it will show me where it takes the information where is the place of that video, this is the strength of the software and it only exploits information based on what we provide and I can chat with it many times and this conversation is actually every time I refresh this page or access it again, it will be lost, the middle part will be lost, so if the part that you are interacting with it that you find very good, want to keep it, then you have to press and save it in the account, it will save it as a here, everything is hidden here. it will be kept intact, that is, I can turn off the Lm notebook, I open it again, it will still have all the conversations in the middle. When I turn it off, it will be lost, so if I can exploit any good information, I will put it in here and if a note is too good, I will change the note to a source here, simply click on the the to convert this file into a source of information here.
In general, it handles information very interestingly. OK, that's the way. Later I will practice.
Now I will demo this part. When I have the MyMap file, it will combine all the things based on the documents I provide. Then I can start to see the overall. For example, the core learning principles, what else?
Remember to take this initiative. I started to see the initiative, I found it very interesting, I want to study more. Wow, very interesting, right?
Now I want to ... I will do it for you. If you click on any of the boxes in My Map, it will immediately process the information in the source of the original document so that it will give you a lot of detailed information about the box that you click on. Because it takes a lot of time to process, everyone. This is the end of this section.
When you click on the button, you will immediately do a specific research related to active memory. When we want to transfer the original source, we click on the number it is written here to return to see which source it is officially stored. This is how we will exploit our LLM notebook.
There is another feature before we start. Another feature is the sharing feature. A very happy news is that you can share your LLM notebook file to others by clicking on the share button at the top. After that, in this section, access and registration, I will choose anyone who has a connection.
Or if I don't want to send it to everyone, I only send it to the company's internal department or to a certain number of people, I will still press the limit and type the specific person's email I want to send them. Then I put them in here. And the account of the company is the notebook pro account, which is a fee, it will allow me to allow users to access all the registration or only just talk, that is, the user cannot see the information sources, then that is a feature that is better controlled than the fee table.
If I don't pay the fee, I share , everyone will see all the information sources and see all the records of me. That's it. It's a way to create it very simply. First, I have to have a warehouse of documents and information. First, I have to put it back in one place, then I will click on the create a new handbook, create a new handbook, then start to give all the resources and then when you give the resources, we will have information boxes to work on this conversation, the boxes to save the notes, that is the nature of the LN notebook Next, I would like to invite Ms. Linh to share the real use case that Ms. Linh has applied in her work and in the research or in the design story of her program so that we can have a more practical experience from the perspective of a professional.
Please go ahead, Linh. Thank you, Cong. First of all, hello everyone. I would like to introduce myself a little bit to those who will join us later. My name is Linh and I am currently working at the position of Design and Experience at the United World College of Southeast Asia in Singapore.
And my job is... As a training room, I had to work with a lot of training materials, including internal training materials. I also had to study a lot about education technology and current trends in education. So, using NotoLM has helped me a lot in my research and teaching work. Before going to the program, Cindy.
I don't know if you have anyone like me here, but we usually save documents in one place. For example, we save it in the photo album, iPhone Notebook, Facebook also has a separate part to save documents, Drive also has it. That is, our knowledge is scattered in one place, there is another part of knowledge. Until I tried the LN notebook about a year ago, It's really a game changer, a tool that changes how you save documents. Now I will share the practical user case that I have applied to Notebook LM and I find it very effective.
My Notebook LM will go to the side of Notebook LM is a knowledge companion and creative assistant. At the beginning, thank you Mr. Cong for introducing to everyone. about the functions of LLM, such as naturally collecting knowledge, or connecting these knowledge with each other from very abstract knowledge, so we will have new insights, or we can create new knowledge from the documents we have.
So, how did you apply these functions of LLM? The first one is the Knowledge Companion. This is a suggestion I got from an AI group on Facebook. If I remember correctly, the group is called Binh Danh Hoc AI. This is a very good way to...
Because I am a designer and... I am an experience learner and a content creator. So my materials are often in many places. I created an everything notebook. I don't focus too much on the classification, I just keep dumping information into it.
All the documents like videos, YouTube, stats, quotes, themes, etc. I just keep dumping them into it. I call it an everything notebook.
And how does it help me in producing content? For example, here I need... create a writing about love, for example.
Then I will find in my library quotes or opinions about love, happiness, psychology, for example. And it will quote those sources for me. It reduces the need to write a book that I can't remember that I have a very good document but I can't read it anywhere.
This is a way to... I think it's a good way to use the Everything Notebook to reflect on yourself. After a long time of using it, you can upload enough documents, notes, and thoughts into the notebook.
You can reflect on yourself by asking questions about the notebook. For example, is there anything that is repeated in your notes? Do you often say any words about yourself or your life value?
This is a way to practice reflection and to increase your metacognition. That is, you think about your own thoughts. This is a way that I use with Everything Notebook.
I put it in a database of myself. It's like a brain scan of myself. In my head, I have to think about how I think.
I uploaded it here. It reduced the fact that It can hallucinate, it can make a content like that. I can reduce it by uploading it all to LLM.
It limits the fact that my documents are not scattered everywhere. This is my first user case. The second user case that I use is a separate topic notebook. As I said at the beginning, I am an educational researcher.
And I... I need to do a lot of research and paper work. For example, last month, I had to prepare slides for workshops on learning experience design in the AI era. I have a lot of documents, such as scientific articles, sub-stack, YouTube, podcast. So, I have to go to every 50 sources. So, how can I combine knowledge from that?
I created an LLM notebook. After the workshop, I can share with you the ELM notebook to give you a lot of information about applying AI in learning design. From this source material, I use the My Map function of the ELM notebook, as well as my brother Cong shared with you. I use the function to ask, answer, compare, and compare materials with the chat function.
and from the way of tracking and processing the documents in such a comprehensive way that we can create Slide training from the general information, frameworks that are more comprehensive than just the information. That's how we can use LLM notebook. Do you remember the first method? It's a very general notebook, we put everything in there.
This method is to create LLM notebooks according to specific topics. For example, I have an LLM notebook on health. I have an LLM notebook about the topic of psychology. People can create LLM notebooks that are so deep so that people can study more deeply about a topic. That's one way I applied using the LLM notebook.
Furthermore, this is an example of building a knowledge base for myself. I can use the LLM notebook to build a collective knowledge base. For example, when I was working in the training room, there were a lot of materials related to internal training. And every time there was a new member in the team, because my team had a new intern every 6 months, they had to re-read the old folders to find the slides, checklists, and instructions.
And every time it was like that, it would take a lot of time. Or even if the old member left, there was an impact that he could lose the material. I have a solution, which is to build an internal database for the database.
For example, training topics such as business culture, AI automation, or take-list of internal training events or training libraries. Even if you are new to this, you can come here and study and chat with these documents to better understand the process. and you can also find out the specific source of the data.
Another way that you can apply this knowledge is, for example, if you do HR, for example, under the Human Resources department, you can create documents, LMS notebooks, for example, you can create documents that everyone knows. For example, the process of legalization or the process of insurance. The topics that people often come to with the HR department, when people create a notebook like that, people in other departments can also apply for it. This is a way for me to build a collective knowledge base in my department, in my company, and it will be much easier than doing it myself. documents, each in its own place.
This is a case that I use. To summarize a bit, the way I build an intellectual system with notebook LLM is I use it to collect and store knowledge in different sources of information, such as PDF, websites, YouTube. I practice the extraction of information, finding information with it.
Because when we ask something, it will lead to our original data. That's how it helps us to research knowledge. Next is the knowledge connection. We have 50 different documents on how to use AI to automate data.
It will help us to have a more comprehensive method of how to automate data quickly and effectively. and more accurate. Supporting research, as I just said, is when I research a topic, I have a way to approach it. For example, using Mind Map, Audio Tom Tak, or Audio Study Guide.
Those are the features that Nobuo LLM is supporting. And what's next? In addition to being good for me, I can create a knowledge source that I can share with the community.
Why? Because life is too short to learn alone. And when there is a collective contribution, we will always learn more from each other. These are some ways that you can build an intellectual system with LLM. Pause a bit, everyone is still here, right?
Pause a bit, okay, thank you. Thank you, Ms. Nhan. Okay, let's move on. So, those are some of the user case that I use with the features of Notebook LLM, which is a knowledge companion, a companion. The second part is the content creator and content creator.
So, how will I use Notebook LLM to support myself? then there is one One user case that I will use here is that I need to produce e-learning science to improve efficiency with Gemini. I used LLM notebook to calculate the tuition for students at different levels. That means you can upload documents to LLM notebook. You can request LLM notebook to calculate the corresponding documents at each level.
Level for beginners, level for those who have knowledge from the previous level, intermediate level for example. Or you can... So, you can ask Gemini, sorry, you can ask Nubule to create a test, create a question based on the documents that you have.
That's a way to limit the hallucination. And the next is still this science, the science that improves the efficiency of Gemini. I was not only able to read the documents, teaming I created a small podcast. I used the audio overview tool for the podcast that you just shared. I don't know if I shared...
Let's double check again. Okay. This is a short audio clip that we produced. ...search for how the AI created, specifically Gemini in Google Workspace, is changing our way of working.
Yes, that's right. The documents we looked through emphasized that it is not just a tool, but like a collaborator. That's right, a collaborator.
But how to talk to it effectively? The keyword is in the prompt command, right? Correct. The prompt quality decides a lot.
It's like we deliver to others. The clearer, the better. Oh, I see.
I heard you mentioned the PTCF frame. PTCF. Sounds a bit strange.
Yes, PTCF. It is the writing of four elements. Persona, that is, role. Task, specific task.
Contact. Uhm. Maybe after this session, I can share this LLM notebook with everyone. It has a lot of documents related to improving the efficiency of working with Gemini. You can see that I asked about the notebook, creating a conversation script between students and AI on the subject of processing data and creating reports.
And I can use the version that WLM created to put it into another tool to create for myself. a podcast, for example. That is how I use Notebook.com to produce the materials and lessons from the content that I control the quality of.
That is the important thing. Because sometimes I use ChessGPT or I use other tools I'm a little worried about whether I can control the quality of that document and it is close to the document that I have provided. because many times I use ChatsBP and it still hallucinates, meaning it still keeps on sending me information. Another example is an approach, a way that people can think about using LLM notebook to turn it into a chatbot to ask questions in their science program.
It's still the science using that AI Gemini tool. I don't just want to learn, people who only read a slide or watch videos, because people say they learn e-learning a lot. when it's very interactive and the students don't have the feeling that someone is supporting them.
I tried in that course to turn LLM notebook into chatbot. That is, I uploaded the documents of all the courses, slides, transcripts, videos. After that, the instructor asked the students to ask questions directly in the notebook.
That is, I instructed that people can ask what questions. With PromPAL, the notebook that I created, the chatbot is called PromPAL, which means Prom Assistant. What is AI?
How to make Prom effective? And my science was feedbacked very positively because people really like this chatbot tool. It feels like it increases the interaction a lot.
And everyone, that's a way for me... introduce notebook LM to everyone, that is, to increase the recognition for this tool. That is an example of how I turned notebook LM into chatbot. People who create content, they see that there are many ways that I can use notebook LM to help me support in my work to create content of many different styles. However, whenever I talk about tools, I always notice that the first thing is wrong information.
That is, the LN notebook is still capable of changing. If people ask confused questions or not fully prepared, AI is still able to predict the situation. This is something I have experienced.
I uploaded a report of the company. that I asked to analyze the report and NobleLM still has the ability to provide information, not necessarily to take the information out of the report, but to misinterpret the report, there is still data bias. NobleLM, because it mainly operates on the data we provide, so if the data is completely filled with bias or discrimination information, LMS will still reproduce or magnify these insights. Therefore, as a data owner, I have to understand that the responsibility of verifying information, the responsibility of identifying information is still with me. I cannot rely on LMS or AI 100% completely.
Secondly, because I know that there will be students using LMS, I feel that I have to warn them. The fact that they use tools to support learning also brings certain risks. For example, a phenomenon that Charles...
George and Paul Kershner talked about deep fake learning, which means using AI tools such as ChessGPP, such as NobuLM to do more turn-to-turn learning tasks. But people can be mistaken that we have, for example, people create a mind map, people have a mind map and people think that they really know about it, but in fact, they just read it like that, they don't really understand it deeply, and they don't actively recall, they don't actively... practice mình không chủ động gọi là thực hành để mình hiểu sâu thì đấy là một cái cái risk một cái nguy cơ có thể đi kèm nếu mà mình lạm dụng bất cứ một cái tool nào đấy quá mức hoặc là cái việc bẫy lười về mặt siêu nhận thức tức là khi mà bản thân mình cũng thế thôi khi mà mình làm nghiên cứu khoa học bản thân mình cũng hoàn toàn phải đọc nghiên cứu trước mình phải đọc trước mình phải hiểu nghiên cứu của mình I will talk about what first, then we will...
I have never uploaded a document without reading it. Then I asked NotebookLM to translate it for me and I think I understood the document. It's not like that.
So we need to use these tools in a very responsible way. That is, we have to be responsible for using such tools. These are some of my user case.
some suggestions and some notes. Linh has said that there is a very important message, that is, we do not need to customize the tools. The first part when he said that, we looked very positive and breakthrough the tools, but we have to look at both sides, the actual application and the notes. Now, after we hear something from a trainer, a person who designs the program, there are many brothers and sisters who are training to sympathize, right? And I believe that many people will find it useful.
As for the product maker, a very specialized and extremely deep research. How did you use this AI tool? How is your notebook?
After this, Mai Anh will share more with everyone. Hello everyone, I'm Mai Anh, born in 1993. Today I'm very happy and honored to have the opportunity to sit with Mr. Cong and Mr. Linh here and participate in this workshop this Saturday morning. I am a person who likes technology, data and AI.
I also joined the AI gen AI process quite early. I am also a fan of the AI gen AI from about a year and a half before it was released to the whole country. Before that, it had to be changed to the US to be used because I really liked the approach of this product.
Because when I first started using AI, I uploaded the documents and I got the error when AI was misplaced and misplaced when I just wanted to control the data sources. At that time, AI was not good. At that time, the Google opened the laptop from a 20% project of the lab. That means each person, each person of Google will have 20% of the time to create a new product. That was a Lab project, and after 2 years of development, it became a very viral and famous tool that can't be missed for researchers.
I think the way of application of NupoM is very good. It's really interesting. There is a theory that the user, that is, we will be the one who controls the input of the document and we will manage that knowledge differently. With the application of other tools, AI will be a tool that can control the input of the document.
and the tool will add additional data sources so it will have to deal with the error That's the approach and philosophy of NoopOM After the last 2 years, I use NoopOM almost daily for data reading I think because I work in AI, I work in AI so I will have more research to answer some questions about how to do it Before we start sharing some personal use case then I will Let's talk a little bit about how How does this technology work and why is it different from the other ones? And why can we use a lot of data to make it work? Why can't we use it in ChatGPT?
I don't know if anyone is curious about that. Has anyone tried to upload a book to ChatGPT to ask and upload it to WM and see the difference? I think I will try to spend some time to dig deeper into this. When I understand the nature, I can then create and use it better. It's just the surface.
Linh, you can turn on the mic. How do you feel? In terms of data, I think Nobuo LM has more advantages. For example, with the same amount of data, Nobuo LM will pick up more and more in a comprehensive way. If you upload the same amount of data on the GPT, there will be some data that will be omitted.
Okay, correct. This is because the difference between the base of the operating technology of these two tools The usual chatbot, when people upload the documents, it will process by adding that content into the context So when people use an AI model, when we chat with AI models, it will have a length of context for us to input into the head As you can see, when a new model is updated, it will gradually have 128k context length, and when there is a new model, it will be 1 million context length. It can read a whole book.
The trend is that the more models are updated, the more they accept the big input. The way to operate normal chatbot models is that when you upload the documents, they will divide the documents into smaller parts. When you ask a question, it will find the smaller parts that are equivalent to the... The question will be put in and it will choose the top 5 or 10 and answer it.
That's why when people ask a question that has more questions than the top 5 or 10, it will be excluded from that limit. That's why when people upload a book to Chargivity and ask, they may be left out of the information. But with NoboM, the way to work is different.
When we upload all the documents, all the content is cut into small pieces and put into a library. It's like a library that connects everything together. It sets up a guide for all the content we upload.
It will form a kind of a map, so that when we ask any question, it will follow that map to ensure that we are not left out. That's the different way of working. With normal chatbot, every time we upload, every time we ask a new question, it will do it again.
It will go back and forth, it will put all the documents we put in and it will list the closest similarities. And it can be deleted if the information we need is in too many places. That's also the reason why if you ask questions in the GPT chat, it will only lead to the newsletter link.
But with NoboM, it can lead to each text, each small section appearing. That's the reason why NoboM did this part very carefully. It divided, cut, and set up the text, and it turned everything into a format. So that later, we can analyze it as if it was arranged into a library.
There are categories, labels for each section, each content. Everything is very simple, and when you ask, it can follow the library to find the information you need. That's the basic way of operation. So for example, with use case, it's very simple.
People want to summarize quickly, one or two articles that can be directly applied to ChatsGPT because the length of the model can be handled. But more than that, at that time, we should use Nobem. If one book is leveled, we should use Nobem.
I think that's one of the basic differences when I understand this part. then people can take the initiative and utilize the use case. I have a question about how AI understands things.
Actually, AI doesn't really understand what humans understand. It learns from inputs and outputs. It tries to understand that with inputs, the output will be like this and it will learn. Based on the meaning, it seems that the first two A will go to B and learn those data sets.
Then, they will learn from a lot of data and gradually, they will be able to produce the question like this. These are the closest and most similar information to this. That's how it was taught, not understood like humans. However, AI has been taught all the internet data and a lot of data in the past 2 years, and now I think AI has a good understanding of it. I will share some use case that I use.
I am a product manager for AI. My main job is to study and learn new things. These are my use case.
These are some of my use case. I will have a quick demo later. My use case is about quick reading of the inside of the book, especially the academic articles. I have to read a lot of papers.
And some of the prompts I use are easy to understand, like explain, like I'm fine, especially when I read the papers and it's quite hard to understand. I think WM handles that quite well. I think WM will help me in transferring the content to a format that I want or to a format for new people.
This is a prompt not only for you, but also for me. I often ask questions like this. Explain this to a new person, explain this to a child. Or quickly get insight from podcast scripts or YouTube links.
Because now I have too many things to read in a week. And I don't have enough time to get all of that, so I need to quickly get information. Oh.
Next, we have use case related to the organization of insights and content from many different documents. Here are some use cases that I often use. In the previous customer feedback survey, one of you also presented this use case.
This is one of the ways I often use it. Because I think it can handle a large amount of data. For example, I have a data set of several thousand surveys or several thousand feedbacks on a product. Nielsen If we put the same data in the GPT chart, it will not be well-combined. If we put in a few hundred lines, it will be okay.
This is the top of the product's pain points and top of the product's bugs. But if we put in a few thousand of those data lines, the approach will be different. If we put it in the NPP, it will be quite good. I can say that the total number of hair and the feedback about the product, the features, the bugs that are reported the most and the order of the words mentioned the most, it will be quite okay.
In general, you have to do a lot of research. You have to research about the opponent or about a topic. You will collect all the information about a company or a product, and put it all in a notebook. Then you usually ask questions like, I can arrange the process of development in MyStone. I can even delete the user manual files or the logs for the updates.
I will delete all of them and tell them to combine with MyStone timeline. That's a very useful use case. Or you can add more different products and you can tell them to pay you in the form of a table.
You can immediately compare the products you are interested in and the time it will be released. How much time it will be called a fund, what new features it will release. Next, use case is to summarize the records or script meetings over time.
For example, you have daily or weekly meetings. And after a few weeks or months, I want to wrap up and summarize the progress I made. That's a use case.
I will add scripts to each day. Then, the Google app will summarize the progress in a timeline. Then, I can see how the progress is. The previous use case was quite interesting.
Vida la, brem hop, I had an offline meeting, so I recorded the whole meeting process. Then I uploaded the MP3 file to the meeting notes. Then I shared it with everyone who couldn't attend the meeting.
That was a use case. Those are some of the use cases that I often use. There are also some use cases that are out of scope, I often mention the research and information research. I often look for information about someone. There are some interesting use cases that people share.
The world community often shares this. They can search on Facebook or Twitter of someone. Then they copy all of their posts, put it in the Google app, and ask if they can give me their profile.
or some interesting points to understand about that person, to understand about their insight. Those are some use cases. Or here, Steven Johnson is the father of Nobuo M. If you can find out more about him, I think he has a pretty good sub-stack.
He is a person who is very impressed with the fact that he built a tool to support the mind-lead. He is a writer and a researcher. He realized that the pain points in writing and researching took a lot of time to switch between the tabs to find the documents.
That's why he designed the EM notebook in such a way, to help people maintain their mind so that they don't waste time switching tabs. I can immediately ask questions and find the information I need. and we can verify it very quickly.
And then take note of it, this whole flow, I think this is a key point of this product. You can follow Twitter or follow Substack, he also shared some use cases. For example, he gave all the research documents of his own in the past 20 years to become my assistant so that when I need to find out something, I can find it again. This way, everyone can completely build their own personal knowledge. I also have a reading list of my books and articles that I like to put in a place.
And sometimes I will go in and ask questions. Or when I encounter a new problem, I will ask a new problem and it will find me. Okay, with this problem of mine, in my knowledge box, what can be the explanation? Or he also built an everything notebook, also throwing away all the written history.
and ask questions, find connections between ideas or find solutions for new problems. And it will find and explain why it is so. Some other interesting use cases on social media, they share that for example, they built a center to help raise children by... This is a father, he put all the instructions related to raising children, then when there is a question, he will ask the question. Or another use case is to follow your personal journey.
For example, if you are trying to do something, it's the use case where you give meeting notes over time. When you have updates about your personal life or you need to check your progress, you can create a notebook and give your progress daily or over time. then later on, I might be able to... I can report the data in a timeline mode Those are some of my favorite news I'm also a data developer, so I like to look at the data trends I like to put the data in a mixed way and make it return to me in different time frames so I can see how the time goes That's it Before I go into the demo, I would like to share some of my tips.
My opinion is that the philosophy of this product is to help us to exploit and dig deeper into the data that we put in. Therefore, the choice of the data to be stored is very important. The way you choose to store data will be based on reliable sources, reliable sources. Hoặc là chọn... The data is very messy and complicated, but with a large number, it can find common points.
That's how I usually do it. So selecting the first data is very important. Ask questions, actively to exploit the multidirectional. Here you can see the OEM button.
Here you will see the OEM button. When you click on it, it will always have follow-up questions. Usually when you type the data in here.
then the Out to, the OEM button will summarize all of my documents about what and they will list some questions below. Where do the questions come from? It comes from the fact that this AI will list all of these data and it will give the top most mentioned information. But I realize that following this is very fast, it helps me to immediately ask questions and go. in your nose.
It's easy to get biased and go too deep into this topic. In case you need to understand this document in a broader and comprehensive way, you will have to ask yourself the following questions. First, what are these documents referring to?
Usually, you can summarize all the information that these documents bring to you. For example, when you put a postcard in, It also suggests some questions, but those are just the main topics If you go to that column, it will continue to suggest the next questions You will notice that when you chat, it will suggest the questions related to the previous question So it will have another side that you can easily go deep into this topic and skip it This document can mention some other insights So it depends on what you want uh, we don't yet. There's a use case where if you download a document and you need to go to exactly one issue, you will go straight to the issue.
However, if you have a document and you want to go through all of it, you will usually have to ask yourself questions. Usually simple questions. Summary. Summary.
I have an example here. Let's see. I'll try to give an example. This is a book. I will summarize a book.
This book. I will go to the basic part, first I will tell you the summary or what are the dimensions of this book then you will have a list and from that list you want to go to and you will ask questions in that part I think that is a note for me and everyone When I get to know the book, I will tell you First, I will give you the summary then you will know the overview What are the materials that can help us avoid those issues? So that we can exploit all aspects of a particular issue. Here, I will show you an example.
This is an example. This is an app from my friend's company. This use case is quite simple. They need to combine the reviews to see what the next week's work is. The solution is quite simple.
Every week, the product team will just need to go here and copy it quickly. Actually, copying is very fast. Sometimes, you don't have to waste time on Chrome. You can copy all the reviews on there and put them in Google AM. The next file can be used to combine the different types of books.
For example, I will use feedback and sub-save. This is a task that LOM notebook does quite well. You can also use this method to upload your video.
You can not only get feedback from here, but also do surveys. For example, Mr. Cong has a survey for today. I can also try to put this survey here. And then, let me turn on the music. This use case will help you upload it to chatGPT because it will be handled differently And here, I will have the inside You can see it here You can also make it into a table so that it's easier to see You can see it here Here.
I think that in less than a minute, we can have an insight into a series of reviews, feedbacks, and surveys. That's also the goal when collecting feedbacks, we need to focus on the most mentioned ones, right? That's a use case I often use. For example, I can copy and paste all the questions about WM. The way to do it is the same, but I put them together.
Combine the questions about WM. I personally think that the form is not that important in terms of grammar. It's not important in terms of grammar, but people need to be aware of what they ask. There was a question earlier about how to answer it in general and go deeper. One is to click on the source and read the details, or ask another question about it.
For example, the it will go deeper. It can only point out a few above, so it will go deeper below. You can use it to analyze data. I will quickly show you one thing. I have been looking at all of your Facebook posts for the past 2 years.
I will try to draw the picture of the person I am looking at. Here are some interesting ways to create with this tool. When you put all of them in, your blog will be completely based on your posts. You can use this use case with other people.
When you research with someone, you usually research with product, you research with the founders, you will often take their interviews or blog posts. I also answer their questions and ask questions like this. I think those are some pretty creative use cases that people can easily adapt to.
Okay, let me back up a bit. I think that the important tip is to choose the right topic, actively ask questions so that you don't get... so that we can control the dialogue with the AI. We don't have to go too deep into the follow-up question, which is good and okay, but it can only make us go deeper into a detail that we are left with through the overall picture.
It depends on what we want. Unless we actually put together a document and we just want to find the information we need, then that's okay. But if we want to understand more broadly and understand all the details, then we need to be more active in asking questions.
I think I still keep the habit of reading long form. I use Noboem to skim through quickly, to understand quickly. But when there's a part that I really like, I will still read the details.
For example, I often use Noboem to capture the insights from a postcard. I realized that there were some parts that I read I think reading or listening to the lyrics is better than reading through the whole album. Because it will lose the individuality of the lyrics.
Because it's a kind of talk. So when it's combined into a whole, it will lose its individuality. Or I will have to put in prompts like, I have to copy the lyrics of the person. But that's just me. and the details are better if that's what you really care about.
And the most important thing is that you should combine flexible with other tools by understanding the strengths and weaknesses of each tool. This is what WM does very well in the combination of data, the combination of information extraction from the initial data sources that we have controlled and changing it in different formats, but it may not be good at writing. It's not good at writing, so if you need to compile information and write a book, then the Google app can only help you stop at the information system and export it.
Then you save the notes and bring them to the chatbot, through chatgbt, model 4.5 to write, for example. Or through Gemini or Wacrock to write because at this time, Google app is not strong enough to write anymore. Or you can use the Deep Research tool outside to collect the first sources.
I used the purplexct to collect the source and then brought it to the OEM to have the source. I think I used it at the right place, at the right task. And for other tasks, I will use other tools.
Those are some of the use case and my personal notes.