Transcript for:
Overview of GBT 5 Developments

Sam Alman teased the GBT 4.5 weeks before its public release. And then he promised to deliver the GBT in a few months from that and it's shaping up to be the biggest, grandest update we've ever seen. By the end of this video, you will know exactly what GBT5 will do differently, how it's going to change the way we all use Cad GBT, and most importantly, what its biggest potential problem might be. On February 12th, Sam posted a detailed update on X about these two. And this was a huge deal because users had noticed a strange split in models. Some were fast and light while others were slower and seemed to think. OpenAI hadn't really explained why, and Sam's post changed that. He admitted that their product lineup was too complicated and promised to simplify things. The key quote was, "We hate the model picker as much as you do and want to return to Magic Unified Intelligence." So, what did the road map reveal? First, Altman confirmed an in-between model called 4.5 was in the way. A few weeks after that tweet, we got the update. Here's a fun tidbit. Internally, 4.5 was codenamed Orion, which is the same name tied to the mysterious GBT in earlier rumors. GBT 4.5 is officially the last non-chain of thought model in OpenAI's lineup. It's the final stage of the old approach, scaling up the classic GPT architecture before the company switches to whole new method with GPT5. It's unlikely the prompting game will change with the GBT 5. So, you should start learning how to prompt right now. For this, check out our prompting guides in Geek Academy. Here's a guide into prompting for developers. Here's prompting for data analysts, writers, and so much more. Dolly prompts, Grog 3 prompts, midjourney, and exclusive video lessons about AI and some built-in AI tools to try. As for GPT4.5, I've tested it almost every day, and it feels more naturally conversational and emotionally aware than GPT40. Its knowledge base is broader and it's a bit less likely to hallucinate facts. Overall, it's a strong upgrade, but it does not do the stepby-step reasoning GP01 can. It doesn't think before it responds, as OpenAI's blog said. It's more of a brute force intellect than a careful reasoner. Also, OpenAI doesn't view it as a true frontier advance in AI. Just hours after launch, they removed a line from its white paper stating it's not a frontier model, probably to keep people's expectations in check. Clearly, GPT4.5 is just a stepping stone, not the final stop. Alman's road map post then revealed the real destination GPT5. and he said the top goal for us is to unify O series models and GPT series models. In other words, GPT5 will merge these two brains into one. According to him, this next system should use all our tools, know when to think for a long time or not, and generally be useful for a very wide range of tasks. Practically speaking, GPD5 should be able to decide on its own when to take extra time to reason or when to give a quick reply. No user settings needed. It's a bit like how Claude 37 works, a hybrid approach. GPT5 will also include their most advanced reasoning module in its core. Sam Alman even said 03 will no longer be released as a separate model. It'll be part of GPT5's unified intelligence. Alman wouldn't commit to a precise release date, but he teased a rough timeline of weeks, months. GPT5 could arrive as soon as spring or maybe by summer if all goes smoothly. After the long silence of 2024, the message suddenly became, "Just hang on. GPD5 is almost [Music] here." Developing GPD5 has been a story of setbacks and perseverance. Late last year, rumors started circulating that GBD5's development was falling behind schedule and blowing its budget. A Wall Street Journal leak suggested each big training run for GBD5 might cost around $500 million in compute power. OpenAI reportedly ran at least two massive training cycles and discovered new problems both times with results that failed to deliver the big breakthroughs they were hoping for. At best, the early GBT prototypes were only a bit better than GBT4, not the huge leap Open AI wanted. That's when the just make it bigger plan went out the window. One major sticking point now is data. GBT4 was trained on around 13 trillion tokens of text. A mind-boggling amount. For GBT5 to improve meaningfully, it needs even more highquality training data. But OpenAI hit a point where the public internet just doesn't have enough new diverse text. They'd basically mined all the easily available knowledge. In mid 2023, they tried another training run codenamed Aricus to test a new architecture for GBT5, but it ran too slowly, indicating a fullscale version would take forever and cost a fortune. The results were underwhelming, confirming that making GBT wouldn't be as smooth as they'd hoped. Open AAI had to regroup. Engineers tweaked GBT5's design and started hunting for new data sources. They even hired experts to create fresh training materials. Everything from code to math problems to go beyond what the internet had. By early 2024, they launched a few smaller training runs to gain confidence. Then in May 2024, they kicked off another large-scale training effort expected to last until November. This time they use more carefully curated data, but halfway through they realized the data set still wasn't as diverse as needed. It's that classic moonshot problem. You don't know what's missing until you're way into the mission. Starting over wasn't an option after all that time and money. So, the team scrambled to inject extra data mid-training. Whether that patch truly solved things is unclear, but by late 2024, it was obvious GBT5 had run into serious snags. While we wait for GBT5, remember there are AI tools for pretty much any task. To find the right one, open Geek Academyy's AI toolfinder and just ask. Tell it what you need and we'll suggest the best tool. can even answer your follow-up questions. If you just want to generate images, head to the AI Art Studio and make whatever you imagine, fast, reliable, and included for all Geek Academy members. And we've just launched our new exclusive course on generative AI with new lessons being added all the time. It will take you all the way from absolute zero with AI to an AI master like me. We are turning Geek Academy into the ultimate platform for AI fans. packed with educational content, prompting guides, curated tool lists, exclusive videos. You can even chat with a virtual version of me if you want. There's so much more to explore. So, click the link in the description and let's make Geek Academy the best platform ever together. On the human side, more than two dozen key OpenAI staffers left in 2024 during the GBT5 push. Big names like chief scientist Ilia Sudskver and CTO Mera Morati. By November 2024, outside experts were picking up on the struggle. Famous AI investor Kyu Lee said GBT5 training hasn't gone smoothly and warned that early predictions of AGI within 3 years might be too optimistic. GBT5 was supposed to land by late 2024, but Lee said mid 2025 was more realistic. Sam's recent tweet confirmed we'll have to wait a few more months and it could get pushed back again. If Sam's tweet is any clue, OpenAI sees GPT5 not just as a simple upgrade, but as a complete shift. That phrase magic unified intelligence suggests one model to do it all instead of the patchwork approach we've had. To understand why this is such a leap, look at what came right before it. GPT4.5 launched in late February 2025 and it's massive. OpenAI's biggest model yet, soaking up more compute and data than any earlier release. It's the pinnacle of the classic GPT scaling strategy. More neurons, more data, more of everything. And sure enough, GPT 4.5's huge size delivers some perks. OpenAI says it has deeper world knowledge and a higher emotional IQ than GBT40, but GBT 4.5 also showed the limits of brute force expansion. On a bunch of reasoning heavy tests, this giant still lost out to smaller AI models built for step-by-step thinking, including some from Anthropic and even OpenAI's own Oer. In tougher math or logic puzzles, just being bigger isn't enough. Careful reasoning wins. OpenAI's own tests hint that GPT 4.5's gains from piling on data and compute are starting to level off. GBT5 is OpenAI's big solution to these ongoing challenges. Instead of dumping another half billion dollars into making the model bigger for minimal improvements, GPT5 blends the best of both worlds. The huge knowledge base from the GPT4 line and the focused step-by-step reasoning of the O series. This step-by-step thinking is often called chain of thought reasoning. It's like the AI scribbles its own notes in the background before handing you the answer. OpenAI's smaller O series models like 01 and 03 already use chain of thought to great effect, but they didn't have the massive knowledge that GPT4.5 does. With GPT5, OpenAI wants to bake that thorough reasoning directly into a model that's just as broad and flexible as GPT 4.5. That's why Sam Alman is confident we won't need a model picker anymore. GPT5 itself will do the picking, switching between quick responses and slow, detailed thinking automatically. Under the hood, we're likely looking at a major jump in design. OpenAI hasn't shared specifics, but it's clear GBT won't just be another slight tweak of the classic GPT formula. One rumor from 2024 suggested they might try a mixture of experts architecture, basically multiple specialized mini models inside one big AI, possibly pushing the total parameter count into the trillions. No one knows for sure if that's the final route, but in late 2024, OpenAI's CFO hinted the next model would be an order of magnitude bigger, implying GPT5 could be 10 times larger than GBT4 in at least one dimension, whether parameters, data, or computational steps. 10 times the size. That's absolutely wild. We're talking about a serious leap in smartness. And if you are an AI fan who wants to 10x your skills, join Geek Academy. We've got a huge library of educational content, super useful built-in AI tools, special deals, and some of the most popular AI platforms. And our new course to generative AI that we just started is something that everyone should take, a complete 101. You can learn everything in one place. Think of it as your personal mixture of experts for AI. Probably the best way to picture GPT5 is as the first real Omni model in OpenAI's lineup, Omni, because it has near limitless knowledge and can tackle all sorts of tasks, logic, creativity, speed, tool usage, and more. For many people in AI, this is the holy grail, one model that adapts instantly to whatever you throw at it without forcing you to choose the right version. In practical terms, chatting with GBT5 could feel even more natural. You just ask it to do something and GBT5 decides whether it needs a quick answer or a deeper, more methodical approach. All that heavy lifting happens behind the scenes, leaving you with one friendly AI personality that's there to help. GPT4 introduced multimodal input by allowing us to upload images. 4 took it up a notch and GPT5 will push this even further, handling text, images, audio, and maybe even video inputs and outputs. Imagine being able to chat, then switch to speak in a question aloud, upload a photo for analysis, and request a detailed image or diagram all in one ongoing conversation, like you're talking to a highly skilled assistant who can adapt to any format on the fly. We might also see some level of video support. GPT5 may not generate Hollywood level movies at launch like some advanced AI such as Sora, but it could analyze video clips or create simple animations. At the very least, fully integrated voice and audio are practically guaranteed. Chad GPT has had a two-way voice modes since late 2023, letting you talk and hear natural sounding responses. GPT5 will definitely build on that, turning Chad GBT into a super convenient conversational partner. Right now, Chad GPT can browse the web, run code, and analyze files. GPT5 aims to take all of that and supercharge it. You'll still have access to the operator mode where GPT can autonomously perform tasks like web navigation or data extraction on your behalf. But GPD5 will do it more seamlessly. It won't just wait for you to prompt it. It could proactively say, "Hey, I can solve this by checking a database." And do so safely within your set limits. We'll likely see every separate feature get rolled out into single unified model, including schedule tasks. Maybe that doesn't sound earthshattering right now, but imagine GBT 5 integrating with your external calendar or coordinating multiple projects without you constantly supervising it. That could really change how we rely on Chad GBT [Music] day. GBT is also poised to ramp up the memory game. Early versions of persistent memory can be a bit hit or miss. In GPD5, expect that memory to become more reliable and more personal. If you mention your dog's name is Buddy, your favorite color is green, or you're working on a novel, GPT5 might hold on to these details for future sessions, tailoring its responses to you. And with things like custom GBTs and the GBT store, you'll probably be able to set up specialized AI helpers for different needs. Maybe one for your job, another for creative writing, another for personal fitness goals. Essentially, GBT5 will help you create multiple AI personalities that all share one strong foundation, but are finely tuned for whatever you do most. GBT 4.5 can already handle up to 128,000 tokens in one go, which is huge. Can read and process entire documents or super long chat logs. GBT5 is likely to expand that even further. So you could drop entire books, research papers, or multi-session transcripts in without losing track. Think of it as GBT 5 have a near endless working memory, able to pull together info from anywhere in your archives. For context, Gemini 2.5 just came out with a context window of a million tokens, and they're targeting 2 million soon. GBT5 might match or beat that, meaning it can chew through massive amounts of data without breaking a [Music] sweat. Open AAI also released a collaborative tool called Canvas, which is basically a digital whiteboard where you and Chad GBT can organize and plan ideas together. The current version is pretty basic and a bit rough around the edges, but GPT5 will likely supercharge this. Maybe adding structured content editing, interactive project management, or more polished design help. It could let multiple people collaborate in real time with an AI that can keep everything organized. If they nail these collaboration features, you'll be able to brainstorm visually, rearrange content, and solve complicated tasks alongside GPT5, all in one shared workspace. That's a big step toward making AI a true team player rather than just a lone chatbot. Overall, GBT5 is meant to tie all of these improvements together. Chain of thought reasoning, multimodal capability, toll integration, deep personalization, autonomous tasks into one smart assistant. No more choosing among different models or flipping modes. GBT will sense when a quick answer is enough and when deeper reasoning is needed, seamlessly switching gears in the background. For users, that means a single AI that feels more flexible and natural than [Music] ever. Everybody's asking this question and nobody can say for sure. AGI or artificial general intelligence is usually defined as an AI that can learn and do any intellectual task a human can. By that strict definition, GPT5 probably isn't AGI. It will still make mistakes, miss certain details, and it won't have its own goals or self-awareness. Even OpenAI says so. But for everyday users, GBT might feel like AGI will likely blow GBT40 out of the water in terms of reasoning, flexibility, and handling a crazy variety of tasks, all without special versions or extra settings. If you're not an AI nerd, it's going to look like a super smart system that can write, plan, analyze, visualize, talk, and reason even better than many humans in certain areas. And this might be enough for most people to think, "Wo, this is basically AGI." We're not saying it's a self-aware robot, just a really advanced assistant, tutor, analyst, researcher, or creative sidekick that goes deeper than anything we've seen before. That alone is a giant leap for how most folks see AI. On paper, GBT5 won't be true AGI, but for plenty of users, it might as well be. Sure, it's technically not AGI, but it's the next best thing. Open AAI isn't the only one racing toward the future. Google is working on Gemini. Enthropic is scaling up Claude. Elon Musk has XAI with Grock, and countless startups, plus open-source communities, are making strides faster than anyone expected. We also can't ignore DeepSk's massive impact in this high stakes race. GBT5 isn't just another model. It's OpenAI's answer to all this competition. Think of it as their flagship intelligence, an integration machine, one system that hooks into tools, apps, daily workflows, and meets both casual and power users right where they are. No more which model is right for you. GBT5 will adapt to whatever you need, and the world's already primed for it. There are over 12 million Cad GBT plus users and 92% of Fortune 500 companies use OpenAI in some way. Entire industries, education, coding, entertainment are leaning heavily on GPT technology. So when GBT5 hits, it's going to kick the entire ecosystem up a notch. So when can we realistically expect GBT? Tweets are fun, but if we've learned anything from past launches, it's to be practical. From Sam Alton's comments in February 2025, we know GBT5 is months away, not years. GBT4.5 just came out, and GBT5 might arrive this spring or summer, maybe May or June. Personally, I'm bracing for at least one more delay. Even when it does launch, it might take a few weeks for the infrastructure to catch up. I'm guessing it will roll out in stages, maybe a small launch first with features and tools coming over time. OpenAI is notoriously secretive, but the bellin blocks are there. The tolls are prepped. We have a public road map, and the stakes couldn't be higher. GPT5 might be more than just another chapter. Could be a whole new volume in the AI story. The moment when AI goes from helpful chatbot to something far more capable and deeply woven into daily life. We are not quite at that point yet, but the countdown is on. Thanks for tuning in and I will catch you in the next one. [Music]