The key to finding great stocks before everyone else is understanding a company's products, not just their profits. Huge growth happens when a company has the perfect product for a hot new market. Tesla made investors rich by conquering electric vehicles and soon, self-driving cars. Nvidia is making millionaires as their data center chips dominate AI training and inference. And in this episode, I'll show you one small hidden gem company that's taking data center networking to the next level.
which could make it a great AI stock with a lot of room to run. Your time is valuable, so let's get right into it. First things first, I'm not here to hold you hostage.
This is a deep dive on a microcap company called Poet Technologies, listed under the ticker symbol Poet on the NASDAQ. Fair warning, Poet stock has already tripled since the last time I covered it back in April thanks to a few new product innovations and big partnerships, like Mitsubishi, who just announced they're working with Poet on on next-generation optical chips for AI networks, which I'll talk about more later in the video. This deep dive has four parts.
What Poet's technology does and the key AI challenges that they address, Poet's competitive advantages and major partnerships, their financials and the risks investors need to know about, and of course, whether or not PoetStock is a good investment as a result. Nothing is more important to me than your trust, so let me say right up front that this video is sponsored by Poet Technologies. but they have no control over what I say and all opinions in this video are my own. There are no links to click and I don't make any money based on the performance of this video. What I did do was spend hours with Poets executive team and their engineers to make sure I got the technical details right.
And as you'll see, there's a lot of tech to talk about. So, let's start with the AI market and the challenges that Poets products help solve. The global artificial intelligence market.
is expected to almost 12x in size over the next 8 years, which is a massive compound annual growth rate of 37%. And while AI being a $2.75 trillion dollar market by 2032 seems crazy at first, it's important to remember three things. First, AI isn't one market at all.
It's a huge field within computer science that applies to almost every market on earth, from self-driving cars and manufacturing to customer service and media production. and from space and defense to drug discovery. That $2.75 trillion estimate includes all of those different markets. Second, AI isn't only going into new products and services like chat GPT, it's also getting added to almost everything we already use today. Google search now has AI overviews, iPhones are getting Apple intelligence, and there's a co-pilot for Windows 11 and Microsoft Office, all of which count towards that estimate as well.
And third, about 20% of that massive market estimate is just the hardware infrastructure to support it. The massive demand for AI means the global data center accelerator market is expected to more than 5x by 2032. which is a compound annual growth rate of 24% per year. And this is where those big challenges for AI come in.
All of those chips require a lot of power to run and to cool. According to Goldman Sachs, a single chat GPT query uses nearly 10 times as much energy as a Google search, and Boston Consulting Group estimates that data centers will account for 16% of total US power consumption by 2030, up from just 2.5% in 2022. The second major bottleneck for AI performance isn't the chips at all, it's the speed of the network that connects them. That's because when a workload is broken up and processed in parallel, the final solution is only as fast as the time it takes the last piece to make it back. When most investors think about networking, they think about copper wire, since that's what's connected most servers, routers, switches, and chips for decades now. But today, AI workloads are pushing the limits of what copper connections can handle.
For example, copper's bandwidth is limited to tens of gigabytes per second and can suffer from signal losses over long distances. On the other hand, fiber optics can transmit much more data per second and lose much less signal over the same time. distance.
That's why we're seeing so many AI data centers use fiber optics today, like NVIDIA's NVLink for chip-to-chip connections and InfiniBand to connect multiple racks together. But unlike copper, the challenges with fiber optics aren't in the wire. They're everywhere else. When I cover chips from companies like NVIDIA, AMD, or Intel, I'm talking about integrated circuits, which have been getting smaller and more tightly integrated over the last 60 years. But that hasn't been happening in photonics.
From the lasers and waveguides to the filters and detectors and even the circuit boards themselves are still separate components that need to be manually aligned, manually wirebonded, manually tested and manually packed up by humans. Until now. Poet, in fact, is a very important character in the game. invented a special kind of waveguide that lets optical components be placed onto a chip by machines, instead of getting wired together by hand.
In electronic chips, the components talk to each other via high-speed metal traces embedded in the wafer. Poet's waveguide works just like those metal traces, except for light instead of electricity, which is a huge deal for three reasons. First, it gives Poet access to the same high-speed manufacturing processes and economies of scale. that electronic chip companies have benefited from for decades, while other photonics companies are still stuck aligning and assembling their systems one at a time by hand.
Second, these waveguides have very low losses, so more of the laser's power makes it to the fiber output instead of turning into heat. That helps address our first big AI challenge because lower losses means the lasers can use less power and lower the need for cooling. And third, it lets Poet integrate electronic and photonic components on the same chip by placing metal traces and their waveguides onto the same wafer.
Poet holds the patent for this waveguide technology so no other company can copy it and this is where things get really interesting for investors. This special wafer is called an optical interposer since it lets Poet and their partners combine optics and electronics that are usually separated inside different data center devices. Fiber optic transceivers are those little plugs that connect to switches in data centers.
They are called transceivers because they are connected to the transceivers. because they transmit and receive data. And there are hundreds of thousands or even millions of these things in every single data center. Let me give you a practical example. There are 8 hopper GPUs in one DGX H100 pod, and there are 4 pods in one server rack.
32 GPUs with 18 connections each and a transceiver on both ends of the connection means one DGX H100 rack has 1,152 optical transceivers in it, while each Blackwell rack will have around 1,296. That's a million transceivers for every 800 racks on average. Amazon Web Services alone has at least 30 data centers around the world right now. each of which has around 2400 racks or around 3 million transceivers.
That's exactly why the optical transceiver market is expected to more than double over the next 8 years, which would be a compound annual growth rate of almost 11%. But the market for 800 gigabit optics specifically is projected to more than 10x over the next 5 years, since that's where all the demand is. That would be a whopping 65% compound annual growth rate with 1.6 terabit optics following that same growth path in the future.
The optical engine is the part of the transceiver that converts the electric signals used by AI chips into light for fiber optic transmission and back again. And it's about 50% of a transceiver's total cost. The reason optical engines are so expensive is the lasers, lenses, filters, and fibers all need to be aligned wire bonded and tested by hand by other companies, like I mentioned earlier.
If they're misaligned even a little, too little light will get to where it needs to go and all that work is wasted, which makes it pretty hard to scale. But Poet can build an optical engine right on top of that special optical interposer wafer, the same way that TSMC builds Nvidia's chips, where the positioning, alignment, testing, and packaging is taken care of along these automated processes. This isn't just about making more parts faster.
Machine manufacturing means the optical engines have higher signal strengths, higher reliability, and lower unit costs. That's a massive edge when data centers decide what vendors to buy hundreds of thousands of transceivers from. Not to mention all the other markets that also use optical engines, like lidars and laser scanners, medical devices, and even virtual reality headsets. Another thing that investors should know about Poets'optical engines is they're modular and they're designed to scale in terms of speed, so they won't go obsolete as data centers start requiring more throughput.
In fact, they're building blocks for Poets'three main products today, called Poet Starlight, Poet Infinity, and Poet Wavelight. Poet Starlight is a low-cost light engine designed for optical transmission between chips. kind of like Nvidia's NVLink but using light instead of electricity for all the reasons I mentioned earlier. PoE Infinity is a fully integrated 400 gigabit per second chiplet that goes inside optical transceivers.
It includes the lasers, drivers, and multiplexers without any wire bonds, which are where a lot of the losses come from. It's called an Infinity chiplet because two can go in the same transceiver to get 800 gigabit speeds instead of 400, or four chiplets can be used to get 1.6 terabytes. a bit speeds without changing the form factor. That's how Poets Technologies can directly help AI data centers with their second big challenge, which was the bottleneck in Netflix.
network performance. Poet Wavelight is their fully packaged 800G transceiver built with two Infinity chiplets, which means they can be expanded to 1.6 and 3.2 terabit speeds down the road. Poet can sell their Wavelight transceivers directly to end customers like Amazon Web Services, Google Cloud, Microsoft Azure, Meta Platforms, and of course, Nvidia.
Alright, now that you understand Poet's technology, their target market, and their competitive advantages, let's talk about their partnerships. their financials, and the potential risks. That way, we have all the context we need to decide if Poet stock should have a spot in our portfolios. And if you feel I've earned it, consider hitting the like button and subscribing to the channel.
That really helps me out and lets me know to cover more hidden gems like this. Thanks and with that out of the way, let's talk about Poet's partnerships. Just a few days ago, Poet announced a big partnership with Mitsubishi to create 3.2 terabit per second optical engines by combining Poets'optical interposer and Mitsubishi's leading-edge lasers to take them and their customers into the next generation of networking for AI and hyperscale data centers. This promises to be a momentous technological achievement when it's unveiled. Foxconn makes a lot of the world's most popular electronics, including the iPhone and iPad, Google Pixel devices, and the modern Xbox, PlayStation, and Nintendo game consoles.
A few months ago, Foxconn Interconnect Technologies announced that they're going to design their next-generation 800 gigabit and 1.6 terabit transceivers on top of PoET's optical engines. According to the press release, the design phase for that is wrapping up right now, and production starts next year. And by the way, almost half of all 800 gigabit optical modules go to Nvidia today, and Nvidia recently announced a major partnership with Foxconn to build AI factories, self-driving EVs, manufacturing robots, and more.
so having PoET's transceivers in the data centers powering those projects is a big deal for such a small company. LuxShare is the second largest iPhone manufacturer after Foxconn, and they also make modules for Nvidia's upcoming Blackwell systems. Just last month, LuxShare expanded their existing relationship with PoET, using their optical engines for LuxShare's portfolio of 400GB, 800GB, and 1.6TB transceivers for artificial intelligence networks. following the successful tests of modules built on top of Poets 800 gigabit receive optical engines. And soon after that, Mentech, which is a huge company specializing in optical products, announced that they're placing purchase orders for Poets optical engines to use in their transceivers as well.
Poets partnerships with Mitsubishi, Foxconn, Luxshare, and and Mentech are important because their customers include massive AI infrastructure companies like Amazon, Google, Microsoft, Meta Platforms, Broadcom, and Nvidia, all of which I cover very often on this channel, and many of which are on my list of top stocks to get rich without getting lucky in 2024. But before we can decide whether Poetstock belongs on that list, let's talk about their financials and their biggest risks, which are currently two sides of the same coin. Poet Technologies is a pre-revenue company. If you want to get in early, it doesn't get much earlier than this. They do have a solid cash position with almost $29 million in the bank from private placements and selling warrants. They currently burn around $1 million per month or $3 million per quarter, which gives Poet a two and a half year runway to really ramp up production and sales even if they don't raise another dollar.
But I need to emphasize the zero next to their sales, since that also means we have zero insight into their profit margins, their earnings, their revenue growth, or any other financial metric that could justify their $200 million market cap. And that's another risk too. This is a $200 million publicly traded company. Poet's small market cap means it's a very volatile stock that's going to be sensitive to market factors that are well outside of their control, like interest rates.
And like I said at the start of this video, Poet stock has already tripled since I covered it back in April, but you can see from the price chart that it's been a bumpy ride. with two separate 40% drops in the last 6 months alone. I also said that nothing is more important to me than your trust, so trust me when I say that there could be many more huge drops in this stock's future and only you can decide if that risk makes sense for you.
For me, Poet is high on my list of long-term moonshots, which usually end up being around 1 or 2% of my portfolio. That's it. I would definitely consider investing more once there's more clarity around their sales, their product pipeline, and their overall Either way, Poet's patented waveguide technology, the optical engines and fiber optic transceivers they build on top of it, the massive partnerships they've announced since I last covered them, and the insane growth of the AI market make me think that Poet is a hidden gem that deserves attention from investors. And that's why I made this video. A big thank you to Poet Technologies for sponsoring this video and helping me understand the science behind this stock.
And as always, thank you for watching and have a great day. and for supporting the channel. Until next time, this is ticker symbol you, my name is Alex, reminding you that the best investment you can make is in you.