[The Bleeding Edge]( Amazonâs Big Reveal By Jeff Brown, Editor, The Bleeding Edge --------------------------------------------------------------- Of all the big tech giants, noticeably absent in the race towards artificial general intelligence (AGI) is Amazon (AMZN). Or at least, this is how the financial and tech media would have us believe. And to be fair, it’s not without reason. After all, if we search Amazon’s latest quarterly report for “large language model,” there isn’t a single match. And if we search for “LLM,” we’ll find 57 matches, but they are all for the “llm” found in the word “fulfillment.” It seems odd. For one of the dominators in the world of cloud-based computing and web services and a company that provides computational horsepower to countless artificial intelligence companies, it would make sense that Amazon is planning to compete for the grand prize of artificial intelligence. It’s not like Amazon is short on capital, either. It currently has $88 billion in cash and will generate more than $45 billion in free cash flow this year. Amazon certainly has the capital to build its own generative AI. And for those who use it, it’s very clear that Alexa – Amazon’s digital voice assistant – needs a major upgrade… What gives, Amazon? Amazon’s Stealth AI Strategy It’s not that Amazon hasn’t been working on developing its own artificial intelligence. It’s just that it didn’t want to talk about it openly. There’s a good reason for that. Despite Amazon Web Services (AWS) being responsible for less than 20% of Amazon’s total revenues, it consistently delivers more than 50% of Amazon’s operating income. This division of Amazon – which leases compute and storage to corporations and individuals alike – has been the cash cow that empowered Amazon to incur losses for years in its e-commerce business. That is, as it built enough scale to eventually become profitable on the typically low e-commerce margins. Artificial intelligence has been the single largest growth area for Amazon Web Services, and Amazon made efforts to position itself as neutral as possible to the industry. After all, if you were designing your own large language model (LLM) – and you knew that Amazon was doing the same – would you want to train and operate your LLM on Amazon’s computing systems? After all, Amazon has already been long-rumored to exploit its own internal data – by studying best-selling products, creating a knock-off version, and then manipulating search results to boost its own Amazon product lines. Who would want Amazon to power its proprietary LLM? That would make many companies nervous, which could result in AI companies moving their cloud computing over to a company that is less “threatening,” like Oracle or IBM who simply aren’t contenders. Remaining neutral was a smart move by Amazon. Who cares if someone thinks Amazon is losing the race to AGI or AI in general? Who cares if Alexa isn’t that smart? The cold, hard truth is that Amazon’s artificial intelligence business is humming, and the stock is trading at all-time highs. Amazon (AMZN) Share Price Since Its IPO in 1997 Amazon’s outward-facing strategy was to appear neutral… and thus grab as much of the AI-related cloud services business as it possibly could. Smart. And in the background, it could quietly make strategic investments in private AI companies, seeding its future customers, as well as developing its own AI. Clever. And there are some interesting developments on both fronts. Recommended Link [270x More Lucrative Than NVIDIA???]( If you’ve missed out on NVIDIA’s recent 1,600% run… Don’t worry. Because there’s one AI stock that could be a lot more lucrative. It’s currently trading for only $16. And it pays out 270X more in "AI Royalties" than NVIDIA. [Get the ticker here ]( --
The “Intelligent Shopper” A few days ago – surprise! Word leaked that Amazon is getting ready to announce its own generative AI, which is known as Olympus. To be fair: This wasn’t entirely a secret, as the project was known in 2023. It just wasn’t spoken about much at all. As far as most mainstream news sources knew, there wasn’t much happening to report. But in the background, Olympus was trained on 2 trillion parameters, which makes it one of the largest generative AIs available. It is a multi-modal AI capable of learning from and understanding text, video, and images. It is reportedly good at analyzing and understanding images and video, which suggests a wider range of utility than most LLMs. My working assumption is that Amazon will make Olympus available to its Amazon Web Services (AWS) ecosystem, allowing companies to plug their software into Olympus, which will provide any company or software package with the superhuman powers of generative AI. This also has implications not just for Amazon’s voice assistant Alexa, but also for its e-commerce business. Just imagine how helpful it will be to have an intelligent “shopper” that understands all of your buying preferences, anticipates your daily needs, and can speak with you in a natural language almost indistinguishable from a human being. That’s what’s coming… and also what we’ll learn more about this week. The timing of these latest developments isn’t a coincidence, as Amazon’s annual conference, AWS re:Invent 2024, is being held in Las Vegas this week. And Amazon isn’t leaving anything to chance. One of its most strategic AWS AI customers, Anthropic, was potentially at risk with the announcement of Olympus. How to Buy and Keep Your Customers Olympus, in many ways, presents direct competition to Anthropic’s impressive suite of LLMs. It’s the kind of business that AWS wouldn’t want to lose. This is almost certainly why Amazon stepped up and announced an additional $4 billion investment in the latest venture capital round of Anthropic, which closed last month. That’s not a typo. Earlier this year, Amazon invested $4 billion in Anthropic, and it did the same again last month for a total of $8 billion. The last known public valuation of Anthropic was $19.3 billion this January, and this latest deal was rumored to be at $40 billion. I wouldn’t be surprised a bit if it wasn’t closer to $50 billion. If we’ve ever wondered, “What’s it all worth?” Or, “Is it all smoke and mirrors?” No, it’s not. The data points are everywhere. OpenAI will generate about $3.7 billion in revenue this year. And the majority of its costs are in compute and storage. Next year, OpenAI is forecasted to have more than $11 billion in revenue… and the same will be true. That’s why Amazon is willing to throw down $8 billion. It wants to ensure that Anthropic will continue to use AWS for both training and inference for its Claude LLMs. There are billions of dollars of business at stake. And while the terms of Amazon’s investment were not made public, Amazon did announce that Anthropic has now named Amazon its “primary training partner” for its generative AI technology. This was clearly part of the deal. This is significant because the two companies collaborated on the development of Amazon’s Trainium 2 AI-specific semiconductor. Amazon is now ramping up its contract manufacturing production of Trainium 2 to offer its new AI-specific semiconductor to all of its AWS customers. This is a major new AI semiconductor entry into the race to AGI. And depending on Trainium 2’s performance, could be a strategic advantage for Amazon. Either way, Trainium 2 is an accelerant for AI in general. And from a product perspective, Trainium 2 is designed to make its relationships with its AWS customers more sticky. AI developers spend a lot of time customizing their software code for a specific semiconductor hardware platform. It can be a pain to switch. So, as long as the hardware is getting the work done and it’s economical, there is little incentive to switch. Amazon’s deal with Anthropic also includes the use of Amazon’s Inferentia semiconductors. As you might have already guessed, the Inferentia chips are designed for inference – the running of AI software – as opposed to those designed for training. Compute costs are so critical, and so much money is involved that it makes sense to customize semiconductors for different AI-related tasks. This is why the market for AI-specific semiconductors is so exciting. This is an exciting week for Amazon and for that matter the entire industry. Amazon will spend about $75 billion this year on capital expenditures, almost all of which are AI-related. And next year, Amazon’s CEO expects to spend even more. This is money well spent. And Amazon will win in two ways with its $8 billion investment in Anthropic. Not only will it make billions through its AWS subsidiary, but Anthropic’s involvement in Trainium 2 will also be great for driving other AI players to use its custom AI-specific semiconductors. And whether Anthropic goes public or gets acquired, I suspect that Amazon will make an impressive return on its investment given Anthropic’s leadership role in the industry. Either way, Amazon wins… Regards, Jeff [Brownstone Research]( Brownstone Research
1125 N Charles St, Baltimore, MD 21201
[www.brownstoneresearch.com]( To ensure our emails continue reaching your inbox, please [add our email address]( to your address book. This editorial email containing advertisements was sent to {EMAIL} because you subscribed to this service. To stop receiving these emails, click [here](. Brownstone Research welcomes your feedback and questions. But please note: The law prohibits us from giving personalized advice. To contact Customer Service, call toll free Domestic/International: 1-888-512-0726, Mon–Fri, 9am–7pm ET, or email us [here](mailto:memberservices@brownstoneresearch.com). © 2024 Brownstone Research. All rights reserved. Any reproduction, copying, or redistribution of our content, in whole or in part, is prohibited without written permission from Brownstone Research. [Privacy Policy]( | [Terms of Use](