After OpenAI’s ChatGPT went viral, Microsoft (MSFT) employed an aggressive strategy to take search traffic away from Alphabet’s (GOOG) Google by allowing users to jump place in the queue by downloading the Bing app, Microsoft Edge, and by making Bing the user’s preferred search engine.
This challenged the status quo as most devices come pre-loaded with Google as the default search and are rarely ever changed from there.
While the battle for search is well underway, the AI war is really just heating up as Google, Amazon (AMZN), and Microsoft, among others, increasingly target professionals and startups via cloud services.
Models tied to ChatGPT have a clear advantage over competitors due to their first mover and usability advantages. And the newly released ChatGPT-4 is already wowing professional users as it creates entire websites based on a sketch on a napkin and recreated the game Pong in under 60 seconds.
Google and Microsoft alike have been making a mad dash to integrate AI into their full suite of products, while Amazon Web Services stumbles behind, likely in third, as it partners with the worst-named AI company to date, Hugging Face.
While Bing aggressively went after chipping away at Google's more-than-dominant search market share, Google and Amazon have seemingly gone where it hurts for Microsoft as both hope to host swarms of AI startups while pursuing generative AI of their own.
Google said it is offering AI-focused startups $250,000 in free cloud use, which it says will cover computing horsepower and storage for the first year.
This is huge as, during my time using New Bing - which I’ve exclusively used for the Chat feature - I haven’t seen a single ad. At least not one that I can recall. I get there is some scaling going on here, where Microsoft is providing an unsustainable user experience now to flood it with ads later, as Facebook and many others did in the past.
But still, ads in chat might be less useful - or vastly more hated - than what we see on Google today. Consider the ads at the top of search pages, which are all query related. You can shoehorn those into Chat, sure.
But I may have just reached a definitive answer when I asked about the pros and cons of various tires, or whatever. If my decision gets made in chat, anything query related is rendered irrelevant unless the conclusion of my chat happened to match the ad. This would be a wasted ad purchase as far as the advertising party is concerned since the resulting decision was for their tires already.
The other side of this is, instead of the ad being rendered obsolete by the chat, the chat starts pushing that which it advertises. Becoming a digital sales intelligence that subtly - or not - influences purchases either without people's knowledge or even against their will.
Now, the AI that was supposed to be a helpful tool becomes utterly useless as it has an immense bias that will perturb shoppers.
Unrelated ads in chat could work, but given that Google is a company that has evolved over the years, I’d be willing to wager that the success of those ads is due to their relevance, and that's why they have stuck around.
Thus, Google and Amazon’s cloud moves are important steps forward as we plow toward a likely winner-take-all (or most) scenario. AI companies are attractive customers that spend a lot of money on cloud services, and securing them on your platform is profitable today and in the future.
Given the limitations that Chat advertising will likely face, being the cloud of choice for companies pursuing AI, while developing and implementing your own models that improve developer experience, is an important battleground in the AI Wars.
AI is expected to be a winner-take-all industry, which reminded me of an excellent Ted Talk on the subject by Sam Harris that very clearly outlines why.
Sam Harris noted how processing speed alone will propel us into the unknown. In the seventh minute of the video, Harris says: “Imagine if we just build a superintelligent AI that was no smarter than your average team of researchers at Stanford or MIT”.
Harris notes that computers can think “about a million times faster” than humans. When that AI runs for its first week, it could complete about “20,000 years of human-level intellectual work”. Work that gets repeated week after week, tirelessly.
To have an AI that is six months ahead of a rival is to be “500,000 years ahead” technologically, according to Harris.
At the point of singularity, and perhaps even before as models creep closer to said point, there becomes no point in using recently outdated models.
Thus, there could really only be a few winners here. The company that hosts the AI model on its cloud and the company that creates it. Making cloud services just as important to these companies as the AI model itself. The third winner is the leading chipmaker.
As of now, ChatGPT and other AI models are well off it, at least we think they are. AI models are comically referred to as giving employers unlimited access to dumb people.
But ChatGPT-4, beyond what I mentioned above, has already had some serious claims for its increased intelligence. Dan Shipper shared on Twitter that GPT-4 can do drug discovery, saying it can find compounds with similar properties to existing drugs, modify them to avoid copyright infringement, and purchase from suppliers (while even drafting the purchase order).
It might not yet be at the winner take all stage, but we are fast approaching a point where it will cease to make sense to even try a model that isn’t the most advanced.
While Amazon and Google play catch-up on the AI front, they, alongside Microsoft, are heating up the battle to host as many AI models as possible, in the next battleground for AI supremacy.