Big Tech’s AI Strategies, Each in a Phrase
In this video, I’ll summarize each big tech’s AI strategy with a phrase and give an informal review. Below is the text version.
OpenAI, the Scaling AI
Back in January 2020, OpenAI published a paper named Scaling Laws for Neural Language Models. It says the system will do better when you simply increase the model size, even with the same structure and datasets.
Just like nobody knew about COVID at the time, nobody cared about scaling laws either. Until, ChatGPT launched in November 2022, 10 million daily active users in 40 days. We know the rest of the story.
It was OpenAI that started this trend of bigger and bigger model sizes, exponentially.
What could be even cooler? They applied the scaling laws to other things: DALLE for images and Sora for videos. 3D modeling is upcoming.
I think OpenAI shows how the strong can be stronger. The text-to-video model Sora was based on the solid foundation of their own DALLE3 and GPT models, and their reliable data infrastructure.
If there has to be any complaint, it might be the controversy of the non-profit origin and its closed-source practice.
Google — the Ethical AI
8 years ago in 2016, Google announced itself to move from a mobile-first company to an AI company. Research at Google was scattered among different teams and ran relatively in a chill way.
So, when OpenAI took away the spotlight and rocked last year, Google’s like:
Please allow me to add a different angle here. Many machine learning methods used in today’s AI products were first proposed by researchers at Google; a lot of them are open-sourced. The Gemini model is pretty good at the benchmarks.
The public criticizes things like-
- Chaotic naming: Bard was renamed to Gemini for some reason. Bard, Gemini, Gemini Advanced, Gemini 1.5 Pro, Gemini Ultra, Gemini for Workspace, Gemini Business and Gemini Enterprise;
- Diversity: Gemini created images that involved minorities in historical events that were white men.
From ChatGPT and other AI products, we know the hallucination problem of large models making things up and that we shouldn’t treat AI-generated results as facts.
Because it’s Google, people may tend to have a much higher expectation. I get that but let’s be fair. I’d summarize it like this- Google has strong research teams, a wide range of products for potential integration, and an awareness of AI ethics.
Meta, the Open-Sourced AI
The Llama2 model had an accidental leak in July 2023, accidentally sparked a series of great open-source alternatives to ChatGPT.
The AI department at Meta advocates for open source. They say, “In the case of AI, the general infrastructure includes our Llama models, including Llama 3 which is training now and is looking great so far, as well as industry-standard tools like PyTorch that we’ve developed. ”
As for criticism-
- concerned about AI safety when such powerful tools are in the wild for everyone to use;
- not open enough due to some restrictions in their terms;
- the model is not as good as the close-sourced ones.
I won’t complain about Llama2 not being good enough. It’s free; you get what you pay for.
Another work worth noting is Meta’s open-source model, Segment Anything. Semantic segmentation is an important research task and is widely performed in the real world. Potentially replacing smaller segmentation models, Meta’s foundation model can perform very well in various fields. For example, this paper I contributed to uses Segment Anything to segment roads and turn them into road network graphs, which achieved some cool results.
Amazingly, Meta went from a social media platform to metaverse to something AI. Llama 3 is coming in the next 2 or 3 months. It will be integrated into Meta’s products like WhatsApp, Messenger, Instagram, and smart glasses. Let’s look forward to it!
Microsoft, the One Backing OpenAI
Microsoft owns 49% of OpenAI and has the right to up to 75% of OpenAI’s profits until it pays off the investment money. I believe Microsoft’s support also played a big part in Sam Alterman’s return to OpenAI.
OpenAI fits well into Microsoft’s AI Cloud ecosystem:
Another good product to mention- Microsoft also acquired GitHub, which owns the copilot, one of the best AIs for coding. Microsoft is so smart at getting the cool stuff through investment.
According to IoT analytics, “Despite its strong partnership with OpenAI, Microsoft also heavily promotes the usage of other models, such as Llama 2, via its platform. Another key priority for Microsoft is integrating AI capabilities into its existing product portfolio, such as Azure, Microsoft/Office 365, and Bing” which I think is an accurate depiction.
Nvidia, the Hardware AI
Nvidia has been selling GPUs to the big techs we mentioned and we know selling shovels is always a safe, rewarding bet.
The demand for GPUs soared last year, and Nvidia therefore strengthened its dominant role in the market.
What’s unique about Nvidia is its software ecosystem. CUDA differentiates Nvidia so much in the market; other open-source libraries like Nemo and Megatron help deliver the value further.
Nvidia has a few competitors including AMD and Intel. The other big techs are working on chips too. Do you think Nvidia will keep its leading position? Leave a comment below!
Tesla, the Driving AI
Tesla’s approach to self-driving is different as it started with electric cars. Every vehicle it sells has the potential to contribute data back to Tesla, invaluable for training machine learning models.
Imagine when a model observes all the places like a human wandering around, it will learn so much about the laws and patterns of the world that we’ll be getting something like a world model. No wonder why “Tesla is building the foundation models for autonomous robots.” (Tesla Tweet)
Tesla is more than cars. The Tesla robot, named Optimus, is a humanoid robot. It shares some technology with Tesla cars and can do a lot of things.
Amazon & Apple
Although it looks like not a lot going on with AI, we do have some clues.
For Amazon, “Jassy said in an annual shareholder letter published Thursday that he is ‘optimistic that much of this world-changing AI will be built on top of AWS.’” (Source)
I agree. From my own experience, running AI on AWS is neat.
Apple “plans to disclose more about its plans to implement generative AI (Gen AI) into its products later this year, according to Chief Executive Officer Tim Cook.” (Source)
Since Apple has always been good at keeping secrets and announcing right on the release dates, I believe they have something planned.
Thoughts
Company-wise efforts for the AI battle
Because of the scaling laws, and the fact that they are now in wartime, big techs are making cross-team efforts to move quicker. That’s why we see the merge of DeepMind and Google Brain. By bringing these teams together, Google not only reduces redundancy but also significantly amplifies its capabilities, driving faster progress in the field.
The focus shift from academic influence to business needs
Tech giants used to publish papers to enhance their industry influence but now they no longer publish technical details. While fundamental research remains important, there is a notable shift towards productionization. We see a lot of engineering efforts before and after model training, such as meticulous data cleaning and the active collection of user feedback. The goal has changed to be more realistic and emphasizes business benefits.
Integrations with existing products
Naturally, big techs seek integration of AI technologies into existing products to leverage their vast user bases and data infrastructure. Pure AI technology is not enough to establish a moat; it’s always AI plus the advantages in a specific domain. For example, Google says they’ll use AI to improve search and cloud, in its Q4 2023 earning call.
Room for Startups is not as big as imagined
Big tech companies have all the talents, data, infra, and compute, and can build powerful foundation models generalized to many niche areas. Remember the scaling laws- it sets a high barrier for entry. Startups face big challenges in finding opportunities and securing resources. The overall space is narrower than it might seem, pushing startups to find underserved markets or innovative approaches to make an impact.
*Disclaimer: Views are my own and based on public information. *