🍔 Quick Bite: While most AI discussions focus on new models like Grok and GPT-4, the real competition might be in AI infrastructure. Companies like SoftBank, Microsoft, and Nvidia are investing billions in chips, computing power, and talent to secure their dominance.
🧠 The Breakdown
Every other week, there’s a new AI model in the spotlight: Grok, DeepMind, Gemini, GPT-4, and whatever’s next. The hype cycles are predictable: a big announcement, social media buzz, and think pieces dissecting its potential. But behind the scenes, the real power struggle is happening elsewhere. The companies shaping the future of AI are not just building models; they are acquiring the infrastructure that makes AI possible.
The biggest players aren’t just chasing breakthroughs in generative AI; they’re making massive bets on chips, computing power, and proprietary AI infrastructure. Because in this race, the ones who control the infrastructure will control the future of AI.
SoftBank’s strategic play for AI dominance
SoftBank’s $6.5 billion acquisition of Ampere Computing is the latest move in this quiet but crucial race. Ampere specialises in designing Arm-based chips optimised for AI and cloud computing. By adding Ampere to its portfolio, alongside its majority stake in Arm Holdings and last year’s acquisition of AI chipmaker Graphcore, SoftBank is positioning itself as a key player in AI hardware. This means that while the world focuses on AI software, SoftBank wants the future of AI to run on its hardware.
If you zoom out, Masayoshi Son’s (CEO of SoftBank Group) strategy becomes obvious: dominate the foundational layers of AI. With Arm, SoftBank controls the fundamental IP behind AI chips. With Ampere, it gains high-performance semiconductor design. Its $500 billion investment partnership with OpenAI further strengthens its position by ensuring its chips and infrastructure are well-placed to power some of the world’s most advanced AI models.
Microsoft’s bet on Infrastructure and AI talent
Microsoft is taking a similar approach, investing heavily in AI infrastructure while also acquiring top talent. The company recently hired key figures from Haiper, an AI video startup, to strengthen its AI division. But more importantly, it has been ramping up its investment in computing power.
To reduce reliance on Nvidia, Microsoft has been developing its own AI chips, such as the Maia AI accelerator, and securing long-term GPU (Graphics processing unit) supplies. It has also built exclusive partnerships with OpenAI and xAI, ensuring that its cloud infrastructure—Azure—remains the backbone of cutting-edge AI development. Microsoft recognises that future AI leadership depends not only on sophisticated models but also on the capacity to operate them at scale.
Nvidia: The AI kingmaker under threat
For now, Nvidia remains the dominant force in AI hardware. Its GPUs power the most advanced AI models from OpenAI, Google, and Meta, making it an indispensable player in the ecosystem. However, as demand for computing power skyrockets, major tech companies are working to reduce their dependence on Nvidia by developing their own AI chips.
Amazon has its Trainium and Inferentia chips, Google has its Tensor processing units (TPU), and Meta is reportedly developing in-house AI processors. This trend is driven not merely by cost considerations but by a desire for greater control. By owning their AI infrastructure, these companies can optimise performance, mitigate supply chain risks, and avoid potential constraints related to external suppliers’ pricing and availability.
The billions behind AI infrastructure
The scale of investment in AI infrastructure is staggering. Meta, Microsoft, Amazon, and Google’s parent company, Alphabet, are expected to collectively spend around $325 billion this year, much of it on AI-focused data centres, semiconductor research, and cloud infrastructure.
Even Apple, which has not been at the forefront of AI advancements, is planning a $500 billion investment over four years to build data centres and custom AI servers. These investments signal a shift in AI strategy; companies are no longer just competing on models but on the hardware and infrastructure that power them.
The battle for AI’s future
At its core, this is a power struggle. AI breakthroughs might grab headlines, but without the infrastructure to support them, they don’t mean much. The companies investing in AI hardware today are the ones shaping AI’s future tomorrow.
That’s why SoftBank is doubling down on chips. It’s why Microsoft is bringing talent in-house and building proprietary hardware. And it’s why Nvidia’s grip on AI is being challenged.
This isn’t just about who builds the next big AI model. It’s about who controls the foundation on which AI is built. And right now, that battle is just getting started.
📈 Trending Stories
Here are other important stories in the media:
- Vendease restructures employee salaries, seeks fresh funding
- Safaricom’s M-PESA decline continues as Airtel Money gains
- MTN’s $5.2B fintech spinoff moves forward, but faces regulatory delays in Nigeria
💼 Opportunities
We carefully curate open opportunities in Product & Design, Data & Engineering, and Admin & Growth every week.
Product & Design
- Mainstack — Senior Product Manager, Lagos
- She Code Africa — Design Lead, Lagos
- ReelFruit — Junior Graphic Designer, Remote
Data & Engineering
- Circle — Senior Quality Engineer, Remote
- Mainstack — Engineering Manager, Lagos
- Mainstack — Senior Frontend Engineer, Lagos
- Mainstack — Backend Engineer, Lagos
Admin & Growth
- Bamboo — Executive Assistant, Lagos
- Glovo — Growth & Marketing Lead, Lagos
- Mainstack — Finance Operations Manager, Lagos
- She Code Africa — Community Manager, Lagos