Off-Prem

PaaS + IaaS

Lambda on the hunt for 'another $800M' to fuel its GPU cloud

Why sell shovels when you can rent them


In the AI gold rush, if you can't be the one selling the GPUs then the next best thing could be to rent them. This week, we learned that Lambda is seeking $800 million in funding to do just that.

Founded in 2012, San Jose-based Lambda is no stranger to accelerated computing, having got its start building systems specifically for machine-learning R&D. It later expanded into colocation services before launching a GPU cloud service in 2018.

According to a Financial Times report, citing people familiar with the matter, term sheets for the impending funding round are expected by mid-July and JPMorgan is helping to coordinate the affair. The capital would be used to purchase additional Nvidia GPUs and associated network infrastructure and software, and to hire staff.

Leasing large quantities of GPUs, particularly for those training custom models, has become a lucrative business in recent years. As our sibling site The Next Platform recently determined, a cluster of 16,000 H100s costing $1.5 billion to purchase, deploy, and network would generate roughly $5.27 billion in revenue over the course of four years.

Of course, to play this game you still need a lot of capital to buy the GPUs in the first place — something that, so far, Lambda has had no shortage of success doing. Back in April, it secured half a billion dollars in debt financing to purchase "tens of thousands" of Nvidia's fastest accelerators, which served as collateral for the loan.

The debt financing came on top of the $320 million series-C funding round it announced back in February, the majority of which is also going toward Nvidia GPUs. The Register reached out to Lambda for comment regarding the reported funding round.

While $800 million may sound like a lot of capital for an upstart biz, it's far from the biggest figure we've seen in this field lately. Many AI outfits have seen their valuations skyrocket as hype over neural networks reaches new heights.

In May, CoreWeave, another bit barn peddling cut-rate GPU rentals, scored $1.1 billion in a series-C round. That same month, the cloud provider used its enormous collection of GPUs as collateral for a $7.5 billion loan backed by Blackstone, BlackRock, and others.

Meanwhile, similar operations such as Voltage Park and TensorWave have looked to recreate Lambda and CoreWeave's successes. However, it's not just AI infrastructure vendors that have seen their valuations take off.

Back in May, so-called data foundry Scale AI, which provides high-quality datasets used in AI training, saw its valuation touch $14 billion after VC firms and AI heavyweights, such as Nvidia, Amazon, and Meta, injected $1 billion into the firm.

Some feel we're heading into bubble-bursting territory. That's either wishful thinking – because the hype may only just be getting started – or a shallow acknowledgment that all good things eventually come to an end. Nvidia's market cap dipped $500 million last week and that was largely inconsequential: The GPU titan's stock price is holding steady today, being up nearly eight percent this past month and up more than 150 percent in the past half-year, and its market cap is sitting pretty north of $3 trillion. ®

Send us news
3 Comments

Nvidia and chums inject $160M into Applied Digital to keep GPU sales rolling

Datacenters are the lifeline for its $30B ML-fueled boom

Nvidia admits Blackwell defect, but Jensen Huang pledges Q4 shipments as promised

The setback won't stop us from banking billions, CFO insists

AI-pushing Adobe says AI-shy office workers will love AI if it saves them time

knowledge workers, overwhelmed by knowledge tasks? We know what you need

DoJ reportedly advances Nvidia antitrust probe

Uncle Sam apparently worried GPU giant may be punishing customers who shop around

Buying a PC for local AI? These are the specs that actually matter

If you guessed TOPS and FLOPS, that's only half right

Nvidia's growth slows to a mere 122 percent but it’s still topping expectations

Still growing in China, ramping Hopper prods and predicting Blackwell billions soon

Tenstorrent's Blackhole chips boast 768 RISC-V cores and almost as many FLOPS

Shove 32 of 'em in a box and you've got nearly 24 petaFLOPS of FP8 perf

AMD's Victor Peng: AI thirst for power underscores the need for efficient silicon

Moore's Law may be running out of steam, but there are still knobs to turn and levers to pull

Copper's reach is shrinking so Broadcom is strapping optics directly to GPUs

What good is going fast if you can't get past the next rack?

LiquidStack says its new CDU can chill more than 1MW of AI compute

So what’s that good for? Like eight of Nvidia’s NVL-72s?

AI's thirst for water is alarming, but may solve itself

Its energy addiction, on the other hand, only seems to get worse

Canadian artist wants Anthropic AI lawsuit corrected

Tim Boucher objects to the mischaracterization of his work in authors' copyright claim