• April 13, 2025
  • Arpan Rai
  • 0

Almost a year after its fateful launch, Sutskever’s latest co-founded company, Safe Superintelligence (SSI), has surged to become one of the most valuable artificial intelligence companies (with backing of Alphabet and Nvidia).

The funding indicates that, as before around AI work, there are interests between infrastructure suppliers and supposedly big IT firms to invest in startups that are developing pioneering artificial intelligence (AI) because of their need of massive processing power. Earlier this week, Alphabet’s cloud computing business also said it was repurposing its own AI processors to act as tensor processing units to its own SSI.

The spokespersons from all three firms declined to comment.

A part of the internet giant’s evolving AI hardware strategy, Alphabet’s business and cloud divisions are creating renowned AI labs like SSI and Anthropic.

At first, TPUs were intended to be utilized internally at Google. There is nothing novel about businesses paying to take their software somewhere and sell it rather than feed their revenue into it, Google managing director Darne Marlowy said last week while discussing the agreement to sell chips in SSI chips large enough to help fund Google’s frontier AI research.

As a result of these model makers, he said that ‘gravity is increasing several orders of magnitude against humanity’.

More than 80 percent of the market for AI (artificial intelligence) chips is powered by Nvidia graphics processing units (GPUs), which have been the desired choice of developers of the technology for years.

However, two sources say that SSI now mainly relies on TPUs rather than GPUs to do AI research and development.

Google also offers its own TPU as well as Nvidia GPUs on top of its cloud services. The processors do jobs that require more efficiency than a general-purpose GPU. This is generated utilizing these processors, which we use to develop large-scale AI models such as the ones Apple recently acquired and Anthropic, which became one of the most successful AI businesses, raising billions of dollars from Google and Amazon.

Amazon is also another competitor of Google and Nvidia whose CPUs it is building to compete with GPUs: its Trainium and Inferentia. Amazon said, last year, that Anthropic would continue to improve its technology on those chips through the end of next year. The IT giant said in December that Anthropic would be the first client to utilize a huge supercomputer that has more than 100,000 of its own CPUs.

According to two sources, Anthropic has not reduced its spending on Google processors and continues to use TPUs in the development of artificial intelligence.

Meanwhile large cloud providers have started to invest many millions of dollars in developing robust AI firms to create such foundational models and to become large users of their own infrastructure at the same time. Just for example, Microsoft has spent almost everything on OpenAI, and both Google and Amazon have put their money into Anthropic. Moreover, Nvidia stands behind OpenAI and xAI (at times promoted by Elon Musk).

Leave a Reply

Your email address will not be published. Required fields are marked *