In 2022, before ChatGPT completely revolutionized the world of artificial intelligence, Etched decided to invest heavily in transformers.
With this focus, the startup has developed Sohu, a specialized ASIC chip specifically designed for transformer models. This architecture is the basis of ChatGPT, Sora, and Gemini.
Sohu has just one trick up its sleeve: it can’t run machine learning models like Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), or Long Short-Term Memory Networks (LSTMs), but Etched claims the system is unmatched in terms of transformers, capable of outperforming Nvidia’s flagship B200 GPU by nearly ten times in speed.
It’s all about scalability
Because Sohu is designed exclusively for transformer models, it can avoid the complex and often unnecessary control logic that general-purpose GPUs must handle to support a wide range of applications.
By focusing exclusively on the computational needs of transformers, Sohu can devote more resources to performing mathematical operations, which are the core tasks of transformer processing.
This streamlined approach allows Sohu to utilize over 90% of its FLOPS capacity, significantly higher than the approximately 30% utilization seen in general-purpose GPUs. This means that Sohu can perform more computations in a given period of time, making it much more efficient for transformer-based tasks.
The world has seen a dramatic increase in the use of transformer models, and every major AI lab – from Google to Microsoft – is committed to scaling this technology further. With performance reaching over 500,000 tokens per second in Llama 70B throughput, Sohu is an order of magnitude faster and more cost-effective than next-gen GPUs.
Etched believes the shift to specialized chips is inevitable and aims to stay ahead of the curve. “Current and next-generation state-of-the-art models are transformers,” the company says. “Tomorrow’s hardware stack will be optimized for transformers. Nvidia’s GB200s have special support for transformers (TransformerEngine). ASICs like Sohu entering the market mark the point of no return.”
Etched reports that production is ramping up in Sohu, with substantial orders already placed. “We believe in the hardware lottery: the models that win are the ones that can run on hardware the fastest and cheapest. Transformers are powerful, useful, and profitable enough to dominate every major AI computing market before alternatives are ready.”
The company adds: “Transformer Killers should run faster on GPUs than Transformers on Sohu. If that happens, we’ll build an ASIC for that too!”