USC researchers built a memristor that works at 700C, surviving conditions that killed every Venus probe. TetraMem is commercialising the technology for AI inference.
Nvidia faces competition in AI inference from startups like SambaNova, Groq, and Cerebras. Inference, the production stage of AI, is seen as a key market by these startups. Startups claim superior ...
Google expects an explosion in demand for AI inference computing capacity. The company's new Ironwood TPUs are designed to be fast and efficient for AI inference workloads. With a decade of AI chip ...
Quantum Computing Inc. ("QCi" or the "Company") (NASDAQ: QUBT) an innovative, quantum optics and integrated photonics technology company, today announced that NeuraWave, its next-generation photonic ...
Sales of Intel's central processing units and custom AI processors are gaining traction as AI inference workloads grow.
CHENGDU, April 25 (Xinhua) -- In a futuristic office resembling a space station, Zhao Hongjie, executive vice president of Adaspace Technology Co., Ltd. (ADAspace), outlined a grand vision: to bring ...
Recently, a proliferation of focus in Artificial Intelligence (AI) on floating-point digital hardware has led to the development of revolutionary large foundation models, with billions of parameters ...
Machine learning inference models have been running on X86 server processors from the very beginning of the latest – and by far the most successful – AI revolution, and the techies that know both ...