Korean researchers develop tiny, low-power AI chip to enable complex language tasks on mobile devices
• Korean researchers develop new AI chip called C-Transformer, claiming it uses 625x less power and is 41x smaller than Nvidia's A100 GPU • C-Transformer leverages refined neuromorphic computing to efficiently process large language models while matching accuracy of deep neural networks • Chip has die area of 20.25mm2, runs at 200MHz, consumes under 500mW power, and achieves 3.41 TOPS • Architecture includes Homogeneous DNN-Transformer Core, Output Spike Speculation Unit, and Implicit Weight Generation Unit • No direct benchmark comparisons provided between C-Transformer and Nvidia A100, but chip shows promise for mobile applications