While electrons perform calculations by reducing the information to a series of 1s and 0s, photonic chips split and mix beams of light within tiny channels to carry out the tasks.
As per Open AI data, the amount of computational power needed to train large AI models has grown massively — doubling every three and a half months since 2021. GPT-3, which requires 3.14E23 FLOPS of computing for training, is a good case in point.
Typically, to carry out high-performance computing tasks, conventional AI chips are equipped with transistors that work with electrons. Although they perform a wide array of complex high performing tasks, energy consumption and engineering glitches pose a challenge. Thus, the growing need for computing power has set researchers on a quest to find a workaround to boost these chips’ power without increasing energy consumption.
And that’s when experts turned to photons and light particles that can easily substitute electrons in AI chips to reduce the heat, leading to a massive reduction in energy consumption and a dramatic upgrade in processor speed.
While electrons perform calculations by reducing the information to a series of 1s and 0s, photonic chips split and mix beams of light within tiny channels to carry out the tasks. Compared to regular AI chips, photonics chips are only designed to perform a certain kind of mathematical calculation, critical for running large AI models.
Lightmatter, an MIT-backed startup, last year developed an AI chip — Envise — that leverages photons (light particles) to perform computing tasks.
Photonics Computing
Lights travel faster than electrons. The concept of using light as a substitute for carrying out heavy tasks (aka photonics computing/optical computing) dates back to the 1980s, when Nokia Bell Labs, an American industrial research and scientific development company, tried to develop a light-based processor. However, due to the impracticality of creating a working optical transistor, the concept didn’t take off.
We experience optical technology in cameras, CDs, and even in Blue-Ray discs. But these photons are usually converted into electrons to deploy in chips. Four decades later, photonic computing gained momentum when IBM and researchers from the University of Oxford Muenster developed the system that uses light instead of electricity to perform several AI model-based computations.
Alongside, Lightmatter’s new AI chip has created a buzz in the industry. According to the company website, Envise can run the largest neural networks three times higher inferences/second than the Nvidia DGX-A100, with seven times the inferences/second/Watt on BERT-Base with the SQuAD dataset.
Japan-based NTT company has also been developing an optical computer believed to outpace quantum computing to solve optimisation problems. Last year, Chinese quantum physicist, Chao-Yang Lu, has also announced light-based quantum computing.
Other companies like US-based Honeywell and IonQ have also been working around the issue by using trapped ions.