MIT spinout Lightmatter Inc. today announced that it has raised another $80 million in funding to finance the development and commercialization of its optical artificial intelligence chips, which use photons to perform calculations.
Viking Global Investors led the round. The firm was joined by Alphabet Inc.’s GV venture capital arm, Hewlett Packard Enterprise Co. and multiple other institutional backers.
Lightmatter’s flagship product is an optical chip called Envise (pictured) that can be used to perform AI inference, or running machine learning models that are already trained. The startup says that Envise makes it possible to run some AI workloads up to seven times more efficiently than on graphics cards.
The chip is based on optical computing technology that Lightmatter’s founders developed during their time as researchers at MIT. Envise encodes data into laser signals and passes them through tiny optimal channels. Those channels manipulate the signals to perform computations. Envise can’t perform as wide a range of tasks as a standard processor, but the chip lends itself well to certain linear algebra operations, mathematical calculations used by AI models to crunch data.
Each Envise chip combines two optical computing modules with 256 conventional cores that are responsible for orchestrating the data processing workflow. Data is stored on an onboard 500-megabyte pool of memory for quick access. For connectivity, there’s an industry-standard PCIe 4 interface that allows Envise to be linked to existing data center infrastructure in a relatively simple manner.
Lightmatter claims that, though at an early stage, its technology is already fast enough to outperform the industry’s fastest graphics cards in some cases.
The startup has run tests comparing a server packing four Envise chips with an Nvidia Corp. DGX-A100 appliance, which features the chipmaker’s top-end A100 data center graphics cards. The task was to run a version of the popular BERT machine learning model. Lightmatter claims that the Envise-powered server managed to provide three times higher inference performance than the DGX-A100 and seven times better power efficiency.
The startup will reportedly use a significant part of its new $80 million round to build an initial batch of chips for early customers. Lightmatter Chief Executive Officer Nick Harris told TechCrunch that those customers include hyperscale data center operators, but didn’t share any names.
Lightmatter’s go-to-market plan has several other components besides the Envise chip itself. The startup is working on another product, Passage, for connecting multiple processors to each other. AI applications are often distributed across a larger number of chips to speed up computations, which requires linking the chips at the hardware level. Lightmatter claims Passage can offer faster connectivity at lower cost than traditional products.
The startup is also developing a software platform called Idiom to make it easier for developers to deploy AI models on Envise chips. According to Lightmatter, Idiom provides the ability to run neural networks created in popular frameworks such as TensorFlow without major code changes. In data centers containing multiple Envise-powered servers, the software can also handle the logistics of splitting processing tasks among the systems.
Since you’re here …
Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!
Support our mission: >>>>>> SUBSCRIBE NOW >>>>>> to our YouTube channel.
… We’d also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.