Harnessing noise in optical computing for AI

Artificial Intelligence and Machine Learning are currently influencing our lives in many small but impressive ways. For example, AI and machine learning applications recommend entertainment that we can enjoy through streaming services such as Netflix and Spotify.

In the near future, these technologies are predicted to have an even greater impact on society through activities such as driving fully autonomous vehicles, enabling complex scientific research and facilitating medical discoveries.

But the computers used for AI and machine learning require a lot of energy. Currently, the computing power requirement related to these technologies doubles approximately every three to four months. And the cloud computing data centers used by AI and machine learning applications around the world are already consuming more electricity per year than some smaller countries. It is easy to see that this level of energy consumption is not sustainable.

A research team led by the University of Washington has developed new optical computing hardware for AI and machine learning that is faster and more energy efficient than conventional electronics. The research also focuses on another challenge: the ‘noise’ inherent in optical computing, which can hinder computing precision.

In a new article published January 21 in Scientific Progress, the team is demonstrating an optical computing system for AI and machine learning that not only reduces this noise, but also uses some of it as input. in order to increase the creative output of the artificial. Neural networks within the system.

“We have built an optical computer that is faster than a traditional digital computer,” said lead author Changming Wu, a UW doctoral student in electrical and computer engineering. “And this optical computer can even create new things based on random inputs generated by optical noise most researchers were trying to avoid.”

Optical computer noise mainly comes from scattered light particles, or photons, from the operation of the laser in the device and the thermal background radiation. To counteract the noise, the researchers connected their optical computing core to a special type of machine learning network called a generative adversarial network.

The team tested several noise mitigation techniques, including using some of the noise generated by the optical computing core to serve as a random input to the GAN.

For example, the team tasked GAN with learning how to write the number “7” by hand like a person would. The optical computer simply could not print numbers according to a prescribed font.

The child had to learn the task by looking at visual examples of handwriting and practicing until he was able to write the number correctly. Of course, optical computers didn’t have a human hand for writing, so the form of “handwriting” was to generate digital images that had a style similar to, but not identical to, the samples they studied.

“Instead of training the network to read handwritten numbers, we trained the network to learn to write numbers, mimicking visual examples of handwriting,” said senior author Mo Lee, a UW professor of electrical and electronic engineering. computer technology.

“We, with the help of our computer science colleagues at Duke University, have also shown that GANs can reduce the negative effects of optical computer hardware noise by using a training algorithm that is resistant to errors and noise. In addition, The network actually uses the sounds as the random input needed to generate the output instance.”

After learning from handwritten examples of the number seven, which came from a standard AI training image set, GAN practiced writing “7” until he could do it successfully. Gradually, it developed its own distinct writing style and was able to write numbers from one to 10 in computer simulation.

The next steps include manufacturing this device on a large scale using current semiconductor fabrication techniques. So instead of building the next version of the device in a lab, the team plans to use an industrial semiconductor foundry to realize the wafer-scale technology. A larger-scale device would further improve performance and allow the research team to perform more complex tasks beyond handwriting generation, such as creating images and even videos.

“This optical system represents a computer hardware architecture that can enhance the creativity of artificial neural networks used in AI and machine learning, but more importantly, it demonstrates the feasibility of this system at that scale.” where noise and errors can be reduced and even exploited.” Lee said. “

AI applications are growing so rapidly that their energy consumption will not be sustainable in the future. This technology has the potential to help reduce energy consumption, making AI and machine learning environmentally sustainable – and Much faster, helping to achieve higher performance overall.”

2 thoughts on “Harnessing noise in optical computing for AI”

Leave a Comment