Home Future This optical brain like computer chip analyses almost 2 billion images per second – By Futurist and Virtual Keynote Speaker Matthew Griffin

This optical brain like computer chip analyses almost 2 billion images per second – By Futurist and Virtual Keynote Speaker Matthew Griffin

0


WHY THIS MATTERS IN BRIEF

In the future machine vision will capture even more videos and images than it does today, and synthetic content will create even more, and we need computer chips that can process this insane volume of graphical data.

 

Love the Exponential Future? Join our XPotential Community, subscribe to the podcast, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

How many images can your brain – the brain in your head that many say is “the most complex thing in the universe” – process? A couple of hundred per second, a few thousand? Well now your puny mind has been bested good and proper after researchers at the University of Pennsylvania announced they’ve developed a powerful new optical chip for futuristic photonic computing systems that can process almost 2 billion images per second. The device is made up of a neural network that processes information as light without needing components that slow down traditional computer chips, like memory.

 

 

The basis of the new chip is a neural network, a system modelled on the way the brain processes information. These networks are made up of nodes that interconnect like neurons, and they even “learn” in a similar way to organic brains by being trained on sets of data, such as recognizing objects in images or words in speech. Over time, they become much better at these tasks.

But rather than electrical signals, the new chip processes information in the form of light. It uses optical wires as its neurons, stacked in multiple layers that each specialize in a particular type of classification.

In tests, the team made a chip measuring 9.3 mm2 (0.01 in2) and put it to work categorizing a series of handwritten characters that resembled letters. After being trained on relevant data sets, the chip was able to classify the images with 93.8 percent accuracy for sets containing two types of characters, and 89.8 percent accuracy for four types.

 

 

Most impressively, the chip was able to categorize each character within 0.57 nanoseconds, which would allow it to process 1.75 billion images per second. The team says that this speed comes from the chip’s ability to process information as light, which gives it several advantages over existing computer chips.

“Our chip processes information through what we call ‘computation-by-propagation,’ meaning that unlike clock-based systems, computations occur as light propagates through the chip,” said Firooz Aflatouni, lead author of the study.

“We are also skipping the step of converting optical signals to electrical signals because our chip can read and process optical signals directly, and both of these changes make our chip a significantly faster technology.”

 

 

Another advantage is that the information being processed doesn’t need to be stored, so it also saves time by not having to send data to memory, and space in not needing a component for memory at all. The team also says that not storing the data is also more secure, since it prevents any possible leaks.

The next steps for the team are to begin scaling up the chip, and adapting the technology to process other types of data.

“What’s really interesting about this technology is that it can do so much more than classify images,” said Aflatouni. “We already know how to convert many data types into the electrical domain – images, audio, speech, and many other data types. Now, we can convert different data types into the optical domain and have them processed almost instantaneously using this technology.”

 

 

All of which means that soon, very soon, the most complex thing in the universe might be a bunch of silicon wafers and an artificial intelligence …

The research was published in the journal Nature.

Source: University of Pennsylvania

LEAVE A REPLY

Please enter your comment!
Please enter your name here