Researchers in the emergin field of “neuromorphic computing” at the US National Institute of Standards and Technology have built something that might make a huge impact on the advancement of AI. A magnetically controlled electronic synpase.
Some of you may wonder what a synampse is. MIT News explains it here better than I ever could:
“Packed within the squishy, football-sized organ are somewhere around 100 billion neurons. At any given moment, a single neuron can relay instructions to thousands of other neurons via synapses — the spaces between neurons, across which neurotransmitters are exchanged. There are more than 100 trillion synapses that mediate neuron signaling in the brain, strengthening some connections while pruning others, in a process that enables the brain to recognize patterns, remember facts, and carry out other learning tasks, at lightning speeds.”
Let’s not get too technical. What is interesting about this artificial synapse, is that it can fire millions of times faster, and use 1000 times less energy than the synapses in your brain. That means that this has the potential to allow us to build super-fast computer chips. These chips work like the human brain and are not limited to binary like digital ones today.
Like the human brain, these chips could efficiently process millions of streams of parallel computations. Today we can only to this with massive super-computer banks. This would make the chips perfect for modern AI and would allow us to solve one of the biggest challenges we have had with AI. Building portable, low-power chips for use in pattern recognition and learning tasks. This has been particularly tricky to reproduce in hardware because of the lack of neural synapse technology.
MIT News writes that their engineers have build a small chip using this technology. In simulation, it can be used to recognize samples of handwriting with 95% accuracy, only 2% less accuracte than existing software algorithms.
There are, however, problems. For example, we have not yet managed to create a large device with this technology, so its future is not exactly clear. Another big problem is that they run at temperatures of 5K, which is way too cold for any practical computin device.
Jeehwan Kim, principal investigator in MIT’s Research Laboratory of Electronics, says:
“Ultimately we want a chip as big as a fingernail to replace one big supercomputer, this opens a stepping stone to produce real artificial hardware.”
It is very interesting in term of super power computers. And to be honest only in this case, for the future decade. In my opinion it is still impossible to implement cheap solutions into real life as a part of human’s brain. Even, when it’s only to researche purpouse. The main reasons are threaths of ethics. However, in case of increasing computing power,this may have a great future. Nevertheless, we can’t forget about increasing potential of quantum computing, which might be a rival to artificial synapses solution.