Twenty-five years ago, artificial intelligence (AI) was still largely a pipe dream for most of us. It was simply something that was talked about in science fiction books and movies. Nowadays, AI is all around us. AI is being used to fulfill customer service roles, enhance manufacturing techniques, and even secure people’s homes.
Back in 2011, IBM’s Watson became the first ever computer to beat a human player on Jeopardy! But, it’s not really a fair fight when you think about it. “At the time, Watson was connected to a supercomputer the size of a room while the human brain is just a few pounds,” says Jeehwan Kim, the Class ‘47 Career Development Professor and a faculty member of MIT’s Department of Mechanical Engineering and the Department of Materials Science and Engineering.
“But the ability to replicate a human brain’s ability to learn is incredibly difficult,” says Kim, defending dear Watson. “Machine learning is cognitive computing. Your computer recognizes things without you telling the computer what it’s looking at.” But, machine learning is just one aspect of AI. Smart devices and systems are another examples. In order for any of these devices to work, both the hardware and software much work together. Any radar, cameras, sensors, and light detection all need to be able to effectively feed information back to the computers.
Kim and others at MIT’s Department of Mechanical Engineering are in the process of developing new software that links up to hardware to create intelligent devices. Currently, most neural networks have been developed using the Von Neumann computing method and have been software-based. Kim has been taking a different approach and been using neuromorphic computing methods instead. “Neomorphic computer means portable AI,” says Kim. “So, you build artificial neurons and synapses on a small-scale wafer.”
Instead of using binary signaling to compute information, Kim’s neural network works like an analog device when processing information. The other main difference is the material used to make the artificial synapses. Previously, amorphous materials would be used in neuromorphic chips. The problem with this type of material is that it’s hard to control the ions once voltage has been applied. So instead, Kim and colleagues used a chip made from silicon germanium which allowed them to reduce variability down to just one percent, as well as control the current flowing out of the synapse.
“We envision that if we build up the actual neural network with the material we can actually do handwriting recognition,” says Kim. So far, their neural network has a success rate of 95 percent when asked to identify handwriting samples. “If you have a camera and an algorithm for the handwriting data set connected to our neural network, you can achieve handwriting recognition.” While handwriting recognition may well be the next step for Kim and his team, the possibilities for this kind of technology are endless. It could be used in computers, phones, and even robots to make them all considerably more intelligent.
As well as making portable products more intelligent there’s also a big push to make homes smarter too, and that’s what Professor Sanjay Sarma and Research Scientist Josh Siegal have set out to do. One evening, Sarma experienced the annoyance of a faulty circuit breaker at home, where it just kept going off. While these breakers, also known as arc-fault circuit interrupters (AFCI), are designed to power down devices when a fire is detected, there was no such issue.
AFCI trips are both common and incredibly annoying, and Sarma decided to take this annoying problem and create a solution for it. “Think of it as a virus scanner,” says Siegal. “Virus scanners are connected to a system that updates them with new virus definitions over time.” Sarma and Siegal figured that by embedding similar technology into AFCIs would enable them to detect what product was plugged in and learn that it’s a safe object.
The way the neural network is built is by collating thousands of data points together that have been gathered during arcing simulations. Algorithms are then used to help recognize patterns and make probability decisions. The team is quite confident that with just a sound card and a cheap microcomputer they can easily integrate this technology into circuit breakers.
But, this is just one area in which neural networks could be used to make homes more intelligent. It could also be used to detect if there are any anomalies within the home such as a burst pipe or an intrusion; it could control the temperature within the home, or it could even run a diagnostic to see if anything needs fixing. “We’re developing software for monitoring mechanical systems that are self-learned,” says Siegel. “You don’t teach these devices all the rules, you teach them how to learn the rules.”
AI improves how we interact with devices, products, and environments. It also helps to improve the efficiency of manufacturing and design processes. “Growth in automation along with complementary technologies including 3-D printing, AI, and machine learning compels us to, in the long run, rethink how we design factories and supply chains,” says Associate Professor A. John Hart. “Having 3-D printers that learn how to create parts with fewer defects and inspect parts as they make them will be a really big deal – especially when the products you’re making have critical properties such as medical devices or parts for aircraft engines.”
At another corner of the MIT campus, also on the mission to develop the world of AI, lies Professor Sangbae Kim, and his robotic cheetah. Using LIDAR technologies, this four-legged machine senses its environment and moves accordingly. And as its name suggests, it can even run and leap over things. Kim is hoping to someday team up with Jeehwan Kim and his neural network. Together they could make they cheetah run and jump as well as recognize people, voices, and even gestures.
While we still may be a way off interacting with robots on an intelligence level, both AI and machine learning have already integrated themselves deep within our lives. Whether it’s using the internet of things to keep our homes safe or it’s using handwriting recognition to protect our information, the benefits of AI are hard to ignore and will only get better and better.
More News to Read
- Researchers Develop Technique that can Remotely Operate Lab-Grown Heart Cells
- How to Improve Your Animal Research Lab Activities
- New Research Sees Big Improvements in Thermoelectric Performance
- New Sperm Stem Cell Research Reveals Worrying Results
- Four Ways Technology is Saving The World