AI drives skin sensor
Advanced techniques have been used to create new skin-like wearable technology.
Researchers from Monash University have combined nanotechnology and artificial intelligence to bring machines one step closer to communicating with the human body.
Using specialised algorithms, personalised AI can now disentangle multiple body signals, understand them and make a decision on what to do next.
Published recently in Nature Nanotechnology, the research could change how remote healthcare is delivered, and create a new generation of personal alarms and communications devices.
Worn on the neck, lead researcher Professor Wenlong Cheng says his team’s new ultra-thin wearable patch has three layers, measuring speech, neck movement and touch.
It also measures breathing and heart rates.
“Emerging soft electronics have the potential to serve as second-skin-like wearable patches for monitoring human health vitals, designing perception robotics and bridging interactions between natural and artificial intelligence,” Professor Cheng says.
Associate Professor Zongyuan Ge, from the Faculty of Information Technology, is part of the Monash team to have developed a frequency/amplitude-based neural network called Deep Hybrid-Spectro, that can automatically monitor multiple biometrics from a single signal.
“As people all sound and act differently, the next step is to program and personalise the sensors using even more sophisticated algorithms so they can be tailored to individuals,” Associate Professor Ge added.
The sensor is made from laminated cracked platinum film, vertically aligned gold nanowires and a percolated gold nanowire film.
Neck skin is the most sensitive skin on the body and connects up to five physiological activities associated with the human throat: speech, heartbeats, breathing, touch and neck movement.