Cheap Website Traffic

Researchers Create Robot Skin that Could Transform Neuroprosthetics

[ad_1]

FavoriteLoadingAdd to favorites

Sensitive, anthropomorphic robots creep closer…

A team of National University of Singapore (NUS) researchers say that they have created an artificial, robot skin that can detect touch “1,000 times faster than the human sensory nervous system and identify the shape, texture, and hardness of objects 10 times faster than the blink of an eye.”

The NUS team’s “Asynchronous Coded Electronic Skin” (ACES), was detailed in a paper in Science Robotics on July 17, 2019.

It could have major implications for progress in human-machine-environment interactions, with potential applications in lifelike, or anthropomorphic robots, as well as neuroprosthetics, researchers say. Intel also believes it could dramatically transform how robots can be deployed in factories.

This week the researchers presented several improvements at the Robotics: Science and Systems, after underpinning the system with an Intel “Loihi” chip and combining touch data with vision data, then running the outputs through a spiking neural network. The system, the found, can process the sensory data 21 percent faster than a top-performing GPU, while using a claimed 45 times less power.

Robot Skin: Tactile Robots, Better Prosthetics a Possibility

Mike Davies, director of Intel’s Neuromorphic Computing Lab, said: “This research from National University of Singapore provides a compelling glimpse to the future of robotics where information is both sensed and processed in an event-driven manner.”

He added in an Intel release: “The work adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms, and hardware architecture.”

Intel conjectures that robotic arms fitted with artificial skin could “easily adapt to changes in goods manufactured in a factory, using tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping. The ability to feel and better perceive surroundings could also allow for closer and safer human-robotic interaction, such as in caregiving professions, or bring us closer to automating surgical tasks by giving surgical robots the sense of touch that they lack today.”

Tests Detailed

In their initial experiment, the researchers used a robotic hand fitted with the artificial skin to read Braille, passing the tactile data to Loihi through the cloud. They then tasked a robot to classify various opaque containers holding differing amounts of liquid using sensory inputs from the artificial skin and an event-based camera.

By combining event-based vision and touch they enabled 10 percent greater accuracy in object classification compared to a vision-only system.

“We’re excited by these results. They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception. It’s a step toward building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations,” said Assistant Professor Harold Soh from the Department of Computer Science at the NUS School of Computing.

How the Robot Skin Works

Each ACES sensor or “receptor,” captures and transmits stimuli information asynchronously as “events” using electrical pulses spaced in time.

The arrangement of the pulses is unique to each receptor. The spread spectrum nature of the pulse signatures permits multiple sensors to transmit without specific time synchronisation, NUS says, “propagating the combined pulse signatures to the decoders via a single electrical conductor”. The ACES platform is “inherently asynchronous due to its robustness to overlapping signatures and does not require intermediate hubs used in existing approaches to serialize or arbitrate the tactile events.”

But What’s It Made Of?!

“Battery-powered ACES receptors, connected together with a stretchable conductive fabric (knit jersey conductive fabric, Adafruit), were encapsulated in stretchable silicone rubber (Ecoflex 00-30, Smooth-On),” NUS details in its initial 2019 paper.

“A stretchable coat of silver ink (PE873, DuPont) and encapsulant (PE73, DuPont) was applied over the rubber via screen printing and grounded to provide the charge return path. To construct the conventional cross-bar multiplexed sensor array used in the comparison, we fabricated two flexible printed circuit boards (PCBs) to form the row and column traces. A piezoresistive layer (Velostat, 3M) was sandwiched between the PCBs. Each intersection between a row and a column formed a pressure-sensitive element. Traces from the PCBs were connected to an ATmega328 microcontroller (Atmel). Software running on the microcontroller polled each sensor element sequentially to obtain the pressure distribution of the array.

A ring-shaped acrylic object was pressed onto the sensor arrays to deliver the stimulus: “We cut the sensor arrays using a pair of scissors to cause damage”

You can read in more substantial technical detail how ACES signaling scheme allows it to encode biomimetic somatosensory representations here. 

See also: Revealed – Google’s Open Source Brain Mapping Technology

 

[ad_2]

Source link

Cheap Website Traffic