Programmable hydrogel mimics octopus skin

Programmable hydrogel mimics octopus skin Programmable hydrogel mimics octopus skin

Researchers from Penn State University have developed a new fabrication technique that can print multifunctional ‘smart synthetic skin.’ These adaptable materials can be programmed to perform a wide variety of tasks, including hiding or revealing information, enabling adaptive camouflage, and supporting soft robotic systems.

The programmable smart skin is made from hydrogel, a soft, water-rich material. Compared to conventional synthetic materials with fixed behaviours, this material can be tuned to respond in multiple ways. Its appearance, mechanical behaviour, surface texture, and ability to change shape can all be adjusted when the material is exposed to external triggers such as heat, solvents, or physical stress.

The inspiration

Hongtao Sun, the project’s principal investigator, has said that the concept behind the material was inspired by cephalopods such as octopuses, which can rapidly alter the look and texture of their skin. These animals use such changes to blend into their surroundings or communicate with one another.

“Cephalopods use a complex system of muscles and nerves to exhibit dynamic control over the appearance and texture of their skin,” Sun said. “Inspired by these soft organisms, we developed a 4D-printing system to capture that idea in a synthetic, soft material.”

Sun has described the team’s method as 4D-printing because it produces 3D objects that can reactively adjust based on changes in the environment.

How it was created

The team used a technique known as halftone-encoded printing, which translates image or texture data onto a surface in the form of binary ones and zeros, to encode digital information directly into the material, similar to the dot patterns used in newspapers or photographs. This technique allows the team to essentially programme their smart skin to change appearance or texture through exposure to stimuli.

These patterns control how different regions of the material respond to their environment, with some areas deswelling or softening more than others when exposed to changes in temperature, liquids, or mechanical forces. By carefully designing the patterns, the team can decide how the material behaves overall.

“In simple terms, we’re printing instructions into the material,” Sun explained. “Those instructions tell the skin how to react when something changes around it.”

According to Haoqing Yang, a doctoral candidate studying IME and first author of the paper, one of the most striking demonstrations of the smart skin is its ability to hide and reveal information. To showcase this feature, the team encoded a photo of the Mona Lisa onto the smart skin. When the film was washed with ethanol, the film appeared transparent, showing no visible image. However, the Mona Lisa became fully visible after immersion in ice water, or during gradual heating.

“This behaviour could be used for camouflage, where a surface blends into its environment, or for information encryption, where messages are hidden and only revealed under specific conditions,” Yang said.

The team have also shown that hidden patterns can be uncovered by gently stretching the material and measuring how it deforms via digital image correlation analysis. This means information can be revealed not just by sight, but also through mechanical deformation, adding another layer of security.

The material is highly malleable, the smart skin could easily transform from a flat sheet into non-traditional, bio-inspired shapes with complex textures. Unlike many other shape-morphing materials, this effect does not require multiple layers or different materials. Rather, these shapes and textured surfaces – like those seen on cephalopod skin – can be controlled by the digitally printed halftone pattern within a single sheet.

Building on these shape-morphing capabilities, the researchers showed that the smart skin can combine multiple functions simultaneously. By carefully co-designing the halftone patterns, the team was able to encode the Mona Lisa image directly into flat films that later emerged as the material transformed into 3D shapes. As the flat sheets curved into dome-like structures, the hidden image information gradually became visible, demonstrating how changes in shape and appearance can be programmed together.

“Similar to how cephalopods coordinate body shape and skin patterning, the synthetic smart skin can simultaneously control what it looks like and how it deforms, all within a single, soft material,” Sun said.

Looking ahead, the team plans to develop a general and scalable platform that enables precise digital encoding of multiple functions into a single adaptive smart material system.

“This interdisciplinary research at the intersection of advanced manufacturing, intelligent materials and mechanics opens new opportunities with broad implications for stimulus-responsive systems, biomimetic engineering, advanced encryption technologies, biomedical devices and more,” Sun said.

The team detailed their work in a paper published in Nature Communications.

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Previous Post
SECO to demonstrate Clea as the enabler of Edge AI at Embedded World 2026

SECO to demonstrate Clea as the enabler of Edge AI at embedded world 2026

Next Post
Alif Semiconductor: the evolution of microcontrollers for Edge AI

Alif Semiconductor: the evolution of microcontrollers for Edge AI