Smartphone users can now 'feel' images and objects seen on their touchscreen!
In
a game-changing invention, engineers at Disney Research, Pittsburgh,
have developed a new technique that allows you to feel the texture of
objects seen on a flat touchscreen.
The novel
algorithm enables a person sliding a finger across a topographic map
displayed on a touchscreen to feel the bumps and curves of hills and
valleys, despite the screen's smooth surface.
The
technique is based on the fact that when a person slides a finger over a
real physical bump, he perceives the bump largely because lateral
friction forces stretch and compress skin on the sliding finger.
By
altering the friction encountered as a person's fingertip glides across
a surface, the Disney algorithm can create a perception of a 3D bump on
a touch surface.
The method can be used to simulate the feel of a wide variety of objects and textures.
"Our
brain perceives the 3D bump on a surface mostly from information that
it receives via skin stretching," said Ivan Poupyrev, who directs Disney
Research, Pittsburgh's Interaction Group.
"Therefore,
if we can artificially stretch skin on a finger as it slides on the
touchscreen, the brain will be fooled into thinking an actual physical
bump is on a touchscreen even though the touch surface is completely
smooth," Poupyrev said in a statement.
In
experiments, researchers used electrovibration to modulate the friction
between the sliding finger and the touch surface with electrostatic
forces.
Researchers created and validated a
psychophysical model that closely simulates friction forces perceived by
the human finger when it slides over a real bump.
The
model was then incorporated into an algorithm that dynamically
modulates the frictional forces on a sliding finger so that they match
the tactile properties of the visual content displayed on the
touchscreen along the finger's path.
A broad variety of visual artifacts thus can be dynamically enhanced with tactile feedback that adjusts as the visual display.
"The
traditional approach to tactile feedback is to have a library of canned
effects that are played back whenever a particular interaction occurs,"
said Ali Israr, a Disney Research, Pittsburgh research engineer who was
the lead on the project.
"This makes it
difficult to create a tactile feedback for dynamic visual content, where
the sizes and orientation of features constantly change. With our
algorithm we do not have one or two effects, but a set of controls that
make it possible to tune tactile effects to a specific visual artifact
on the fly," Israr said.
The new research will be presented at the ACM Symposium on User Interface Software and Technology in St Andrews, Scotland.
0 comments:
Post a Comment