Your phone already speaks in tiny buzzes. The next step is more ambitious: making haptics do real communication work, from calming someone before a presentation to sending information to people who can’t rely on sound or sight.
That idea sits at the center of a doctoral thesis from Tallinn University’s Yulia Sion, who explores ”tactons” – tactile icons that turn vibration into structured, non-visual messages. It’s a neat twist on a feature most people treat as plumbing, and it fits a broader push across wearables and accessories to make interfaces less screen-bound and more human.
What tactons are supposed to do
Sion’s research suggests vibrations can carry emotion as well as alerts. In one example, tactons were used as a haptic stand-in for reassurance, aiming to reduce anxiety and improve focus in stressful moments such as public speaking. The work also looked at how people can translate memories into patterns of touch, with differences in intensity, rhythm, and duration changing how the message is understood.
That part is both promising and slightly tricky, because urgency is easy while nuance is not. Most people will probably read a sharp, irregular buzz as ”pay attention” without any training, but more subtle emotional cues may need a shared vocabulary before they become useful rather than vaguely artsy.
Why wearables need better haptic feedback
The current generation of devices already hints at where this is going. Apple Watch notifications can feel surprisingly expressive, Sony’s DualSense controller has shown how haptics can add texture to an experience, and smart clothing could push the idea further if more powerful actuators are built in. That is the real hardware challenge: if you want haptics to say more, the motor has to be able to say it clearly.
There’s also a practical angle here for accessibility. Sion examined how tactons could help low-vision or blind users understand environmental information, either alongside audio or as a substitute when sound is not appropriate. In other words, this is not just about making gadgets feel cooler; it is about giving them another channel to communicate when the screen is the wrong tool.
The near future of touch on devices
If this goes anywhere, it will probably arrive in small, boringly useful steps first: better alerts, clearer navigation cues, more intentional smartwatch feedback, maybe smarter clothing for specific tasks. The bigger bet is that haptics could evolve from a background feature into a kind of language stack for devices, one that complements text, sound, and visuals instead of just vibrating whenever the phone feels ignored.
The open question is whether manufacturers will invest in that kind of precision or keep shipping the same generic buzz patterns and calling it innovation. My money is on the former eventually, because once people notice that touch can carry meaning, the old ”one buzz fits all” approach starts to look very 2010.

