T.A.I is a tangible AI interface that enhances physical engagement in digital communication between users and a conversational AI agent. A compact, pneumatically shape-changing hardware is designed with a rich set of physical gestures that actuate on mobile devices during real-time conversations.

In the form of a common phone case, T.A.I enables a text-based AI conversational agent to deliver physical interactions in real time during mobile chat conversations. More than a virtual, illusory intelligence, the physical interactions provided a more “living” “creature”-like impression of the AI agent for the users.

The hardware of T.A.I is contained in the form of a lightweight, portable phone case. A low-voltage micro pumpand three 8mm solenoid valves are used for air control. Each elastomer bag has its own valve to be independently controlled for deflation and inflation. The high-flow (up to 2.5 LPM free low) micro pump is used to provide air, and the air flow is distributed to each elastomer bag through connected air channels. By switching the valve states and using PWM to control the air flow from the pump, physical actuations can be varied based on speed, amplitude, waveform (motion combinations of three air bags) and duration. Its main control board was programmed to enable exploration of a range of physical gestures and communication with our Android chatting application though Bluetooth.

T.A.I is a research project I worked on while interning in Microsoft Research NYC, with tremendous help from my mentor Kati London (Fuse Lab) and facility support from NYU ITP.


Liu, Xin, and Kati London. “TAI: A Tangible AI Interface to Enhance Human-Artificial Intelligence (AI) Communication Beyond the Screen.” Proceedings of the 2016 ACM Conference on Designing Interactive Systems. ACM, 2016.