Here we look at the ways in which humans communicate visually through gesture and body language, and show how human conversation can be broken down into general phases of actions and processes. These explorations are manifested as an art installation in which two computers converse using a purely visual language composed of gestural elements.
As humans, we naturally show our emotions and attitudes using our bodies, both consciously and unconsciously. Our hands are used as tools for expression as much as tools for physical interaction, and our faces often communicate more than the words we speak.
A 'body language' of computers has evolved as well. We recognize certain visual cues on a screen as holding particular significance — e.g. a blinking cursor means the computer is waiting, a progress bar means it is working on a given task. A common iconography continues to emerge which enables computers to speak to humans in a non-verbal, unwritten language that spans borders and cultures.
When we communicate with one another, we follow protocols that allow our conversations to flow smoothly — using known cues, we recognize when a conversation is starting, when it's our turn to talk (and when to listen). We're able to make mutual decisions, handle transactions, and manage unexpected interruptions.
To represent this exploration, we developed a pair of devices that communicate using a visual language based on the elements of conversation identified above. In contrast to familiar computer-to-human communication, the computers are designed to talk directly to one another using this gestural vocabulary.
Initial research, concepts, and prototypes were presented at Eyebeam Art and Technology Center as part of the exhibition SFPC:The First Class.