Human Media Lab | |
Established: | 2000 |
Research Field: | Human-computer interaction Flexible displays |
Affiliations: | Queen's University |
Country: | Canada |
City: | Kingston, Ontario |
The Human Media Lab (HML) is a research laboratory in Human-Computer Interaction at Queen's University's School of Computing in Kingston, Ontario. Its goals are to advance user interface design by creating and empirically evaluating disruptive new user interface technologies, and educate graduate students in this process. The Human Media Lab was founded in 2000 by Prof. Roel Vertegaal and employs an average of 12 graduate students.
The laboratory is known for its pioneering work on flexible display interaction and paper computers, with systems such as PaperWindows (2004), PaperPhone (2010) and PaperTab (2012). HML is also known for its invention of ubiquitous eye input, such as Samsung's Smart Pause and Smart Scroll technologies.
In 2003, researchers at the Human Media Lab helped shape the paradigm Attentive User Interfaces,[1] demonstrating how groups of computers could use human social cues for considerate notification.[2] Amongst HML's early inventions was the eye contact sensor, first demonstrated to the public on ABC Good Morning America.[3] Attentive User Interfaces developed at the time included an early iPhone prototype that used eye tracking electronic glasses to determine whether users were in a conversation, an attentive television that play/paused contents upon looking away, mobile Smart Pause and Smart Scroll (adopted in Samsung's Galaxy S4)[4] as well as a technique for calibration-free eye tracking by placing invisible infrared markers in the scene.
Current research at the Human Media Lab focuses on the development of Organic User Interfaces: user interfaces with a non-flat display. In 2004, researchers at the HML built the first bendable paper computer, PaperWindows,[5] which premiered at CHI 2005. It featured multiple flexible, hires, colour, wireless, thin-film multitouch displays through real-time depth-cam 3D Spatial Augmented Reality. In May 2007 HML coined the term Organic User Interfaces.[6] Early Organic User Interfaces developed at HML included the first multitouch spherical display,[7] and Dynacan, an interactive pop can: early examples of everyday computational things with interactive digital skins.[8] [9]
In 2010, the Human Media Lab, with Arizona State University, developed the world's first functional flexible smartphone, PaperPhone. It pioneered bend interactions and was first shown to the public at ACM CHI 2011 in Vancouver.[10] In 2012, the Human Media Lab introduced the world's first pseudo-holographic, live size 3D video conferencing system,[11] TeleHuman.[12]
In 2013, HML researchers unveiled PaperTab,[13] the world's first flexible tablet PC, at CES 2013 in Las Vegas, in collaboration with Plastic Logic and Intel.
The Human Media Lab is located in Jackson Hall on Queen's University campus in Kingston, Ontario. The facilities were designed by Karim Rashid.