Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 Research  





2 Location and Facilities  





3 External links  





4 References  














Human Media Lab







Add links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


Human Media Lab
Established2000

Field of research

Human-computer interaction
Flexible displays
LocationKingston, Ontario, Canada
Affiliations Queen's University

The Human Media Lab (HML) is a research laboratory in Human-Computer Interaction at Queen's University's School of Computing in Kingston, Ontario. Its goals are to advance user interface design by creating and empirically evaluating disruptive new user interface technologies, and educate graduate students in this process. The Human Media Lab was founded in 2000 by Prof. Roel Vertegaal and employs an average of 12 graduate students.

The laboratory is known for its pioneering work on flexible display interaction and paper computers, with systems such as PaperWindows (2004),[1] PaperPhone (2010)[2] and PaperTab (2012).[3] HML is also known for its invention of ubiquitous eye input, such as Samsung's Smart Pause and Smart Scroll[4] technologies.

Research

[edit]

In 2003, researchers at the Human Media Lab helped shape the paradigm Attentive User Interfaces,[5] demonstrating how groups of computers could use human social cues for considerate notification.[6] Amongst HML's early inventions was the eye contact sensor, first demonstrated to the public on ABC Good Morning America.[7] Attentive User Interfaces developed at the time included an early iPhone prototype that used eye tracking electronic glasses to determine whether users were in a conversation,[7] an attentive television that play/paused contents upon looking away, mobile Smart Pause and Smart Scroll (adopted in Samsung's Galaxy S4)[4] as well as a technique for calibration-free eye tracking by placing invisible infrared markers in the scene.

Current research at the Human Media Lab focuses on the development of Organic User Interfaces: user interfaces with a non-flat display. In 2004, researchers at the HML built the first bendable paper computer, PaperWindows,[1] which premiered at CHI 2005. It featured multiple flexible, hires, colour, wireless, thin-film multitouch displays through real-time depth-cam 3D Spatial Augmented Reality. In May 2007 HML coined the term Organic User Interfaces.[8] Early Organic User Interfaces developed at HML included the first multitouch spherical display,[9] and Dynacan, an interactive pop can: early examples of everyday computational things with interactive digital skins.[10][11]

In 2010, the Human Media Lab, with Arizona State University, developed the world's first functional flexible smartphone, PaperPhone. It pioneered bend interactions and was first shown to the public at ACM CHI 2011 in Vancouver.[2]

Main laboratory space featuring a large wall-sized display with remote gestural interaction

In 2012, the Human Media Lab introduced the world's first pseudo-holographic, live size 3D video conferencing system,[12] TeleHuman.[13]

In 2013, HML researchers unveiled PaperTab,[3] the world's first flexible tablet PC, at CES 2013 in Las Vegas, in collaboration with Plastic Logic and Intel.

Location and Facilities

[edit]

The Human Media Lab is located in Jackson Hall on Queen's University campus in Kingston, Ontario. The facilities were designed by Karim Rashid.[citation needed]

[edit]

References

[edit]
  1. ^ a b Holman, D., Vertegaal, R. and Troje, N. (2005). PaperWindows: Interaction Techniques for Digital Paper. In Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems. ACM Press, 591-599.
  • ^ a b Lahey, B., Girouard, A., Burleson, W. and R. Vertegaal. (2011). PaperPhone: Understanding the Use of Bend Gestures in Mobile Devices with Flexible Electronic Paper Displays. In Proceedings of ACM CHI’11 Conference on Human Factors in Computing Systems, ACM Press, 1303-1312.
  • ^ a b Warner, B. (2013). PaperTab a Fold-Up, Roll-Up Tablet Computer. Bloomberg Businessweek, May 2013.
  • ^ a b Dickie, C., Vertegaal, R., Sohn C., and Cheng, D. (2005). eyeLook: using attention to facilitate mobile media consumption. In Proceedings of the 18th annual ACM symposium on User interface software and technology (UIST '05). ACM Press, 103-106.
  • ^ Vertegaal, R. (2003). Attentive User Interfaces. Editorial, In Special Issue on Attentive User Interfaces, Communications of ACM 46(3), ACM Press, 30-33.
  • ^ Gibbs, W. (2005) Considerate Computing. Scientific American 292, 54 - 61
  • ^ a b Vertegaal, R., Dickie, C., Sohn, C. and Flickner, M. (2002). Designing attentive cell phone using wearable eyecontact sensors. In CHI '02 Extended Abstracts on Human Factors in Computing Systems. ACM Press, pp. 646-647.
  • ^ Vertegaal R., and Poupyrev, I. (2008). Introduction to Organic User Interfaces. In Special Issue on Organic User Interfaces, Communications of the ACM 51(6), 5-6.
  • ^ Holman, D. and Vertegaal, R. (2008). Organic User Interfaces: Designing Computers in Any Way, Shape, or Form. In Special Issue on Organic User Interfaces, Communications of the ACM 51(6), 48-55.
  • ^ Akaoka, E., Ginn, T. and R. Vertegaal. (2010). DisplayObjects: Prototyping Functional Physical Interfaces on 3D Styrofoam, Paper or Cardboard Models. In Proceedings of TEI’10 Conference on Tangible, Embedded and Embodied Interaction. ACM Press, 49-56.
  • ^ Vertegaal, R. (2011). The (Re)Usability of Everyday Computational Things. In ACM Interactions Magazine, ACM Press, Jan/Feb 2011, 39-41
  • ^ Kingsley, J. with will.i.am. (2013). Use Your Illusion. Wired UK, August 2013, 140-141.
  • ^ Kim, K., Bolton, J., Girouard, A., Cooperstock, J. and Vertegaal, R. (2012). TeleHuman: Effects of 3D Perspective on Gaze and Pose Estimation with a Life-size Cylindrical Telepresence Pod. In Proceedings of CHI’12 Conference on Human Factors in Computing Systems, ACM Press, 2531-2540.

  • Retrieved from "https://en.wikipedia.org/w/index.php?title=Human_Media_Lab&oldid=1193072653"

    Categories: 
    Queen's University at Kingston
    Humancomputer interaction
    Flexible displays
    Hidden categories: 
    All articles with unsourced statements
    Articles with unsourced statements from September 2020
     



    This page was last edited on 1 January 2024, at 22:38 (UTC).

    Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



    Privacy policy

    About Wikipedia

    Disclaimers

    Contact Wikipedia

    Code of Conduct

    Developers

    Statistics

    Cookie statement

    Mobile view



    Wikimedia Foundation
    Powered by MediaWiki