Project Hands and Eyes
  • Modules
  • About Team

Medium mix 1 - “phone & eyes”

2018 / 12 / 15

Tactile touchscreen interface on a phone where appropriate + HMD

Interactions

  • Phone image triggers interaction from HMD camera
  • Phone connected through a low-latency protocol to HMD as a secondary screen and tactile input interface.
  • Mix use of HMD sensors with phone sensors.

Applications

  1. Multi-purpose IoT Device Remote Control - Access anything you see
    • Realtime hack demo
      • CircuitryIOTAR - Grand Prize from AT&T IoT Hackathon
      • HackPack AR
      • VenueTaggerLive
    • Simulation demos for the enterprise from the author’s Twilio SignalConf talk: http://iot.areality3d.com
    • Remote Desktop operating system on a HMD with hybrid phone/ML controller + hand gesture interaction
  2. Reality Designer / Freeform Any-Object Placement
    • General 3D modelplacement
    • Interior Design “Portable” Virtual Warehouse: Browse through furniture on your phone or tablet and place them holographically on the HMD. http://bit.ly/holofurni
    • Magic spell designer: http://bit.ly/magicdex
  3. Identify Who’s Who (as inspired by a Black Mirror episode): Identify people you see and browse more info about them on your phone.
  4. Keynote Reality: Use a phone based interface to select and format the slides and annotations you want to place in reality on the HMD. http://bit.ly/prototypereality
Comment is not allowed.
  • Medium Mix 2 - Hands Only

    2018 / 12 / 15
    Meaningful applications from hand-based interactions
© 1998-2019 I. Yosun Chang
Built with Book, a Jekyll theme.