Scientists at the University of Glasgow, UK, have managed to suspend little polystyrene particles in mid-air, supported only by ultrasonic acoustic waves. This is levitation. The technology may lead to new kinds of displays to command machines and hence revolutionise human-machine interactions. The study runs under the Levitate project, supported by the research programme on Future and Emerging Technologies of the European Commission.
Tag Archives: Levitate
Public and Performative Interaction @ CHI 2018
We are presenting two papers at ACM CHI 2018. The links below will provide an open access copy of these papers.
Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain
Room-scale Virtual Reality (VR) has become an affordable consumer reality, with applications ranging from entertainment to productivity. However, the limited physical space available for room-scale VR in the typical home or office environment poses a significant problem. To solve this, physical spaces can be extended by amplifying the mapping of physical to virtual movement (translational gain). Although amplified movement has been used since the earliest days of VR, little is known about how it influences reach-based interactions with virtual objects, now a standard feature of consumer VR. Consequently, this paper explores the picking and placing of virtual objects in VR for the first time, with translational gains of between 1x (a one-to-one mapping of a 3.5m*3.5m virtual space to the same sized physical space) and 3x (10.5m*10.5m virtual mapped to 3.5m*3.5m physical). Results show that reaching accuracy is maintained for up to 2x gain, however going beyond this diminishes accuracy and increases simulator sickness and perceived workload. We suggest gain levels of 1.5x to 1.75x can be utilized without compromising the usability of a VR task, significantly expanding the bounds of interactive room-scale VR.
Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain
CHI ’18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 2018
Acoustic levitation enables a radical new type of human-computer interface composed of small levitating objects. For the first time, we investigate the selection of such objects, an important part of interaction with a levitating object display. We present Point-and-Shake, a mid-air pointing interaction for selecting levitating objects, with feedback given through object movement. We describe the implementation of this technique and present two user studies that evaluate it. The first study found that users could accurately (96%) and quickly (4.1s) select objects by pointing at them. The second study found that users were able to accurately (95%) and quickly (3s) select occluded objects. These results show that Point-and-Shake is an effective way of initiating interaction with levitating object displays.
Point-and-Shake: Selecting from Levitating Object Displays
CHI ’18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 2018