Scientists at the University of Glasgow, UK, have managed to suspend little polystyrene particles in mid-air, supported only by ultrasonic acoustic waves. This is levitation. The technology may lead to new kinds of displays to command machines and hence revolutionise human-machine interactions. The study runs under the Levitate project, supported by the research programme on Future and Emerging Technologies of the European Commission.
A new performance piece melding live theatre and virtual reality has just opened as part of Army@TheFringe.
Familiar Stranger brings together live acting and virtual reality to tell the story of an Iraq War veteran returning to civilian life.
The 45-minute show is a collaboration between The University of Glasgow Department of Computing Science and Glasgow-based artist and coder collective RealRealReal.
Hosted in the Hepburn House Army Reserve Centre (Venue 210) in East Claremont Street, Edinburgh, it offers an insight into a veteran’s attempts to reintegrate into everyday life.
It opens with a monologue performed by career soldier Sergeant Major Garry Worrall, after which audiences are introduced to the Oculus Go VR headsets that plunge them into a virtual space, and the veteran’s inner life.
It ranges through his home and then into his memories of deployment – an experience that is simultaneously familiar and strange.
Afterwards the audience meet Garry again and have the chance to talk to him about his experiences in and out of the Army – opening up the space between the artists’ ideas of army life and his first hand knowledge.
Dr Julie Williamson, Lecturer in Human Computer Interaction at the university’s School of Computing Science, developed the technical set up and collaborated on the script.
She said:“Virtual reality is often considered a solitary activity, but I’m interested in exploring how we can use virtual spaces to expand shared experiences.
“Working with Dennis Reinmuller and Debbie Moody from RealRealReal and Army@TheFringe has given us a great opportunity to explore how theatre can be melded with VR to create an experience that can’t be delivered any other way.”
Familiar Stranger features the voice of Louise Oliver as The Magazine Soldier, guiding the viewer through the fictional veteran’s memory. The music is created by Sarah J Stanley of HQFU together with RealRealReal.
Army@TheFringe is presented by Army Headquarters Scotland as a way of engaging with wider society through the arts and initiating discussion about soldiering.
The venue, which runs from 10 to 25 August, is staffed by soldiers who run the bar and front of house services, and who mingle with the public before and after shows.
Familiar Stranger if supported by the University of Glasgow’s Dean’ Fund.
It runs until 24 August with performances at 1pm, 3.45pm and 6.45pm daily.
We are presenting two papers at ACM CHI 2018. The links below will provide an open access copy of these papers.
Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain
Room-scale Virtual Reality (VR) has become an affordable consumer reality, with applications ranging from entertainment to productivity. However, the limited physical space available for room-scale VR in the typical home or office environment poses a significant problem. To solve this, physical spaces can be extended by amplifying the mapping of physical to virtual movement (translational gain). Although amplified movement has been used since the earliest days of VR, little is known about how it influences reach-based interactions with virtual objects, now a standard feature of consumer VR. Consequently, this paper explores the picking and placing of virtual objects in VR for the first time, with translational gains of between 1x (a one-to-one mapping of a 3.5m*3.5m virtual space to the same sized physical space) and 3x (10.5m*10.5m virtual mapped to 3.5m*3.5m physical). Results show that reaching accuracy is maintained for up to 2x gain, however going beyond this diminishes accuracy and increases simulator sickness and perceived workload. We suggest gain levels of 1.5x to 1.75x can be utilized without compromising the usability of a VR task, significantly expanding the bounds of interactive room-scale VR.
Acoustic levitation enables a radical new type of human-computer interface composed of small levitating objects. For the first time, we investigate the selection of such objects, an important part of interaction with a levitating object display. We present Point-and-Shake, a mid-air pointing interaction for selecting levitating objects, with feedback given through object movement. We describe the implementation of this technique and present two user studies that evaluate it. The first study found that users could accurately (96%) and quickly (4.1s) select objects by pointing at them. The second study found that users were able to accurately (95%) and quickly (3s) select occluded objects. These results show that Point-and-Shake is an effective way of initiating interaction with levitating object displays.
We’re very excited that our CHI 2017 paper has been given a Best Paper Award (top 1% of submissions).
Public evaluations are popular because some research questions can only be answered by turning “to the wild.” Different approaches place experimenters in different roles during deployment, which has implications for the kinds of data that can be collected and the potential bias introduced by the experimenter. This paper expands our understanding of how experimenter roles impact public evaluations and provides an empirical basis to consider different evaluation approaches. We completed an evaluation of a playful gesture-controlled display – not to understand interaction at the display but to compare different evaluation approaches. The conditions placed the experimenter in three roles, steward observer, overt observer, and covert observer, to measure the effect of experimenter presence and analyse the strengths and weaknesses of each approach.
Full text will be available after publication in May.
This talk discusses our ongoing work to re-appropriate public spaces through digital interactive art. The Public and Performative Interaction Group recently organised a workshop that brought together artists, designers, and computing scientists for a two-day event. Our Goal: to create a working prototype of an interactive installation in just two days. Over the course of the workshop, we developed a concept, implemented the interface, and deployed this on the University of Glasgow Campus. Our untitled piece brought light, play, and interaction to a relatively derelict and empty space on campus, bringing new life and new ideas to the digital urban landscape.
We recently completed two showings of the Sunken Ripples Interactive Experience, where a spherical display acts as a portal to an underwater world. During Sunken Ripples, audience members can interact with the sphere to control the jellyfish creatures on the IMAX screen. Small interactions on the sphere ripple out into huge proportions in this playful installation.
This year the SIPS project will be presenting an Interactivity Exhibit and an alt.chi paper at CHI 2015. We’re very excited to bring the sphere to new places and share some of our experiences completing evaluations in public spaces.
Deep Cover HCI: A Case for Covert Research in HCI
Julie R. Williamson and Daniel Sundén
The growing popularity of methodologies that turn “to the wild” for real world data creates new ethical issues for the HCI community. For investigations questioning interactions in public or transient spaces, crowd interaction, or natural behaviour, uncontrolled and uninfluenced (by the experimenter) experiences represent the ideal evaluation environment. We argue that covert research can be completed rigorously and ethically to expand our knowledge of ubiquitous technologies. Our approach, which we call Deep Cover HCI, utilises technology-supported observation in public spaces to stage completely undisturbed experiences for evaluation. We complete studies without informed consent and without intervention from an experimenter in order to gain new insights into how people use technology in public settings. We argue there is clear value in this approach, reflect on the ethical issues of such investigations, and describe our ethical guidelines for completing Deep Cover HCI Research.
Multi-Player Gaming on Spherical Displays – Interactivity
Julie R. Williamson, John Williamson, Daniel Sundén, Jay Bradley
Spherical displays offer unique affordances for multi-player games and playful interactions in social spaces. The shape of a spherical display allows users to face each other and maintain eye contact during interaction, creating a different social dynamic than at a flat display. There is also no intrinsically defined front or centre of the display, offering different views from different viewing angles. This creates shared and private areas of the display given users’ varying perspectives. Trajectory based games have a dramatically different experience when played on a spherical surface. Side-scrolling games are also exciting on a spherical surface, becoming “rotating” games where users’ action affect others playing at different points around the screen. This Interactivity exhibit showcases two multi-player games that specifically exploit the affordances of a spherical display in a social setting.
The Public and Performative Interaction has been discussed on the BBC Click Programme about Glasgow’s Digital Creativity. Julie Williamson’s work on engagement with public displays explores how we can create the best possible user experience on spherical touch sensitive displays. Evaluating engagement and experience is vital to ensure the novel technologies we develop are actually used in practice.
Watch the programme here: BBC Click: Glasgow’s Digital Creativity (featured at 6:35)
For more information about our projects and public engagement, see publicinteraction.co.uk for our latest updates.
In November, Julie Williamson (University of Glasgow) and Audrey O’Brien (Visual Artist) ran a workshop two-day workshop to explore ideas and concepts for a digital art installation for public spaces. The goal was to create and design with concepts such as playfulness, performative interactions, surveillance, touch, and lighting. The only requirement placed on workshop participants was to create a working prototype together during the two-day event. At the end of the workshop, the participants exhibited the final prototype in a pop-up exhibition on the University of Glasgow campus.
Over two days, this micro-residency brought together artists, designers and computing scientists from a wide variety of backgrounds. The workshop began with an exploration of the installation site. The installation was staged in a dark space beneath one of the University building, positioned below a busy pedestrianised walkway. The final product was composed of six touch sensitive pendulums arranged around a spherical display. Touching the pendulums produced music, with each pendulum creating different visualisations on the sphere. The video below showcases the final installation from the pop-up exhibit.