SIXTH Demo Options
Sixth



Shvil : Collaborative Augmented Reality Land Navigation.

Abstract : We present our prototype of Shvil, an AugmentedReality (AR) system for collaborative land navigation.Shvil facilitates path planning and execution by creating a collaborative medium between an overseer (indoor user) and an explorer (outdoor user) using AR and 3D printing techniques. Shvil provides a remote overseer with a physical representation of the topography of the mission via a 3D printout of the terrain, and merges the physical presence of the explorer and the actions of the overseer via dynamic AR visualization. The system supports collaboration by both overlaying visual information related to the explorer on top of the overseer’s scaled-down physical representation, and overlaying visual information for the explorer in-situ as it emerges from the overseer. We report our current prototype effort and preliminary results, and our vision for the future of Shvil.




PhoneEar : Interactions for Mobile Devices that Hear High-Frequency Sound-Encoded Data.

Abstract :We present PhoneEar, a new approach that enables mobile devices to understand the broadcasted audio and sounds that we hear every day using existing infrastructure. PhoneEar audio streams are embedded with sound-encoded data using nearly inaudible high frequencies. Mobile devices then listen for messages in the sounds around us, taking actions to ensure we don’t miss any important info. In this paper,we detail our implementation of PhoneEar, describe a study demonstrating that mobile devices can effectively receive sound-based data, and describe the results of a user study that shows that embedding data in sounds is not detrimental to sound quality. We also exemplify the space of new interactions, through four PhoneEar-enabled applications. Finally, we discuss the challenges to deploying apps that can hear and react to data in the sounds around us.


       



PLANWELL : Spatial User Interface for Collaborative Petroleum Well Planning.

Abstract : We present our prototype of PlanWell, a spatial augmented reality interface that facilitates collaborative field operations. PlanWell allows a central overseer (in a command and control center) and a remote explorer (an outdoor user in the field) to explore and collaborate within a geographical area. PlanWell provides the overseer with a tangible user interface (TUI) based on a 3D printout of surface geography which acts as a physical representation of the region to be explored. Augmented reality is used to dynamically overlay properties of the region as well as the presence of the remote explorer and their actions on to the 3D representation of the terrain. The overseer is able to perform the actions directly on the TUI and then the overseer’s actions are presented as dynamic AR visualizations superimposed on the explorer’s view in the field. Although our interface could be applied to many domains, the PlanWell prototype was developed to facilitate petroleum engineering tasks such as well planning and coordination of drilling operations. Our paper describes the details of the design and implementation of the current PlanWell prototype in the context of petroleum well planning and drilling, and discusses some of the preliminary reflections of a focus group session with domain experts.


         





 

Flying Frustum: A Spatial Interface for Enhancing Human-
UAV Awareness

Abstract :We present Flying Frustum, a 3D spatial interface that enables control of semi-autonomous UAVs (UnmannedAerial Vehicles) using pen interaction on a physical model of the terrain, and that spatially situates the information streaming from the UAVs onto the physical model. Our interface is based on a 3D printout of the terrain, which allows the operator to enter goals and paths to the UAV by drawing them directly on the physical model. In turn, the UAV’s streaming reconnaissance information is superimposed on the 3D printout as a view frustum, which is situated according to the UAV’s position and orientation on the actual terrain. We argue that Flying Frustum’s 3D spatially situated interaction can potentially help improve human-UAV awareness, allow a better operators-to-UAV ratio, and enhance the overall situational awareness. We motivate our design approach for Flying Frustum, discuss previous related work in CSCW and HRI, present our current prototype using both handheld and headset augmented reality interfaces, reflect on Flying Frustum’s strengths and weaknesses, and discuss our plans for future evaluation and prototype improvements.


       




Exploring the Effect of Color-Coding on Gestural Memory

Abstract :Over the last few years touch interactions have become one of the most widely used forms of inputs. While new and advanced gesture sets are developed for different interfaces, they are often limited in acceptability due to several reasons such as, not being “natural” to use, lack of quick ways to learn the gesture or the extra effort needed to learn them. In this paper we approach the problem of learning new touch gestures and propose to use color coding as an aid for teaching new single touch gestures. The motivation for selecting color coding is because it has been previously studied to have a significant impact on visual memory. In our work we investigate if the same holds for gestural memory. In this paper we present a controlled study with twenty four participants to explore the effect of learning and recalling eight new single touch gestures using color coding. Color coding gestures refers to mapping the gestural strokes by direction, i.e. up, down, left and right directions mapped to yellow, green, red and blue respectively. Results show that although there is a very small difference in mean values of the success rate that can be achieved using color-coding, there is no major statistical difference between learning and recalling new single touch gestures with or without color coding.








-->

LightLogger: Eco-feedback Design for Sustainable Light Usage

This was a course project which was done as part of the "Personal And Collaborative Computing" course during my Masters degree at the University Of Calgary.

Abstract :Light pollution, also known as photo pollution is excessive,misdirected obtrusive artificial light. Over-illumination refersto the excessive use of light. Lighting is responsible for onefourth of all electricity consumption worldwide, and studies have shown that over-illumination results in energy wastage.Previous HCI research has contributed to the design of ecofeedback systems, behavior change for sustainable electricity consumption, and focused on various other environmental behavior changes in general, rather than this specific narrow problem of light pollution. In this paper we explore the design of a system that tries to motivate people towards sustainable light usage. Initially we conducted an online-survey to gather inputs and requirements from the people. Based on the inputs and feedback , we designed a system that could be used by people to visualize their light usage. We then conducted a pilot study of the system and present our findings.








2D Focus + Context Rendering Of Oil & Gas Reservoir Models

This was a course project which was done as part of the "Rendering" course during my Masters degree at the University Of Calgary.

Abstract :This paper describes a set of interactive 3D visualizations designed to explore the oil/gas reservoir simulation postprocessing models. The visualizations that have been presented here are : Orthogonal 3D cutting of the postprocessing models(along all the 3 planes, i.e XY plane, YZplane, XZ-Plane), and a Focus + Context visualization technique that helps in focusing and analyzing a specific area of the 3D post-processing model.In the remaining sections of the paper, an introduction to the reservoir simulation,related work, and technical implementation of the implemented visualizations are presented. Results of the implemented visualizations are then presented in the end.








3D Stylized Focus + Context Rendering Of Oil & Gas Reservoir Models

This was a course project which was done as part of the "Rendering" course during my Masters degree at the University Of Calgary.

Abstract :This extended upon my previous project of 2D Focus + Context visualization of oil/gas reservoir simulation postprocessing models.

I had implemented the Stylized Focus + Context rendering technique by F.Cole et al. More details about the rendering algorithm can be found in the paper available at the below link.