Skip Navigation

Duke Research - Gesture-Based Interfaces: My Research

Laser pointer plays Tic-tac-toe on wall

December 10, 2010

Gesture-Based Interfaces: My Research

Blogger Vansh Muttreja (Class of  2012 B.S.E. Electrical and Computer Engineering and B.S. Economics) updates us on his research:

Over the past three months I have been working on some very interesting projects as part of Duke SyNRG (Systems Networking Research Group) which is led by Dr. Romit Roy Choudhury, Nortel Networks assistant professor of electrical and computer engineering.

The goal of our research is to bridge the gap between physical and virtual interfaces, and to make collaboration much easier and intuitive. A lot of times we notice that the screen size of our desktop or our phone is not feasible when we need to work on multiple applications. Thus, it is much more convenient to project the data on the wall using a projector, or in our case, using palm-sized pico-projectors.


Pico-projectors are proliferating rapidly and by next year we expect to see phones with built-in projectors. Imagine the possibilities!

We've come up with two novel applications for pico- projectors. The first is called Virtual White Board, which enables multiple users to remotely and simultaneously edit white boards in real time. Everyone using their respective physical white boards becomes part of a network. Now these users, wherever they may be located, can write and draw on their white boards using a color marker in their natural way, and the information is simultaneously projected on all the boards in our virtual network. Here is a screenshot of a remote tic-tac-toe game being played by two users on boards in different physical locations.

The second program we developed is called Smart Transfer. Some jobs like design projects and architectural techniques require the user to work on a bigger screen. We wanted to enable the user to directly interact with the projection screen. So we implemented a system in which a person can use a laser pointer as a point-and-click device and directly work on the projection screen. The laser pointer is now effectively a mouse for the projected screen.

The final aim was to make collaboration and file transfer between devices more instinctive. So we came up with a design in which a user can simply use the laser pointer to select a file he wishes to transfer, and perform an intuitive ‘flicking’ gesture towards the device he wants to transfer the file to. For example, imagine there is one computer to the left of our parent screen, and one computer to the right. A simple selection of file and flicking gesture to the right will initiate file transfer to the computer on the right and to the left if the flick is directed towards the left. This is especially advantageous for transferring standalone objects such as images within a document. So multiple people collaborating on a word document can intuitively and easily share charts, images, text etc. present within their document.


Posted by vm19. 0 comments

Tags: computers/technology, engeinering, students, Visualization


Keep me posted on research news. Learn more >>


All fields required