Hi, I’m David and I am a researcher / PhD candidate in the field of Human Computer Interaction.
I am working in the Computer Graphics group with Prof. Marc Alexa at TU Berlin.
Before, I was working with Prof. Jörg Müller at TU Berlin (now Aarhus University) as well as at the Media Interaction Lab in Hagenberg, Austria. I completed the bachelor’s and master’s program in Hagenberg. During my master’s degree, I spent an exchange semester at the University of Waterloo, Canada, working with Prof. Stacey Scott and Prof. Mark Hancock. I was also working as a software developer at Interactive Pioneers.
My research focuses on exploring how modifying the optical appearance of interactive devices changes the way we use them (e.g., making displays transparent). Additionally, I am interested in new input methods and devices, sketch interpretation, and gestural interfaces.
You can download my cv here: cv_davidlindlbauer.pdf.
GelTouch: Localized Tactile Feedback Through Thin, Programmable Gel
GelTouch is a gel-based layer that can selectively transition between soft and stiff to provide tactile multi-touch feedback. It is flexible, transparent when not activated, and contains no mechanical, electromagnetic, or hydraulic components, resulting in a compact form factor (a 2mm thin touchscreen layer for our prototype). The activated areas can be morphed freely and continuously, without being limited to fixed, predefined shapes. GelTouch consists of a poly(N-isopropylacrylamide) gel layer which alters its viscoelasticity when activated by applying heat (>32 ◦C). We present three different activation techniques: 1) Indium Tin Oxide (ITO) as a heating element that enables tactile feed- back through individually addressable taxels; 2) predefined tactile areas of engraved ITO, that can be layered and combined; 3) complex arrangements of resistance wire that create thin tactile edges. We present a tablet with 6x4 tactile areas, enabling a tactile numpad, slider, and thumbstick. We show that the gel is up to 25 times stiffer when activated and that users detect tactile features reliably (94.8%).
V. Miruchna, R. Walter, D. Lindlbauer, M. Lehmann, R. von Klitzing, J. Müller, 2015. GelTouch: Localized Tactile Feedback Through Thin, Programmable Gel. UIST '15, Charlotte, NC, USA
ACM UIST 2015 Best Paper Award Honorable Mention
Featured on: MIT Technology Review, Engadget, Wired DE, El País, ...
Accuracy of Monocular Gaze Tracking on 3D Geometry
Many applications in visualization benefit from accurate knowledge of where a person is looking at. We present a system for accurately tracking gaze positions on a three dimensional object using a monocular head mounted eye tracker. We accomplish this by 1) using digital manufacturing to create stimuli with accurately known geometry, 2) embedding fiducial markers directly into the manufactured objects to reliably estimate the rigid transformation of the object, and, 3) using a perspective model to relate pupil positions to 3D locations. This combination enables the efficient and accurate computation of gaze position on an object from measured pupil positions. We validate the accuracy of our system experimentally, achieving an angular resolution of 0.8◦ and a 1.5% depth error using a simple calibration procedure with 11 points.
X. Wang, D. Lindlbauer, C. Lessig, M. Alexa, 2015.
Accuracy of Monocular Gaze Tracking on 3D Geometry.
ETVIS Workshop '15 (in conj. IEEE VIS '15), Chicago, Il, USA
Analyzing Visual Attention During Whole Body Interaction with Public Displays
While whole body interaction can enrich user experience on public displays, it remains unclear how common visualizations of user representations impact users’ ability to perceive content on the display. In this work we use a head-mounted eye tracker to record visual behavior of 25 users interacting with a public display game that uses a silhouette user representation, mirroring the users’ movements. Results from visual attention analysis as well as post-hoc recall and recognition tasks on display contents reveal that visual attention is mostly on users’ silhouette while peripheral screen elements remain largely unattended. In our experiment, content attached to the user representation attracted significantly more attention than other screen contents, while content placed at the top and bottom of the screen attracted significantly less. Screen contents attached to the user representation were also significantly better remembered than those at the top and bottom of the screen.
R. Walter, A. Bulling, D. Lindlbauer, M. Schüssler, J. Müller, 2015.
Analyzing Visual Attention During Whole Body Interaction with Public Displays. UBICOMP '15, Osaka, Japan
Creature Teacher: A Performance-Based Animation System for Creating Cyclic Movements
Creature Teacher is a performance-based animation system for creating cyclic movements. Users directly manipulate body parts of a virtual character by using their hands. Creature Teacher’s generic approach makes it possible to animate rigged 3D models with nearly arbitrary topology (e.g., non-humanoid) without requiring specialized user-to-character mappings or predefined movements. We use a bimanual interaction paradigm, allowing users to select parts of the model with one hand and manipulate them with the other hand. Cyclic movements of body parts during manipulation are detected and repeatedly played back - also while animating other body parts. Our approach of taking cyclic movements as an input makes mode switching between recording and playback obsolete and allows for fast and seamless creation of animations. We show that novice users with no animation background were able to create expressive cyclic animations for initially static virtual 3D creatures.
A. Fender, J. Müller, D. Lindlbauer, 2015.
Creature Teacher: A Performance-Based Animation System for Creating Cyclic Movements. SUI '15, Los Angeles, CA, USA
for See-through Displays
Tracs is a dual-sided see-through display system with controllable transparency. Traditional displays are a constant visual and communication barrier, hindering fast and efficient collaboration of spatially close or facing co- workers. Transparent displays could potentially remove these barriers, but introduce new issues of personal privacy, screen content privacy and visual interference. We therefore propose a solution with controllable transparency to overcome these problems. Tracs consists of two see-through displays, with a transparency-control layer, a backlight layer and a polarization adjustment layer in-between. The transparency- control layer is built as a grid of individually addressable transparency-controlled patches, allowing users to control the transparency overall or just locally. Additionally, the locally switchable backlight layer improves the contrast of LCD screen content. Tracs allows users to switch between personal and collaborative work fast and easily and gives them full control of transparent regions on their display.
D. Lindlbauer,T.Aoki, R. Walter, Y. Uema, A. Höchtl, M. Haller, M. Inami, J. Müller, 2014. Tracs: Transparency-control for See-through Displays.
UIST '14, Honolulu, Hawaii, USA. long video (3 min)
also presented as demo at UIST'14
D. Lindlbauer,T.Aoki, Y. Uema, A. Höchtl, M. Haller, M. Inami, J. Müller, 2014. A Collaborative See-through Display Supporting On-demand Privacy,
Siggraph Emerging Technology '14, Vancouver, Canada, video
Featured on: Gizmodo
A Chair as Ubiquitous Input Device:
Exploring Semaphoric Chair Gestures for Focused and Peripheral Interaction
During everyday office work we are used to controlling our computers with keyboard and mouse, while the majority of our body remains unchallenged and the physical workspace around us stays largely unattended. Addressing this untapped potential, we explore the concept of turning a flexible office chair into a ubiquitous input device. To facilitate daily desktop work, we propose the utilization of semaphoric chair gestures that can be assigned to specific application functionalities. The exploration of two usage scenarios in the context of focused and peripheral interaction demonstrates high potential of chair gestures as additional input modality for opportunistic, hands-free interaction.
K. Probst, D. Lindlbauer, M. Haller, B. Schwartz, A. Schrempf, 2014. A Chair as Ubiquitous Input Device: Exploring Semaphoric Chair Gestures for Focused and Peripheral Interaction. CHI '14, Toronto, Canada
K. Probst, D. Lindlbauer, M. Haller, B. Schwartz, A. Schrempf, 2014.
Exploring the Potential of Peripheral Interaction through Smart Furniture.
Workshop on Peripheral Interaction: Shaping the Research and Design Space at CHI '14, CHI '14, Toronto, Canada
K. Probst, D. Lindlbauer, P. Greindl, M. Trapp, M. Haller, B. Schwartz, and A. Schrempf, 2013. Rotating, Tilting, Bouncing: Using an Interactive Chair to Promote Activity in Office Environments. CHI EA ’13, Paris, France, 2013
Selection Assistance for Digital Sketching
Modifying a digital sketch may require multiple selections before a particular editing tool can be applied. Especially on large interactive surfaces, such interactions can be fatiguing. Accordingly, we propose a method, called Suggero, to facilitate the selection process of digital ink. Suggero identifies groups of perceptually related drawing objects. These “perceptual groups” are used to suggest possible extensions in response to a person’s initial selection. Two studies were conducted. First, a background study investigated participant’s expectations of such a selection assistance tool. Then, an empirical study compared the effectiveness of Suggero with an existing manual technique. The results revealed that Suggero required fewer pen interactions and less pen movement, suggesting that Suggero minimizes fatigue during digital sketching.
D. Lindlbauer, M. Haller, M. Hancock, S. D. Scott, and W. Stuerzlinger, 2013. Perceptual Grouping: Selection Assistance for Digital Sketching.
ITS ’13, St. Andrews, Scotland
D. Lindlbauer, 2012
Perceptual Grouping of Digital Sketches.
Master’s thesis (supervised by Prof Michael Haller)
University of Applied Sciences Upper Austria, Hagenberg
Understanding Mid-Air Hand Gestures:
A Study of Human Preferences in Usage of Gesture Types for HCI
In this paper we present the results of a study of human preferences in using mid-air gestures for directing other humans. Rather than contributing a specific set of gestures, we contribute a set of gesture types, which together make a set of the core actions needed to complete any of our six chosen tasks in the domain of human-to-human gestural communication without the speech channel. We observed 12 participants, cooperating to accomplish different tasks only using hand gestures to communicate. We analyzed 5,500 gestures in terms of hand usage and gesture type, using a novel classification scheme which combines three existing taxonomies in order to better capture this interaction space. Our findings indicate that, depending on the meaning of the gesture, there is preference in the usage of gesture types, such as pointing, pantomimic acting, direct manipulation, semaphoric, or iconic gestures. These results can be used as guidelines to design purely gesture driven interfaces for interactive environments and surfaces.
R. Aigner, D. Wigdor, H. Benko, M. Haller, D. Lindlbauer, A. Ion, S. Zhao, and J.T.K.V. Koh, 2012. Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI.
Microsoft Tech Report, Redmond, WA, USA, MSR-TR-2012-11.
Exploring the Use of Distributed Multiple Monitors Within an Activity-Promoting Sit-and-Stand Office Workspace
Nowadays sedentary behaviors such as prolonged sitting have become a predominant element of our lives. Particularly in the office environment, many people spend the majority of their working day seated in front of a computer. In this paper, we investigate the adoption of a physically active work process within an activity-promoting office workspace design that is composed of a sitting and a standing workstation. Making use of multiple distributed monitors, this environment introduces diversity into the office workflow through the facilitation of transitions between different work-related tasks, workstations, and work postures. We conducted a background study to get a better understanding of how people are performing their daily work within this novel workspace. Our findings identify different work patterns and basic approaches for physical activity integration, which indicate a number of challenges for software design. Based on the results of the study, we provide design implications and highlight new directions in the field of HCI design to support seamless alternation between different postures while working in such an environment.
K. Probst, D. Lindlbauer, F. Perteneder, M. Haller, B. Schwartz, and A. Schrempf, 2013. Exploring the Use of Distributed Multiple Monitors Within an Activity-Promoting Sit-and-Stand Office Workspace.
Interact ’13, Capetown, South Africa, 2013.
matreco is an eco-feedback visualisation. The software analyses energy data coming from the home automation system and displays it to users. The energy used by each consumer and its status is presented in a 2D visualisation. Additionally, users can replay the last 12/24/48 hours of energy consumption and can listen to a musical interpretation of the data. Therefore, users can keep track of their energy consumption and if necessary, change behavior.
AEC Facade Visualisation 
This project is a visualization on the interactive facade of the Ars Electronica Center, Linz. Users can play the game Breakout (or "bricks"). The platform is controlled with users' body movement. The system tracks the player in front of the building with a camera and positions the camera accordingly. This is done by sending network commands to the AEC facade interface. The project was realized within 3 days with Alexandra Ion as a course project for the class "Generative Arts" during my master's degree.
Kontrollwerk is a multitouch midi controller software for surface platforms. It was a bachelor project done with Alexandra Ion and Stefan Wasserbauer. KontrollWerk gives the user the possibility to create its own user interface with different types of midi controls. The output can be directed to any software or to any kind of internal and external midi devices. With gesture recognition and a blob menu the application has an intuitive user interface and handling. The software fits very well to DJs and VJs controlling several devices during a live performance.
The Witness 
The iPhone App “The Witness” is an interactive real life game containing multiple components, realized during my time at Interactive Pioneers. Depending on the level of the game and the location the player has to complete different tasks to find out more information and to complete the game. Located in Berlin, players were guided to multiple locations through the software. After reaching a location, players used the app to watch videos of the story, fulfill tasks like finding QR codes and communicate with actors, which were part of the game. The project was realized with Jung von Matt Spree (advertising agency, concept) and 13th Street (client). My part was the complete development of the iPhone App.
You can download my cv here: cv_davidlindlbauer.pdf.