This page is no longer maintained or updated.
(January 2020)
This page is no longer maintained or updated.
(January 2020)
David Dobbelstein joined the HCI group in October 2011. During his studies he was an intern at Daimler AG and at Nokia.
David holds a M. Sc. in Applied Computer Science - Systems Engineering (2014) from University of Duisburg-Essen. He wrote his Master thesis in cooperation with Nokia Research in Sunnyvale (USA). The title of the master thesis was: Investigating Interaction Performance on a Novel Handheld Near-Eye Display.
He obtained a doctoral degree on 18th December 2019 with his dissertation Near-body Interaction for Wearable Interfaces.
If you are interested in writing a thesis, please send me an email or drop by my office.
Belt is a novel unobtrusive input device for wearable displays that incorporates a touch surface encircling the user’s hip. The wide input space is leveraged for a horizontal spatial mapping of quickly accessible information and applications. We discuss social implications and interaction capabilities for unobtrusive touch input and present our hardware implementation and a set of applications that benefit from the quick access time. In a qual- itative user study with 14 participants we found out that for short interactions (2-4 seconds), most of the surface area is considered as appropriate input space, while for longer interactions (up to 10 seconds), the front areas above the trouser pockets are preferred.
Read more...
We present PocketThumb, a wearable touch interface for smart-eyewear that is embedded into the fabrics of the front trouser pocket. The interface is reachable from outside and inside of the pocket to allow for a combined dual-sided touch input. The user can control an absolute cursor with their thumb sliding along the fabric from the inside, while at the same time tapping or swiping with fingers from the outside to perform joint gestures. This allows for resting the hand in a comfortable and quickly accessible position, while performing interaction with a high expressiveness that is feasible in mobile scenarios. In a cursor-based target selection study, we found that our introduced dual-sided touch interaction is significantly faster in comparison to common single-sided absolute as well as relative touch interaction (∼19%, resp. ∼23% faster). The effect is largest in the mobile conditions standing and walking (up to ∼31% faster).
Read more...
We introduce inScent, a wearable olfactory display that can be worn in mobile everyday situations and allows the user to receive personal scented notifications, i.e. scentifications. Olfaction, i.e. the sense of smell, is used by humans as a sen- sorial information channel as an element for experiencing the environment. Olfactory sensations are closely linked to emo- tions and memories, but also notify about personal dangers such as fire or foulness. We want to utilize the properties of smell as a notification channel by amplifying received mobile notifications with artificially emitted scents. We built a wearable olfactory display that can be worn as a pendant around the neck and contains up to eight different scent aromas that can be inserted and quickly exchanged via small scent cartridges. Upon emission, scent aroma is vaporized and blown towards the user. A hardware - and software framework is presented that allows developers to add scents to their mobile applications. In a qualitative user study, participants wore the inScent wearable in public. We used subsequent semi-structured interviews and grounded theory to build a common understanding of the experience and derived lessons learned for the use of scentifications in mobile situations.
DOI: | 10.1145/3267242.3267249 |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/mitarbeiterbereiche/dobbelstein/movelet.pdf |
DOI: | 10.1145/3267242.3267248 |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/mitarbeiterbereiche/dobbelstein/snapband.pdf |
DOI: | 10.1145/3090055 |
Weblink: | https://youtu.be/Ep0GUToErJg |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/mitarbeiterbereiche/dobbelstein/pocketThumb.pdf |
DOI: | 10.1145/3027063.3053202 |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/Papers/Prof_Weber/2017/Walch2017Evaluating.pdf |
DOI: | 10.1145/3123024.3123185 |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/Papers/Prof_Rukzio/2016/david_merged.pdf |
DOI: | 10.1145/3123024.3123185 |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/mitarbeiterbereiche/dobbelstein/pocketthumb_demo.pdf |
DOI: | 10.1145/2984511.2984576 |
Weblink: | https://youtu.be/MHbN9lseHYE |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/Papers/Prof_Rukzio/2016/p49c.pdf |
DOI: | 10.1145/2851581.2890242 |
Weblink: | https://youtu.be/tvAjOvXB56c |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/Papers/Prof_Rukzio/2016/FaceTouchGugenheimer.pdf |
DOI: | 10.1145/2851581.2892292 |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/Papers/Prof_Rukzio/2016/UnconstrainedDobbelstein.pdf |
DOI: | 10.1145/2851581.2859016 |
File: | http://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/mitarbeiter/dobbelstein/Unobtrusive_Interaction_for_Wearable_Computing.pdf |
DOI: | 10.1145/2642918.2647361 |
File: | https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.100/institut/mitarbeiter/dobbelstein/Loupe_A_Handheld_Near-Eye_Display.pdf |
DOI: | 10.1145/2556288.2557365 |
Weblink: | http://www.youtube.com/watch?v=ahG06CERqAI |
File: | /fileadmin/website_uni_ulm/iui.inst.100/1-hci/hci-paper/2014/winkler2168-AMP-D.pdf |
DOI: | 10.1145/2556288.2557075 |
Weblink: | http://www.youtube.com/watch?v=DKofzCI7Yfw |
File: | /fileadmin/website_uni_ulm/iui.inst.100/institut/Papers/Prof_Rukzio/2014/winkler672-surfacephone.pdf |