MULTIMODAL INTERACTION ON MOBILES DEVICES

Teachers: Denis Lalanne

The topic

In mobile tasks, it is particularly interesting to use multiple input and output modalities to adapt to the context of use (e.g. car, home, work), type of task (e.g., information search, entertainment) or user preferences. Multimodal user interfaces give users the possibility to interact with machine using multiple modalities such as speech, gestures, multi-touch interactions, eye gaze, etc. These modalities can be used complementarily or in a redundant way. As such, multimodal interfaces have proven to enhance human machine interactions mainly because they have the ability to adapt to users’ preferences or particularities, and also to the context of use.

Description

The goal of this seminar is to overview the domain of "Multimodal Interaction on Mobile Devices". Another objective of this seminar is to develop skills in reading, writing and reviewing academic papers. In this context, students will be asked to write and present a state of the art of a sub-domain of "Multimodal Interaction on Mobile Devices" research field. Sub topics such as fusion, fission, existing toolkits, visualization on small displays, etc. will be addressed in this context.

Each student will be asked to choose a theme within the domain, select state-of-the-art references relevant to the chosen thematic, synthesize these references and present them orally in a presentation, done during one of the final seminar sessions in a written report, of 4 pages, authored in LaTeX following ACM Strict format.

Learning Outcomes

At the end of the seminar, students will know how to do a bibliographic research, how to judge the quality of a scientific paper, and how to write a scientific article. Further, they will know what is "Multimodal Interaction on Mobile Devices" and the current techniques and trends, and will deepen their knowledge on a particular subtopic.

Participants

If you are willing to participate to this seminar, please contact the organizers. Max. 6 students will be accepted.

Semester: Spring 2012

Language: Français, Anglais

Time: 5 sessions, Friday morning in the meeting room B 420 (Perolles II).

Themes

  1. Multimodal interaction on mobiles and applications [PDF report] - Arnaud Gaspoz
  2. Toolkits for multimodal interaction on mobiles devices [PDF report] - Jan Kuheni
  3. Context recognition on mobiles [PDF report]- Fatih Tuemen
  4. Context-adaptive multimodal interfaces [PDF report] - Herve Sierro
  5. Interactive Visualization on small displays [PDF report] - Siavash Bigdeli
  6. Gestural interaction on mobiles [PDF report] - Dani Rotzetter
  7. Spoken interaction on mobiles [PDF report] - Caroline Voeffray
  8. Multimodal fusion on mobiles [PDF report] - Frederic Aebi
  9. Wearable multimodal interfaces [PDF report] - Stefan Egli
  10. Mixed reality on mobiles [PDF report] - Adel Rizaev

Date: 2012, Spring