Sponsorship Information

Technical Program
ICMI-MLMI Pocket Guide (pdf)
Special Sessions
Social Program
Conference Calendar
Local Information
Important Dates
Related Sites

Call for Papers
For Authors
Sponsorship Opportunities

For the first time in 2009, the International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction will be held jointly.  We are writing to invite you to become a 2009 corporate sponsor for this important academic conference.  ICMI-MLMI is a truly international conference, with previous venues in Canada, China, Czech Republic, Greece, Italy, Japan, The Netherlands, United Kingdom, and the United States of America.  Next year we will be proudly hosting this important meeting in Boston, hopefully with your help!

Multimodal interface is a challenging and important field that brings together top quality work on computer vision, speech recognition, gesture recognition, touch interfaces, biometrics, learning, and cognitive modeling.  A particular attraction of the multimodal interface field is that our work combines these disparate tools together into working systems.  Our attendees are not only world-class theoreticians, they also have solid infrastructure and architecture skills.  Corporate Sponsorship will allow your company to be a visible member of this talented, interdisciplinary community.

Multimodal interface is also about the marriage of novel interface, sensing, and actuation hardware with cutting-edge software.  Our attendees are always interested in new tools and better solutions.  Corporate Sponsorship is an important opportunity to connect with this broad community of trend setters in a single location.

In 2009 we expect about 150 attendees from all over the world.  The 2008 ICMI program included authors and speakers from the USA, Greece, UK, France, Finland, Switzerland, The Netherlands, Germany, Japan, Italy, Sweden, Canada, New Zealand, and Egypt.  Attendees represent even more localities.  Corporate Sponsorship is a great way to gain visibility across the globe for new products.

Our sponsorship options are explained on the following pages, but do not hesitate to contact the Sponsorship Chair at sponsorship-icmi2009@acm.org if you have any questions.

Please consider participating in ICMI-MLMI 2009 as corporate sponsors, and help us make this a great meeting!

ICMI-MLMI Sponsorship Chair

Hervé Bourlard, Idiap Research Institut


ICMI-MLMI 2009 General Co-Chairs

James Crowley, INRIA

Yuri Ivanov, MERL

Christopher Wren, Google

Sponsorship Ladder for

The International Conference on Multimodal Interfaces and

Workshop on Machine Learning for Multi-modal Interaction

  Gift Level
Choice of Benefits
  $10,000 Fully Multimodal Sponsor
Exhibit and Host and Promote
plus 3 free registrations,
plus prominent recognition in print and electronic media,
plus special recognition during opening and closing ceremonies.
  $5,000 Bimodal Sponsor Exhibit and Host   
Exhibit and Promote
Host and Promote
plus 2 free registrations,
plus prominent recognition in print and electronic media,
  $2,500 Unimodal Sponsor Exhibit
plus 1 free registration,
plus recognition in print and electronic media
Up to $2,500 Friend of ICMI-MLMI Recognition in print media.

The Benefits of Sponsorship

10 square meters of exhibit space in the conference lobby, plus one free registration
Insert promotional items into the attendee tote bag, or provide name tag lanyards.
Naming rights for a social gathering or shuttle service at the conference, an Outstanding Paper or Outstanding Student Paper Awards.

Sponsorship Notes
Outstanding Paper Prizes
Naming Rights are available for the Outstanding Paper or Outstanding Student Paper prizes.   Outstanding Paper Prize sponsorship confers a naming right only: the prize recipients will be chosen by an unbiased committee chosen by the conference organizers.  Sponsorship of Outstanding Paper Sponsors will be recognized during the Prize Ceremony during the conference. 

Unimodal, Bimodal, and Multimodal Sponsorship Grades

Naming Opportunities
Social Events that may be sponsored, available on a first come first served basis, include the Conference Banquet, Welcome Reception, Cocktail Reception, Committee Dinner, and Volunteer Dinner.

Bank Information
All the amounts listed above are US Dollars.  All checks should be made out to "ACM/ICMI-MLMI 2009".  Please send donations to:

Janet McAndless
ICMI-MLMI 2009 Treasurer
Mitsubishi Electric Research Labs
201 Broadway, 8 Floor
Cambridge, MA 02139

Since this is first year ICMI and MLMI will be held jointly we are expecting 150 attendees.

Please do not hesitate to contact Hervé Bourlard, the ICMI-MLMI sponsorship chair (sponsorship-icmi2009@acm.org) if you have any questions about sponsorship.

Recent Outstanding Paper Awards

CMI 2008, Chania, Crete

Context-based Recognition during Human Interactions: Automatic Feature Selection and Encoding Dictionary
    Louis-Philippe Morency, Iwan de Kok, Jonathan Gratch

Outstanding Student Papers:
Crossmodal Congruence: The Look, Feel and Sound of Touchscreen Widgets
    Eve Hoggan, Topi Kaaresoja, Pauli Laitinen, Stephen Brewster
A Wizard of Oz Study for an AR Multimodal Interface
    Minkyung Lee, Mark Billinghurst

ICMI 2007, Nagoya, Japan

Automatic Inference of Cross-modal Nonverbal Interactions in Multiparty Conversations

    Kazuhiro Otsuka, Hiroshi Sawada, Junji Yamato

Outstanding Student Paper:

Designing Audio and Tactile Crossmodal Icons for Mobile Devices

    Eve Hoggan, Stephen Brewster

ICMI 2006, Banff, Canada

Co-Adaptation of Audio-Visual Speech and Gesture Classifiers

    Christos Christoudias, Kate Saenko, Louis-Philippe Morency, and Trevor Darrell

Using Redundant Speech and Handwriting for Learning New Vocabulary and Understanding Abbreviations

    Edward Kaiser

ICMI 2005, Trento, Italy

Probabilistic Grounding of Situated Speech using Plan Recognition and Reference Resolution

    Peter Gorniak and Deb Roy

Contextual Recognition of Head Gestures

    L.P. Morency, C. Lee, C. Sidner, T. Darrell

Perceiving Ordinal Data Haptically Under Workload

    A. Tang, P. McLachlan, K. Lowe, C. R. Saka, K. MacLean