Publication statistics

Pub. period:1993-2012
Pub. count:62
Number of co-authors:78



Co-authors

Number of publications with 3 favourite co-authors:

Marcos Serrano:9
Joëlle Coutaz:7
Emmanuel Dubois:6

 

 

Productive colleagues

Laurence Nigay's 3 most productive colleagues in number of publications:

Ann Blandford:85
Philippe A. Palanq..:66
Jean Scholtz:54
 
 
 
Jul 13

A general principle for all user interface design is to go through all of your design elements and remove them one at a time. If the design works as well without a certain design element, kill it.

-- Jakob Nielsen, Designing Web Usability, p. 22.

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!

 
 

Laurence Nigay

Has also published under the name of:
"L. Nigay"

Personal Homepage:
http://iihm.imag.fr/nigay/

Add description
Add publication

Publications by Laurence Nigay (bibliography)

 what's this?
2012
 
Edit | Del

Avouac, Pierre-Alain, Lalanda, Philippe and Nigay, Laurence (2012): Autonomic management of multimodal interaction: DynaMo in action. In: ACM SIGCHI 2012 Symposium on Engineering Interactive Computing Systems 2012. pp. 35-44.

Multimodal interaction can play a dual key role in pervasive environments because it provides naturalness for interacting with distributed, dynamic and heterogeneous digitally controlled equipment and flexibility for letting the users select the interaction modalities depending on the context. The DynaMo (Dynamic multiModality) framework is dedicated to the development and the runtime management of multimodal interaction in pervasive environments. This paper focuses on the autonomic approach of DynaMo whose originality is based on partial interaction models. The autonomic manager combines and completes partial available models at runtime in order to build multimodal interaction adapted to the current execution conditions and in conformance with the predicted models. We illustrate the autonomic solution by considering several running examples and different partial interaction models.

© All rights reserved Avouac et al. and/or ACM Press

2011
 
Edit | Del

Avouac, Pierre-Alain, Lalanda, Philippe and Nigay, Laurence (2011): Service-oriented autonomic multimodal interaction in a pervasive environment. In: Proceedings of the 2011 International Conference on Multimodal Interfaces 2011. pp. 369-376.

Heterogeneity and dynamicity of pervasive environments require the construction of flexible multimodal interfaces at run time. In this paper, we present how we use an autonomic approach to build and maintain adaptable input multimodal interfaces in smart building environments. We have developed an autonomic solution relying on partial interaction models specified by interaction designers and developers. The role of the autonomic manager is to build complete interaction techniques based on runtime conditions and in conformity with the predicted models. The sole purpose here is to combine and complete partial models in order to obtain an appropriate multimodal interface. We illustrate our autonomic solution by considering a running example based on an existing application and several input devices.

© All rights reserved Avouac et al. and/or ACM Press

 
Edit | Del

Francone, Jérémie and Nigay, Laurence (2011): Using the user's point of view for interaction on mobile devices. In: Proceedings of the 2011 Conference of the Association Francophone dInteraction Homme-Machine 2011. p. 4.

We study interaction modalities for mobile devices (smartphones and tablets) that rely on a camera-based head tracking. This technique defines new possibilities for input and output interaction. For output, by computing the position of the device according to the user's head, it is for example possible to realistically control the viewpoint on a 3D scene (Head-Coupled Perspective, HCP). This technique improves the output interaction bandwidth by enhancing the depth perception and by allowing the visualization of large workspaces (virtual window). For input, head movement can be used as a means of interacting with a mobile device. Moreover such an input modality does not require any additional sensor except the built-in front-facing camera. In this paper, we classify the interaction possibilities offered by head tracking on smartphones and tablets. We then focus on the output interaction by introducing several applications of HCP on both smartphones and tablets and by presenting the results of a qualitative user experiment.

© All rights reserved Francone and Nigay and/or ACM Press

2010
 
Edit | Del

Francone, Jérémie, Bailly, Gilles, Lecolinet, Eric, Mandran, Nadine and Nigay, Laurence (2010): Wavelet menus on handheld devices: stacking metaphor for novice mode and eyes-free selection for expert mode. In: Proceedings of the 2010 International Conference on Advanced Visual Interfaces 2010. pp. 173-180.

This paper presents the design and evaluation of the Wavelet menu and its implementation on the iPhone. The Wavelet menu consists of a concentric hierarchical Marking menu using simple gestures. The novice mode, i.e. when the menu is displayed, is well adapted to the limited screen space of handheld devices because the representation of the menu hierarchy is inverted, the deeper submenu being always displayed at the center of the screen. The visual design is based on a stacking metaphor to reinforce the perception of the hierarchy and to help users to quickly understand how the technique works. The menu also supports submenu previsualization, a key property to navigate efficiently in a hierarchy of commands. The quantitative evaluation shows that the Wavelet menu provides an intuitive way for supporting efficient gesture-based navigation. The expert mode, i.e. gesture without waiting for the menu to pop-up, is another key property of the Wavelet menu: By providing stroke shortcuts, the Wavelet favors the selection of frequent commands in expert mode and makes eyes-free selection possible. A user experiment shows that participants are able to select commands, eyes-free, while walking.

© All rights reserved Francone et al. and/or their publisher

2009
 
Edit | Del

Dubois, Emmanuel, Gray, Philip D. and Nigay, Laurence (2009): The Engineering of Mixed Reality Systems. Springer

 
Edit | Del

Francone, Jérémie, Bailly, Gilles, Nigay, Laurence and Lecolinet, Eric (2009): Wavelet menus: a stacking metaphor for adapting marking menus to mobile devices. In: Proceedings of 11th Conference on Human-computer interaction with mobile devices and services 2009. p. 49.

Exploration and navigation in multimedia data hierarchies (e.g., photos, music) are frequent tasks on mobile devices. However, visualization and interaction are impoverished due to the limited size of the screen and the lack of precise input devices. As a result, menus on mobile devices do not provide efficient navigation as compared to many innovative menu techniques proposed for Desktop platforms. In this paper, we present Wavelet, the adaptation of the Wave menu for the navigation in multimedia data on iPhone. Its layout, based on an inverted representation of the hierarchy, is particularly well adapted to mobile devices. Indeed, it guarantees that submenus are always displayed on the screen and it supports efficient navigation by providing previsualization of the submenus.

© All rights reserved Francone et al. and/or their publisher

 
Edit | Del

Lalanne, Denis, Nigay, Laurence, Palanque, Philippe A., Robinson, Peter, Vanderdonckt, Jean and Ladry, Jean-François (2009): Fusion engines for multimodal input: a survey. In: Proceedings of the 2009 International Conference on Multimodal Interfaces 2009. pp. 153-160.

Fusion engines are fundamental components of multimodal inter-active systems, to interpret input streams whose meaning can vary according to the context, task, user and time. Other surveys have considered multimodal interactive systems; we focus more closely on the design, specification, construction and evaluation of fusion engines. We first introduce some terminology and set out the major challenges that fusion engines propose to solve. A history of past work in the field of fusion engines is then presented using the BRETAM model. These approaches to fusion are then classified. The classification considers the types of application, the fusion principles and the temporal aspects. Finally, the challenges for future work in the field of fusion engines are set out. These include software frameworks, quantitative evaluation, machine learning and adaptation.

© All rights reserved Lalanne et al. and/or their publisher

 
Edit | Del

Serrano, Marcos and Nigay, Laurence (2009): Temporal aspects of CARE-based multimodal fusion: from a fusion mechanism to composition components and WoZ components. In: Proceedings of the 2009 International Conference on Multimodal Interfaces 2009. pp. 177-184.

The CARE properties (Complementarity, Assignment, Redundancy and Equivalence) define various forms that multimodal input interaction can take. While Equivalence and Assignment express the availability and respective absence of choice between multiple input modalities for performing a given task, Complementarity and Redundancy describe relationships between modalities and require fusion mechanisms. In this paper we present a summary of the works we have carried using the CARE properties for conceiving and implementing multimodal interaction, as well as a new approach using WoZ components. We present different technical solutions for implementing the Complementarity and Redundancy of modalities with a focus on the temporal aspects of the fusion. Starting from a monolithic fusion mechanism, we then explain our component-based approach and the composition components (i.e., Redundancy and Complementarity components). As a new contribution for exploring design solutions before implementing an adequate fusion mechanism as well as for tuning the temporal aspects of the performed fusion, we introduce Wizard of Oz (WoZ) fusion components. We illustrate the composition components as well as the implemented tools exploiting them using several multimodal systems including a multimodal slide viewer and a multimodal map navigator.

© All rights reserved Serrano and Nigay and/or their publisher

2008
 
Edit | Del

Serrano, Marcos, Nigay, Laurence, Lawson, Jean-Yves L., Ramsay, Andrew, Murray-Smith, Roderick and Denef, Sebastian (2008): The openinterface framework: a tool for multimodal interaction. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 3501-3506.

The area of multimodal interaction has expanded rapidly. However, the implementation of multimodal systems still remains a difficult task. Addressing this problem, we describe the OpenInterface (OI) framework, a component-based tool for rapidly developing multimodal input interfaces. The OI underlying conceptual component model includes both generic and tailored components. In addition, to enable the rapid exploration of the multimodal design space for a given system, we need to capitalize on past experiences and include a large set of multimodal interaction techniques, their specifications and documentations. In this work-in-progress report, we present the current state of the OI framework and the two exploratory test-beds developed using the OpenInterface Interaction Development Environment.

© All rights reserved Serrano et al. and/or ACM Press

 
Edit | Del

You, Yilun, Chin, Tat-Jun, Lim, Joo-Hwee, Chevallet, Jean-Pierre, Coutrix, Céline and Nigay, Laurence (2008): Deploying and evaluating a mixed reality mobile treasure hunt: Snap2Play. In: Hofte, G. Henri ter, Mulder, Ingrid and Ruyter, Boris E. R. de (eds.) Proceedings of the 10th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2008 September 2-5, 2008, Amsterdam, the Netherlands. pp. 335-338.

 
Edit | Del

Serrano, Marcos, Juras, David and Nigay, Laurence (2008): A three-dimensional characterization space of software components for rapidly developing multimodal interfaces. In: Digalakis, Vassilios, Potamianos, Alexandros, Turk, Matthew, Pieraccini, Roberto and Ivanov, Yuri (eds.) Proceedings of the 10th International Conference on Multimodal Interfaces - ICMI 2008 October 20-22, 2008, Chania, Crete, Greece. pp. 149-156.

 
Edit | Del

Juras, David, Nigay, Laurence, Ortega, Michaël and Serrano, Marcos (2008): Multimodal slideshow: demonstration of the openinterface interaction development environment. In: Digalakis, Vassilios, Potamianos, Alexandros, Turk, Matthew, Pieraccini, Roberto and Ivanov, Yuri (eds.) Proceedings of the 10th International Conference on Multimodal Interfaces - ICMI 2008 October 20-22, 2008, Chania, Crete, Greece. pp. 193-194.

 
Edit | Del

Bailly, Gilles, Lecolinet, Eric and Nigay, Laurence (2008): Flower menus: a new type of marking menu with large menu breadth, within groups and efficient expert mode memorization. In: Levialdi, Stefano (ed.) AVI 2008 - Proceedings of the working conference on Advanced Visual Interfaces May 28-30, 2008, Napoli, Italy. pp. 15-22.

 
Edit | Del

Coutrix, Céline and Nigay, Laurence (2008): Balancing physical and digital properties in mixed objects. In: Levialdi, Stefano (ed.) AVI 2008 - Proceedings of the working conference on Advanced Visual Interfaces May 28-30, 2008, Napoli, Italy. pp. 305-308.

 
Edit | Del

Serrano, Marcos, Juras, David, Ortega, Michaël and Nigay, Laurence (2008): OIDE: un outil pour la conception et le développement d'interfaces multimodales. In: Proceedings of the 2008 French-speaking conference on Mobility and ubiquity computing 2008. pp. 91-92.

Multimodal interaction software development presents a particular challenge because of ever increasing number of novel interaction devices. In this paper, we present the OpenInterface Interaction Development Environment (OIDE) that addresses the design and development of multimodal interfaces. To illustrate our approach, we present a multimodal slideshow implemented with our tool.

© All rights reserved Serrano et al. and/or ACM Press

 
Edit | Del

Jourde, Frédéric, Laurillau, Yann, Morán, Alberto L. and Nigay, Laurence (2008): Towards Specifying Multimodal Collaborative User Interfaces: A Comparison of Collaboration Notations. In: Graham, T. C. Nicholas and Palanque, Philippe A. (eds.) DSV-IS 2008 - Interactive Systems. Design, Specification, and Verification, 15th International Workshop July 16-18, 2008, Kingston, Canada. pp. 281-286.

 
Edit | Del

Serrano, Marcos, Juras, David and Nigay, Laurence (2008): A three-dimensional characterization space of software components for rapidly developing multimodal interfaces. In: Proceedings of the 2008 International Conference on Multimodal Interfaces 2008. pp. 149-156.

In this paper we address the problem of the development of multimodal interfaces. We describe a three-dimensional characterization space for software components along with its implementation in a component-based platform for rapidly developing multimodal interfaces. By graphically assembling components, the designer/developer describes the transformation chain from physical devices to tasks and vice-versa. In this context, the key point is to identify generic components that can be reused for different multimodal applications. Nevertheless for flexibility purposes, a mixed approach that enables the designer to use both generic components and tailored components is required. As a consequence, our characterization space includes one axis dedicated to the reusability aspect of a component. The two other axes of our characterization space, respectively depict the role of the component in the data-flow from devices to tasks and the level of specification of the component. We illustrate our three dimensional characterization space as well as the implemented tool based on it using a multimodal map navigator.

© All rights reserved Serrano et al. and/or their publisher

 
Edit | Del

Juras, David, Nigay, Laurence, Ortega, Michaël and Serrano, Marcos (2008): Multimodal slideshow: demonstration of the openinterface interaction development environment. In: Proceedings of the 2008 International Conference on Multimodal Interfaces 2008. pp. 193-194.

In this paper, we illustrate the OpenInterface Interaction Development Environment (OIDE) that addresses the design and development of multimodal interfaces. Multimodal interaction software development presents a particular challenge because of the ever increasing number of novel interaction devices and modalities that can used for a given interactive application. To demonstrate our graphical OIDE and its underlying approach, we present a multimodal slideshow implemented with our tool.

© All rights reserved Juras et al. and/or their publisher

2007
 
Edit | Del

Horchani, Meriam, Nigay, Laurence and Panaget, Franck (2007): A platform for output dialogic strategies in natural multimodal dialogue systems. In: Proceedings of the 2007 International Conference on Intelligent User Interfaces 2007. pp. 206-215.

The development of natural multimodal dialogue systems remains a very difficult task. The flexibility and naturalness they offer result in an increased complexity that current software tools do not address appropriately. One challenging issue we address here is the generation of cooperative responses in an appropriate multimodal form, highlighting the intertwined relation of content and presentation. We identify a key component, the dialogic strategy component, as a mediator between the natural dialogue management and the multimodal presentation. This component selects the semantic information content to be presented according to various presentation constraints. Constraints include inherent characteristics of modalities, the availability of a modality as well as preferences of the user. Thus the cooperative behaviour of the system could be adapted as could its multimodal behaviour. In this paper, we present the dialogic strategy component and an associated platform to quickly develop output multimodal cooperative responses in order to explore different dialogic strategies.

© All rights reserved Horchani et al. and/or ACM Press

 
Edit | Del

Horchani, Meriam, Caron, Benjamin, Nigay, Laurence and Panaget, Franck (2007): Natural multimodal dialogue systems: a configurable dialogue and presentation strategies component. In: Massaro, Dominic W., Takeda, Kazuya, Roy, Deb and Potamianos, Alexandros (eds.) Proceedings of the 9th International Conference on Multimodal Interfaces - ICMI 2007 November 12-15, 2007, Nagoya, Aichi, Japan. pp. 291-298.

 
Edit | Del

Bailly, Gilles, Lecolinet, Eric and Nigay, Laurence (2007): Wave Menus: Improving the Novice Mode of Hierarchical Marking Menus. In: Baranauskas, Maria Cecília Calani, Palanque, Philippe A., Abascal, Julio and Barbosa, Simone Diniz Junqueira (eds.) DEGAS 2007 - Proceedings of the 1st International Workshop on Design and Evaluation of e-Government Applications and Services September 11th, 2007, Rio de Janeiro, Brazil. pp. 475-488.

 
Edit | Del

Horchani, Meriam, Caron, Benjamin, Nigay, Laurence and Panaget, Franck (2007): Natural multimodal dialogue systems: a configurable dialogue and presentation strategies component. In: Proceedings of the 2007 International Conference on Multimodal Interfaces 2007. pp. 291-298.

In the context of natural multimodal dialogue systems, we address the challenging issue of the definition of cooperative answers in an appropriate multimodal form. Highlighting the intertwined relation of multimodal outputs with the content, we focus on the Dialogic strategy component, a component that defines from the set of possible contents to answer a user's request, the content to be presented to the user and its multimodal presentation. The content selection and the presentation allocation managed by the Dialogic strategy component are based on various constraints such as the availability of a modality and the user's preferences. Considering three generic types of dialogue strategies and their corresponding handled types of information as well as three generic types of presentation tasks, we present a first implementation of the Dialogic strategy component based on rules. By providing a graphical interface to configure the component by editing the rules, we show how the component can be easily modified by ergonomists at design time for exploring different solutions. In further work we envision letting the user modify the component at runtime.

© All rights reserved Horchani et al. and/or their publisher

2006
 
Edit | Del

Serrano, Marcos, Nigay, Laurence, Demumieux, Rachel, Descos, Jerome and Losquin, Patrick (2006): Multimodal interaction on mobile phones: development and evaluation using ACICARE. In: Proceedings of 8th conference on Human-computer interaction with mobile devices and services 2006. pp. 129-136.

The development and the evaluation of multimodal interactive systems on mobile phones remains a difficult task. In this paper we address this problem by describing a component-based approach, called ACICARE, for developing and evaluating multimodal interfaces on mobile phones. ACICARE is dedicated to the overall iterative design process of mobile multimodal interfaces, which consists of cycles of designing, prototyping and evaluation. ACICARE is based on two complementary tools that are combined: ICARE and ACIDU. ICARE is a component-based platform for rapidly developing multimodal interfaces. We adapted the ICARE components to run on mobile phones and we connected them to ACIDU, a probe that gathers customer's usage on mobile phones. By reusing and assembling components, ACICARE enables the rapid development of multimodal interfaces as well as the automatic capture of multimodal usage for in-field evaluations. We illustrate ACICARE using our contact manager system, a multimodal system running on the SPV c500 mobile phone.

© All rights reserved Serrano et al. and/or ACM Press

 
Edit | Del

Mansoux, B., Nigay, Laurence and Troccaz, J. (2006): Output Multimodal Interaction: The Case of Augmented Surgery. In: Proceedings of the HCI06 Conference on People and Computers XX 2006. pp. 177-192.

 
Edit | Del

Serrano, Marcos, Nigay, Laurence, Demumieux, Rachel, Descos, Jerome and Losquin, Patrick (2006): Multimodal interaction on mobile phones: development and evaluation using ACICARE. In: Nieminen, Marko and Röykkee, Mika (eds.) Proceedings of the 8th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2006 September 12-15, 2006, Helsinki, Finland. pp. 129-136.

 
Edit | Del

Bailly, Gilles, Nigay, Laurence and Auber, David (2006): NAVRNA: visualization - exploration - editing of RNA. In: Celentano, Augusto (ed.) AVI 2006 - Proceedings of the working conference on Advanced visual interfaces May 23-26, 2006, Venezia, Italy. pp. 504-507.

 
Edit | Del

Coutrix, Céline and Nigay, Laurence (2006): Mixed reality: a model of mixed interaction. In: Celentano, Augusto (ed.) AVI 2006 - Proceedings of the working conference on Advanced visual interfaces May 23-26, 2006, Venezia, Italy. pp. 43-50.

2005
 
Edit | Del

Coutrix, Céline, Nigay, Laurence and Renevier, Philippe (2005): Modèle d'interaction mixte: la réalité mixte à la lumière des modalités d'interaction. In: Proceedings of the 2005 French-speaking conference on Mobility and ubiquity computing 2005. pp. 153-160.

Mixed Reality seeks to smoothly link the physical and data processing (digital) environments. In recent years, mixed reality has been the subject of growing interest. Although mixed reality systems are becoming more prevalent, we still do not have a clear understanding of this interaction paradigm. This article introduces a new interaction model called Mixed Interaction model. It adopts a unifying point of view on mixed reality by considering the interaction modalities that are involved for defining mixed environments. This article presents the model after explaining its foundations, applies it to describe and compare existing systems and shows its generative power for designing new forms of mixed interaction.

© All rights reserved Coutrix et al. and/or ACM Press

 
Edit | Del

Bailly, Gilles, Nigay, Laurence and Auber, David (2005): 2M: un espace de conception pour i'interaction bi-manuelle. In: Proceedings of the 2005 French-speaking conference on Mobility and ubiquity computing 2005. pp. 177-184.

In the physical world, most human activities involve the use of two hands. As part of ubiquitous computing that aims at combining the physical and the digital worlds, understanding how to design two-handed interaction is therefore important. This article presents a new design space for two-handed interaction, called 2M. The 2M design space takes its foundations in two complementary research domains: multimodal interaction and two-handed interaction. On the one hand we enrich the multimodal design space to take into account the roles of the two hands. On the other hand, we extend the two-handed interaction design space to cover the temporal as well as fusion aspects of multimodal interaction. This article presents the resulting 2M design space and demonstrates its descriptive power. We finally study the generative capacity of 2M for designing new forms of two-handed interaction in the context of the NavGraphe project that focuses on the interactive manipulation of very large biological structures (e.g., RNA structures).

© All rights reserved Bailly et al. and/or ACM Press

 
Edit | Del

Adiba, Michel, Alvarado, Patricia Serrano, Nigay, Laurence and Riveill, Michel (2005): Tutorial « informatique mobile et pervasive ». In: Proceedings of the 2005 French-speaking conference on Mobility and ubiquity computing 2005. p. 209.

L'Informatique Mobile définit un vaste domaine d'études pluridisciplinaires. Elle s'inscrit dans le mouvement de l'informatique pervasive, ubiquitaire et évanescente. La multiplication des ordinateurs de poche et des assistants personnels, la généralisation de l'utilisation des cartes à puces et la multiplication des systèmes embarqués dans des objets d'usage courant (automobile, télévision, etc.) sont des témoins de cet essor. L'explosion de la téléphonie mobile constitue un autre exemple significatif de la formidable extension de ce domaine.

© All rights reserved Adiba et al. and/or ACM Press

2004
 
Edit | Del

Bouchet, Jullien, Nigay, Laurence and Ganille, Thierry (2004): ICARE software components for rapidly developing multimodal interfaces. In: Sharma, Rajeev, Darrell, Trevor, Harper, Mary P., Lazzari, Gianni and Turk, Matthew (eds.) Proceedings of the 6th International Conference on Multimodal Interfaces - ICMI 2004 October 13-15, 2004, State College, PA, USA. pp. 251-258.

 
Edit | Del

Bouchet, Jullien, Nigay, Laurence and Balzagette, Didier (2004): ICARE: a component-based approach for multimodal interaction. In: Proceedings of the 2004 French-speaking conference on Mobility and ubiquity computing 2004. pp. 36-43.

Mobile and ubiquitous systems support multiple interaction techniques such as the synergistic use of active modalities (speech, gesture, etc.) and passive modalities (gaze, localization of a user, etc.). The flexibility they offer results in an increased complexity that current software development tools do not address appropriately. In this paper we describe a component-based approach, called ICARE, for specifying and developing interfaces combining active and passive modalities. Our approach relies on two types of components: (1) elementary components that describe pure modalities (active and passive) and (2) composition components (Complémentarité, Redundancy and Equivalence) that enable the designer to specify combined usage of modalities. The designer graphically assembles the ICARE components and the code of the multimodal user interface is automatically generated. Although the ICARE platform is not fully developed, we illustrate the applicability of the approach with the implementation of three mobiles systems: two GeoNote systems and one prototype of cockpit commands of Rafale (French military plane).

© All rights reserved Bouchet et al. and/or ACM Press

 
Edit | Del

Renevier, Philippe and Nigay, Laurence (2004): A design notation for mobile collaborative mixed systems. In: Proceedings of the 2004 French-speaking conference on Mobility and ubiquity computing 2004. pp. 66-73.

Real needs of mobile users working in groups can be addressed by Mobile Collaborative Mixed Systems. But the design of such systems is a complex task due to the variety of design parameters. For example the characterization of the usage context is difficult because of the mobility of the users as well as the different types of collaboration amongst them. In this paper we present a characterization space and a notation as useful tools for the design of Mobile Collaborative Mixed Systems. Our two contributions are placed within a scenario based design process.

© All rights reserved Renevier and Nigay and/or ACM Press

 
Edit | Del

Daudé, Sylvain and Nigay, Laurence (2004): Physical objects linked to audio sources: phicon+earcon=phearcon. In: Proceedings of the 2004 French-speaking conference on Mobility and ubiquity computing 2004. pp. 166-173.

This article introduces a new kind of interface, the Phearcons, which are physical objects linked to audio sources that are spatialized according to the objects position and represent information sources. We aim at designing and developing a platfrom called Phearcons for specifying and running such interfaces. This article provides design rules as well as a process for the specification and implementation of such a platform.

© All rights reserved Daudé and Nigay and/or ACM Press

 
Edit | Del

Bouchet, Jullien, Nigay, Laurence and Ganille, Thierry (2004): ICARE software components for rapidly developing multimodal interfaces. In: Proceedings of the 2004 International Conference on Multimodal Interfaces 2004. pp. 251-258.

Although several real multimodal systems have been built, their development still remains a difficult task. In this paper we address this problem of development of multimodal interfaces by describing a component-based approach, called ICARE, for rapidly developing multimodal interfaces. ICARE stands for Interaction-CARE (Complementarity Assignment Redundancy Equivalence). Our component-based approach relies on two types of software components. Firstly ICARE elementary components include Device components and Interaction Language components that enable us to develop pure modalities. The second type of components, called Composition components, define combined usages of modalities. Reusing and assembling ICARE components enable rapid development of multimodal interfaces. We have developed several multimodal systems using ICARE and we illustrate the discussion using one of them: the FACET simulator of the Rafale French military plane cockpit.

© All rights reserved Bouchet et al. and/or their publisher

2003
 
Edit | Del

Dubois, Emmanuel, Gray, Philip D. and Nigay, Laurence (2003): ASUR++: Supporting the design of mobile mixed systems. In Interacting with Computers, 15 (4) pp. 497-520.

In this paper we present ASUR++, a notation for describing, and reasoning about the design of, mobile interactive computer systems that combine physical and digital objects and information: mobile mixed systems. ASUR++ helps a designer to specify the key characteristics of such systems and to focus on the relationship between physical objects and actions and digital information exchanges. Following a brief introduction to the notation, we illustrate its potential usefulness via examples based on the design of an augmented museum gallery. We conclude with a consideration of the integration of ASUR++ into the system development process and its augmentation via associated methods and tools.

© All rights reserved Dubois et al. and/or Elsevier Science

 
Edit | Del

Nigay, Laurence, Dubois, E., Renevier, P., Pasqualetti, L. and Troccaz, J. (2003): Mixed Systems: Combining Physical and Digital Worlds. In: Stephanidis, Constantine (ed.) Proceedings of the Tenth International Conference on Human-Computer Interaction June 22-27, 2003, Crete, Greece. pp. 1203-1207.

 
Edit | Del

Laurillau, Y. and Nigay, Laurence (2003): CoVitesse: A Groupware Interface for Collaborative Navigation on the WWW. In: Stephanidis, Constantine (ed.) Proceedings of the Tenth International Conference on Human-Computer Interaction June 22-27, 2003, Crete, Greece. pp. 954-958.

2002
 
Edit | Del

Laurillau, Yann and Nigay, Laurence (2002): Clover architecture for groupware. In: Churchill, Elizabeth F., McCarthy, Joe, Neuwirth, Christine and Rodden, Tom (eds.) Proceedings of the 2002 ACM conference on Computer supported cooperative work November 16 - 20, 2002, New Orleans, Louisiana, USA. pp. 236-245.

In this paper we present the Clover architectural model, a new conceptual architectural model for groupware. Our model results from the combination of the layer approach of Dewan's generic architecture with the functional decomposition of the Clover design model. The Clover design model defines three classes of services that a groupware application may support, namely, production, communication and coordination services. The three classes of services can be found in each functional layer of our model. Our model is illustrated with a working system, the CoVitesse system, its software being organized according to our Clover architectural model.

© All rights reserved Laurillau and Nigay and/or ACM Press

 
Edit | Del

Laurillau, Yann and Nigay, Laurence (2002): Le modèle d'architecture Clover pour les collecticiels. In: Proceedings of the 2002 Conference of the Association Francophone dInteraction Homme-Machine 2002. pp. 113-120.

In this paper we present the Clover architectural model, a new conceptual architectural model for groupware. Our model results from the combination of the layer approach of Dewan's generic architecture with the functional decomposition of the Clover design model. The Clover design model defines three classes of services that a groupware may support, namely, production, communication and coordination services. The three classes of services can be found in each functional layer of our model. Our model is illustrated with the CoVitesse system, its software being organized according to our Clover architectural model.

© All rights reserved Laurillau and Nigay and/or ACM Press

 
Edit | Del

Nigay, Laurence and Gray, Philip D. (2002): Architecture logicielle conceptuelle pour la capture de contexte. In: Proceedings of the 2002 Conference of the Association Francophone dInteraction Homme-Machine 2002. pp. 211-214.

This paper describes a PAC-Amodeus software architectural solution for context-aware systems. The software architectural solution is based on an architecture dedicated to multimodal interaction. Indeed we argue in this paper that several software design issues are common between multimodal systems and context-aware systems. Our architectural solution is intentionally generic, intended to serve as the basis for a wide range of possible systems and compatible with the existing architectural approaches.

© All rights reserved Nigay and Gray and/or ACM Press

 
Edit | Del

Daudé, Sylvain and Nigay, Laurence (2002): Processus et paramètres de conception de la sonification. In: Proceedings of the 2002 Conference of the Association Francophone dInteraction Homme-Machine 2002. pp. 255-258.

In this paper we present a unified framework for the design of audible interfaces. We describe the steps of the sonification process and their parameters. The process is modeled as a sequence of transformation functions from the data to present to the produced sounds. The usefulness of the framework for classifying existing audible interfaces and for designing new ones is then discussed.

© All rights reserved Daudé and Nigay and/or ACM Press

 
Edit | Del

Dubois, E., Nigay, Laurence and Troccaz, J. (2002): Assessing continuity and compatibility in augmented reality systems. In Universal Access in the Information Society, 1 (4) pp. 263-273.

Integrating computer-based information into the real world of the user is becoming a crucial challenge for the designers of interactive systems. The Augmented Reality (AR) paradigm illustrates this trend. Information is provided by an AR system to facilitate or to enrich the natural way in which the user interacts with the real environment. We focus on the output of such systems and, in particular, on the smooth integration of additional information in the real environment of the user. We characterize the integration of the computer-provided entities with the real ones using two new properties: compatibility and continuity. After defining the two properties, we provide factors and an analytical method needed for assessing them. We also empirically study the two properties to highlight their impact on interaction. The CASPER system, developed in our teams, is used to illustrate the discussion.

© All rights reserved Dubois et al. and/or Springer Verlag

 
Edit | Del

Dubois, Emmanuel, Gray, Philip D. and Nigay, Laurence (2002): ASUR++: A Design Notation for Mobile Mixed Systems. In: Paterno, Fabio (ed.) Mobile Human-Computer Interaction - 4th International Symposium - Mobile HCI 2002 September 18-20, 2002, Pisa, Italy. pp. 123-139.

 
Edit | Del

Nigay, Laurence, Salembier, P., Marchand, T., Renevier, Philippe and Pasqualetti, Laurence (2002): Mobile and Collaborative Augmented Reality: A Scenario Based Design Approach. In: Paterno, Fabio (ed.) Mobile Human-Computer Interaction - 4th International Symposium - Mobile HCI 2002 September 18-20, 2002, Pisa, Italy. pp. 241-255.

2001
 
Edit | Del

Little, Murray Reed and Nigay, Laurence (eds.) EHCI 2001 - Engineering for Human-Computer Interaction, 8th IFIP International Conference May 11-13, 2001, Toronto, Canada.

 
Edit | Del

Renevier, Philippe and Nigay, Laurence (2001): Mobile Collaborative Augmented Reality: The Augmented Stroll. In: Little, Murray Reed and Nigay, Laurence (eds.) EHCI 2001 - Engineering for Human-Computer Interaction, 8th IFIP International Conference May 11-13, 2001, Toronto, Canada. pp. 299-316.

 
Edit | Del

Dubois, Emmanuel, Nigay, Laurence and Troccaz, Jocelyne (2001): Consistency in Augmented Reality Systems. In: Little, Murray Reed and Nigay, Laurence (eds.) EHCI 2001 - Engineering for Human-Computer Interaction, 8th IFIP International Conference May 11-13, 2001, Toronto, Canada. pp. 111-122.

2000
 
Edit | Del

Graham, T. C. Nicholas, Watts, Leon A., Calvary, Gaelle, Coutaz, Joëlle, Dubois, Emmanuel and Nigay, Laurence (2000): A Dimension Space for the Design of Interactive Systems Within their Physical Environments. In: Proceedings of DIS00: Designing Interactive Systems: Processes, Practices, Methods, & Techniques 2000. pp. 406-416.

This paper introduces a Dimension Space describing the entities making up richly interactive systems. The Dimension Space is intended to help designers understand both the physical and virtual entities from which their systems are built, and the tradeoffs involved in both the design of the entities themselves and of the combination of these entities in a physical space. Entities are described from the point of view of a person carrying out a task at a particular time, in terms of their attention received, role, manifestation, input and output capacity and informational density. The Dimension Space is applied to two new systems developed at Grenoble, exposing design tradeoffs and design rules for richly interactive systems.

© All rights reserved Graham et al. and/or ACM Press

 
Edit | Del

Vernier, Frederic and Nigay, Laurence (2000): A Framework for the Combination and Characterization of Output Modalities. In: DSV-IS 2000 2000. pp. 35-50.

 
Edit | Del

Dubois, Emmanuel and Nigay, Laurence (2000): Augmented reality: which augmentation for which reality?. In: Designing Augmented Reality Environments 2000 2000. pp. 165-166.

1999
 
Edit | Del

Spence, R., Chatty, Stephane, Christensen, Henrik Bærbak, Fishkin, Kenneth P., Johnston, Lorraine, Koning, Nicole de, Lu, Shijian, Nigay, Laurence, Orosco, Ricardo and Scholtz, Jean (1999): The Visualisation of Web Usage. In: Chatty, Stephane and Dewan, Prasun (eds.) Engineering for Human-Computer Interaction, IFIP TC2/TC13 WG2.7/WG13.4 Seventh Working Conference on Engineering for Human-Computer Interaction September 14-18, 1999, Heraklion, Crete, Greece. pp. 351-361.

1998
 
Edit | Del

Johnson, Hilary, Nigay, Laurence and Roast, C. R. (eds.) Proceedings of the Thirteenth Conference of the British Computer Society Human Computer Interaction Specialist Group - People and Computers XIII August 1-4, 1998, Sheffield, UK.

 
Edit | Del

Nigay, Laurence and Vernier, Frederic (1998): Design method of interaction techniques for large information spaces. In: Catarci, Tiziana, Costabile, Maria Francesca, Santucci, Giuseppe and Tarantino, Laura (eds.) AVI 1998 - Proceedings of the working conference on Advanced Visual Interfaces May 24 - 27, 1998, LAquila, Italy. pp. 37-46.

1997
 
Edit | Del

Calvary, Gaelle, Coutaz, Joëlle and Nigay, Laurence (1997): From Single-User Architectural Design to PAC*: a Generic Software Architecture Model for CSCW. In: Pemberton, Steven (ed.) Proceedings of the ACM CHI 97 Human Factors in Computing Systems Conference March 22-27, 1997, Atlanta, Georgia. pp. 242-249.

This article reports our reflection on software architecture modelling for multi-user systems (or groupware). First, we introduce the notion of software architecture and make explicit the design steps that most software designers in HCI tend to blend in a fuzzy way. Building on general concepts and practice from main stream software engineering, we then present a comparative analysis of the most significant architecture models developed for single-and multi-user systems. We close with the presentation of PAC*, a new architectural framework for modelling and designing the software architecture of multi-user systems. PAC* is a motivated combination of existing architectural models selected for the complementarity of their "good properties". These include operational heuristics such as rules for deriving agents in accordance to the task model or criteria for reasoning about replication, as well as properties such as support for style heterogeneity, portability, and reusability.

© All rights reserved Calvary et al. and/or ACM Press

1996
 
Edit | Del

Bellotti, Victoria, Blandford, Ann, Duke, David, MacLean, Allan, May, Jon and Nigay, Laurence (1996): Interpersonal Access Control in Computer-Mediated Communications: A Systematic Analysis of the Design Space. In Human-Computer Interaction, 11 (4) pp. 357-432.

Certain design projects raise difficult user-interface problems that are not easily amenable to designers' intuition or rapid prototyping due to their novelty, conceptual complexity, and the difficulty of conducting appropriate user studies. Interpersonal access control in computer-mediated communication (CMC) systems is just such a problem. We describe a collection of systematic theory-based analyses of a system prototype that inherited its control mechanism from two preexisting systems. We demonstrate that the collective use of system and user modeling techniques provides insight into this complex design problem and enables us to examine the implications of design decisions for users and implementation. The analyses identify a number of weaknesses in the prototype and are used to propose ways of making substantive refinements to improve its simplicity and appropriateness for two tasks: altering one's accessibility and distinguishing between who can make what kinds of connections. We conclude with a discussion of some critical issues that are relevant for CMC systems, and reflect on the process of applying formal human-computer interaction (HCI) techniques in informal, exploratory design contexts.

© All rights reserved Bellotti et al. and/or Taylor and Francis

 
Edit | Del

Mulhem, P. and Nigay, Laurence (1996): Interactive Information Retrieval Systems: From User Centered Interface Design to Software Design. In: Proceedings of the 19th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval 1996. pp. 326-334.

This article is concerned with the design and implementation of Information Retrieval Systems (IRS). We show how theories and models from the domain of Human Computer Interaction (HCI) can be applied to the design of IRS. We first study the user's tasks by modelling the mental activities of the user while accomplishing a task. Adopting a system perspective, we consider the processing tasks of an IRS and organize them in a design space. We then build upon the design space to consider the implications of such data processing and levels of abstraction on software design. Finally we present PAC-Amodeus, a software architecture model and illustrate the applicability of the approach with the implementation of an IRS: the TIAPRI system.

© All rights reserved Mulhem and Nigay and/or ACM Press

1995
 
Edit | Del

Nigay, Laurence and Coutaz, Joëlle (1995): A Generic Platform for Addressing the Multimodal Challenge. In: Katz, Irvin R., Mack, Robert L., Marks, Linn, Rosson, Mary Beth and Nielsen, Jakob (eds.) Proceedings of the ACM CHI 95 Human Factors in Computing Systems Conference May 7-11, 1995, Denver, Colorado. pp. 98-105.

Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech and direct manipulation. The flexibility they offer results in an increased complexity that current software tools do not address appropriately. One of the emerging technical problems in multimodal interaction is concerned with the fusion of information produced through distinct interaction techniques. In this article, we present a generic fusion engine that can be embedded in a multi-agent architecture modelling technique. We demonstrate the fruitful symbiosis of our fusion mechanism with PAC-Amodeus, our agent-based conceptual model, and illustrate the applicability of the approach with the implementation of an effective interactive system: MATIS, a Multimodal Airline Travel Information System.

© All rights reserved Nigay and Coutaz and/or ACM Press

 
Edit | Del

Coutaz, Joëlle, Nigay, Laurence and Salber, Daniel (1995): Multimodality from the User and System Perspectives. In: Stephanidis, Constantine (ed.) Proceedings of the 1st ERCIM Workshop on User Interfaces for All October 30-31, 1995, Heraklion, Crete, Greece. p. 17.

This article is concerned with the usability and implementation of multimodal user interfaces. We show how the usability of such systems can be characterized in terms of the relations they are able to maintain between the modalities they support. Equivalence, assignment, redundancy, and complementarity of modalities form an interesting set of relations relevant to usability assessment and software design. We use the notion of compatibility between user preferences and system properties to show how the CARE properties interact with user modelling to predict usability during the design of a system. In addition we demonstrate how experimental evaluations can be based on the CARE properties. We then depart from the HCI perspective to consider the implications of such properties on software design and techniques: we present PAC-Amodeus, a software architecture model, in conjunction with a generic fusion mechanism.

© All rights reserved Coutaz et al. and/or The European Research Consortium for Informatics and Mathematics - ERCIM

 
Edit | Del

Nigay, Laurence, Coutaz, Joëlle, Salber, Daniel, Blandford, Ann, May, Jon and Young, Richard M. (1995): Four Easy Pieces for Assessing the Usability of Multimodal Interaction: the CARE Properties. In: Nordby, Knut (ed.) Proceedings of INTERACT 95 - IFIP TC13 Fifth International Conference on Human-Computer Interaction June 25-29, 1995, Lillehammer, Norway. pp. 115-120.

We propose the CARE properties as a simple way of characterising and assessing aspects of multimodal interaction: the Complementarity, Assignment, Redundancy, and Equivalence that may occur between the interaction techniques available in a multimodal user interface. We provide a formal definition of these properties and use the notion of compatibility to show how the system CARE properties interact with user CARE-like properties in the design of a system. The discussion is illustrated with MATIS, a Multimodal Air Travel Information System.

© All rights reserved Nigay et al. and/or Chapman and Hall

 Cited in the following chapter:

User Interface Design Adaptation: [/encyclopedia/user_interface_design_adaptation.html]


 
1993
 
Edit | Del

Nigay, Laurence and Coutaz, Joëlle (1993): A Design Space for Multimodal Systems: Concurrent Processing and Data Fusion. In: Ashlund, Stacey, Mullet, Kevin, Henderson, Austin, Hollnagel, Erik and White, Ted (eds.) Proceedings of the ACM CHI 93 Human Factors in Computing Systems Conference April 24-29, 1993, Amsterdam, The Netherlands. pp. 172-178.

Multimodal interaction enables the user to employ different modalities such as voice, gesture and typing for communicating with a computer. This paper presents an analysis of the integration of multiple communication modalities within an interactive system. To do so, a software engineering perspective is adopted. First, the notion of "multimodal system" is clarified. We aim at proving that two main features of a multimodal system are the concurrency of processing and the fusion of input/output data. On the basis of these two features, we then propose a design space and a method for classifying multimodal systems. In the last section, we present a software architecture model of multimodal systems which supports these two salient properties: concurrency of processing and data fusion. Two multimodal systems developed in our team, VoicePaint and NoteBook, are used to illustrate the discussion.

© All rights reserved Nigay and Coutaz and/or ACM Press

 
Edit | Del

Coutaz, Joëlle, Nigay, Laurence and Salber, Daniel (1993): The MSM Framework: A Design Space for Multi-Sensori-Motor Systems. In: East-West International Conference on Human-Computer Interaction: Proceedings of the EWHCI93 1993. pp. 220-232.

One of the new design goals in Human Computer Interaction is to extend the sensory-motor capabilities of computer systems to better match the natural communication means of human beings. This article proposes a dimension space that should help reasoning about current and future Multi-Sensori-Motor systems (MSM). To do so, we adopt a system centered perspective although we draw upon the "Interacting Cognitive Subsystems" psychological model. Our problem space is comprised of 6 dimensions. The first two dimensions deal with the notion of communication channel: the number and direction of the channels that a particular MSM system supports. The other four dimensions are used to characterize the degree of built-in cognitive sophistication of the system: levels of abstraction, context, fusion/fission, and granularity of concurrency. We illustrate the discussion with examples of multimedia and multimodal systems, both MSM systems but with distinct degrees of built-in cognitive sophistication.

© All rights reserved Coutaz et al. and/or Intl. Centre for Scientific And Technical Information

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
 
 
 

Changes to this page (author)

09 Sep 2013: Added
09 Nov 2012: Modified
04 Apr 2012: Modified
04 Apr 2012: Modified
20 Apr 2011: Modified
20 Apr 2011: Modified
20 Apr 2011: Modified
20 Apr 2011: Modified
20 Apr 2011: Modified
20 Apr 2011: Modified
17 Apr 2011: Modified
17 Apr 2011: Modified
17 Apr 2011: Modified
17 Apr 2011: Modified
17 Apr 2011: Modified
17 Apr 2011: Modified
15 Apr 2011: Modified
15 Apr 2011: Modified
15 Apr 2011: Modified
15 Apr 2011: Modified
15 Apr 2011: Modified
15 Apr 2011: Modified
15 Apr 2011: Modified
02 Nov 2010: Modified
02 Nov 2010: Modified
18 Jan 2010: Added
17 Aug 2009: Modified
25 Jul 2009: Modified
20 Jul 2009: Modified
20 Jul 2009: Modified
27 Jun 2009: Modified
27 Jun 2009: Modified
27 Jun 2009: Modified
27 Jun 2009: Modified
27 Jun 2009: Modified
27 Jun 2009: Modified
27 Jun 2009: Modified
17 Jun 2009: Modified
17 Jun 2009: Modified
17 Jun 2009: Modified
17 Jun 2009: Modified
17 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
03 Jun 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
12 May 2008: Modified
12 May 2008: Modified
12 May 2008: Added
12 May 2008: Modified
12 May 2008: Modified
12 May 2008: Modified
25 Jul 2007: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
28 Jun 2007: Modified
27 Jun 2007: Modified
25 Jun 2007: Modified
24 Jun 2007: Added
22 Jun 2007: Modified
28 Apr 2003: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/laurence_nigay.html

Publication statistics

Pub. period:1993-2012
Pub. count:62
Number of co-authors:78



Co-authors

Number of publications with 3 favourite co-authors:

Marcos Serrano:9
Joëlle Coutaz:7
Emmanuel Dubois:6

 

 

Productive colleagues

Laurence Nigay's 3 most productive colleagues in number of publications:

Ann Blandford:85
Philippe A. Palanq..:66
Jean Scholtz:54
 
 
 
Jul 13

A general principle for all user interface design is to go through all of your design elements and remove them one at a time. If the design works as well without a certain design element, kill it.

-- Jakob Nielsen, Designing Web Usability, p. 22.

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!