Publication statistics

Pub. period:2002-2012
Pub. count:76
Number of co-authors:130



Co-authors

Number of publications with 3 favourite co-authors:

Farzam Farbiz:11
Simon Prince:10
Owen Noel Newton Fe..:10

 

 

Productive colleagues

Adrian David Cheok's 3 most productive colleagues in number of publications:

Albrecht Schmidt:110
Mark Billinghurst:92
Robert J. K. Jacob:57
 
 
 
Jul 28

A user will find any interface design intuitive...with enough practice.

-- Popular computer one-liner

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!

 
 

Adrian David Cheok

Picture of Adrian David Cheok.
Personal Homepage:
http://www.adriancheok.info/

Adrian David Cheok is director of the Mixed Reality Lab National University of Singapore. He has worked in real-time systems, soft computing, and embedded computing in both Mitsubishi Electric Research Labs (Osaka, Japan) and National University of Singapore (NUS). In NUS he has been working on research covering mixed reality, human-computer interaction, wearable computers and smart spaces, fuzzy systems, embedded systems, power electronics, and multi-modal recognition. He has successfully obtained funding for four externally funded projects in the area of wearable computers and mixed reality from the Defense Science Technology Agency Singapore. The research output has included numerous high quality academic journal papers, research prototype deliverables to DSTA, numerous demonstrations including to the President and Deputy Prime Minister of Singapore, CNN / CNBC television worldwide broadcasts on his research, and international invited new media exhibits such as Ars Electronica. He is currently an Assistant Professor at the National University of Singapore where he leads a team of over 20 researchers and students. He has been a keynote and invited speaker at numerous international and local conferences and events. He is invited to exhibit for two years in the Ars Electronica Museum of the Future, launching in the Ars Electronica Festival 2003. He was IEEE Singapore Section Chairman 2003, and is presently ACM SIGCHI Chapter President. He was awarded the Hitachi Fellowship 2003, the A-STAR Young Scientist of the Year Award 2003, and the SCS Singapore Young Professional of the Year Award 2004.

Adrian David Cheok, Mixed Reality Lab, 4 Engineering Drive 3, National University of Singapore, Singapore 117576. +65 68746850 email: adriancheok@nus.edu.sg http://mixedreality.nus.edu.sg

Edit author info
Add publication

Publications by Adrian David Cheok (bibliography)

 what's this?
2012
 
Edit | Del

Samani, Hooman Aghaebrahimi, Parsani, Rahul, Rodriguez, Lenis Tejada, Saadatian, Elham, Dissanayake, Kumudu Harshadeva and Cheok, Adrian David (2012): Kissenger: design of a kiss transmission device. In: Proceedings of DIS12 Designing Interactive Systems 2012. pp. 48-57

In this paper, we present Kissenger (Kiss Messenger), an interactive device that provides a physical interface for transmitting a kiss between two remotely connected people. Each device is paired to another and can sense and transmit the amount of force that a user applies to a pair of lips which is recreated on the other device using motors. Kissenger was designed to augment already existing remote communication technologies such as video chat. The goal of this work is to promote intimacy between humans in long distance relationships. After presenting the background and motivation for the need of such a device, we describe the design process that consisted of three iteration stages, each with its own focus and evaluation. We then present a preliminary user study performed with seven couples that compare Kissenger to current video chat technology.

© All rights reserved Samani et al. and/or ACM Press

 
Edit | Del

Wei, Jun, Cheok, Adrian David and Nakatsu, Ryohei (2012): Let's have dinner together: evaluate the mediated co-dining experience. In: Proceedings of the 2012 International Conference on Multimodal Interfaces 2012. pp. 225-228

Having dinner together is undoubtedly a kind of pleasurable experience, which involves various channels for mutual interactions, not only audio, vision and touch, but also smell and taste. With the aim to extend this rich experience to remote situations, we developed the Co-dining system to support a range of mealtime interactions to enhance the feeling of social togetherness. This paper describes the preliminary study with this interactive multisensory system. It aims to investigate the actual effectiveness of the working prototype on enhancing the social presence and engagement during telepresent dining, and also to get a comprehensive understanding about users' perception. This evaluation focused on three main aspects: the overall Co-dining feeling, cultural awareness, and engagement. The study results revealed that this system positively achieved the sense of "being together" among users, through the interactive activities touching upon tableware, tablecloth and real edible food, and each interaction module contributed differently to the overall experience. In this paper, we report the evaluation process, present and interpret the data, and discuss the initial insights to enhance the sense of co-presence through multi-channel interactions.

© All rights reserved Wei et al. and/or ACM Press

 
Edit | Del

Ranasinghe, Nimesha, Cheok, Adrian David and Nakatsu, Ryohei (2012): Taste/IP: the sensation of taste for digital communication. In: Proceedings of the 2012 International Conference on Multimodal Interfaces 2012. pp. 409-416

In this paper, we present a new methodology for integrating the sense of taste with the existing digital communication domain. First, we discuss existing problems and limitations for integrating the sense of taste as a digital communication media. Then, to assess this gap, we present a solution with three core modules: the transmitter, form of communication, and receiver. The transmitter is a mobile application, where the sender formulates a taste message to send. Then, for communication, we present a new extensible markup language (XML) format, the TasteXML (TXML) to specify the format of taste messages. As the receiver (actuator), we introduce Digital Taste Stimulator, a novel method for stimulating taste sensations on human. Initial user experiments and qualitative feedbacks were discussed mainly focusing on the Digital Taste Stimulator. We conclude with a brief overview of future aspects of this technology and possibilities on other application domains.

© All rights reserved Ranasinghe et al. and/or ACM Press

2011
 
Edit | Del

Cheok, Adrian David, Koh, Jeffrey Tzu Kwan Valino, Peiris, Roshan Lalintha and Fernando, Owen Noel Newton (2011): Mixed reality lab Singapore: a genealogy of lab projects employing the blue sky innovation research methodology. In: Proceedings of ACM CSCW11 Conference on Computer-Supported Cooperative Work 2011. pp. 17-24.

In this paper we outline a genealogy of Mixed Reality Lab (MXR) projects, their influencing factors both from the Asian region and also within the lab, the employment of "Blue-Sky Innovation" by the lab for ideation, collaboration and project generation, as well as discuss some major points of inspiration for MXR from various sources.

© All rights reserved Cheok et al. and/or their publisher

Cheok, Adrian David (2013): Mixed Reality. In: Soegaard, Mads and Dam, Rikke Friis (eds.). "The Encyclopedia of Human-Computer Interaction, 2nd Ed.". Aarhus, Denmark: The Interaction Design Foundation. Available online at http://www.interaction-design.org/encyclopedia/mixed_reality.html

 
Edit | Del

Wei, un, Wang, Xuan, Tache, Remi, Peiris, Roshan Lalintha, Choi, Yongsoon, Halupka, Veronica, Koh, Jeffrey Tzu Kwan Valino, Martinez, Xavier Roman and Cheok, Adrian David (2011): Food Media: exploring interactive entertainment over telepresent dinner. In: Advances in Computer Entertainment Technology 2011 2011.

 
Edit | Del

Wei, Jun, Wang, Xuan, Peiris, Roshan Lalintha, Choi, Yongsoon, Martinez, Xavier Roman, Tache, Remi, Koh, Jeffrey Tzu Kwan Valino, Halupka, Veronica and Cheok, Adrian David (2011): CoDine: an interactive multi-sensory system for remote dining. In: Ubicomp 2011 2011. pp. 21-30

 
Edit | Del

Choi, Yongsoon, Cheok, Adrian David, Martinez, Xavier Roman, Nguyen, The Anh and Sugimoto, Kenichi (2011): Sound perfume: designing a wearable sound and fragrance media for face-to-face interpersonal interaction. In: Advances in Computer Entertainment Technology 2011 2011.

Sound and smell can sometimes generate stronger emotional feelings than words can. They can even sometimes awaken strong, long forgotten memories. The Sound Perfume system provides users with additional auditory and olfactory sensory inputs through a pair of glasses, to augment their unique identities and impressions to others during face-to-face interpersonal communication. When a user starts a conversation with another person for the first time, sound and fragrance IDs information is shared with each other through wireless mobile communication. Received IDs information is forwarded to the user's glasses, and they can feel the other person's sound and fragrance IDs through actuators in the glasses. Perfume actuators heat the solid perfume and pulsating sound ID from the speaker helps to gently emit a fragrance. When the user meets the other person again in the future, their phone recognizes the other person and transfers ID information to their glasses to regenerate the other person's sound and fragrance IDs once again. This indirect stimulation helps users to express their unique identity during face-to-face interpersonal interactions, and awaken memories of the other person when they meet again after some time. Sound Perfume can also be integrated with a mobile phone camera and photo viewer application. When a user takes the other person's picture using a mobile phone camera, this photo saves a shot location, time, the other person's name, the other person's mobile phone MAC address, and the other person's sound and fragrance ID information. When he sees this picture later, he can feel the other person's sound and fragrance IDs softly through his glasses again. In this paper, as the first step, we are more focusing on the designing of the Sound Perfume system.

© All rights reserved Choi et al. and/or their publisher

 
Edit | Del

Choi, Yongsoon, Cheok, Adrian David, Martinez, Xavier Roman, Halupka, Veronica and Sugimoto, Kenichi (2011): Sound Perfume: Augmenting user's identity using sound and fragrance stimulation. In: Oct. 23-28, 2011, Providence, RI . pp. 99-100

Sound Perfume system is a subtle communication media could help a user to create a unique identity for others and augment the recollection of memories through indirect sound and smell stimulation from a pair of glasses during face-to-face communication. Sound Perfume system was evaluated through testing with twelve Japanese participants (six males, six females) to measure the effectiveness of augmenting their impressions in face-to-face interpersonal interactions. Results showed that Sound Perfume system could help eight users, especially during the first contact, to augment their impressions to their partner during communication. However, in order to assert this result concretely, we would need to improve the current version of the system and conduct further user evaluations.

© All rights reserved Choi et al. and/or their publisher

 
Edit | Del

Wei, Jun, Wang, Xuan, Peiris, Roshan Lalintha, Choi, Yongsoon, Martinez, Xavier Roman, Tache, Remi, Koh, Jeffery Tzu Kwan Valino, Halupka, Veronica and Cheok, Adrian David (2011): CoDine: an interactive multi-sensory system for remote dining. In: Proceedings of the 2011 International Conference on Uniquitous Computing 2011. pp. 21-30

The pervasiveness of computing has extended into domestic realms, including the dining room. Beyond simply a place to consume food, the dining room is a social hub where family members meet and share experiences. Yet busy lifestyles can make it difficult to spend social time with your family. To provide a new solution for family bonding, this paper presents the CoDine system, a dining table embedded with interactive subsystems that augment and transport the experience of communal family dining to create a sense of coexistence among remote family members. CoDine connects people in different locations through shared dining activities: gesture-based screen interaction, mutual food serving, ambient pictures on an animated tablecloth, and the transportation of edible messages. Rather than focusing on functionality or efficiency, CoDine aims to provide people with an engaging interactive dining experience through enriched multi-sensory communication.

© All rights reserved Wei et al. and/or ACM Press

 
Edit | Del

Ranasinghe, Nimesha, Cheok, Adrian David, Fernando, Owen Noel Newton, Nii, Hideaki and Ponnampalam, Gopalakrishnakone (2011): Electronic taste stimulation. In: Proceedings of the 2011 International Conference on Uniquitous Computing 2011. pp. 561-562

In this paper, we present a system, which could digitally stimulate the sense of taste (gustation) on human. The system utilizes electrical stimulation on human tongue to produce taste sensations. The initial experiments reveal that the method is viable and deserves further developments. This requires further analyses of the properties of electric pulses (current, frequency, and voltage) on tongue along with the stimulating material. The experimental results suggested that sourness, bitterness, and saltiness are the main sensations that could be evoked at present.

© All rights reserved Ranasinghe et al. and/or ACM Press

 
Edit | Del

Ranasinghe, Nimesha, Cheok, Adrian David, Nii, Hideaki, Fernando, Owen Noel Newton and Ponnampalam, Gopalakrishnakone (2011): Digital taste interface. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 11-12

Thus far, most of the systems for generating taste sensations are based on blending several chemicals respectively, and there were no definite strategies to stimulate the sense of taste digitally. In this paper, a method for digitally actuating the sense of taste is introduced by actuating the tongue through electrical and thermal stimulations. The digital taste interface, a control system, is developed to stimulate the taste sensations digitally on the tongue. The effect of most persuading factors such as current, frequency, and temperature have been accounted for stimulating the tongue non-invasively. The experimental results suggested that sourness and saltiness are the main sensations that could be evoked while there are evidences of sweet and bitter sensations too.

© All rights reserved Ranasinghe et al. and/or ACM Press

 
Edit | Del

Ranasinghe, Nimesha, Cheok, Adrian David, Nii, Hideaki, Fernando, Owen Noel Newton and Ponnampalam, Gopalakrishnakone (2011): Digital taste for remote multisensory interactions. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 79-80

We present a novel control system that enables digital stimulations of the sense of taste (gustation) on human to enhance remote multisensory interactions. The system uses two approaches to actuate taste sensations digitally: the electrical and thermal stimulations on tongue. The experimental results suggested that sourness and saltiness are the main sensations that could be evoked besides several evidences of sweet and bitter sensations.

© All rights reserved Ranasinghe et al. and/or ACM Press

2009
 
Edit | Del

Cheok, Adrian David, Fernando, Owen Noel Newton and Fernando, Charith Lasantha (2009): Petimo: safe social networking robot for children. In: Proceedings of ACM IDC09 Interaction Design and Children 2009. pp. 274-275.

As social networking widely spreads among the community, especially among the younger generation, the negative influence created on children has become a serious social concern. "Petimo" is an interactive robotic toy designed to protect children from potential risks in social networks and the virtual world and it helps them to make a safely connected social networking environment. It adds a new physical dimension to social computing through enabling a second authentication mode, providing extra safety in making friends by physically touching each others robot. Petimo can be connected to any social network and it provides safety and security for children. As a proof of concept, we have developed a 3D virtual world, "Petimo-World" which demonstrates all of the realizable basic features with traditional online social networks. Petimo-World stands out from all other virtual worlds with its interesting and sophisticated interactions such as the visualization of a friends' relationships through spatial distribution in the 3D space to clearly understand the closeness of the friendship, personalized avatars and sending of special gifts/emoticons.

© All rights reserved Cheok et al. and/or ACM Press

 
Edit | Del

Teh, James Keng Soon, Cheok, Adrian David, Choi, Yongsoon, Fernando, Charith Lasantha, Peiris, Roshan Lalintha and Fernando, Owen Noel Newton (2009): Huggy pajama: a parent and child hugging communication system. In: Proceedings of ACM IDC09 Interaction Design and Children 2009. pp. 290-291.

Huggy Pajama is a novel wearable system aimed at promoting physical interaction in remote communication between parent and child. This system enables parents and children to hug one another through a novel hugging interface device and a wearable, hug reproducing pajama connected through the Internet. The hugging device is a small, mobile doll with an embedded pressure sensing circuit that is able to accurately sense varying levels of the range of human force produced from natural touch. This device sends hug signals to a haptic jacket that simulates the feeling of being hugged to the wearer. It features air pressure actuation to reproduce hug.

© All rights reserved Teh et al. and/or ACM Press

 
Edit | Del

Moloney, Luke, Rod, Jan, Tuters, Marc, Dayarathna, Miyuru and Cheok, Adrian David (2009): Paruresis. In: Proceedings of the 2009 Conference on Creativity and Cognition 2009. pp. 467-468.

Paruresis is an object critical design that addresses the phobia of an inability to urinate in the presence of others. It combines machine vision and sensors with a camera recording the gaze triggered for playback by urination.

© All rights reserved Moloney et al. and/or their publisher

 
Edit | Del

Khoo, Eng Tat, Merritt, Tim and Cheok, Adrian David (2009): Designing physical and social intergenerational family entertainment. In Interacting with Computers, 21 (1) pp. 76-87.

Present computer games and digital entertainment do not usually facilitate intergenerational family interactions. According to recent survey results in Japan, there is a high percentage of older people who own and play electronic or computer games, but rarely do they play the games with their family members. It is a positive sign that more older people are participating in the digital games arena, but it would be even more beneficial if they could interact actively with the young family members through gaming activities. This could possibly strengthen family bonding and bridge the gap between older people and youth culture. This paper presents steps for designing an intergenerational family entertainment system which focuses on physical and social interactions using a mixed reality floor system. The main design goals include: facilitating interactions between users with varied levels of skill in utilizing technology, utilizing the familiar physical motions from other activities to make an intuitive physical interface, and encouraging social interactions among families and friends. Detailed implementation of these steps is presented in the design of our intergenerational entertainment system, Age Invaders. Four main prototype iterations for the system is presented. Our design process is based on User Centered Design and relies on constant involvement of users to understand the key issues and to help make effective design decisions. The results of the study help to focus the refinements of the existing platform from a usability standpoint and also aids in the development of new physical entertainment and interactive applications. This study provides insights into user issues including how users interact in a complex mixed reality experience, which is heavily based in physicality. The use of one portion of the user base which is most proficient with technology before involving the novice users was found to empower novice users to begin to use digital technology.

© All rights reserved Khoo et al. and/or Elsevier Science

 
Edit | Del

Liu, Wei, Teh, Keng Soon, Peiris, Roshan Lalintha, Choi, Yongsoon, Cheok, Adrian David, Lim, Charissa Mei Ling, Theng, Yin Leng, Nguyen, Ta Huynh Duy, Qui, Tran Cong Thien and Vasilakos, Athanasios V. (2009): Internet-Enabled User Interfaces for Distance Learning. In International Journal of Technology and Human Interaction, 5 (9) pp. 51-77

2008
 
Edit | Del

Cheok, Adrian David, Kok, Roger Thomas, Tan, Chuen, Fernando, Owen Noel Newton, Merritt, Tim and Sen, Janyn Yen Ping (2008): Empathetic living media. In: Proceedings of DIS08 Designing Interactive Systems 2008. pp. 465-473.

We describe a new form of interactive living media used to communicate social or ecological information in the form of an empathetic ambient media. In the fast paced modern world people are generally too busy to monitor various significant social or human aspects of their lives, such as time spent with their family, their overall health, state of the ecology, etc. By quantifying such information digitally, information is semantically coupled into living microorganisms, E. coli. Through the use of transformed DNA, the E. coli will then glow or dim according to the data. The core technical innovation of this system is the development of an information system based on a closed-loop control system through which digital input is able to control input fluids to the E. coli, and thereby control the output glow of the E. coli in real time. Thus, social or ecological based information is coupled into a living and organic media through this control system capsule and provides a living media which promotes empathy. We provide user design and feedback results to verify the validity of our hypothesis, and provide not only system results but generalized design frameworks for empathetic living media in general.

© All rights reserved Cheok et al. and/or ACM Press

 
Edit | Del

Teh, James Keng Soon, Cheok, Adrian David, Peiris, Roshan L., Choi, Yongsoon, Thuong, Vuong and Lai, Sha (2008): Huggy Pajama: a mobile parent and child hugging communication system. In: Proceedings of ACM IDC08 Interaction Design and Children 2008. pp. 250-257.

Huggy Pajama is a novel wearable system aimed at promoting physical interaction in remote communication between parent and child. This system enables parents and children to hug one another through a novel hugging interface device and a wearable, hug reproducing pajama connected through the Internet. The hugging device is a small, mobile doll with an embedded pressure sensing circuit that is able to accurately sense varying levels of the range of human force produced from natural touch. This device sends hug signals to a haptic jacket that simulates the feeling of being hugged to the wearer. It features air pockets actuating to reproduce hug, heating elements to produce warmth that accompanies hug, and color changing pattern and accessory to indicate distance of separation and communicate expressions. In this paper, we present the system design of Huggy Pajama.

© All rights reserved Teh et al. and/or ACM Press

 Cited in the following chapter:

Human-Robot Interaction: [/encyclopedia/human-robot_interaction.html]


 
 
Edit | Del

Inakage, Masa and Cheok, Adrian David (eds.) Proceedings of the International Conference on Advances in Computer Entertainment Technology - ACE 2008 December 3-5, 2008, Yokohama, Japan.

 
Edit | Del

Cheok, Adrian David and Li, Yue (2008): Ubiquitous interaction with positioning and navigation using a novel light sensor-based information transmission system. In Personal and Ubiquitous Computing, 12 (6) pp. 445-458.

 
Edit | Del

Tsekeridou, Sofia, Cheok, Adrian David, Giannakis, Konstantinos and Karigiannis, John (eds.) Proceedings of the 3rd international Conference on Digital interactive Media in Entertainment and Arts - DIMEA 08, vol. 349 September 10-12, 2008, Athens, Greece.

 
Edit | Del

Tat, Khoo Eng, Cheok, Adrian David, Nguyen, Ta Huynh Duy and Pan, Zhigeng (2008): Age invaders: social and physical inter-generational mixed reality family entertainment. In Virtual Reality, 12 (1) pp. 3-16.

 
Edit | Del

Hong, Dongpyo, Höllerer, Tobias, Haller, Michael, Takemura, Haruo, Cheok, Adrian David, Kim, Gerard Jounghyun, Billinghurst, Mark, Woo, Woontack, Hornecker, Eva, Jacob, Robert J. K., Hummels, Caroline, Ullmer, Brygg, Schmidt, Albrecht, Hoven, Elise van den and Mazalek, Ali (2008): Advances in Tangible Interaction and Ubiquitous Virtual Reality. In IEEE Pervasive Computing, 7 (2) pp. 90-96

2007
 
Edit | Del

Chong, Peter H. J. and Cheok, Adrian David (eds.) Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology September 10-12, 2007, Helsinki, Finland.

 
Edit | Del

Cheok, Adrian David and Chittaro, Luca (eds.) Proceedings of the 9th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2007 September 9-12, 2007, Singapore.

 
Edit | Del

Cheok, Adrian David, Mustafa, Abd-ur-Rehman, Fernando, Owen Noel Newton, Barthoff, Anne-Katrin, Wijesena, Imiyage Janaka Prasad and Tosa, Naoko (2007): BlogWall: displaying artistic and poetic messages on public displays via SMS. In: Cheok, Adrian David and Chittaro, Luca (eds.) Proceedings of the 9th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2007 September 9-12, 2007, Singapore. pp. 483-486.

 
Edit | Del

Cheok, Adrian David, Fernando, Owen Noel Newton, Wijesena, Imiyage Janaka Prasad, Mustafa, Abd-ur-Rehman, Barthoff, Anne-Katrin and Tosa, Naoko (2007): BlogWall: a new paradigm of artistic public mobile communication. In: Cheok, Adrian David and Chittaro, Luca (eds.) Proceedings of the 9th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2007 September 9-12, 2007, Singapore. pp. 333-334.

 
Edit | Del

Fernando, Owen Noel Newton, Cohen, Michael and Cheok, Adrian David (2007): Mobile spatial audio interfaces. In: Cheok, Adrian David and Chittaro, Luca (eds.) Proceedings of the 9th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2007 September 9-12, 2007, Singapore. pp. 345-347.

 
Edit | Del

Portalés, Cristina, Perales, Carlos D. and Cheok, Adrian David (2007): Exploring social, cultural and pedagogical issuesin AR-gaming through the live lego house. In: Inakage, Masa, Lee, Newton, Tscheligi, Manfred, Bernhaupt, Regina and Natkin, Stéphane (eds.) Proceedings of the International Conference on Advances in Computer Entertainment Technology - ACE 2007 June 13-15, 2007, Salzburg, Austria. pp. 238-239.

 
Edit | Del

Cheok, Adrian David (2007): Embodied Media and Mixed Reality for Social and Physical Interactive Communication and Entertainment. In: Gross, Tom (ed.) Mensch and Computer 2007 September 2-5, 2007, Weimar, Germany. pp. 3-4.

 
Edit | Del

Theng, Yin Leng, Lim, Charissa Mei Ling, Liu, Wei and Cheok, Adrian David (2007): Mixed Reality Systems for Learning: A Pilot Study Understanding User Perceptions and Acceptance. In: Shumaker, Randall (ed.) ICVR 2007 - Virtual Reality - Second International Conference - Part 1 July 22-27, 2007, Beijing, China. pp. 728-737.

 
Edit | Del

Chong, Peter H. J. and Cheok, Adrian David (eds.) International Conference On Mobile Technology, Applications, And Systems Mobility 2007 September 10-12, 2007, 2007, Singapore.

 
Edit | Del

Cheok, Adrian David, Lim, Zheng Shawn and Tan, Roger Thomas Kok Chuen (2007): Humanistic Oriental art created using automated computer processing and non-photorealistic rendering. In Computers & Graphics, 31 (2) pp. 280-291

2006
 
Edit | Del

Ping, Lee Shang, Cheok, Adrian David, Soon, James Teh Kheng, Debra, Goh Pae Lyn, Jie, Chio Wen, Chuang, Wang and Farbiz, Farzam (2006): A Mobile Pet Wearable Computer and Mixed Reality System for Human – Poultry Interaction through the Internet. In Personal and Ubiquitous Computing Journal- Special Issue in Multimodal Interaction with Mobile and Wearable Devices, 10 (5) pp. 301-317.

 
Edit | Del

Tat, Khoo Eng, Lee, Shang Ping and Cheok, Adrian David (2006): Age invaders. In: Ishii, Hiroshi, Lee, Newton, Natkin, Stéphane and Tsushima, Katsuhide (eds.) Proceedings of the International Conference on Advances in Computer Entertainment Technology - ACE 2006 June 14-16, 2006, Hollywood, California, USA. p. 94.

 
Edit | Del

Liu, Wei, Cheok, Adrian David, Hwee, Sim and Ivene, Ang (2006): Mixed reality for fun learning in primary school. In: Ishii, Hiroshi, Lee, Newton, Natkin, Stéphane and Tsushima, Katsuhide (eds.) Proceedings of the International Conference on Advances in Computer Entertainment Technology - ACE 2006 June 14-16, 2006, Hollywood, California, USA. p. 107.

 
Edit | Del

Teh, James Keng Soon, Lee, Shang Ping and Cheok, Adrian David (2006): Internet pajama. In: Ishii, Hiroshi, Lee, Newton, Natkin, Stéphane and Tsushima, Katsuhide (eds.) Proceedings of the International Conference on Advances in Computer Entertainment Technology - ACE 2006 June 14-16, 2006, Hollywood, California, USA. p. 102.

 
Edit | Del

Tan, Roger Thomas Kok Chuen, Todorovic, V., Andrejin, G., Teh, James Keng Soon and Cheok, Adrian David (2006): Metazoa Ludens. In: Ishii, Hiroshi, Lee, Newton, Natkin, Stéphane and Tsushima, Katsuhide (eds.) Proceedings of the International Conference on Advances in Computer Entertainment Technology - ACE 2006 June 14-16, 2006, Hollywood, California, USA. p. 89.

 
Edit | Del

Lee, Shang Ping, Cheok, Adrian David, Teh, James Keng Soon, Debra, Goh Pae Lyn, Jie, Chio Wen, Chuang, Wang and Farbiz, Farzam (2006): A mobile pet wearable computer and mixed reality system for human-poultry interaction through the internet. In Personal and Ubiquitous Computing, 10 (5) pp. 301-317.

 
Edit | Del

Cheok, Adrian David, Sreekumar, Anuroop, Cao, Lei and Thang, Le Nam (2006): Capture the Flag: Mixed-Reality Social Gaming with Smart Phones. In IEEE Pervasive Computing, 5 (2) pp. 62-69.

 
Edit | Del

Cheok, Adrian David and Yu, Gino (2006): Introduction. In Computers in Entertainment, 4 (3) .

 
Edit | Del

Cheok, Adrian David, Teh, Keng Soon, Nguyen, Ta Huynh Duy, Qui, Tran Cong Thien, Lee, Shang Ping, Liu, Wei, Li, Cheng Chen, Díaz, Diego J. and Tovar, Clara Boj (2006): Social and physical interactive paradigms for mixed-reality entertainment. In Computers in Entertainment, 4 (2) .

 
Edit | Del

Tat, Khoo Eng and Cheok, Adrian David (2006): Age Invaders: Inter-generational Mixed Reality Family Game. In IJVR, 5 (2) pp. 45-50.

 
Edit | Del

Tan, Roger Thomas Kok Chuen, Cheok, Adrian David and Teh, James Keng Soon (2006): Metazoa Ludens: Mixed Reality Environment for Playing Computer Games with Pets. In IJVR, 5 (3) pp. 53-58.

 
Edit | Del

Pan, Zhigeng, Cheok, Adrian David, Yang, Hongwei, Zhu, Jiejie and Shi, Jiaoying (2006): Virtual reality and mixed reality for virtual learning environments. In Computers & Graphics, 30 (1) pp. 20-28

2005
 
Edit | Del

Qui, Tran Cong Thien, Nguyen, Ta Huynh Duy, Mallawaarachchi, Asitha, Xu, Ke, Liu, Wei, Lee, Shang Ping, Zhou, Zhi Ying, Teo, Sze Lee, Teo, Hui Siang, Thang, Le Nam, Li, Yu and Cheok, Adrian David (2005): Magic land: live 3D human capture mixed reality interactive system. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 1142-1143.

"Magic Land" is a cross-section of art and technology. It not only demonstrates the latest advances in human-computer interaction and human-human communication: mixed reality, tangible interaction, and 3D-live human capture technology; but also defines new approaches of dealing with live mixed reality content for artists of any discipline. In this system, the user is captured by cameras from many angles and her live 3D avatar is created to be confronted with 3D computer-generated virtual animations. The avatars and virtual objects can interact with each other in a virtual scenery in the mixed reality context; and users can tangibly interact with these characters using their own hands.

© All rights reserved Qui et al. and/or ACM Press

 
Edit | Del

Zhou, ZhiYing, Cheok, Adrian David, Li, Yu and Kato, Hirokazu (2005): Magic cubes for social and physical family entertainment. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 1156-1157.

Physical and social interactions are constrained,and natural interactions are lost in most of present digital family entertainment systems [5]. Magic Cubes strive for bringing the computer storytelling, doll 's house,and board game back into reality so that the children can interact socially and physically as what we did in the old days. Magic Cubes are novel augmented reality systems that explore to use cubes to interact with three dimensional virtual fantasy world. Magic Cubes encourage discussion, idea exchange, collaboration, social and physical interactions among families.

© All rights reserved Zhou et al. and/or ACM Press

 
Edit | Del

Farbiz, Farzam, Cheok, Adrian David, Wei, Liu, Ying, Zhou Zhi, Ke, Xu, Prince, Simon, Billinghurst, Mark and Kato, Hirokazu (2005): Live 3-dimensional content for augmented reality. In IEEE Transaction of Multimedia, 7 (3) .

 
Edit | Del

Tovar, Clara Boj, Díaz, Diego J., Cheok, Adrian David, Xu, Ke and Liu, Wei (2005): Free network visible network. In: Lee, Newton (ed.) Proceedings of the International Conference on Advances in Computer Entertainment Technology - ACE 2005 June 15-15, 2005, Valencia, Spain. pp. 395-396.

 
Edit | Del

Magerkurth, Carsten, Cheok, Adrian David, Mandryk, Regan L. and Nilsen, Trond (2005): Pervasive games: bringing computer entertainment back to the real world. In Computers in Entertainment, 3 (3) p. 4.

2004
 
Edit | Del

Singh, Siddharth, Cheok, Adrian David, Ng, Guo Loong and Farbiz, Farzam (2004): 3D augmented reality comic book and notes for children using mobile phones. In: Proceedings of ACM IDC04: Interaction Design and Children 2004. pp. 149-150.

In this paper we describe two Augmented Reality (AR) applications for children using mobile phones as the user interface. We make use of standard mobile phones readily available in the consumer market without making any hardware modifications to them. The first AR application we describe is the AR Comic Book which allows children to view their favorite cartoon characters in full 3D appearing on books or magazines (or any paper). These 3D virtual characters are rendered into the actual scene captured by the mobile phone's camera. The second AR application is the AR Post-It, which combines the speed of traditional electronic messaging with the tangibility of paper based messages. The key concept of the AR Post-It system is that the messages are displayed only when the intended receiver is within the relevant spatial context. For both these applications a server is used to do the image processing tasks and the phone connects to the server using Bluetooth.

© All rights reserved Singh et al. and/or ACM Press

 
Edit | Del

Zhou, ZhiYing, Cheok, Adrian David, Pan, JiunHorng and Li, Yu (2004): An interactive 3D exploration narrative interface for storytelling. In: Proceedings of ACM IDC04: Interaction Design and Children 2004. pp. 155-156.

Storytelling is an effective and important educational means for children. With the augmented reality (AR) technology, storytelling becomes more and more interactive and intuitive in the sense of human computer interaction. Although AR technology is not new, it's potential in education is just beginning to be explored. In this paper, we present a 3D mixed media story cube which uses a foldable cube as the tangible and interactive storytelling interface. Here, we embed both the concept of AR and the concept of tangible interaction. Multiple modalities including speech, 3D audio, 3D graphics and touch are used to provide the user (especially children) with multi-sensory experiences in the process of storytelling. Our research explores a new interface for children education.

© All rights reserved Zhou et al. and/or ACM Press

 
Edit | Del

Zhou, ZhiYing, Cheok, Adrian David, Yang, Xubo and Qiu, Yan (2004): An experimental study on the role of software synthesized 3D sound in augmented reality environments. In Interacting with Computers, 16 (5) pp. 989-1016.

Investigation of augmented reality (AR) environments has become a popular research topic for engineers, computer and cognitive scientists. Although application oriented studies focused on audio AR environments have been published, little work has been done to vigorously study and evaluate the important research questions of the effectiveness of 3D sound in the AR context, and to what extent the addition of 3D sound would contribute to the AR experience. Thus, we have developed two AR environments and performed vigorous experiments with human subjects to study the effects of 3D sound in the AR context. The study concerns two scenarios. In the first scenario, one participant must use vision only and vision with 3D sound to judge the relative depth of augmented virtual objects. In the second scenario, two participants must co-operate to perform a joint task in a game-based AR environment. Hence, the goals of this study are (1) to access the impact of 3D sound on depth perception in a single-camera AR environment, (2) to study the impact of 3D sound on task performance and the feeling of "human presence and collaboration", (3) to better understand the role of 3D sound in human-computer and human-human interactions, (4) to investigate if gender can affect the impact of 3D sound in AR environments. The outcomes of this research can have a useful impact on the development of audio AR systems which provide more immersive, realistic and entertaining experiences by introducing 3D sound. Our results suggest that 3D sound in AR environment significantly improves the accuracy of depth judgment and improves task performance. Our results also suggest that 3D sound contributes significantly to the feeling of "human presence and collaboration" and helps the subjects to "identify spatial objects".

© All rights reserved Zhou et al. and/or Elsevier Science

 
Edit | Del

Zhou, ZhiYing, Cheok, Adrian David, Yang, Xubo and Qiu, Yan (2004): An experimental study on the role of 3D sound in augmented reality environment. In Interacting with Computers, 16 (6) pp. 1043-1068.

Investigation of augmented reality (AR) environments has become a popular research topic for engineers, computer and cognitive scientists. Although application oriented studies focused on audio AR environments have been published, little work has been done to vigorously study and evaluate the important research questions of the effectiveness of three-dimensional (3D) sound in the AR context, and to what extent the addition of 3D sound would contribute to the AR experience. Thus, we have developed two AR environments and performed vigorous experiments with human subjects to study the effects of 3D sound in the AR context. The study concerns two scenarios. In the first scenario, one participant must use vision only and vision with 3D sound to judge the relative depth of augmented virtual objects. In the second scenario, two participants must cooperate to perform a joint task in a game-based AR environment. Hence, the goals of this study are (1) to access the impact of 3D sound on depth perception in a single-camera AR environment, (2) to study the impact of 3D sound on task performance and the feeling of "human presence and collaboration", (3) to better understand the role of 3D sound in human-computer and human-human interactions, (4) to investigate if gender can affect the impact of 3D sound in AR environments. The outcomes of this research can have a useful impact on the development of audio AR systems, which provide more immersive, realistic and entertaining experiences by introducing 3D sound. Our results suggest that 3D sound in AR environment significantly improves the accuracy of depth judgment and improves task performance. Our results also suggest that 3D sound contributes significantly to the feeling of human presence and collaboration and helps the subjects to "identify spatial objects".

© All rights reserved Zhou et al. and/or Elsevier Science

 
Edit | Del

Zhou, ZhiYing, Cheok, Adrian David, Pan, Jiun Horng and Li, Yu (2004): Magic Story Cube: an interactive tangible interface for storytelling. In: Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology June 3-5, 2004, Singapore. pp. 364-365.

 
Edit | Del

Zhou, ZhiYing, Cheok, Adrian David, Chan, Tingting and Li, Yu (2004): Jumanji Singapore: an interactive 3D board game turning hollywood fantasy into reality. In: Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology June 3-5, 2004, Singapore. pp. 362-363.

 
Edit | Del

Cheok, Adrian David, Goh, Kok Hwee, Liu, Wei, Farbiz, Farzam, Teo, Sze Lee, Teo, Hui Siang, Lee, Shang Ping, Li, Yu, Fong, Siew Wan and Yang, Xubo (2004): Human Pacman: a mobile wide-area entertainment system based on physical, social, and ubiquitous computing. In: Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology June 3-5, 2004, Singapore. pp. 360-361.

 
Edit | Del

Singh, Siddharth, Cheok, Adrian David and Kiong, Soh Chor (2004): A step towards anywhere gaming. In: Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology June 3-5, 2004, Singapore. pp. 357-358.

 
Edit | Del

Singh, Siddharth, Cheok, Adrian David, Ng, Guo Loong and Farbiz, Farzam (2004): Augmented reality post-it system. In: Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology June 3-5, 2004, Singapore. p. 359.

 
Edit | Del

Cheok, Adrian David, Goh, Kok Hwee, Liu, Wei, Farbiz, Farzam, Fong, Siew Wan, Teo, Sze Lee, Li, Yu and Yang, Xubo (2004): Human Pacman: a mobile, wide-area entertainment system based on physical, social, and ubiquitous computing. In Personal and Ubiquitous Computing, 8 (2) pp. 71-81.

 
Edit | Del

Zhou, ZhiYing, Cheok, Adrian David and Pan, Jiun Horng (2004): 3D story cube: an interactive tangible user interface for storytelling with 3D graphics and audio. In Personal and Ubiquitous Computing, 8 (5) pp. 374-376.

 
Edit | Del

Cheok, Adrian David (2004): ACM SIGCHI international conference on advances in computer entertainment technology. In Computers in Entertainment, 2 (1) p. 3.

2003
 
Edit | Del

Cheok, Adrian David, Wan, Fong Siew, Goh, Kok Hwee, Yang, Xubo, Liu, Wei, Farbiz, Farzam and Li, Yu (2003): Human Pacman: A Mobile Entertainment System with Ubiquitous Computing and Tangible Interaction over a Wide Outdoor Area. In: Chittaro, Luca (ed.) Human-Computer Interaction with Mobile Devices and Services - 5th International Symposium - Mobile HCI 2003 September 8-11, 2003, Udine, Italy. pp. 209-223.

 
Edit | Del

Xu, Ke, Prince, Simon, Cheok, Adrian David, Qiu, Yan and Kumar, Krishnamoorthy Ganesh (2003): Visual registration for unprepared augmented reality environments. In Personal and Ubiquitous Computing, 7 (5) pp. 287-298.

2002
 
Edit | Del

Prince, Simon, Cheok, Adrian David, Farbiz, Farzam, Williamson, Todd, Johnson, Nik, Billinghurst, Mark and Kato, Hirokazu (2002): 3-D live: real time interaction for mixed reality. In: Churchill, Elizabeth F., McCarthy, Joe, Neuwirth, Christine and Rodden, Tom (eds.) Proceedings of the 2002 ACM conference on Computer supported cooperative work November 16 - 20, 2002, New Orleans, Louisiana, USA. pp. 364-371.

We describe a real-time 3-D augmented reality video-conferencing system. With this technology, an observer sees the real world from his viewpoint, but modified so that the image of a remote collaborator is rendered into the scene. We register the image of the collaborator with the world by estimating the 3-D transformation between the camera and a fiducial marker. We describe a novel shape-from-silhouette algorithm, which generates the appropriate view of the collaborator and the associated depth map at 30 fps. When this view is superimposed upon the real world, it gives the strong impression that the collaborator is a real part of the scene. We also demonstrate interaction in virtual environments with a "live" fully 3-D collaborator. Finally, we consider interaction between users in the real world and collaborators in a virtual space, using a "tangible" AR interface.

© All rights reserved Prince et al. and/or ACM Press

 
Edit | Del

Cheok, Adrian David, Edmund, Neo Weng Chuen and Eng, Ang Wee (2002): Inexpensive Non-Sensor Based Augmented Reality Modeling of Curves and Surfaces in Physical Space. In: 2002 IEEE and ACM International Symposium on Mixed and Augmented Reality ISMAR 2002 30 September-1 October, 2002, Darmstadt, Germany. pp. 273-274.

 
Edit | Del

Cheok, Adrian David, Weihua, Wang, Yang, Xubo, Prince, Simon, Wan, Fong Siew, Billinghurst, Mark and Kato, Hirokazu (2002): Interactive Theatre Experience in Embodied + Wearable Mixed Reality Space. In: 2002 IEEE and ACM International Symposium on Mixed and Augmented Reality ISMAR 2002 30 September-1 October, 2002, Darmstadt, Germany. pp. 59-68.

 
Edit | Del

Cheok, Adrian David, Weihua, Wang, Yang, Xubo, Prince, Simon, Wan, Fong Siew, Billinghurst, Mark and Kato, Hirokazu (2002): Interactive Theatre Experience in Embodied + Wearable Mixed Reality Space. In: 2002 IEEE and ACM International Symposium on Mixed and Augmented Reality ISMAR 2002 30 September-1 October, 2002, Darmstadt, Germany. p. 317.

 
Edit | Del

Chia, Kar Wee, Cheok, Adrian David and Prince, Simon (2002): Online 6 DOF Augmented Reality Registration from Natural Features. In: 2002 IEEE and ACM International Symposium on Mixed and Augmented Reality ISMAR 2002 30 September-1 October, 2002, Darmstadt, Germany. pp. 305-.

 
Edit | Del

Prince, Simon, Cheok, Adrian David, Farbiz, Farzam, Williamson, Todd, Johnson, Nikolas, Billinghurst, Mark and Kato, Hirokazu (2002): 3D Live: Real Time Captured Content for Mixed Reality. In: 2002 IEEE and ACM International Symposium on Mixed and Augmented Reality ISMAR 2002 30 September-1 October, 2002, Darmstadt, Germany. pp. 7-13.

 
Edit | Del

Prince, Simon, Cheok, Adrian David, Farbiz, Farzam, Williamson, Todd, Johnson, Nikolas, Billinghurst, Mark and Kato, Hirokazu (2002): 3D Live: Real Time Captured Content for Mixed Reality. In: 2002 IEEE and ACM International Symposium on Mixed and Augmented Reality ISMAR 2002 30 September-1 October, 2002, Darmstadt, Germany. p. 317.

 
Edit | Del

Cheok, Adrian David, Yang, Xubo, Zhou, ZhiYing, Billinghurst, Mark and Kato, Hirokazu (2002): Touch-Space: Mixed Reality Game Space Based on Ubiquitous, Tangible, and Social Computing. In Personal and Ubiquitous Computing, 6 (5) pp. 430-442.

 
Edit | Del

Billinghurst, Mark, Cheok, Adrian David, Prince, Simon and Kato, Hirokazu (2002): Real World Teleconferencing. In IEEE Computer Graphics and Applications, 22 (6) pp. 11-13

 
Edit | Del

Prince, Simon, Xu, Ke and Cheok, Adrian David (2002): Augmented Reality Camera Tracking with Homographies. In IEEE Computer Graphics and Applications, 22 (6) pp. 39-45

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
 
 
 

Changes to this page (author)

10 Nov 2012: Modified
10 Nov 2012: Modified
09 Nov 2012: Modified
07 Nov 2012: Modified
07 Nov 2012: Modified
07 Nov 2012: Modified
07 Nov 2012: Modified
05 Apr 2012: Modified
05 Apr 2012: Modified
05 Apr 2012: Modified
05 Apr 2012: Modified
21 Dec 2011: Added
19 Dec 2011: Added
19 Dec 2011: Added
19 Dec 2011: Added
19 Dec 2011: Added
26 Jul 2011: Modified
11 May 2011: Modified
11 May 2011: Modified
29 Apr 2011: Modified
29 Apr 2011: Modified
29 Apr 2011: Modified
18 Apr 2011: Modified
18 Nov 2010: Modified
03 Nov 2010: Modified
17 Oct 2010: Added
06 Feb 2010: Added
20 Jul 2009: Modified
20 Jul 2009: Modified
20 Jul 2009: Modified
20 Jul 2009: Modified
26 Jun 2009: Modified
26 Jun 2009: Modified
05 Jun 2009: Modified
02 Jun 2009: Modified
02 Jun 2009: Modified
31 May 2009: Modified
31 May 2009: Modified
31 May 2009: Modified
31 May 2009: Modified
31 May 2009: Modified
31 May 2009: Modified
31 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
08 Apr 2009: Modified
07 Apr 2009: Modified
26 Aug 2008: Added
04 Feb 2008: Added
23 Nov 2007: Added
29 Jun 2007: Modified
29 Jun 2007: Modified
27 Jun 2007: Modified
27 Jun 2007: Modified
23 Jun 2007: Modified
23 Jun 2007: Modified
28 Apr 2003: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/adrian_david_cheok.html

Publication statistics

Pub. period:2002-2012
Pub. count:76
Number of co-authors:130



Co-authors

Number of publications with 3 favourite co-authors:

Farzam Farbiz:11
Simon Prince:10
Owen Noel Newton Fe..:10

 

 

Productive colleagues

Adrian David Cheok's 3 most productive colleagues in number of publications:

Albrecht Schmidt:110
Mark Billinghurst:92
Robert J. K. Jacob:57
 
 
 
Jul 28

A user will find any interface design intuitive...with enough practice.

-- Popular computer one-liner

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!