Context-Aware Mobile AR System for Personalization, Selective Sharing, and Interaction of Contents in Ubiquitous Computing Environments* Youngjung Suh1, Youngmin Park1, Hyoseok Yoon1, Yoonje Chang2, and Woontack Woo1 1 GIST U-VR Lab., 500-712, S.Korea {ysuh,ypark,hyoon,wwoo}@gist.ac.kr 2 GIST CTRC, 500-712, S.Korea

[email protected]

Abstract. With advances in tracking and increased computing power, mobile AR systems are popular in our daily life. Researchers in mobile AR technology have emphasized the technical challenges involved in the limitations imposed from mobility. They did not consider context-aware service with user-related information annotation, even if in ubiquitous computing environment, various contexts of both a user and an environment can be utilized easily as well as effectively. Moreover, it is difficult to have access to pervasive but invisible computing resources. At the same time, the more smart appliances become evolved with more features, the harder their user interfaces tend to become to use. Thus, in this paper, we propose Context-aware Mobile Augmented Reality (CaMAR) system. It lets users interact with their smart objects through personalized control interfaces on their mobile AR devices. Also, it supports enabling contents to be not only personalized but also shared selectively and interactively among user communities. Keywords: context-aware, mobile AR, personalization, selective sharing. 1 Introduction Until now, augmented reality (AR) systems have been developed to faithfully realize the basic concept that they supplement the real world with virtual objects (or information) to enhance the user’s perception of and interaction with the real world [1]. Researchers in AR technologies have emphasized the technical challenges involved in accurate augmentation, natural interaction, realistic rendering. With advances in tracking and increased computing power, mobile AR systems are developing. Studies on mobile AR technologies also try to technically improve challenges [2]. They did not consider context-aware service with user-related information annotation, even if in ubiquitous computing environment, various * This research is supported by the Ubiquitous Computing and Network (UCN) Project, the Ministry of Information and Communication (MIC) 21st Century Frontier R&D Program in Korea. J. Jacko (Ed.): Human-Computer Interaction, Part II, HCII 2007, LNCS 4551, pp. 966–974, 2007. © Springer-Verlag Berlin Heidelberg 2007 Context-Aware Mobile AR System for Personalization 967 contexts of both a user and an environment can be utilized easily as well as effectively. However, uniform provision of such context information using mobile AR technology lets the user fall into utter confusion. Thus, making a bridge between a variety of contexts in ubiquitous computing environment and mobile AR technology, several studies have tried to overcome the limitations that those are restricted to provide uniform information to every person [2][3]. aPost-it[4] was proposed as a context-based information augmentation and sharing system. However, it personalized only media contents through a webpage obtained from an object. It did not deal with personalization in controlling smart objects. In addition, for selective sharing of media contents, through user interfaces in a web page users have to explicitly inform the system of whether they would like to share. However, it is difficult to have access to pervasive but invisible computing resources. At the same time, the more smart appliances become evolved with more features, the harder their user interfaces tend to become to use. Moreover, it is not convenient to explicitly request selective sharing of contents. Thus, in this paper, we propose Context-aware Mobile Augmented Reality (CAMAR) system. It supports enabling contents to be not only personalized but also shared selectively and interactively among user communities based on mobile user’s profile and context in ubiquitous computing environments. Moreover, users can control various kinds of smart objects within the environment in their own personalized manner, once they take pictures of smart objects within the environment with a camera in a mobile AR device. Thus, making a bridge between a variety of contexts in ubiquitous computing environment and mobile AR technologies, the proposed system overcomes limitations in the existing Information Technologies which provide uniform information to every person. Applicable areas of the proposed system could be MAR-enabling applications, such as a meeting system that supports information augmentation to real environment and collaboration, a universal remote controller for controlling various kinds of smart objects, a mobile service agent that utilize user's location and activity to diversify and expand its use for MAR-based services. This paper is organized as follows. In section 2, overview of the CAMAR system is given. We describe the applicable scenarios in section 3. Section 4 deals with implementations. Finally, we conclude our work and briefly outline a remaining work in Section 5. 2 System Overview Context-aware Mobile AR system will support personalization, selective sharing, and mobile AR interaction-based collaboration. It supports user-centric monitoring and controlling of smart services as well as personalized contents in ubiquitous computing environments. Personalized UI in a controller of smart services lets AR systems provide intuitive and transparent UI such that let user concentrate on the original task. Also, it supports presenting private information to individuals without fearing that others will see it. As for personalized contents, it allows personalized responsive content to be augmented. It constructs group or community space based on common interests among users in ubiquitous computing environments. Based on the 968 Y. Suh et al. constructed communities in ubiquitous computing environments, selective contents sharing and collaboration will be supported. It specifically enables selective sharing of common knowledge/experience on contents interactively/collaboratively among user communities. Fig. 1 shows the concept diagram for CAMAR system supporting personalization, selective sharing, and mobile AR interaction-based collaboration in the environment equipped with smart TV, window, table, etc. Fig. 1. The concept diagram for CAMAR 3 Applicable Scenarios As a test-bed regarding contents, we model a rich knowledge containing many different types of information on a specific domain, the Unju temple. Our test data, thus, includes cultural heritage information, legendary characters, photos, geographic information, and detailed 3D models of places, characters, and items [7][8]. Fig. 2 shows the conceptual diagram implying scenarios applicable to CAMAR system. Fig. 2. Visualization of overall scenarios applicable to CAMAR Context-Aware Mobile AR System for Personalization 969 3.1 Personalized Smart Object Control “We can control, in a personalized manner, smart objects which we take pictures with a camera in a mobile AR device” When a user takes a picture of a specific smart object with a camera in a mobile AR device, a personalized controller to the captured smart object could be augmented on the mobile AR device: ubiTV Controller, ARTable Controller, MRWindow Controller, Light Controller [9][10]. Mr. Kang comes home from his work. He becomes aware of the fact that his wife Ms. Lee went to a market in a neighborhood. As soon as he comes into a living room, lighting service prepares a green lighting he prefers at ordinary times. He changes the color of lighting into a blue one since he feels the weather is so hot and wants to make the room refreshing. Then, he approaches ubiTV to watch TV. ubiTV recognizes that Mr. Kang might approach close to it and provides recommendations based on his preference at normal times. Only media contents with a high degree of similarity to his preference are recommended. Mr. Kang selects sports news channel using ubiTV Controller on his PDA and begins to watch TV. Ms.Lee comes back home from a market. After she confirms that Mr. Kang picks up a sports channel to watch TV, she decides to pay a visit of a virtual Unju temple through PDA before watching TV. She moves to MRWindow. First of all, she confirms available service lists from service discovery on her PDA. Then, she is about to enjoy an interesting service searching from service lists. Unfortunately, she forgot how to use services available in MRWindow since she hardly have used them. She gets to have a chance to review how to use MRWindow services through sound guidance from her PDA. She begins to control MRWindow services and navigates a virtual Unju temple. After she enjoys herself in a virtual navigation system to the full, she comes to ubiTV because she recognizes that sports news program Mr. Kang has watched is over. At this time, in MRWindow, there displays a painting she prefers normally. When she is in front of ubiTV, authority transfer button is appeared in Mr. Kang’s ubiTV Controller on PDA. Mr. Kang transfers his authority on controlling ubiTV to Ms. Lee. ubiTV recognizes that Ms. Lee gets the authority on ubiTV from Mr. Kang and provides recommendations based on Ms.Lee’s preference at normal times. Only media contents with a high degree of similarity to her preference are recommended. Ms. Lee selects cooking channel using ubiTV Controller on her PDA and begins to watch TV. After Ms. Lee moves to the dining room, ubiTV and Light service are automatically turned off. Fig. 3 shows visualization of the scenario applicable to Personalized Smart Object Controller in ubiquitous computing environments. 3.2 Context-Based Contents Augmentation and Sharing 1) Mr. Kang Got Back from Visiting Unju Temple Last Week . Mr. Kang got back from visiting Unju temple with his family in the last week. He really loves photographing. So, he took a picture of in the precincts of Unju temple. The photos are indexed spatially through GPS-receiver in the mobile device that was used to take the photograph, so that Mr. Kang can later know where a certain photo was shot. 970 Y. Suh et al. Fig. 3. Visualization of the scenario applicable to Personalized Smart Object Controller in ubiquitous computing environments Mr. Kang decides to recommend visiting the Unju temple to his friend. Mr. Kang invites his friend Mr. Kim to show the pictures he took in Unju temple. It goes by a week from that time. That’s why it is difficult that Mr. Kang remembers he took which scenes at which places and when. Mr. Kang likes to revive the memory of Unju temple through ARTable and navigate a road in virtual Unju temple. He recognizes with a single glance the place he took the pictures through the map displayed on ARTable. Mr. Kang moves UMPC with his hands to a specific region on the map to see again the picture he took in the region. Mr. Kang enjoys the pictures augmented on a screen in UMPC. As an optional service, he got to have a same feeling of one at that time searching all over the region with the enlarged one of the pictures. 2) Mr. Kim Visit Mr. Kang’s Home. Mr. Kim has never visited Unju temple. He decides to visit a virtual Unju temple through ARTable, hearing from Mr. Kang that Unju temple gives an unusual impression unlike other temples. The places at which Mr. Kang took the pictures can be indicated through makers displayed on the map in ARTable. “Wabul” in the map is indicated by a distinguished marker. Mr. Kim decides to look into that place moving UMPC with his hands to that one. Mr. Kim feels that is impressive and have a good understanding of that place. He got interested Fig. 4. Visualization of the scenario applicable to Context-based contents augmentation and sharing service Context-Aware Mobile AR System for Personalization 971 in Unju temple with the help of sound guidance on the tale of the place. Immediately he feels like visiting even “Chilsung” and “Gongsa” rock in Unju temple. He may make a definite promise to visit Unju temple and make a novel experience with his family in next week. Fig. 4 shows visualization of the scenario applicable to Contextbased contents augmentation and sharing service. 4 Implementations We present Context-aware Mobile Augmented Reality (CAMAR) system that supports two main functionalities. One is controlling smart objects with mobile AR devices. The other is enabling contents to be not only personalized but also shared selectively and interactively among user communities. Fig. 5 shows overall system block diagram. mobile UI Profile ba sed displa y a nd service User input a pp.(1) a pp.(2) Applica t ion User profile Cont ent s Cont ent s synchroniz a t ion Cont ent s Cont ent s persona liza t i on a nd ma na gement User profile ma na gement Communit y info. Communit y ma na gement User profile wit h priorit y Mult iple users’ collision Ma na gement object ID, pose Object t ra cking Access t o sma rt object s User’s a ccess det ect ion user ID list s MRWindow User det ect ion ubiLight Int erna l pa ra met er Ca mera ca libra t ion ( Offline) AR Se rver Server ubiTV ARTa ble Sensing Mobile Device [UMPC] a pp.(1): Sma rt Object ARCont roller Sma rt Object s a pp.(2): Cont ent s Augment a t ion a nd Sha ring Fig. 5. CAMAR System block diagram 4.1 Smart Object Control: Controlling Smart Objects with Mobile AR Device The personalized smart object controller system lets users interact with their smart objects through personalized control interfaces on their mobile AR devices. Most mobile devices interact with only one user, making it easy for the mobile device to provide personalized interfaces. For example, a mobile AR device provides interfaces that are consistent with smart object ones that the user is familiar with or preferred. We have designed and implemented a personalized smart object controller to make users control smart objects in their environment through a personalized user interface. When a user wants to control a smart object, the user has only to take a picture of it with a camera attached to mobile AR device. When a user takes a picture of a smart object, the controller device gets a context containing an abstract functional description from the smart object and uses that context of the description to properly generate an interface. We explored developing graphical user interfaces on PDA. The personalized controllers to smart objects augmented on the mobile AR devices are 972 Y. Suh et al. Fig. 6. Smart Object Control with AR Controllers ubiTV Controller, MRWindow Controller, Light Controller, ARTable Controller [5] [6]. Some of them are shown in Fig. 6. For prototypes, AR Controller is implemented in PDA and UMPC platforms, respectively. PDA platform has an advantage because it is relatively smaller, cheaper and highly portable. However, UMPC performs better in image processing for pattern matching. For research purpose, we used both platforms interchangeably to develop compatible components for PDA and UMPC alike. Fig. 7 shows two different versions of AR Controller. (a) (b) Fig. 7. AR Controller in (a) UMPC platform and (b) PDA platform, respectively 4.2 Context-Based Contents Augmentation and Sharing The context-based contents augmentation and sharing system lets users experience the personalized contents and share them with the others who have common interests. To show contents augmentation and sharing with mobile AR technology, we implement an edutainment system that augment and share photos in a site. When we explore a cultural site, we tend to take a picture of cultural assets and save them in our mobile device to treasure up our memory or experience. The context information we can acquire in this aspect might be the time we took a photograph of the cultural assets and the location of the impressive places we took a photograph of. The personal context information such as time, location, etc. is managed in an individual mobile device like UMPC. We made our system to show the different contents to two users depending on whether a user has an experience of visiting Unju temple or not [7][8]. Fig. 8 shows that the system augments the personalized photo contents and makes them to be shared within a specific site in ARTable[6]. The personal context information such as time, location, etc. is managed in an individual mobile device like UMPC. We assume a user has ever been to Unju temple. Then the pictures he took in several places at Unju temple are saved in his UMPC. When the user looks into a specific AR marker in a map in ARTable with his UMPC in his hands, the pictures he took in that place are augmented. The user can flick through and enlarge the pictures one by one by clicking buttons. Context-Aware Mobile AR System for Personalization 973 Fig. 8. Context-based AR contents augmentation and sharing For selective sharing, our system interprets preferences on contents of a user and provides the personalized contents to the user. Additionally, the system generates and manages group context by extracting common preferences after analyzing multiple users’ integrated contexts and the relationship. By managing group context, it selectively allows the users who have interests in common to share contents. 5 Conclusion and Future Work Our work demonstrates the feasibility of personalized smart object AR controllers in ubiquitous computing environments as well as context-based contents augmentation and selective sharing systems in ubiquitous computing environments. It supports enabling contents to be not only personalized but also shared selectively and interactively among user communities based on mobile user’s profile and context in ubiquitous computing environments. Moreover, users can control various kinds of smart objects within the ubiquitous computing environments in their own personalized manner, once they take pictures of smart objects within the environment with a camera in a mobile AR device. Thus, making a bridge between a variety of contexts in ubiquitous computing environment and mobile AR technologies, the proposed system overcomes limitations in the existing Information Technologies which provide uniform information to every person. We also plan to conduct user studies to see how our system is interesting and accepted as a novel one to users comparing with the existing other similar systems. Considering compatibility of embodied technology and contents, we will try to find out contents to support concretizing user’s experience and sharing something meaningful to family through CAMAR device. Also, to increase satisfaction of embodied technology, we need to show possibility for controlling various smart appliances which users are able to think of as more useful using CAMAR device. References 1. Höllerer, T., Feiner, S., Pavlik, J.: Situated Documentaries: Embedding Multimedia Presentations in the Real World. In: Prc. ISWC ’99 (3rd Int. Symp. On Wearabel Computers), San Francisco, CA, October 18-19, pp. 79–86 (1999) 2. Henrysson, Ollila: UMAR - Ubiquitous Mobile Augmented Reality. To appear at 3rd Mobile Ubiquitous Multimedia 2004, Washington, USA (2004) 974 Y. Suh et al. 3. Reitmayr, G., Schmalstieg, D.: Collaborative Augmented Reality for Outdoor Navigation and Information Browsing (To appear). In: Proc. Symposium Location Based Services and TeleCartography (2004) 4. Oh, Y., Lee, M., Jung, S., Woo, W.: Dynamic Contents Provision of Context-based Information Augmentation & Sharing System, ACM/IEEE ICAT04, pp. 594–597 (2004) 5. Oh, Y., Shin, C., Jung, W., Woo, W.: The ubiTV application for a Family in ubiHome. In: 2nd Ubiquitous Home workshop, pp. 23–32 (2005) 6. Park, Y., Woo, W.: The ARTable. In: Pan, Z., Aylett, R., Diener, H., Jin, X., Göbel, S., Li, L. (eds.) An AR-based Tangible User Interface System. LNCS, vol. 3942, pp. 1198–1207. Springer, Heidelberg (2006) 7. Lee, Y., Oh, S., Lee, B., Park, J., Park, Y., Oh, Y., Lee, S., Oh, H., Ryu, J., Lee, K.H., Kim, H., Lee, Y., Kim, J., Ho, Y., Woo, W.: Responsive Multimedia System for Contextbased Storytelling. LNCS (PCM), vol. 3767, pp. 361–372 (2005) 8. Lee, Y., Oh, S., Woo, W.: A Context-based Storytelling with Responsive Multimedia System (RMS), LNCS(ICVS), vol. 3805, pp. 12–21 (2005) 9. Oh, Y., Shin, C., Jung, W., Woo, W.: The ubiTV application for a Family in ubiHome. In: 2nd Ubiquitous Home workshop, pp. 23–32 (2005) 10. Park, Y., Woo, W.: The ARTable: An AR-based Tangible User Interface System, LNCS, vol. 3942, pp. 1198–1207 (2006)