TY - JOUR
T1 - Enabling lightweight immersive user interaction in smart buildings through learning-based mobile panorama streaming
AU - Xu, Chi
AU - Li, Zhengzhe
AU - Huai, Guo Qing
AU - Zhao, Jia
AU - Zhu, Yifei
AU - Ma, Xiaoqiang
AU - Wang, Haiyang
N1 - Publisher Copyright:
© 2024 Elsevier B.V.
PY - 2024/6
Y1 - 2024/6
N2 - Smart buildings integrate users’ digitized wearables with their physical surroundings, creating a seamless and interactive user experience. This is achieved through the utilization of multiple sensors, video streaming, artificial intelligence, and edge computing. These technologies gather extensive data and provide users with a wide range of applications, such as 3D audio/video in AR/VR, localization, virtual tours, and vigilant monitoring. Nevertheless, the current AR/VR devices face limitations due to the bulkiness and discomfort of the hardware used for on-body sensing, such as headsets and specialized glasses. These components often become uncomfortable during prolonged usage, posing a challenge for creating an immersive system that combines lightweight interaction with high-quality presentation. This paper presents a comprehensive system designed to enable immersive interaction in smart buildings with a focus on lightweight solutions. The system consists of the following components: (1). A lightweight panoramic imaging framework to address the challenges related to hardware size and functionality. (2). A learning-based video transcoding cost prediction framework for efficient load balancing. (3). A layered networking architecture designed to facilitate high-quality mobile panorama live streaming. Collectively, these components offer lightweight interaction paired with enhanced presentation quality. Our experimental results demonstrate the effectiveness of the system design, showcasing its seamless operation across different times, geographical locations, and heterogeneous wireless networks.
AB - Smart buildings integrate users’ digitized wearables with their physical surroundings, creating a seamless and interactive user experience. This is achieved through the utilization of multiple sensors, video streaming, artificial intelligence, and edge computing. These technologies gather extensive data and provide users with a wide range of applications, such as 3D audio/video in AR/VR, localization, virtual tours, and vigilant monitoring. Nevertheless, the current AR/VR devices face limitations due to the bulkiness and discomfort of the hardware used for on-body sensing, such as headsets and specialized glasses. These components often become uncomfortable during prolonged usage, posing a challenge for creating an immersive system that combines lightweight interaction with high-quality presentation. This paper presents a comprehensive system designed to enable immersive interaction in smart buildings with a focus on lightweight solutions. The system consists of the following components: (1). A lightweight panoramic imaging framework to address the challenges related to hardware size and functionality. (2). A learning-based video transcoding cost prediction framework for efficient load balancing. (3). A layered networking architecture designed to facilitate high-quality mobile panorama live streaming. Collectively, these components offer lightweight interaction paired with enhanced presentation quality. Our experimental results demonstrate the effectiveness of the system design, showcasing its seamless operation across different times, geographical locations, and heterogeneous wireless networks.
KW - Artificial intelligence
KW - Interactive systems
KW - Mobile video
KW - Smart devices
KW - Streaming media
UR - http://www.scopus.com/inward/record.url?scp=85191657322&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85191657322&partnerID=8YFLogxK
U2 - 10.1016/j.comcom.2024.04.002
DO - 10.1016/j.comcom.2024.04.002
M3 - Article
AN - SCOPUS:85191657322
SN - 0140-3664
VL - 222
SP - 68
EP - 76
JO - Computer Communications
JF - Computer Communications
ER -