Point clouds (PCs)—sets of three-dimensional (3D) data points and their attributes collectively representing an object or scene—play a crucial role in distributed autonomous systems, enabling applications such as augmented reality, autonomous vehicles, and environmental monitoring. These applications rely on remote sensors to capture PCs and wirelessly transmit them to edge servers for downstream tasks, such as registration, i.e., aligning multiple PCs within the same 3D-coordinate system. Current approaches treat PC transmission and registration as separate modules, leading to inefficiencies in handling latency and registration accuracy. This project takes a joint design approach that integrates wireless communication, computer vision, and machine learning to optimize PC transmission from sensors to an edge server for remote registration. We start with a single-sensor scenario, using registration loss to guide the joint training of both transmission and registration components. We then extend to multiple coordinated sensors, addressing inter-sensor interference and signal separation. Finally, we tackle a swarm of uncoordinated sensors, developing scalable transmission schemes that handle sporadic transmissions without requiring known sensor identities. In each scenario, we will develop novel techniques that enable efficient, low-latency, and reliable PC transmission and registration. The results will enhance spatial awareness in PC-reliant applications, advancing autonomous systems’ capabilities to perceive, analyze, and interact with their surroundings in real time. The project leverages the complementary expertise of the PI and co-PI, as well as our excellent research environment and strong connection with industrial and academic partners.
Duration: 2025-2030
Funding: XX
Research leaders: Khac-Hoang Ngo, Per-Erik Forssén