Developing novel Wireless Sensor modalities to aid Robot Automation.
A recent spur of interest in indoor robotics has increased the importance of robust simultaneous localization and mapping algorithms in indoor scenarios. This robustness is typically provided by the use of multiple sensors which can correct each others’ deficiencies. In this vein, exteroceptive sensors, like cameras and LiDAR’s, employed for fusion are capable of correcting the drifts accumulated by wheel odometry or inertial measurement units (IMU’s). However, these exteroceptive sensors are deficient in highly structured environments and dynamic lighting conditions. This letter will present WiFi as a robust and straightforward sensing modality capable of circumventing these issues.
The idea is to be able to readily integrate Wi-Fi as an alternative to GPS for indoor navigation for robotics as shown above. We have already demonstrated the use of WiFI for robotic navigation as shown above, where the robots movement over three timestamp are shown in the fiture at the center, and have also designed a toolbox that can readily integrate Wi-Fi for robotics into the existing SLAM algorithms.
A recent spur of interest in indoor robotics has increased the importance of robust simultaneous localization and mapping algorithms in indoor scenarios. This robustness is typically provided by the use of multiple sensors which can correct each others’ deficiencies. In this vein, exteroceptive sensors, like cameras and LiDAR’s, employed for fusion are capable of correcting the drifts accumulated by wheel odometry or inertial measurement units (IMU’s). However, these exteroceptive sensors are deficient in highly structured environments and dynamic lighting conditions. This letter will present WiFi as a robust and straightforward sensing modality capable of circumventing these issues. Specifically, we make three contributions. First, we will understand the necessary features to be extracted from WiFi signals. Second, we characterize the quality of these measurements. Third, we integrate these features with odometry into a state-of-art GraphSLAM backend. We present our results in a 25×30 m and 50×40 environment and robustly test the system by driving the robot a cumulative distance of over 1225 m in these two environments. We show an improvement of at least 6× compared odometry-only estimation and perform on par with one of the state-of-the-art Visual-based SLAM.
@article{arun2022p2slam,title={P2slam: Bearing based wifi slam for indoor robots},author={Arun, Aditya and Ayyalasomayajula, Roshan and Hunter, William and Bharadia, Dinesh},journal={IEEE Robotics and Automation Letters},volume={7},number={2},pages={3326--3333},year={2022},publisher={IEEE},url={https://ieeexplore.ieee.org/abstract/document/9691786},}
ViWiD: Leveraging WiFi for Robust and Resource-Efficient SLAM
Recent interest towards autonomous navigation and exploration robots for indoor applications has spurred research into indoor Simultaneous Localization and Mapping (SLAM) robot systems. While most of these SLAM systems use Visual and LiDAR sensors in tandem with an odometry sensor, these odometry sensors drift over time. To combat this drift, Visual SLAM systems deploy compute and memory intensive search algorithms to detect ‘Loop Closures’, which make the trajectory estimate globally consistent. To circumvent these resource (compute and memory) intensive algorithms, we present ViWiD, which integrates WiFi and Visual sensors in a dual-layered system. This dual-layered approach separates the tasks of local and global trajectory estimation making ViWiD resource efficient while achieving on-par or better performance to state-of-the-art Visual SLAM. We demonstrate ViWiD’s performance on four datasets, covering over 1500 m of traversed path and show 4.3x and 4x reduction in compute and memory consumption respectively compared to state-of-the-art Visual and Lidar SLAM systems with on par SLAM performance.
@article{arun2022viwid,title={ViWiD: Leveraging WiFi for Robust and Resource-Efficient SLAM},author={Arun, Aditya and Hunter, William and Ayyalasomayajula, Roshan and Bharadia, Dinesh},journal={arXiv preprint arXiv:2209.08091},year={2022},url={https://arxiv.org/pdf/2209.08091},}
2020
LocAP: Autonomous millimeter accurate mapping of WiFi infrastructure
Indoor localization has been studied for nearly two decades fueled by wide interest in indoor navigation, achieving the necessary decimeter-level accuracy. However, there are no real-world deployments of WiFi-based user localization algorithms, primarily because these algorithms are triangulation based and therefore assume the location of the Access Points, their antenna geometries, and deployment orientations in the physical map. In the real world, such detailed knowledge of the location attributes of the Access Point is seldom available, thereby making WiFi localization hard to deploy. In this paper, for the first time, we establish the accuracy requirements for the location attributes of access points to achieve decimeter level user localization accuracy. Surprisingly, these requirements for antenna geometries and deployment orientation are very stringent, requiring millimeter level and sub-10 degree of accuracy respectively, which is hard to achieve with manual effort. To ease the deployment of real-world WiFi localization, we present LocAP, which is an autonomous system to physically map the environment and accurately locate the attributes of existing infrastructure AP in the physical space down to the required stringent accuracy of 3 mm antenna separation and 3degree deployment orientation median errors, whereas state-of-the-art report 150 mm and 25degrees respectively.
@inproceedings{ayyalasomayajula2020locap,title={LocAP: Autonomous millimeter accurate mapping of WiFi infrastructure},author={Ayyalasomayajula, Roshan and Arun, Aditya and Wu, Chenfeng and Rajagopalan, Shrivatsan and Ganesaraman, Shreya and Seetharaman, Aravind and Jain, Ish Kumar and Bharadia, Dinesh},booktitle={17th USENIX Symposium on Networked Systems Design and Implementation (NSDI 20)},pages={1115--1129},year={2020},url={https://www.usenix.org/conference/nsdi20/presentation/ayyalasomayajula},}