Here's the modified HTML code with all video and image file paths updated with the specified prefix: ```html

Campus Autonomy: Navigating the Future with Autonomous Indoor-Outdoor Delivery Vehicles

Yongqi Zhang (zhangyq12023@shanghaitech.edu.cn) & Jiajie Zhang (zhangjj2023@shanghaitech.edu.cn)

Video

Abstract

The "Campus Autonomy" project focuses on developing an autonomous delivery vehicle capable of navigating both indoor and outdoor environments within a campus setting. By assembling an Agile X HUNTER SE Ackermann model drive vehicle equipped with advanced sensors like Lidar and panoramic camera, the project aims to address the complex challenges of autonomous localization, path planning and navigation. A significant component of this initiative involves migrating existing software from ROS1 to ROS2, leveraging ROS2's enhanced features for better performance and reliability. The project's objective is to create a versatile and efficient delivery system that can autonomously transport meals and parcels across campus, showcasing the practical application and integration of cutting-edge robotic technologies in a real-world environment.

Introduction

The integration of autonomous mobile robots into daily life promises significant advancements in efficiency and convenience, especially in the context of delivery services within campus environments. "Campus Autonomy" aims to harness these technologies to create an autonomous vehicle that can navigate the complexities of both indoor and outdoor spaces for the purpose of delivering meals and parcels. The importance of localization, navigation, and path planning in the development of such autonomous systems cannot be overstated, as they form the backbone of a robot's ability to understand its surroundings, determine its own position, and chart a course to its destination while avoiding obstacles.

hunterSE

To realize this vision, the project employs an Agile X HUNTER SE vehicle, chosen for its agility and adaptability, outfitted with state-of-the-art sensors including the Hesai PandarQT64 Lidar and Insta360 Air panoramic camera. These technologies provide the vehicle with the detailed environmental data necessary for precise localization and obstacle avoidance, key components in ensuring the safety and efficiency of the delivery service.

lidar insta360air

One of the project's core challenges is the migration of its software foundation from ROS1 to ROS2. This transition is crucial for harnessing the improved communication, real-time processing, and security features of ROS2, which are vital for autonomous operations in potentially crowded and unpredictable campus environments.

System Description

The "Campus Autonomy" initiative is conceptualized to deliver an innovative autonomous navigation solution, specifically designed for efficient and reliable package delivery across both indoor and outdoor campus environments. Central to this initiative is the Agile X HUNTER SE vehicle, meticulously selected for its adaptability and robustness, enabling it to navigate the intricate landscapes of academic institutions. This vehicle is equipped with an array of sophisticated sensors, notably the Hesai PandarQT64 Lidar and the Insta 360 Air panoramic camera, which together provide comprehensive environmental awareness. The integration of an odometer and an IMU further enhances the system's capability to track its movement and orientation with high precision.

Robot

In the realm of software, the project is anchored in the ROS2 framework, chosen for its advanced communication features, security enhancements, and support for real-time operations. Utilizing ROS2's modular design, the system incorporates the Navigation2 package for advanced path planning and navigation, and the Cartographer for efficient SLAM operations. This fusion of technologies facilitates the vehicle's ability to dynamically map its surroundings while navigating to deliver packages, adjusting its path in response to environmental changes and obstacles.

av2_architecture

A distinctive feature of our system is the use of behavior trees within Navigation2, enabling the customization of navigation strategies to suit varied delivery requirements. This flexible approach allows for the potential inclusion of additional functionalities, such as object tracking or comprehensive area scanning, in future iterations of the project.

overall_behavior_tree

Specifically, in the selection of Nav2 plugins, considering that our robot belongs to the Ackerman steering model, we chose Smac hybrid A* as the planner and MPPI as the controller. At first, we tried to use TEB Local Planner, but the latter is not officially supported with ROS2, and the maintainer said that MPPI Controller is the official successor of TEB, after several comparisons, we found that MPPI as the controller has better local planning effect. In addition, local costmap, global costmap and bt navigator all use the official default parameters and plug-ins of Nav2.

Despite the comprehensive planning and advanced technological integration, the project faces several challenges and uncertainties. One such area is the optimization of the sensor fusion process to enhance obstacle detection and avoidance in densely populated or dynamically changing environments, where the precision and reliability of data from different sensors are critical. Another challenge is the development of robust algorithms for indoor localization without GPS, requiring high accuracy in diverse indoor settings. Additionally, the system's ability to adapt to unexpected environmental changes, such as construction zones or temporary obstacles, remains a complex problem requiring further research and innovation.

The software architecture's scalability and the hardware system's modularity are designed with future expansion and maintenance in mind, promoting easy integration of new technologies and functionalities. Despite these provisions, identifying the most efficient and user-friendly approach to system monitoring and control presents another challenge, emphasizing the need for a sophisticated yet intuitive interface for system operators.

In conclusion, while the "Campus Autonomy" project leverages cutting-edge technologies to tackle the complex issue of autonomous campus delivery, it acknowledges existing gaps in knowledge and technology. These challenges represent opportunities for innovation and further research, promising to advance the field of autonomous navigation and robotic delivery services significantly.

System Evaluation

To align the system evaluation with the project's specific implementation goal of demonstrating an autonomous robot's ability to navigate from an indoor location to the corridor elevator entrance, we will tailor the experiments and success metrics to assess the system's performance in achieving this task effectively.

Experimental Design:

Indoor Navigation Accuracy: The robot will initiate its journey from various indoor starting points, navigating towards the designated elevator entrance. Success will be quantified by the robot's ability to accurately arrive at the elevator entrance within a defined error margin (e.g., less than 0.5 meters from the designated point).

Obstacle Detection and Avoidance in Indoor Settings: The experiment will introduce static and dynamic obstacles common in indoor settings, such as furniture and moving people, to assess the robot's capability to detect and navigate around them. A successful navigation is one that avoids collisions while maintaining a minimum safe distance of 0.5 meters from any obstacle.

Time Efficiency: The robot's efficiency will be evaluated based on the time taken to complete the navigation from the starting point to the elevator entrance. A successful system will demonstrate the ability to complete the task within a predetermined timeframe, reflecting efficient path planning and execution (e.g., achieving the task in less than 5 minutes).

Adaptability to Environmental Changes: The robot's response to sudden environmental changes, such as closed pathways or unexpected obstacles, will be tested. Success in this context means the system can quickly adapt, recalculating a new path to the destination without manual intervention and within a reasonable time delay.

System Reliability: Throughout multiple trials, the robot's reliability will be assessed by its consistent performance in achieving the set task. A high success rate (e.g., completing the task in 95\% of trials) will be indicative of the system's reliability.

Evaluation Metrics:

1. Navigation Accuracy: Deviation from the elevator entrance point in meters.

2. Obstacle Avoidance Success Rate: Percentage of obstacles successfully detected and avoided.

3. Task Completion Time: Time taken to navigate from the starting point to the elevator entrance.

4. Adaptability Measure: Time required for path re-planning in response to environmental changes.

5. Reliability Index: The percentage of successful task completions across all trials.

Conclusion & Future Work

The "Campus Autonomy" project aims to revolutionize campus logistics through the development of an autonomous vehicle capable of seamless indoor and outdoor navigation for efficient package delivery. Utilizing the Agile X HUNTER SE vehicle, equipped with state-of-the-art sensors like the Hesai QT64 Lidar and a panoramic camera, alongside the advanced capabilities of ROS2, this project addresses the growing demand for autonomous delivery solutions. By leveraging the Navigation2 framework and incorporating innovative algorithms for localization, mapping, and navigation, we propose a system designed for high reliability, safety, and flexibility in complex campus environments. The project's successful implementation will not only enhance operational efficiency and convenience on campuses but also contribute significantly to the broader field of robotics by demonstrating the practical application of ROS2 in autonomous navigation. Through "Campus Autonomy," we endeavor to push the boundaries of what is possible in autonomous delivery services, paving the way for future advancements in robotic transportation.

In the subsequent work, we will introduce the osmAG map format into the entire navigation stack. We will replace the Global Planner in Nav2 with the osmAG plug-in, which is used to plan the global path in the Area Graph that only contains permanent building information (such as rooms, corridors, etc.). We will use the Smac Hybrid A* algorithm to replace the original A* algorithm. The former will consider various kinematic limitations and physical characteristics of the Ackerman model robot when planning the global path, so as to plan a more feasible global path. On the other hand, since osmAG uses a very concise format to contain indoor information, and only considers indoor permanent obstacles and combines WiFi when positioning, in the navigation stack, we will customize the local cost map for real-time dynamic obstacle detection in the converted grid map. We believe that integrating osmAG into the navigation stack of ROS2 will enable robots to break through various limitations: such as the storage limit of the grid map, the indoor and outdoor environment limits, so that mobile robots can better serve humans.

osmAG

```