This is the code repository for my final year project titled "Mixed Traffic Simulation for Autonomous Systems in Shared Spaces". The project extends the existing AirSim simulator created by Microsoft.
The project looks at extending AirSim to include mixed forms of traffic. This includes running several entities simultaneously, adding pedestrians to the simulator and making the simulator simpler to extend. A large range of APIs are also added. The simulator expands the current Unity prototype build. The changes can be found inside the Unity directory, which contains the Unity project and the AirLib wrapper, the AirLib directory, and the PythonClient directory.
The report and documentation can be found here.
Build the project as specified here.
After the project has built, UMA2 has to be downloaded from the Asset store.
If this was not done automatically, as if for example Unity was running, run the build script inside Unity/build
. This will build the AirLib DLL.
For a scene to be an AirSim environment, simply drag the AirSim prefab from the prefab directory into the scene as shown in the image below. Make sure that UMA2 is installed and that the AirLib DLL is in the plugin directory.
As can be seen, the AirSim prefab consists of 5 components.
- AssetHandler - This is where the available vehicle models should be added. This can be extended for other entities as well.
- Main Camera - This consists of the free-moving camera script and the entity tracking script.
- UMA_GLIB - Is the rendering prefab for UMA2
- AirSimGlobal - Contains global AirSim configurations as well as the game server controls.
- AirSimHUD - Consists of the simulator HUD.
There are two ways of doing this.
- Method one is to use the existing AirSim car controller script to control the vehicle and the car script for the logic and interaction with the vehicle companion. Simply attaching these two scripts will link the entity to the vehicle API. The final step is to add the asset to the asset handler. This method still allows for different vehicle physics as seen below.
- This is slightly more complicated but allows for almost full freedom. To have different vehicle controls or if something else wants updating, overload the vehicles class. This will still connect to the vehicle API. If a completely new set of APIs are required, look at Adding additional entities
To add additional entities with a new set of APIs, the simplest way is to base it off the pedestrian. The code for the pedestrian can be found here, where the PedestrianCompanion is a static class that keeps track of all the pedestrians in the scene, and which can communicate with the APIs. The pedestrian script is the behaviour script attached to the pedestrian.
The figure below illustrates how the API calls interacts with each software component. It is therefore easier to read the code as many files have changed.
To be able to capture video feed from the entity, add a gameobject called CaptureCameras. Then attach cameras to this object. The cameras consists of two scripts, the data capture script and the camera filters script. Look at the existing vehicles to see how this is done.
A key update to this project was to divide the existing server into three components. As a part of extending the server for any type of entity, simply replicating the process used to create the pedestrians. This would allow for the possibility to simply extend the project in the future.
For a more detailed description of how this was done, the report can be found here.
A large variety of demoes can be found in the PythonClient directory. The main ones worked on for this project can be found in car folder.
AirSim is a simulator for drones, cars and more, built on Unreal Engine (we now also have an experimental Unity release). It is open-source, cross platform, and supports software-in-the-loop simulation with popular flight controllers such as PX4 & ArduPilot and hardware-in-loop with PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped into any Unreal environment. Similarly, we have an experimental release for a Unity plugin.
Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles. For this purpose, AirSim also exposes APIs to retrieve data and control vehicles in a platform independent way.
Check out the quick 1.5 minute demo
Drones in AirSim
Cars in AirSim
For more details, see the use precompiled binaries document.
View our detailed documentation on all aspects of AirSim.
If you have remote control (RC) as shown below, you can manually control the drone in the simulator. For cars, you can use arrow keys to drive manually.
AirSim exposes APIs so you can interact with the vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle and so on. The APIs are exposed through the RPC, and are accessible via a variety of languages, including C++, Python, C# and Java.
These APIs are also available as part of a separate, independent cross-platform library, so you can deploy them on a companion computer on your vehicle. This way you can write and test your code in the simulator, and later execute it on the real vehicles. Transfer learning and related research is one of our focus areas.
Note that you can use SimMode setting to specify the default vehicle or the new ComputerVision mode so you don't get prompted each time you start AirSim.
There are two ways you can generate training data from AirSim for deep learning. The easiest way is to simply press the record button in the lower right corner. This will start writing pose and images for each frame. The data logging code is pretty simple and you can modify it to your heart's content.
A better way to generate training data exactly the way you want is by accessing the APIs. This allows you to be in full control of how, what, where and when you want to log data.
Yet another way to use AirSim is the so-called "Computer Vision" mode. In this mode, you don't have vehicles or physics. You can use the keyboard to move around the scene, or use APIs to position available cameras in any arbitrary pose, and collect images such as depth, disparity, surface normals or object segmentation.
Press F10 to see various options available for weather effects. You can also control the weather using APIs. Press F1 to see other options available.
- Video - Setting up AirSim with Pixhawk Tutorial by Chris Lovett
- Video - Using AirSim with Pixhawk Tutorial by Chris Lovett
- Video - Using off-the-self environments with AirSim by Jim Piavis
- Reinforcement Learning with AirSim by Ashish Kapoor
- The Autonomous Driving Cookbook by Microsoft Deep Learning and Robotics Garage Chapter
- Using TensorFlow for simple collision avoidance by Simon Levy and WLU team
More technical details are available in AirSim paper (FSR 2017 Conference). Please cite this as:
@inproceedings{airsim2017fsr,
author = {Shital Shah and Debadeepta Dey and Chris Lovett and Ashish Kapoor},
title = {AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles},
year = {2017},
booktitle = {Field and Service Robotics},
eprint = {arXiv:1705.05065},
url = {https://arxiv.org/abs/1705.05065}
}
Please take a look at open issues if you are looking for areas to contribute to.
We are maintaining a list of a few projects, people and groups that we are aware of. If you would like to be featured in this list please make a request here.
Join our GitHub Discussions group to stay up to date or ask any questions.
We also have an AirSim group on Facebook.
- Python wrapper for Open AI gym interfaces.
- Python wrapper for Event camera simulation
- Voxel grid construction
- Programmable camera distortion
- Wind simulation
- Azure development environment with documentation
- ROS wrapper for multirotor and car.
For complete list of changes, view our Changelog
If you run into problems, check the FAQ and feel free to post issues in the AirSim repository.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.
This project is released under the MIT License. Please review the License file for more details.