The toolbox is intended to equip engineers working on autonomous systems in aerospace and defence, automotive, consumer electronics, and other industries with algorithms and tools to maintain position, orientation, and situational awareness. The toolbox extends MATLAB based workflows to help engineers develop accurate perception algorithms for autonomous systems.
Engineers working on the perception stage of autonomous system development need to fuse inputs from various sensors to estimate the position of objects around these systems. Now, they can use algorithms for localisation and tracking, along with reference examples within the toolbox, as a starting point to implement components of airborne, ground-based, shipborne, and underwater surveillance, navigation, and autonomous systems.
The toolbox provides a flexible and reusable environment that can be shared across developers. It provides capabilities to simulate sensor detections, perform localization, test sensor fusion architectures, and evaluate tracking results.
The Sensor Fusion and Tracking Toolbox includes:
- Algorithms and tools to design, simulate, and analyse systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness
- Reference examples that provide a starting point for airborne, ground-based, shipborne, and underwater surveillance, navigation, and autonomous systems
- Multi-object trackers, sensor fusion filters, motion and sensor models, and data association algorithms that can be used to evaluate fusion architectures using real and synthetic data
- Scenario and trajectory generation tools
- Synthetic data generation for active and passive sensors, including RF, acoustic, EO/IR, and GPS/IMU sensors
- System accuracy and performance standard benchmarks, metrics, and animated plots
- Deployment options for simulation acceleration or desktop prototyping using C-code generation