TESIS: Member of the Vector Group
Virtual Test Driving

Your Contact

DYNA4 Vehicle and Environment Simulation

Sensor simulation: Physical simulation of environment sensors for ADAS & AD

The detection of the environment builds the basis for assisted and automated driving (ADAS/AD). For the development and testing of driving functions, the simulation environment DYNA4 offers a physical modeling of ultrasonic, lidar, camera and radar sensors. For functions that are located after the sensor fusion, an automatic semantic image segmentation of all visible objects is generated. By the calculation on the graphics card (GPU) of your standard PC the simulation is performed with maximum time and cost efficiency.

Virtual test driving for assisted and automated driving

  • Real-time capable simulation for development and testing of sensor-based ADAS/AD functions
  • From early development phases to determine the sensor configuration up to virtual validation
  • Physical modelling of ultrasonic, lidar, camera and radar sensors
  • Variety of scenarios from parking to autonomous driving in surrounding traffic
  • Detailed vehicle dynamics for realistic sensor movements
  • Efficient test coverage through test automation with numerous variants

Animation-based sensor simulation

  • Simulation of DYNA4 with DYNAanimation for optimal utilization of CPU and GPU
  • All objects in the animation are considered by the sensor simulation, including their geometry, possible occlusion and sensor-specific material properties
  • Flexible usage and further processing of sensor signals thanks to the Data Distribution Service (DDS) standard
  • Pre-configured receiver modules for function development in Simulink and ROS

Ultrasound

  • Consideration of propagation and atmospheric attenuation 
  • Absorption and reflection based on object geometry and material properties
  • Adjustable opening angle and signal resolution
  • Output of an intensity depth histogram

Camera

  • Configurable cameras with opening angles up to 360°
  • Distortion parameterization with OpenCV or Scaramuzza parameters
  • Support of dirt on the lens
  • Display of RGB image streams on separate screens for image injection
  • Usage from MiL (algorithm development) to HiL (image injection on ECU)

Video: Configurable cameras for virtual ADAS testing with DYNA4

 

Lidar

  • Reflection intensity based on angle between laser beam and object surface and its material properties
  • Availability of rotating and non-rotating lidar sensors
  • Opening angle and signal resolution adjustable
  • 3D point cloud output as ROS Topic via DDS or via UDP in Velodyne format

Radar

  • Scattered radar waves based on object geometry and material properties
  • Consideration of different antenna characteristics (Short-, Mid-, Long-Range)
  • Adjustable opening angle and signal resolution
  • Output of raw data such as relative velocity and distance to the object as well as the intensity of the electric field or GPU-based Fourier transform to generate range Doppler plots

Semantic Image Segmentation

  • Automatic semantic image segmentation for depicting an ideal sensor fusion
  • Consideration and classification of all objects available in the object catalog
  • Configurable object classes
  • Adjustable opening angle and signal resolution
  • Output of relevant sensor fusion data such as relative velocity and distance to the object as well as its class

More information, contacts and support

Further information

Support

Visit the support page
From 9:00am to 5:00pm CET
Send support request >>
Download link for updates >>