ISO/DIS 11010-2
ISO/TC 22/SC 33
Secretariat: DIN
Date: 2025-12-09
Passenger cars — Simulation model classification —
Part 2:
Perception sensor models for ADAS /AD
Voitures particulières — Classification des modèles de simulation —
Partie 2: Modèles de capteurs de perception pour ADAS /AD
DIS stage
© ISO 2025
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: + 41 22 749 01 11
E-mail: copyright@iso.org
Website: www.iso.org
Published in Switzerland
Contents
4.1 Perception sensor topology 5
4.2 Perception sensor model topology 6
4.3 Sensor physical system model 6
5 Sensor model characterization 8
5.4 Sensor model design approach 18
6 Model designation numbers 19
6.2 Sensor physical system model subclass 19
6.3 Sensor controller model subclass 20
6.4 Model designation scheme 21
ISO (the International Organization for Standardization) is a worldwide federation of national standards bodies (ISO member bodies). The work of preparing International Standards is normally carried out through ISO technical committees. Each member body interested in a subject for which a technical committee has been established has the right to be represented on that committee. International organizations, governmental and non-governmental, in liaison with ISO, also take part in the work. ISO collaborates closely with the International Electrotechnical Commission (IEC) on all matters of electrotechnical standardization.
The procedures used to develop this document and those intended for its further maintenance are described in the ISO/IEC Directives, Part 1. In particular the different approval criteria needed for the different types of ISO documents should be noted. This document was drafted in accordance with the editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights. ISO shall not be held responsible for identifying any or all such patent rights. Details of any patent rights identified during the development of the document will be in the Introduction and/or on the ISO list of patent declarations received (see www.iso.org/patents).
Any trade name used in this document is information given for the convenience of users and does not constitute an endorsement.
For an explanation on the meaning of ISO specific terms and expressions related to conformity assessment, as well as information about ISO's adherence to the World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT) see the following URL: www.iso.org/iso/foreword.html.
This document was prepared by Technical Committee ISO/TC 22, Road vehicles, Subcommittee SC 33, Vehicle dynamics, chassis components and driving automation systems testing.
This is the first edition.
A list of all parts in the ISO 11010 series can be found on the ISO website.
Any feedback or questions on this document should be directed to the user’s national standards body. A complete listing of these bodies can be found atwww.iso.org/members.html.
This document was developed in response to worldwide demand for standardization of simulation models and their requirements in specific scenarios as use cases. During development and test of road vehicles the question arises, how good simulation models have to be for performing certain use cases. Without standardization, it is common practice that experts in different organizations develop their own methods and processes to answer this question. When it comes to comparability and model exchange between project partners, obstacles occur.
The main purpose of this standard is to provide a framework that enables a systematic characterization of perception sensor simulation models. This edition of the standard does not consider specific sensor effects for classification. The simulation models are classified into certain model classes, their designation number and related elements, characteristics and common modelling methods. The assignment is the responsibility of the user or can be specified by other regulations and standards.
Identification of patent holders: the following text shall be included if patent rights have been identified.
The International Organization for Standardization (ISO) [and/or] International Electrotechnical Commission (IEC) draw[s] attention to the fact that it is claimed that compliance with this document may involve the use of a patent.
ISO [and/or] IEC take[s] no position concerning the evidence, validity and scope of this patent right.
The holder of this patent right has assured ISO [and/or] IEC that he/she is willing to negotiate licences under reasonable and non-discriminatory terms and conditions with applicants throughout the world. In this respect, the statement of the holder of this patent right is registered with ISO [and/or] IEC. Information may be obtained from the patent database available at www.iso.org/patents.
Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights other than those in the patent database. ISO [and/or] IEC shall not be held responsible for identifying any or all such patent rights.
Passenger cars — Simulation model classification —
Part 2:
Perception sensor models for ADAS /AD
1.0 Scope
This document specifies a classification framework and terminology of simulation models for perception sensors in Advanced Driver Assistance Systems and Automated Driving (ADAS/AD) use cases. It builds upon the overall framework defined in ISO 11010-1:2022, which is applicable to vehicle dynamic models.
This document focuses on perception sensor models for radar, camera, lidar and ultrasonic.
This document enables a structured approach for perception sensor model selection and provides a foundation for consistent sensor model comparison. The model classification is based on the modelling approach of the sensor model and its environment as well as the necessary inputs and outputs of the model.
2.0 Normative references
The following documents are referred to in the text in such a way that some or all of their content constitutes requirements of this document. For dated references, only the edition cited applies. For undated references, the latest edition of the referenced document (including any amendments) applies.
ISO 11010-1:2022, Passenger cars — Simulation model classification — Part 1: Vehicle dynamics
3.0 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminological databases for use in standardization at the following addresses:
- IEC Electropedia: available at http://www.electropedia.org/
- ISO Online browsing platform: available at http://www.iso.org/obp
simulation model
mathematical model for the calculation of the system state variables based on equations describing a vehicle or vehicle sub-system
[SOURCE: ISO 11010-1:2022, 3.1 - Note to entry deleted]
model class
category of simulation models of a vehicle or a vehicle sub-system
EXAMPLE Radar perception sensor model class.
model subclass
subclassification of a model class (3.2), including the separation of sensor physical system model and controller model parts of the overall model class into individual subclasses
EXAMPLE Radar perception sensor physical system model subclass.
model designation number
gradation of model type and depth with associated model characteristics, represented effects and minimal model inputs and outputs
[SOURCE: ISO 11010-1:2022, 3.3]
core environment model
core part of the simulation model (3.1) of the physical world outside of the vehicle required to keep track of the state of the world, excluding any rendering or signal propagation (3.10) modelling
Note 1 to entry: Not the vehicle-internal model of its surroundings (cf. ISO 23150).
perception sensor
hardware and software components embedded in a vehicle to gather information about its surrounding environment, including any initial processing of this information in terms of feature (3.30), detection (3.31), or object (3.29) identification, tracking, and classification
perception sensor model
simulation model (3.1) of a perception sensor (3.6)
sensor physical system
part of the perception sensor (3.6) that comprises the internal signal propagation (3.12) and sensing technology (3.14) parts of the overall sensor including all analogue processing and the initial conversion into the digital domain
sensor physical system model
simulation model (3.1) of the sensor physical system (3.8)
signal propagation
emitter – receiver transmission path of signals from the environment to the sensor (3.21) and vice-versa for active sensors
external signal propagation
part of the signal propagation (3.10) that occurs external to the perception sensor (3.6) component
EXAMPLE The propagation of radio waves of a radar perception sensor once they have left the sensor component and their return path until entering the sensor component form the external signal propagation.
internal signal propagation
part of the signal propagation (3.10) that occurs inside the perception sensor (3.6) component
EXAMPLE The optical train of a camera perception sensor forms part of the internal signal propagation.
signal propagation model
simulation model (3.1) of the signal propagation (3.10) path between environment and perception sensor (3.6) or a part of the signal propagation path
sensing technology
physical signal conversion used by the perception sensor (3.6) to acquire needed signals including conversion into the digital domain
sensing technology model
simulation model (3.1) of the sensing technology (3.14) part of a perception sensor (3.6)
sensor controller
part of the perception sensor (3.6) that performs the signal processing and sensor functions in the digital domain
sensor controller model
simulation model (3.1) of the sensor controller (3.16) part of a perception sensor (3.6)
digital signal processing
DSP
processing of the sensing technology (3.14) output signals performed in the digital domain with the intent of providing improved signal-based input to the higher-level sensor function (3.21)
field-programmable gate array
FPGA
type of configurable integrated circuit that can be repeatedly programmed after manufacturing, and is often used in digital signal processing (3.18) applications requiring configurable massively parallel operation
digital signal processing model
simulation model (3.1) of the digital signal processing (3.18) part of a perception sensor (3.6)
sensor function
object (3.29) detection, identification, classification and tracking and localization functions performed on the converted physical signals
Note 1 to entry: The sensor function defined here is a specific set of functions not necessarily reflective of the broader terms sensor or perception.
sensor function model
simulation model (3.1) of the sensor function (3.21) part of a perception sensor (3.6)
vehicle function
specific component of a vehicle that relies on information gained from one or more perception sensors to perform its intended function
EXAMPLE Automated lane keeping system (ALKS): Based on the images of one or more cameras the position and orientation of the host vehicle is derived. The ALKS is requesting steering inputs to let the host vehicle follow the lane.
vehicle function model
simulation model (3.1) of a vehicle function (3.23)
vehicle model
overall simulation model (3.1) of a vehicle
input interface
data required by a simulation model (3.1) to perform its function
output interface
data provided by a simulation model (3.1) as a result of performing its function
field of view
FoV
characteristic restricted spatial view of the perception sensor (3.6) into its environment
object
representation of a real-world entity with defined boundaries and characteristics in the vehicle coordinate system
Note 1 to entry: The geometric description of the object is in the vehicle coordinate system.
Note 2 to entry: Object signals are basically sensor technology independent. Sensor technology specific signals may extend the object signals.
[SOURCE: ISO 23150, modified — Examples and references to other terms have been removed.]
feature
sensor technology specific entity represented in the vehicle coordinate system based on multiple measurements
Note 1 to entry: Multiple measurements can originate from a sensor cluster.
Note 2 to entry: Multiple measurements can originate from multiple measurement cycles.
Note 3 to entry: The term feature is used in this document not as function or group of functions as specified in ISO/SAE PAS 22736.
[SOURCE: ISO 23150, modified — References to other terms have been removed.]
detection
sensor technology specific entity represented in the sensor coordinate system based on a single measurement of a sensor
Note 1 to entry: A small amount of history can be used for some detection signals, for example, model-free filtering may be used in track-before-detect algorithms.
Note 2 to entry: Raw data at this level of information can be referred to using the more specific terms reflection (3.32) or image (3.33), depending on context, data formats and sensor technology.
[SOURCE: ISO 23150, modified — References to other terms have been removed.]
reflection
single intensity, phase, and/or polarization measurement of an active sensor, including radar and lidar sensors, that corresponds to a return signal of an emitted signal represented in a sensor detector-specific coordinate system
Note 1 to entry: See detection (3.31) for the more generic term for this level of information, as used in ISO 23150.
image
two-dimensional rectilinear grid representation of the environment where the information at each point in the representation, given by brightness, colour, depth, or other information, is related to the output signal from the sensing technology (3.14) detector directly or through some form of digital signal processing (3.18)
Note 1 to entry: See detection (3.31) for the more generic term for this level of information, as used in ISO 23150.
4.0 Model topology
4.1 Perception sensor topology
The following stylized structure of perception sensors forms the basis of the definition of a corresponding sensor model topology in 4.2.
A perception sensor typically involves a number of distinct parts that together perform the function of the perception sensor as shown in Figure 1:
- A signal propagation path from the surrounding environment to the actual sensing technology detection mechanism of the sensor: Parts of this path will reside outside of the sensor itself (e.g. the air), constituting the external signal propagation, while some parts like an optical train of a camera will reside inside the sensor, constituting the internal signal propagation. For an active sensor like RADAR or LIDAR the signal propagation path will be bidirectional, for a passive sensor like a camera it will be unidirectional.
- The actual sensing technology part of the perception sensor that detects the signals and converts them, potentially via other physical domains, into the digital realm.
- Any digital signal processing steps that are focussed on improving the signal for the higher-level sensor functions.
- The higher-level sensor function, including object extraction, tracking and any other further processing that is part of the overall perception sensor function. This includes the handling of all external communication interfaces, whether at detection, feature, or object level.
The internal signal propagation and sensing technology parts constitute the sensor physical system, whereas the digital signal processing and sensor function parts constitute the sensor controller.
The data produced by a perception sensor is consumed by one or more vehicle functions, either directly or indirectly via sensor fusion or other intermediary processing steps, which are subsumed in the vehicle function for purposes of the overall topology.
Similarly, the signal propagation interacts with the environment in which the vehicle operates.
Figure 1 — Stylized structure of a perception sensor in its real-world context
4.1.1 Perception sensor model topology
Based on the sensor topology of 4.1 a corresponding nominal perception sensor model topology can be defined as shown in Figure 2:
Figure 2 — Nominal topology of a perception sensor model in its simulation context
The nominal model topology can be separated into two major simulation model blocks, the sensor physical system model, comprising the modelled parts of the signal propagation and the sensing technology, and the sensor controller model, comprising the digital signal processing and sensor function aspects.
The nominal model topology is defined purely for the purpose of model classification. It does not prescribe the actual separation into separate model components: The separation into blocks in the topology and the overall "Perception Sensor Model" box do not imply that these are the actual boundaries of models used to model the perception sensor: The perception sensor model could be split up into multiple models for individual parts thereof without violating the nominal topology. Similarly, many parts can be modelled in a combined fashion in an overall model.
The overall perception sensor model interacts with one or more vehicle function model that consume the data provided by the perception sensor models. In closed-loop simulations, these form part of an overall vehicle model, which feeds back into the core environment model for state updates of the vehicle in its environment.
For access to the state of the environment the signal propagation model interacts with the core environment model.
The interfaces between the individual parts of the model topology are characterized in the technology-specific sections of Clause 5. They form part of the model classification scheme of Clause 6 due to the input interface and output interface based level of detail classifications. Beside the major data interfaces as indicated in the topology, additional control interfaces can be present that are needed to feed control information back into preceding model from later processing stages. Where relevant, these are also specified in technology-specific sections of Clause 5.
4.1.2 Sensor physical system model
4.1.3 General
The sensor physical system model covers the parts of the perception sensor model that are concerned with physical signal propagation and conversion into the digital domain.
In the model classification scheme of Clause 6 this forms a model subclass.
4.1.4 Signal propagation model
The signal propagation model covers the signal propagation path from the surrounding environment to the actual sensing technology detection mechanism of the sensor. For active sensors this also includes the signal propagation from the sensor into the environment (bidirectional signal propagation), for passive sensors the signal propagation will be unidirectional.
Parts of the signal propagation path will reside outside of the sensor itself (e.g. the air), termed the external signal propagation, while some parts like an optical train of a camera will reside inside the sensor, termed internal signal propagation.
The extent to which the signal propagation model is included in the overall perception sensor model, and to which extent it employs models outside the overall perception sensor is an important aspect of model classification, as defined in Clause 6. This also affects the way these models interact with the core environment model.
4.1.5 Sensing technology model
The sensing technology model covers the part of the perception sensor that detects the signals as propagated and converts them, potentially via conversions to other physical domains, into the digital realm. For active sensors this also covers the generation of signals emitted from the sensor.
Example 1 For a camera perception sensor, the sensing technology model would include the conversion of light rays to electrical signals in a CCD imager, their amplification and readout, as well as their conversion to the digital domain via analogue to digital conversion (ADC).
Example 2 For a radar perception sensor, the sensing technology model would include the generation of radar waves via oscillators, modulators, and amplifiers, including their transmission via antenna, as well as their reception via antenna, amplification, any potential demodulation, and conversion to the digital domain via analogue to digital conversion (ADC).
4.2 Sensor controller model
4.2.1 General
The sensor controller model is concerned with all computational processing aspects of the perception sensor, including processing hardware like DSP or FPGA systems, as well as all software aspects. This model can be separated into the following notional sub-models.
In the model classification scheme of Clause 6 this forms a model subclass.
4.2.2 Digital signal processing model
The digital signal processing model comprises the modelling of any digital signal processing steps that are focussed on improving the signal coming from the sensing technology model for the higher-level sensor functions modelled in the sensor function model.
This includes all generic filtering or other transformations performed on the signal to improve signal detection capabilities, as well as the initial stages of detection or feature extraction. It depends on the sensor implementation and modelling choices how far the extraction process itself is handled as part of DSP, and to what extent it is processed as part of the sensor function.
Usually final classification of detections and features, as well as object extraction and tracking are considered part of the sensor function and hence modelled as part of the sensor function model.
4.2.3 Sensor function model
This model represents the higher-level sensor function, including object extraction, tracking and any other further processing that is part of the overall perception sensor function. This most commonly comprises the final classification of detections and features, as well as object extraction, classification, and tracking functionality.
The sensor function includes the handling of all external communication interfaces, whether at detection, feature, or object level, and is part of the functionality modelled in the sensor function model. While at least the initial stages of detection or feature extraction are usually handled as part of the digital signal processing modelled in the DSP model, any further processing, including final classification, the preparation of this data for communication to the outside world, as well as the actual communication handling are part of the sensor function model.
5.0 Sensor model characterization
5.1 General
Sensor models are characterised according to the following aspects:
- Sensor model class, i.e. physical sensing technique;
- Sensor model fidelity;
- Sensor model design approach.
These aspects are elaborated in the further sub-clauses.
5.1.1 Sensor model classes
5.1.2 General
The present standard takes into account four sensor model classes, reflecting the physical mode of operation (sensing technology) of the sensors being modelled, namely radar, lidar, camera and ultrasonic sensors.
The sensor model classes are elaborated in the following sub-clauses.
5.1.3 Radar model (RM)
Radar sensor topology
The radar sensor topology shown in Figure 3 presents the sensing signal flow of a radar sensor including the sensor physical system and the sensor controller. The sensor physical system is divided into the frontend containing all necessary components to handle the received signals and those to be transmitted and the antennas acting as an interface to the external signal propagation within the environment. As radar sensors are active sensors, the sensor physical system comprises a transmit path and is bidirectional accordingly.
Figure 3 — Radar sensor topology
Once the received signals are handled by the sensor physical system, the digitalized baseband data are handled by the sensor controller. In the radar digital signal processing block, relevant perception parameters like range, velocity, azimuth and elevation angles as well as the radar cross-section are extracted and stored in the detection list. The sensor function block contains post-processing steps like clustering, tracking, and classification of objects as well as geometry tracking, while the resulting objects are stored in the object list.
The object list is used by the vehicle function, which can refer to safe-driving functionalities supporting the driver while driving, during parking manoeuvrers or in a self-driving state. Radar sensors utilize different interfaces, such as detection list and object list, for sensor data fusion with other sensors, typically resulting in an output object list.
Radar sensor model topology
Figure 4 shows the radar sensor simulation topology within XiL (X-in-the-loop) simulation frameworks. The component models are derived from the generic radar sensor topology introduced in 5.2.2.1. The high-level signal and control data exchanges between each component model are depicted. Solid-line arrows represent simulated signal flows while dashed-line arrows indicate simulation control flows.
Figure 4 — Radar sensor simulation topology
Radar model classes may be differentiated by different input and output data domains. Based on the simulation topology shown in Figure 4, the following interfaces are defined:
Radar external signal propagation
The electromagnetic wave emitted by the radar sensor will propagate through the environment, becomes reflected and will be received in a delayed version, where the propagation delay corresponds to the distance between radar sensor and reflection point. The electromagnetic wave may also be Doppler frequency shifted, and phase shifted corresponding to the velocity and the angle of a radar object, respectively.
Depending on weather conditions, the electromagnetic wave becomes attenuated and backscattering occurs e.g. due to reflections of rain drops and snowflakes.
Radar internal signal propagation
The electromagnetic wave propagates through transmission lines between the components on the frontend (e.g. oscillator and mixer), which may cause phase shifts and distortions. While the transmit antennas convert the electromagnetic waves to radio frequency radiation, the receive antennas convert the radio frequency radiation to electromagnetic waves on the transmission lines. The radar housing and the radome may cause further unwanted and/or unintended effects to the radio frequency radiation.
The down-converted signals are often amplified and then sampled resulting in the digitalized baseband data.
Radar digital signal processing
The digitalized baseband data contain various types of information in the range, velocity, and angle dimension as frequency shifts. The radar DSP model is dependent on the modulation scheme as well as the antenna layout and it describes how the detection list is obtained from the digitalized baseband data.
Radar sensor function
The sensor function model handles the detection lists of consecutive radar measurements and implements post-processing steps like clustering, tracking, classification of objects as well as geometry tracking. By filtering the detection lists over time, the accumulated amount of information can be increased indefinitely. The resulting objects are stored in the object list.
5.1.4 Lidar model (LM)
Lidar sensor topology
The lidar sensor topology shown in Figure 5 represents the sensor signal flow of a lidar sensor, including the sensor physical system and the sensor controller. The sensor physical system is divided into the frontend, which contains all the necessary components to generate, emit, receive, and pre-process the optical signals, and the receiver and emitter optics as the interface to the external signal propagation in the environment. Since lidar sensors are active sensors, the sensor physical system includes a transmission path and is bidirectional. However, the receiver optics also collect signals from the environment, like from the sun and / or other lidars.
Figure 5 — Lidar sensor topology
Once the received optical signals are pre-processed within the sensor physical system, the digitized data are handled by the sensor controller. In the lidar digital signal processing (DSP) block, relevant perception parameters such as distance, azimuth, elevation, and intensity are extracted and stored in the detection list, often called "point cloud". The sensor function block can include post-processing steps such as clustering, object tracking, classification, and geometric filtering, while the final processed objects are stored in the object list.
The object list is then utilized by the vehicle function, which supports various applications such as autonomous driving, driver assistance systems, and parking manoeuvrers. Lidar sensors utilize different interfaces, such as detection list and object list, for sensor data fusion with other sensors, typically resulting in an output object list.
Lidar sensor model topology
Figure 6 shows the lidar sensor simulation topology within XiL (X-in-the-loop) simulation frameworks. The component models are derived from the generic lidar sensor topology introduced in 5.2.3.1. The high-level signal and control data exchanges between each component model are depicted. Solid-line arrows represent simulated signal flows while dashed-line arrows indicate simulation control flows.
Figure 6 — Lidar sensor simulation topology
Lidar model classes may be differentiated by different input and output data domains. Based on the simulation topology shown in Figure 4, the following interfaces are defined:
Lidar external signal propagation
The laser pulses emitted by the lidar sensor propagate through the environment, interact with objects, and are reflected back to the receiver. Depending on the material properties of objects, the returning signal may be partially absorbed, transmitted, scattered, or diffused, affecting the intensity of the received signal. In most actual automotive lidars, the time of flight (ToF) delay between emission and reception is measured and corresponds to the distance between the lidar sensor and the reflection point. In contrast to ToF lidars, frequency-modulated continuous wave (FMCW) lidars operate by emitting a continuous, frequency-modulated laser beam. The distance and velocity of objects are determined by analysing the frequency shift of the returning signal, similar to radar principles. This approach enables direct Doppler velocity measurement.
Environmental factors such as fog, rain, and snow can attenuate the laser pulses, leading to signal degradation. Partial absorption and scattering effects occur due to small particles like dust, water droplets, and aerosols, which may introduce additional noise in the received data and reduce overall detection accuracy.
Lidar internal signal propagation
The emitted laser pulses pass through optical components such as beam splitters, mirrors, and lenses before being transmitted into the environment. These components can introduce distortions, dispersion, or unwanted internal reflections that may impact signal quality. The emitted laser beams are shaped and directed using scanning mechanisms, such as rotating mirrors, MEMS mirrors, or solid-state beam steering techniques.
Upon reception, the reflected signals pass through optical filters and lenses before reaching the photodetector. The received optical signals are then converted into electrical signals using photodetectors such as avalanche photodiodes (APDs) or single-photon avalanche diodes (SPADs). These electrical signals are amplified and processed to extract range, intensity, and other relevant features. Finally, the digitized data are generated, representing the spatial distribution of detected objects.
Lidar digital signal processing
The digitized, sorted data contain various types of information, including range, intensity, and angular position, depending on the underlying measurement principle. Often, more than one detection per beam or receiving angle-combination is possible due to multiple available data. The following lidar digital signal processing model is influenced by factors such as the scanning mechanism, detection method (e.g., ToF or FMCW), and optical system configuration. It describes how the data are filtered and further processed to obtain structured information in the form of a detection list (so-called point cloud) or object list. The signal-to-noise ratio (SNR) plays a key role in this process, as a higher SNR improves the reliability of detections. Low-SNR returns, which may be caused by weak reflections or environmental interference, are often discarded to minimize false detections.
For FMCW lidars, the digital signal processing model includes additional steps such as frequency demodulation and Doppler shift analysis to extract velocity information directly from the returning signal. This targets to enhance object detection capabilities, especially in dynamic environments.
Lidar sensor function
The sensor function model processes the detection lists from consecutive lidar measurements and implements post-processing steps such as clustering, object tracking, and classification. By analysing multiple frames over time, it can filter out noise, improve object detection consistency, and enhance geometric tracking of detected objects.
For lidar sensors, clustering algorithms are used to group neighbouring points that likely belong to the same object. Object tracking methods, such as Kalman filtering or particle filtering, predict object movement across frames, while classification techniques help distinguish between different object types, such as vehicles, pedestrians, or static infrastructure.
By filtering and accumulating detection lists over time, the amount of useful information can be increased, reducing false positives and improving sensor reliability. The final processed objects are stored in the object list, which serves as the primary output for sensor fusion and driving functions.
5.1.5 Camera Model (CM)
Camera sensor topology
Figure 7 shows the generic topology of a camera sensor based on the generic topology of a sensor as described in Clause 4, depicting only the sensing signal flow. Control signals are omitted in the high-level topology. The signal path for a camera sensor is uni-directional, i.e. receive only, since the mode of operation of a camera sensor is as a passive receiver of a light signal.
Figure 7 — Camera sensor topology
For the camera sensor topology, the element signal propagation block in the generic sensor model is divided into external signal propagation and the camera lens blocks, with the camera lens block corresponding to the internal signal propagation.
The external signal propagation constitutes environmental effects that affect the propagation of light to the sensor, e.g. weather conditions or air pollution.
The camera lens also has inherent optical properties that affect the reception of light from the external environment.
Camera sensor model
Figure 8 shows the topology of camera sensor simulation for XiL simulation approaches. The camera sensor model could be implemented as a single entity, or it could be implemented as separate camera lens model and imager model.
The high-level signal and control data flows between each component model are identified. Data flows depicted with solid-line arrows represent simulated signal flows. Data flows depicted with dashed-line arrows represent simulation control flows.
Figure 8 — Camera sensor simulation topology
Figure 9 shows the components of the camera sensor model and identifies input and output interfaces in the simulation environment.
Two input interfaces (II) are defined for camera sensor models, II-1 and II-2. Which one is used in a simulation use case depends on the configuration of the simulation system at hand.
II-1 is used to inject image frames into a lens model or into a combined lens and imager model.
II-2 is used to inject image frames into a stand-alone imager model.
Figure 9 — Camera sensor model simulation interfaces
Camera external signal propagation
The image rendering function delivers images according to the environment, scenario under test, and any external effects to the camera sensor model. The image data are the final result of the simulated optical signal that is provided as input to the camera sensor via either of the input interfaces.
Camera lens model
The camera lens has inherent optical properties that affect the reception of light from the external environment.
Simulated image data are input to the lens model via input interface II-1.
The adjusted image data are image data that have been adapted due to aberrations caused by the camera lens. The adjusted image data are provided to the imager model.
Camera imager model
Depending on the simulation use case at hand, the imager model takes as input either:
- the adjusted simulated image data provided by the lens model (with the image data being injected at interface II-1), or
- simulated image data injected at interface II-2.
The imager model should replicate relevant sensor imager properties that affect the conversion of the simulated optical signal into the output image data.
The image control data interface is used to apply configurations settings to the imager in the same way that the imager inside a physical camera sensor can be configured and controlled.
Camera digital signal processing
The digital signal processing model takes as input the raw image data from the imager model.
In a dynamic simulation use case, the digital signal processing model should implement the same imager control functions as those implemented in the modelled camera sensor. These can include adjustment of the shutter mode of operation and exposure time.
The output of the digital signal processing model is the processed image data, output at interface OI.
Camera sensor function
The sensor function model takes as input the processed image data from the signal processing model.
The sensor function model interprets the processed image data in order to recognize objects in the FoV.
The sensor function model can be implemented based on the processing of image data from a single camera sensor imager, from multiple camera image sensors, or it can process the outputs of different types of sensor in a sensor fusion context.
The output of the sensor function model is the instantaneous object list.
5.1.6 Ultrasonic Model (UM)
Ultrasonic sensor topology
The ultrasonic sensor works by transmitting ultrasonic pulses of known characteristics and receiving back echoes of these pulses reflected by objects surrounding the host vehicle. Typically time-delay of echo return is the most important characteristic. However, some ultrasonic systems can also evaluate other return signal characteristics like amplitude or Doppler shift.
The ultrasonic sensor topology shown in Figure 10. The most obvious difference to other sensor technologies is that the ultrasonic sensing principle is based on set of individual ultrasonic sensors working together. The sensors are controlled and evaluated by overall ultrasonic system logic. One difference to other active sensors like radar and lidar is that the signals transmitted by one ultrasonic sensor are received not only by the transmitting sensor but also by other, neighbouring sensors.
Note * Architecture dependent - actual presence depends on implementation choices.
Figure 10 — Ultrasonic sensor topology
Looking at the topology, the components at the system level (listed in downstream order) are:
- Sensor function - processing of received ultrasonic responses, provision of detected objects/features/freespace to higher vehicle function;
- Signal processing - optional step (* architecture dependent), conditioning and filtering of received signals (cf. digital signal processing);
- Sensor controller - handles start-up and coordination of individual ultrasonic sensors a controls the ultrasonic pulse transmission sequence.
Looking at the topology, the components at the sensor level (listed in downstream order) are:
- Signal processing - optional step (* architecture dependent), conditioning and filtering of received signals;
- Tx driver - power amplifier driving the excitation of the ultrasonic membrane. The excitation can be done on the membrane's resonance frequency or at a different frequency to improve discrimination on signal reception;
- Rx amplifier - reception amplifier to amplify the signals received by the sensor membrane.
The ultrasonic system function (typically running on an ECU) is connected to the ultrasonic sensors (typically installed around the vehicle - bumpers, possibly sides/door) using either a star or a bus communication topology.
Ultrasonic sensor model topology
Figure 11 shows the topology of ultrasonic sensor simulation for XiL simulation approaches. The component models are derived from the generic ultrasonic sensor topology introduced in 5.2.5.1. The high-level signal and control data between each component model are shown. Data flows depicted with solid-line arrows represent simulated signal flows. Only the raw data and object list interfaces are evaluable sensor quantities. All other quantities are purely simulation interfaces or internal interfaces of the ultrasonic model function. Data flows depicted with dashed-line arrows represent simulated sensor control flows.
Figure 11 — Ultrasonic model topology
The simulation system performs the task of coordinating the exchange of data between the component models such that the simulation reflects the operation of the equivalent physical sensing components as realistically as possible, or as realistically as the simulation is chosen to be, depending on the effects being simulated. The performance capability of the computing platform hosting the simulation system determines the execution speed of the simulation, in particular whether real-time performance can be attained while keeping required simulation fidelity.
Ultrasonic external signal propagation
The transmitted pulse of acoustical energy is spatially radiated. When hitting an outside object in the vicinity of the sensor part of the energy is reflected back and detected by the ultrasonic sensor(s). The signal reflection is determined by several properties:
- spatial radiation/reception efficiency of the ultrasonic sensor;
- target object material (acoustical reflectivity at given frequency);
- target object geometry (surfaces perpendicular to the acoustic wave propagation reflect with higher efficiencies);
- environmental conditions - air temperature, humidity and absolute pressure.
Upon reception the ultrasonic system typically evaluates:
Ultrasonic internal signal propagation
The main active element of an ultrasonic sensor is the front membrane, typically driven by piezoelectric principle, converting electrical to acoustical energy when transmitting, and detecting the received signal when receiving. The automotive sensors only employ single membrane for both transmission and reception functions, which are mutually exclusive.
When in use the sensors are in 2 possible modes:
- Transmission (active) mode. In this mode the sensor transmits the ultrasonic pulse upon command form the sensor controller block and then listens for the reflected sound echoes. There needs to be some settling time between pulse transmission and reception of the reflected signal, due to the mechanical limitations given by the ultrasonic membrane.
- Reception (passive) mode. In this mode the sensor listens for ultrasonic echoes originating from another sensor that actually transmitted the ultrasonic pulse. The position of the object reflecting the echo is determined using triangulation from relative positions of the transmitting an receiving sensors and measured echo distance.
Ultrasonic digital signal processing
Digital signal processing of the detected ultrasonic returns usually consist of filtering and discrimination of valid versus invalid returns.
Ultrasonic systems typically employ heavy filtering and averaging of the signals from individual sensors due to its relatively noisy nature. Typically several ultrasonic returns from a target object are needed for it to be entered into a map or presented as a detected feature to the upper layers of driving assistance algorithms.
Ultrasonic sensor function
The ultrasonic sensor system usually provides detected features (points, lines, freespace, etc. ) to downstream driving automation layers. These detections can have some properties associated, for example existence probability, spatial accuracy, classification of static or moving objects, classification of object height (low/high).
5.2 Sensor model fidelity
Model fidelity is a measure of how accurately a model replicates the physical sensor. The required level of model fidelity depends on the simulation use case at hand, and it influences the required level of detail of the model.
One aspect of model fidelity is the degree to which all characteristics of the sensor are modelled, for example with the number of perception sensor effects that are taken into account. For a camera sensor, examples of sensor effects are motion blur, and flare. The level of detail of the model could be differentiated for example with a combined model for the whole camera sensor, as opposed to the provision of distinct lens model and imager model.
5.2.1 Sensor model design approach
5.2.2 General
The four following approaches are considered for perception sensor simulation models:
- the ideal model approach;
- the statistical model approach;
- the physical model approach;
- the phenomenological model approach.
The four model approaches are elaborated in the following sub-clauses.
5.2.3 Ideal model
This approach assumes an ideal sensor without any weaknesses, distortions or failures.
The output of the model is according to the ground truth.
Example The position and orientation of the obstacles to be detected within the field of view of the sensor are converted into sensor coordinates and reported without taking into account any occlusion of objects by other obstacles.
5.2.4 Statistical model
This approach is based on the real measured data of the sensor to be modelled. Based on statistics of the measured data the model output is generated according to the model input. Knowledge about the sensor can be used to setup e.g. neural networks. Sensor internal data are not necessarily generated.
Example Based on the position of the traffic participants and obstacles, the environment conditions as output the point cloud is generated with the highest probability.
5.2.5 Physical model
The physical approach attempts to model the physical principle of the sensor by applying formulae. Usually the components of the sensor are modelled as well, thus intermediate results are available within the model. This approach is usually of the highest level of fidelity.
Example The light distribution is modelled using ray-tracing and the artificial point cloud is computed from the received rays. The model contains sub-models of the components of the sensor such as lens, laser and photo diode, amplifier, AD converter.
5.2.6 Phenomenological model
The approach is based on the ideal model, but deviations from the ideal behaviour are modelled based on the situation. Known phenomena of the sensor's behaviour are described without modelling the components of the sensor and their impact on the output directly.
Example The position and orientation of the obstacles to be detected which are in the field of view of the sensor are converted into sensor coordinates and reported. Known phenomena such as occlusion and visibility are modelled:
- occluded obstacles are not reported;
- obstacles with poor visibility due to fog and range are also not reported.
6.0 Model designation numbers
6.1 General
The model designation numbers are defined with reference to the nominal sensor model topology as laid out in 4.2, based on the model classes and subclasses. Each sensor model class defined in this document consists of two model subclasses: sensor physical system model and sensor controller model subclasses. See Figure 12 for an overview of the model designation scheme and its elements.
6.1.1 Sensor physical system model subclass
Based on the definitions given in this document the following classification attributes and their values are to be considered as elements of the model classification of the sensor physical system model subclass:
- Input interface (II), according to Table 1;
- Signal propagation (SP), according to Table 2;
- Sensing technology (ST), according to Table 3.
Table 1 — Input interface options of a sensor physical system model
Input Interface (II) | Description |
|---|---|
1 | object |
2 | feature |
3 | detection (reflection / image) |
X | Othera |
a "Other" input interface “X” addresses cases where a sensor model input requires different input interface options than the listed ones. In the case of a sensor requiring multiple options from the table, these are listed using a "+" separator, as defined in 6.4. | |
Table 2 — Signal Propagation options of a sensor physical system model
Signal Propagation (SP) | Description |
|---|---|
1 | Depends on signal propagation model outside the sensor model |
2 | Partial signal propagation model integrated into sensor model (depends on addtional signal propagation model outside the sensor model) |
3 | Full signal propagation model integrated into sensor model |
Table 3 — Sensing technology options of a sensor physical system model
Sensing Technology (ST) | Description |
|---|---|
1 | Ideal (s. 5.4.2) |
2 | Statistical (s. 5.4.3) |
3 | Phenomenological (s. 5.4.5) |
4 | Physical (s. 5.4.4) |
X | Other |
6.1.2 Sensor controller model subclass
Based on the definitions given in this document the following classification attributes and their values are to be considered as elements of the model classification of the sensor controller model subclass:
Table 4 — Model characteristic and designation number of sensor controller model subclass
Controller | Description |
|---|---|
0 | None |
1 | Simplified placeholder model |
2.1 | Principal logic controller model |
2.2 | Principal logic controller software |
3.1 | Target software: application software only |
3.2 | Target software: full-function virtual ECU (vECU) – application + simulation base software |
3.3 | Target software: full-function virtual ECU (vECU) – application + production base software |
3.4 | Target software: target binary virtual ECU (vECU) – application + production base software and drivers |
4 | Target ECU: Hardware ECU |
Note Designation numbers 2.1 and 2.2 correspond to vECU Level 0, with designation numbers 3.1, 3.2, 3.3, and 3.4 corresponding to vECU Levels 1, 2, 3, and 4, respectively of the prostep ivip SmartSE Recommendation V4 vECU classification scheme[3]. | |
Table 5 — Output interface options of a sensor controller model
Output Interface (OI) | Description |
|---|---|
1 | object |
2 | feature |
3 | detection (reflection / image) |
X | Othera |
a "Other" output interface “X” addresses cases where a sensor model output provides different interface options than the listed ones. In the case of a sensor providing multiple options from the table, these are listed using a "+" separator, as defined in 6.4. | |
6.1.3 Model designation scheme
Figure 12 presents a visualization of the classification attributes of a perception sensor model.
Figure 12 — Model Designation Scheme Overview
A perception sensormodel designation number is defined by following scheme:
<II> / <SP> / <ST> / <CM> / <OI>
This scheme enables the designation of various types of perception sensor models.
When multiple options are relevant for a designation number element, the options are joined with a "+" separator. Multiple options shall only be used for the II and OI designation number elements.
Example 1 A sensor model requiring object and detection inputs, including signal propagation in the model, using an ideal sensing technology model, a placeholder controller model, and providing object, feature and detection outputs, would be designated as 1+3 / 3 / 1 / 1 / 1+2+3.
Example 2 A radar sensor model delivering a tracked object list output based on reflections inputs generated by ray-tracing outside the sensor model could be classified as 3 / 1 / 4 / 1 / 1.
Example 3 A sensor model with focus on sensing technology without a controller and signal propagation handled outside the model would be classified as 3 / 1 / 4 / 0 / 3. In this case the input interface of the sensor model is directly the input of sensing technology model.
Example 4 A camera sensor model that relies on fully rendered image input which is then processed through the actual controller software could be classified as 3 / 1 / 1 / 3.1 / 1+2.
Example 5 The classical ideal sensor model working purely on object inputs and outputs would be classified as 1 / 3 / 1 / 1 / 1.
[1] ISO 23150:2023, Road vehicles — Data communication between sensors and data fusion unit for automated driving functions — Logical interface
[2] ISO/SAE PAS 22736:2021, Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles
