WO2022184127A1 - 一种车辆、传感器的仿真方法及装置 - Google Patents

一种车辆、传感器的仿真方法及装置 Download PDF

Info

Publication number
WO2022184127A1
WO2022184127A1 PCT/CN2022/078997 CN2022078997W WO2022184127A1 WO 2022184127 A1 WO2022184127 A1 WO 2022184127A1 CN 2022078997 W CN2022078997 W CN 2022078997W WO 2022184127 A1 WO2022184127 A1 WO 2022184127A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
vehicle
information
target
relative
Prior art date
Application number
PCT/CN2022/078997
Other languages
English (en)
French (fr)
Inventor
刘荣
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022184127A1 publication Critical patent/WO2022184127A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/23Testing, monitoring, correcting or calibrating of receiver elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles

Definitions

  • the present application relates to the technical field of intelligent networked vehicles, and in particular, to a simulation method and device for a vehicle and a sensor.
  • Autonomous driving is a technology in which a computer system replaces humans to drive a motor vehicle, which includes functional modules such as environmental perception, location positioning, path planning, decision control, and power systems.
  • functional modules such as environmental perception, location positioning, path planning, decision control, and power systems.
  • the present application provides a simulation method and device for a vehicle and a sensor, which are used to improve the simulation effect of the vehicle.
  • the present application provides a vehicle simulation method, which can be applied to a test device, and can include hardware devices that support running simulation software, such as personal computers, servers, vehicle-mounted mobile terminals, industrial computers, and embedded devices. Wait.
  • the test device can be implemented by a server or a virtual machine in the cloud.
  • the test device can also be a chip that supports running simulation software.
  • the test device is used for testing a simulated vehicle, and the test device may include a sensor model for simulating sensors in the simulated vehicle.
  • the test device may be a server for testing the simulated vehicle or a chip on the server.
  • the method can include:
  • the sensor feature prediction value includes: At least one of the following: a predicted value of radar cross section (RCS) and a predicted value of signal-noise ratio (SNR); wherein, the first target vehicle is in the test environment where the simulated vehicle is located The vehicle, the position information and speed information of the first target vehicle relative to the simulated vehicle and the road environment information of the simulated vehicle are determined according to the test environment, and the sensor model is based on the measurement data of the sensor and the marked road environment information; input the sensor feature prediction value of the first target vehicle into the decision-making module of the simulated vehicle to obtain the simulation decision-making result of the simulated vehicle; wherein, the decision-making module is used for A vehicle travel decision determined based on the predicted value of the sensor feature is output.
  • RCS radar cross section
  • SNR signal-noise ratio
  • the sensor model can obtain the predicted value of the sensor characteristics of the target object in the test environment, for example, the predicted value of RCS and the predicted value of SNR, so that the relevant information input to the decision module for simulating the sensor is closer to the real millimeter wave
  • the output of the radar sensor improves the fidelity of the sensor model to the sensor simulation, which is beneficial for the decision-making module to better simulate the possible decision-making results made by the vehicle based on the relevant information collected by the sensor in the real scene, thereby improving the simulation effect of the vehicle.
  • the sensor model since the sensor model is obtained by training based on the measurement data of the sensor and the marked road environment information, it can output corresponding sensor feature prediction values based on different test environments (which can correspond to different road environment information), effectively Improve vehicle simulation performance and robustness of simulation results.
  • the first target vehicle is a vehicle within the detection range of the sensor determined from the candidate vehicle according to the position information and speed information of the candidate vehicle relative to the simulated vehicle;
  • the position information and speed information of the candidate vehicle relative to the simulated vehicle are determined according to the test environment, and the candidate vehicle is a vehicle in the test environment where the simulated vehicle is located.
  • candidate vehicles can be screened based on vehicles within the detection range of the sensor of the simulated vehicle, so that the determined first target vehicle can be all possible candidate vehicles within the detection range of the sensor, considering When the sensor may be occluded by multiple first target vehicles relative to the simulated vehicle, due to the multipath effect of the sensor, the measurement data of the multiple first target vehicles can still be collected. Therefore, it is possible to better
  • the multipath effect of the sensor is simulated, so that the sensor model can also output multiple predicted values of sensor characteristics of the first target vehicle in this case. Therefore, the sensor model can reflect the multipath effect of the sensor, and the simulation effect of the sensor model is improved.
  • the predicted SNR value of the first target vehicle is greater than a visible threshold.
  • the sensor can determine whether the collected measurement data is a target object or noise by determining whether the signal-to-noise ratio is greater than a preset threshold.
  • a target it is judged to have a target and the judgment is correct, which is called “discovery”; report (false negative)"; when there is no target, it is judged to be no target, and the judgment is correct, this situation is called “correct but not found”; when there is no target, it is judged to have a target and the judgment is wrong, this situation is called “" False alarms (false positives)".
  • the introduction of this method into the sensor model enables the sensor model to simulate the physical characteristics of the sensor that may cause misjudgment of the target object based on the signal-to-noise ratio.
  • the SNR prediction value of the candidate vehicle is screened to determine whether the candidate vehicle is the first target vehicle. When the predicted SNR value of the candidate vehicle is greater than the visibility threshold, the candidate vehicle may be determined to be the first target vehicle. When the predicted SNR of the candidate vehicle is less than or equal to the visibility threshold, it may be determined that the candidate vehicle is determined to be noise by the sensor. Therefore, the physical characteristics of the sensor that may cause misjudgment can be reflected, and the effect of the sensor model to simulate the sensor can be improved.
  • the first target vehicle includes a first candidate vehicle and a second candidate vehicle; the sensor feature prediction value of the first target vehicle is based on the sensor feature prediction value of the first candidate vehicle and the first candidate vehicle.
  • the predicted values of the sensor characteristics of the two candidate vehicles are determined; the first candidate vehicle and the second candidate vehicle satisfy: the relative position of the first position relative to the second position is less than the first position threshold; the first position is the The position of the first candidate target vehicle relative to the simulated vehicle, and the second position is the position of the second candidate target vehicle relative to the simulated vehicle.
  • two or more vehicles may be output as one vehicle.
  • the first candidate vehicle and the second candidate vehicle satisfy that the relative position of the first position relative to the second position is smaller than the first position threshold, it can be determined that the sensor will misjudge the first candidate vehicle and the second candidate vehicle as the first target vehicle.
  • the sensor model when it is determined that the first candidate vehicle and the second candidate vehicle satisfy that the relative position of the first position relative to the second position is smaller than the first position threshold, the first candidate vehicle and the second candidate vehicle
  • the two candidate vehicles are output as the first target vehicle, thereby simulating the physical characteristics that the sensor may not be able to distinguish among multiple candidate vehicles, and improving the effect of the sensor model for simulating the sensor.
  • the first candidate vehicle and the second candidate vehicle also satisfy: the relative speed of the first speed relative to the second speed is less than a first speed threshold; the first speed is the first candidate The speed of the target vehicle relative to the simulated vehicle, and the second speed is the speed of the second candidate target vehicle relative to the simulated vehicle.
  • the situation that the sensor may output two or more vehicles as one vehicle may be determined based on the relative position and relative speed. Therefore, in the present application, it can be determined that the relative position of the first candidate vehicle and the second candidate vehicle satisfies that the relative position of the first position relative to the second position is smaller than the first position threshold, and the relative speed of the first speed relative to the second speed is determined to be smaller than the first position. After a speed threshold, the first candidate vehicle and the second candidate vehicle are output as the first target vehicle, thereby better simulating the physical characteristics that the sensor may not be able to distinguish between multiple candidate vehicles, and improving the effect of the sensor model simulating the sensor.
  • the sensor model is obtained by training according to the measurement data of the sensor and the marked road environment information, including: acquiring the measurement data of the sensor; the measurement data includes: the second target vehicle is relative to the The position information and speed information of the sensor, and the sensor characteristic value of the second target vehicle collected by the sensor; the sensor characteristic value includes: RCS measurement value and SNR measurement value; the sensor is located in the measurement vehicle, and the first The second target vehicle is a vehicle near the measurement vehicle; the sensor model is obtained by training according to the measurement data of the sensor and the obtained labeling information; the labeling information includes at least one of the following: the second target vehicle is relative to the The yaw angle of the sensor, the road environment information marked when the sensor collects data, the vehicle information where the sensor is located; the input of the sensor model is the position information, speed information and speed information of the first target vehicle relative to the sensor. For the label information, the output of the sensor model is the predicted value of the sensor feature of the first target vehicle.
  • the sensor characteristic value of the second target vehicle collected by the sensor can be used as training samples for training, so that the trained sensor
  • the model can output the sensor prediction value of the target vehicle, and the sensor prediction value is trained based on the sensor characteristic value of the second target vehicle collected by the sensor. Therefore, the sensor model can be closer to the real output measurement data of the sensor.
  • the road environment information when the sensor collects the measurement data is also considered in the training sample, so that the sensor feature prediction value output by the sensor model can better reflect the sensor output under different road environment information, and improve the effect of the sensor model simulating the sensor.
  • the present application provides a method for simulating a sensor, including:
  • the measurement data includes: position information and speed information of the second target vehicle relative to the sensor, and the sensor characteristic measurement value of the second target vehicle collected by the sensor; the sensor characteristic measurement value Including: RCS measurement value and SNR measurement value; the sensor is located in the measurement vehicle; the second target vehicle is a vehicle near the measurement vehicle; training is performed according to the measurement data of the sensor and the obtained annotation information to obtain the sensor
  • the sample input of the sensor model is the position information, speed information and label information of the second target vehicle relative to the sensor, and the output of the sensor model is the predicted value of the sensor feature of the second target vehicle;
  • the sensor feature prediction value of the second target vehicle includes at least one of the following: RCS prediction value and SNR prediction value;
  • the label information includes at least one of the following: the yaw angle of the second target vehicle relative to the sensor, the The road environment information marked when the sensor collects data, and the vehicle information where the sensor is located.
  • the sensor characteristic value of the second target vehicle collected by the sensor can be used as training samples for training, so that the trained sensor
  • the model can output the sensor prediction value of the target vehicle, and the sensor prediction value is trained based on the sensor characteristic value of the second target vehicle collected by the sensor. Therefore, the sensor model can be closer to the real output measurement data of the sensor.
  • the road environment information when the sensor collects the measurement data is also considered in the training sample, so that the predicted value of the sensor feature output by the sensor model can better reflect the output of the sensor under different road environment information, and improve the effect of the sensor model to simulate the sensor. , which is beneficial to provide the simulation effect of the vehicle.
  • the application provides a simulation device for a vehicle, comprising:
  • a sensor feature prediction module configured to input the position information and speed information of the first target vehicle relative to the simulated vehicle and the road environment information of the simulated vehicle into the sensor model to obtain the sensor feature prediction value of the first target vehicle;
  • the sensor feature prediction value includes at least one of the following: radar reflection cross section RCS prediction value and signal-to-noise ratio SNR prediction value; wherein, the sensor model is used to simulate the sensor in the simulated vehicle, and the first target vehicle is The vehicle in the test environment where the simulated vehicle is located, the position information and speed information of the first target vehicle relative to the simulated vehicle and the road environment information of the simulated vehicle are determined according to the test environment, and the sensor The model is obtained by training according to the measurement data of the sensor and the marked road environment information;
  • an output module configured to input the predicted value of the sensor feature of the first target vehicle to the decision module of the simulated vehicle, and obtain a simulation decision result of the simulated vehicle; wherein, the decision module is used to output the output based on the sensor Vehicle driving decisions determined by feature predictions.
  • the device may further include:
  • a first determination module configured to determine a vehicle within the detection range of the sensor as the first target vehicle in the candidate vehicle according to the position information and speed information of the candidate vehicle relative to the simulated vehicle; the The position information and speed information of the candidate vehicle relative to the simulated vehicle are determined according to the test environment; the candidate vehicle is a vehicle in the test environment where the simulated vehicle is located.
  • the apparatus may further include: a second determination module, configured to determine that the predicted SNR value of the first target vehicle is greater than a visible threshold.
  • a possible implementation further includes: a third determination module, configured to determine the predicted value of the sensor feature of the first target vehicle according to the predicted value of the sensor feature of the first candidate vehicle and the predicted value of the sensor feature of the second candidate vehicle;
  • the first target vehicle includes a first candidate vehicle and a second candidate vehicle; the first candidate vehicle and the second candidate vehicle satisfy: the relative position of the first position relative to the second position is less than a first position threshold; the The first position is the position of the first candidate target vehicle relative to the simulated vehicle, and the second position is the position of the second candidate target vehicle relative to the simulated vehicle.
  • the first candidate vehicle and the second candidate vehicle also satisfy:
  • the relative speed of the first speed to the second speed is less than a first speed threshold; the first speed is the speed of the first candidate target vehicle relative to the simulated vehicle, and the second speed is the second candidate target vehicle relative to the speed of the simulated vehicle.
  • the device further includes: a sensor model training module, where the sensor model training module includes:
  • an acquisition module configured to acquire measurement data of a sensor;
  • the measurement data includes: position information and speed information of the second target vehicle relative to the sensor, and sensor characteristic values of the second target vehicle collected by the sensor;
  • the sensor characteristic values include: RCS measurement value and SNR measurement value; the sensor is located in the measurement vehicle, and the second target vehicle is a vehicle near the measurement vehicle;
  • a training module configured to perform training according to the measurement data of the sensor and the obtained label information to obtain a sensor model;
  • the label information includes at least one of the following: the yaw angle of the second target vehicle relative to the sensor, the The road environment information and the vehicle information of the measuring vehicle marked when the sensor collects data;
  • the input of the sensor model is the position information, speed information and the marked information of the first target vehicle relative to the sensor, the said The output of the sensor model is the predicted value of the sensor feature of the first target vehicle.
  • the present application provides a sensor simulation device, including:
  • an acquisition module configured to acquire measurement data of the sensor;
  • the measurement data includes: position information, speed information of the second target vehicle relative to the sensor, and sensor characteristic measurement values of the second target vehicle collected by the sensor;
  • the sensor characteristic measurement values include: RCS measurement value and SNR measurement value; the sensor is located in the measurement vehicle; the second target vehicle is a vehicle near the measurement vehicle;
  • a training module is used for training according to the measurement data of the sensor and the obtained label information to obtain a sensor model;
  • the sample input of the sensor model is the position information, speed information and label of the second target vehicle relative to the sensor information
  • the output of the sensor model is the predicted value of the sensor feature of the second target vehicle;
  • the predicted value of the sensor feature of the second target vehicle includes at least one of the following: a predicted RCS value and a predicted SNR value;
  • the labeling information includes at least one of the following: a yaw angle of the second target vehicle relative to the sensor, road environment information labelled when the sensor collects data, and vehicle information where the sensor is located.
  • the present application provides a vehicle simulation device, comprising: a processor and an interface circuit; wherein the processor is coupled to a memory through the interface circuit, and the processor is configured to execute program codes in the memory , to implement the method described in the first aspect or any possible implementation manner of the first aspect.
  • the present application provides a sensor simulation device, comprising: a processor and an interface circuit; wherein the processor is coupled to a memory through the interface circuit, and the processor is configured to execute program codes in the memory , to implement the method described in the implementation manner of the second aspect.
  • the present application provides a computer-readable storage medium, comprising computer instructions that, when the computer instructions are executed by a processor, cause a simulation device of the vehicle to execute the method described in any one of the first aspects or the method described in the second aspect.
  • the present application provides a computer program product that, when the computer program product runs on a processor, causes the vehicle simulation device to execute the method described in any one of the first aspect or the second aspect. Methods.
  • an embodiment of the present application provides a vehicle networking communication system, the system includes an in-vehicle system and the device according to the third aspect or the fourth aspect, wherein the in-vehicle system is communicatively connected to the device.
  • an embodiment of the present application provides a chip system, where the chip system includes a processor for invoking a computer program or computer instruction stored in a memory, so that the processor executes the process as described in the first aspect or the second aspect. Any one of the possible implementations of the method described.
  • the processor is coupled to the memory through an interface.
  • the chip system further includes a memory, and the memory stores computer programs or computer instructions.
  • Embodiments of the present application further provide a processor, where the processor is configured to invoke a computer program or computer instruction stored in a memory, so that the processor executes the implementation of any one of the possible implementations of the first aspect or the second aspect. method described.
  • FIG. 1a is a schematic diagram of a system architecture of a vehicle according to an embodiment of the application.
  • FIG. 1b is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 1c is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of the principle of a radar sensor provided by an embodiment of the present application.
  • 3a is a schematic flowchart of a vehicle simulation method
  • 3b is a schematic diagram of a test environment of a vehicle simulation method provided by an embodiment of the application.
  • 3c is a schematic diagram of a vehicle occlusion scene provided by an embodiment of the present application.
  • FIG. 4a is a schematic diagram of a scene in which a vehicle collects measurement data according to an embodiment of the application
  • 4b is a schematic flowchart of a method for simulating a sensor according to an embodiment of the present application
  • 4c is a schematic diagram of measurement data collected by a vehicle according to an embodiment of the present application.
  • FIG. 4d is a schematic diagram of a method for simulating a sensor according to an embodiment of the present application.
  • FIG. 4e is a schematic diagram of a method for simulating a sensor according to an embodiment of the present application.
  • FIG. 5a is a schematic diagram of a simulation structure of a vehicle according to an embodiment of the application.
  • 5b is a schematic flowchart of a vehicle simulation method provided by an embodiment of the application.
  • 6a is a schematic diagram of a detection range of a sensor of a vehicle according to an embodiment of the application.
  • 6b is a schematic diagram of determining a target object according to an embodiment of the present application.
  • FIGS. 7a-7d are schematic diagrams of determining a target object according to an embodiment of the present application.
  • FIG. 8a is a schematic diagram of a simulation structure of a vehicle according to an embodiment of the application.
  • 8b is a schematic flowchart of a vehicle simulation method provided by an embodiment of the application.
  • FIG. 9 is a schematic structural diagram of a vehicle simulation device provided by an embodiment of the application.
  • FIG. 10 is a schematic structural diagram of a vehicle simulation device provided by an embodiment of the application.
  • FIG. 11 is a schematic structural diagram of a sensor simulation device according to an embodiment of the application.
  • FIG. 12 is a schematic structural diagram of a sensor simulation device according to an embodiment of the present application.
  • At least one (item) refers to one or more, and "a plurality” refers to two or more.
  • “And/or” is used to describe the relationship between related objects, indicating that there can be three kinds of relationships, for example, “A and/or B” can mean: only A, only B, and both A and B exist , where A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s).
  • At least one (a) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, c can be single or multiple.
  • FIG. 1a is an exemplary functional block diagram of the vehicle 100 according to the embodiment of the present application.
  • the vehicle 100 may be configured in a fully or partially autonomous driving mode.
  • the vehicle 100 may simultaneously control itself while in an autonomous driving mode, and may determine the current state of the vehicle and its surrounding environment through human manipulation, determine the possible behavior of at least one other vehicle in the surrounding environment, and determine the other vehicle A confidence level corresponding to the likelihood of performing the possible behavior and controlling the vehicle 100 based on the determined information.
  • the vehicle 100 may be placed to operate without human interaction.
  • components coupled to or included in vehicle 100 may include propulsion system 110 , sensor system 120 , control system 130 , peripherals 140 , power supply 150 , computer system 160 , and user interface 170 .
  • the components of the vehicle 100 may be configured to operate in interconnection with each other and/or with other components coupled to the various systems.
  • power supply 150 may provide power to all components of vehicle 100 .
  • Computer system 160 may be configured to receive data from and control propulsion system 110 , sensor system 120 , control system 130 , and peripherals 140 .
  • Computer system 160 may also be configured to generate a display of images on user interface 170 and receive input from user interface 170 .
  • vehicle 100 may include more, fewer, or different systems, and each system may include more, fewer, or different components.
  • the illustrated systems and components may be combined or divided in any manner, which is not specifically limited in this application.
  • Propulsion system 110 may provide powered motion for vehicle 100 .
  • the propulsion system 110 may include an engine/engine 114 , an energy source 113 , a transmission 112 and wheels/tires 111 .
  • the propulsion system 110 may additionally or alternatively include other components than those shown in Figure 1a. This application does not specifically limit this.
  • the sensor system 120 may include several sensors for sensing information about the environment in which the vehicle 100 is located. As shown in FIG. 1a, the sensors of the sensor system 120 include a global positioning system (Global Positioning System, GPS) 126, an inertial measurement unit (Inertial Measurement Unit, IMU) 125, a lidar 122, a camera sensor 123, a millimeter-wave radar 124, and sensors for Actuator 121 that modifies the position and/or orientation of the sensor.
  • the millimeter wave radar 124 may utilize radio signals to sense objects within the surrounding environment of the vehicle 100 . In some embodiments, in addition to sensing the target, the millimeter wave radar 124 may be used to sense the speed and/or heading of the target.
  • the lidar 122 may utilize laser light to sense objects in the environment in which the vehicle 100 is located.
  • lidar 122 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
  • the camera sensor 123 may be used to capture multiple images of the surrounding environment of the vehicle 100 .
  • the camera sensor 123 may be a still camera or a video camera.
  • GPS 126 may be any sensor used to estimate the geographic location of vehicle 100 .
  • the GPS 126 may include a transceiver that estimates the position of the vehicle 100 relative to the earth based on satellite positioning data.
  • the computer system 160 may be used to estimate the road on which the vehicle 100 is traveling using the GPS 126 in conjunction with map data.
  • the IMU 125 may be used to sense position and orientation changes of the vehicle 100 based on inertial acceleration and any combination thereof.
  • the combination of sensors in IMU 125 may include, for example, an accelerometer and a gyroscope. Additionally, other combinations of sensors in IMU 125 are possible.
  • the sensor system 120 may also include sensors that monitor the internal systems of the vehicle 100 (eg, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding properties (position, shape, orientation, velocity, etc.). This detection and identification is a critical function for the safe operation of the vehicle 100 . Sensor system 120 may also include other sensors. This application does not specifically limit this.
  • the control system 130 controls the operation of the vehicle 100 and its components.
  • Control system 130 may include various elements including steering unit 136 , throttle 135 , braking unit 134 , sensor fusion algorithms 133 , computer vision system 132 , route control system 131 , and obstacle avoidance system 137 .
  • the steering unit 136 is operable to adjust the heading of the vehicle 100 .
  • it may be a steering wheel system.
  • the throttle 135 is used to control the operating speed of the engine 114 and thus the speed of the vehicle 100 .
  • the control system 130 may additionally or alternatively include other components than those shown in Figure 1a. This application does not specifically limit this.
  • the braking unit 134 is used to control the deceleration of the vehicle 100 .
  • the braking unit 134 may use friction to slow the wheels 111 .
  • the braking unit 134 may convert the kinetic energy of the wheels 111 into electrical current.
  • the braking unit 134 may also take other forms to slow the wheels 111 to control the speed of the vehicle 100 .
  • Computer vision system 132 is operable to process and analyze images captured by camera sensor 123 in order to identify objects and/or features in the environment surrounding vehicle 100 .
  • the objects and/or features may include traffic signals, road boundaries and obstacles.
  • Computer vision system 132 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision techniques.
  • SFM structure from motion
  • the computer vision system 132 may be used to map the environment, track objects, estimate the speed of objects, and the like.
  • the route control system 131 is used to determine the travel route of the vehicle 100 .
  • route control system 131 may combine data from sensor system 120, GPS 126, and one or more predetermined maps to determine a driving route for vehicle 100.
  • the obstacle avoidance system 137 is used to identify, evaluate and avoid or otherwise traverse potential obstacles in the environment of the vehicle 100 .
  • control system 130 may additionally or alternatively include components other than those shown and described. Alternatively, some of the components shown above may be reduced.
  • Peripherals 140 may be configured to allow vehicle 100 to interact with external sensors, other vehicles, and/or a user.
  • peripheral devices 140 may include, for example, a wireless communication system 144 , a touch screen 143 , a microphone 142 and/or a speaker 141 .
  • Peripherals 140 may additionally or alternatively include other components than those shown in Figure 1a. This application does not specifically limit this.
  • peripherals 140 provide a means for a user of vehicle 100 to interact with user interface 170 .
  • touch screen 143 may provide information to a user of vehicle 100 .
  • the user interface 170 may also operate the touch screen 143 to receive user input.
  • peripheral device 140 may provide a means for vehicle 100 to communicate with other devices located within the vehicle.
  • the microphone 142 may receive audio (eg, voice commands or other audio input) from a user of the vehicle 100 .
  • speakers 141 may output audio to a user of vehicle 100 .
  • Wireless communication system 144 may wirelessly communicate with one or more devices, either directly or via a communication network.
  • wireless communication system 144 may use 3G cellular communications, such as code division multiple access (CDMA), EVDO, global system for mobile communications (GSM)/general packet radio service technology (general packet) radio service, GPRS), or 4G cellular communications, such as long term evolution (LTE), or 5G cellular communications.
  • the wireless communication system 144 may utilize wireless fidelity (WiFi) to communicate with a wireless local area network (WLAN).
  • WiFi wireless local area network
  • wireless communication system 144 may communicate directly with devices using wireless protocols such as infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols, such as various vehicle communication systems, for example, wireless communication system 144 may include one or more dedicated short range communications (DSRC) devices, which may include communication between vehicles and/or roadside stations public and/or private data communications.
  • DSRC dedicated short range communications
  • Power supply 150 may be configured to provide power to some or all components of vehicle 100 .
  • power source 150 may include, for example, a rechargeable lithium-ion or lead-acid battery.
  • one or more battery packs may be configured to provide power.
  • Other power supply materials and configurations are also possible.
  • power source 150 and energy source 113 may be implemented together, as in some all-electric vehicles.
  • Components of the vehicle 100 may be configured to operate in interconnection with other components within and/or outside of their respective systems. To this end, the components and systems of the vehicle 100 may be communicatively linked together through a system bus, network, and/or other connection mechanisms.
  • Computer system 160 may include processor 161 , transceiver 162 and memory 163 . Therein, processor 161 executes instructions 1631 stored in a non-transitory computer-readable medium such as memory 163 . Computer system 160 may also be multiple computing devices that control individual components or subsystems of vehicle 100 in a distributed fashion.
  • the processor 161 may be any conventional processor, such as a commercially available central processing unit (CPU). Alternatively, the processor may be a dedicated device such as application specific integrated circuits (ASIC) or other hardware-based processors.
  • FIG. 1a functionally illustrates a processor, memory, and other elements of the computer system 160 in the same block, one of ordinary skill in the art will understand that the processor, computer, or memory may actually include a processor, a computer, or a memory that may or may not Multiple processors, computers, or memories that are not stored within the same physical enclosure.
  • the memory may be a hard drive or other storage medium located within an enclosure other than computer system 160 .
  • processors or computers will be understood to include reference to a collection of processors or computers or memories that may or may not operate in parallel.
  • some components such as the steering and deceleration components may each have their own processor that only performs computations related to component-specific functions .
  • a processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle while others are performed by a remote processor, including taking steps necessary to perform a single maneuver.
  • memory 163 may include instructions 1631 (eg, program logic) executable by processor 161 to perform various functions of vehicle 100 , including those described above.
  • Memory 163 may also contain additional instructions, including sending data to, receiving data from, interacting with, and/or controlling one or more of propulsion system 110 , sensor system 120 , control system 130 , and peripherals 140 . instruction.
  • memory 163 may store data such as road maps, route information, vehicle location, direction, speed, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 160 during operation of the vehicle 100 in autonomous, semi-autonomous and/or manual modes.
  • User interface 170 for providing information to or receiving information from a user of vehicle 100 .
  • user interface 170 may include one or more input/output devices within the set of peripheral devices 140 , such as wireless communication system 144 , touch screen 143 , microphone 142 and speaker 141 .
  • Computer system 160 may control the functions of vehicle 100 based on input received from various subsystems (eg, propulsion system 110 , sensor system 120 , and control system 130 ) and from user interface 170 .
  • computer system 160 may utilize input from control system 130 in order to control steering unit 136 to avoid obstacles detected by sensor system 120 and obstacle avoidance system 137 .
  • computer system 160 is operable to provide control of various aspects of vehicle 100 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • memory 163 may exist partially or completely separate from vehicle 100 .
  • the above-described components may be communicatively coupled together in a wired and/or wireless manner.
  • FIG. 1a should not be construed as a limitation on the embodiments of the present application.
  • An autonomous vehicle traveling on a road can recognize objects within its surroundings to determine adjustments to current speed.
  • the targets may be other vehicles, traffic control devices, or other types of targets.
  • each identified target may be considered independently, and based on the target's respective characteristics, such as its current speed, acceleration, distance from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to adjust.
  • the autonomous vehicle vehicle 100 or a computing device associated with the autonomous vehicle 100 may be based on the characteristics of the identified object and the state of the surrounding environment (eg, traffic, rain, ice on the road, etc.) to predict the behavior of the identified target.
  • each of the identified objects is dependent on the behavior of each other, so it is also possible to predict the behavior of a single identified object by considering all of the identified objects together.
  • the vehicle 100 can adjust its speed based on the predicted behavior of the identified target.
  • the self-driving car can determine what steady state the vehicle will need to adjust to (eg, accelerate, decelerate, or stop) based on the predicted behavior of the target.
  • other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and the like.
  • the computing device may also provide instructions to modify the steering angle of the vehicle 100 so that the self-driving car follows a given trajectory and/or maintains a target in the vicinity of the self-driving car (eg, , cars in adjacent lanes on the road) safe lateral and longitudinal distances.
  • a target in the vicinity of the self-driving car eg, , cars in adjacent lanes on the road
  • the above-mentioned vehicle 100 can be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, a recreational vehicle, a playground vehicle, construction equipment, a tram, a golf cart, a train, a cart, etc.
  • the application examples are not particularly limited.
  • the radar systems described in the embodiments of the present application can be applied to various fields.
  • the radar systems in the embodiments of the present application include but are not limited to vehicle-mounted radars, roadside traffic radars, no Man-machine radar.
  • the sensor system is described in detail below.
  • the sensors on the car can be divided into two categories according to the sensing method: passive sensing sensors and active sensing sensors.
  • passive perception sensors rely on the radiation information of the external environment.
  • a typical passive perception sensor is a camera.
  • the perception of the camera is not in the form of transmitting and receiving energy waves, and the accuracy of its perception results mainly depends on image processing and classification algorithms.
  • Camera sensor 123 may include any camera (eg, still camera, video camera, etc.) used to acquire images of the environment in which vehicle 100 is located. To this end, the camera sensor 123 may be configured to detect visible light, or may be configured to detect light from other parts of the spectrum, such as infrared light or ultraviolet light. Other types of camera sensors 123 are also possible. The camera sensor 123 may be a two-dimensional detector, or may have a three-dimensional spatial range detection function. In some examples, camera sensor 123 may be, for example, a distance detector configured to generate a two-dimensional image indicative of distances from camera sensor 123 to several points in the environment. To this end, camera sensor 123 may use one or more distance detection techniques.
  • camera sensor 123 may use one or more distance detection techniques.
  • the camera sensor 123 may be configured to use structured light technology, wherein the vehicle 100 illuminates objects in the environment with a predetermined light pattern, such as a grid or checkerboard pattern, and the camera sensor 123 is used to detect the predetermined light pattern from the object reflection. Based on the distortion in the reflected light pattern, the vehicle 100 may be configured to detect the distance to a point on the object.
  • the predetermined light pattern may include infrared light or other wavelengths of light.
  • Camera sensor 123 may include any camera (eg, still camera, video camera, etc.) used to acquire images of the environment in which vehicle 100 is located.
  • camera sensor 123 may be, for example, a distance detector configured to generate a two-dimensional image indicative of distances from camera sensor 123 to several points in the environment. To this end, camera sensor 123 may use one or more distance detection techniques. When the camera sensor senses that there is a target in the image sensing area, the image information is transmitted to the processing module for further processing.
  • the camera sensor 123 can be one or more of the following camera sensors, for example: 1) an infrared camera sensor (infrared radiation-red green blue image sensor, IR-RGB image sensor), using a CCD unit (charge-coupled device) , charge-coupled device) or standard CMOS unit (complementary meta-oxide semiconductor), filtered by a filter, only the light of the color wavelength band and the set infrared wavelength band is allowed to pass through, in the image signal processor
  • the IR (infrared radiation, infrared) image data stream and the RGB (red green blue, three primary colors) image data stream are separated in the middle.
  • the IR image data stream is the image data stream obtained in the low-light environment, and the two image data streams obtained by separation are used as Other application processing.
  • the visible light camera sensor adopts a CCD unit (charge-coupled device, charge coupled device) or a standard CMOS unit (complementary meta-oxide semiconductor, complementary metal oxide semiconductor) to obtain visible light data images.
  • the active sensing sensor is to sense the environment by actively emitting energy waves.
  • the active perception type sensor may be a radar sensor.
  • the vehicle-mounted radar sensor transmits the detection signal (electromagnetic wave) outward through the antenna and receives the signal reflected by the target, amplifies and down-converts the signal reflected by the target, and obtains the relative distance, relative speed and angle between the vehicle and the target. , and then target tracking, recognition and classification are carried out according to the obtained information, and after reasonable decision-making, functions such as obstacle measurement, collision prediction, and adaptive cruise control can be realized.
  • the radar sensor After the radar sensor performs target tracking, identification and classification according to the obtained information, and after a reasonable decision, it informs or warns the driver in various ways such as sound, light and touch, or actively intervenes in the car in time, which can effectively reduce the Driving difficulty, reducing the burden on the driver and reducing the incidence of accidents, thereby ensuring the safety and comfort of the driving process, have been widely used in the automotive field.
  • LRR Long Range Radar
  • MRR Middle Range Radar
  • SRR Short Range Radar
  • LRR has ranging and anti-collision functions, and is widely used in adaptive cruise control (Adaptive Cruise Control, ACC), forward collision warning (Forward Collision Warning, FCW), automatic emergency braking (Automatic Emergency Brake, AEB) and other fields .
  • the LRR is installed at the center of the front bumper of the vehicle, the azimuth angle is 0°, the elevation angle is set to 1.5° when the height is lower than 50cm, and the elevation angle is set to 0° when the height exceeds 50cm, so that the truck can be realized. 150 meters, 100 meters for cars, 60 meters for pedestrians, moving target detection ability.
  • LRR's ACC, FCW, AEB and other functions have a significant safety prompting effect when the driver is distracted, tired or sleepy, or fails to notice the situation ahead when using a mobile phone.
  • MRR and SRR have Blind Spot Detection (BSD), Lane Change Assistance (LCA), Rear CrossTraffic Alert (RCTA), Exit Assistant Function (EAF), front With functions such as Forward Cross Traffic Alert (FCTA), it can accurately detect targets within a certain range of the vehicle.
  • BSD Blind Spot Detection
  • LCA Lane Change Assistance
  • RCTA Rear CrossTraffic Alert
  • EAF Exit Assistant Function
  • FCTA Forward Cross Traffic Alert
  • FCTA Forward Cross Traffic Alert
  • SRR can effectively reduce the risk factor caused by inconvenient observation of drivers in bad weather conditions such as night, fog, heavy rain, etc. in BSD, LCA and other fields, and avoid drivers in the process of merging , the danger of possible collisions between adjacent lanes and blind spots of "vision”.
  • LRR, MRR and SRR all play important functions in the Advanced Driving Assistant System (ADAS).
  • ADAS Advanced Driving Assistant System
  • the following is an example of a specific radar sensor.
  • Ultrasonic radar sensor ultrasonic refers to mechanical waves with frequencies higher than 20 kHz. In order to use ultrasound as a detection method, it is necessary to generate ultrasonic waves and receive ultrasonic waves. The device that accomplishes this function is the ultrasonic radar. Ultrasonic radar has a transmitter and a receiver, but an ultrasonic radar can also have the dual role of sending and receiving sound waves. Ultrasonic radar uses the principle of piezoelectric effect to convert electrical energy and ultrasonic waves into each other, that is, when ultrasonic waves are emitted, the electrical energy is converted to emit ultrasonic waves; and when echoes are received, ultrasonic vibrations are converted into electrical signals.
  • Millimeter-Wave Radar Sensor is a radar that works in the millimeter wave band (millimeter wav).
  • millimeter waves refer to the 30 to 300 gigahertz (GHz) frequency domain (wavelengths of 1 to 10 millimeters).
  • the wavelength of millimeter wave is between microwave and centimeter wave, so millimeter wave radar has some advantages of microwave radar and photoelectric radar. It has the characteristics of small size, light weight and high spatial resolution, and has strong ability to penetrate fog, smoke and dust, and is widely used in navigation systems such as vehicles and aircraft.
  • the measurement value of the millimeter-wave radar sensor has depth information, which can provide the distance of the target; secondly, because the millimeter-wave radar sensor has obvious Doppler effect, it is very sensitive to speed, and the speed of the target can be directly obtained by detecting its Doppler effect.
  • the frequency shift can extract the speed of the target.
  • the two mainstream automotive millimeter-wave radar application frequency bands are 24GHz and 77GHz.
  • the wavelength of the former is about 1.25cm, which is mainly used for short-range sensing, such as the surrounding environment of the vehicle body, blind spots, parking assistance, lane change assistance, etc.; the latter wavelength About 4mm, used for medium and long distance measurement, such as automatic following, adaptive cruise (ACC), emergency braking (AEB), etc.
  • a lidar sensor can be thought of as an object detection system that uses light sensing to detect objects in the environment in which the vehicle 100 is located.
  • Lidar which works in the infrared and visible light bands and uses lasers as the working beam is called lidar.
  • the working principle of lidar is to transmit a detection signal (laser beam) to the target, and then compare the received signal (target echo) reflected from the target with the transmitted signal, and after proper processing, the relevant information about the target can be obtained.
  • Information such as parameters such as target distance, bearing, altitude, speed, attitude, and even shape.
  • lidar sensors can measure the distance to the target or other properties of the target by illuminating the target with light.
  • a lidar sensor may include a laser source and/or a laser scanner configured to emit laser pulses, and a detector for receiving reflections of the laser pulses.
  • a lidar sensor may include a laser rangefinder reflected by a rotating mirror and scan the laser around the digitized scene in one or two dimensions, collecting distance measurements at specified angular intervals.
  • a lidar sensor may include components such as light (eg, laser) sources, scanners and optical systems, light detectors and receiver electronics, and position and navigation systems.
  • Lidar sensors determine the distance of an object by scanning the laser light reflected back from an object, and can form a 3D map of the environment with centimeter-level accuracy.
  • a lidar sensor can be thought of as an object detection system that illuminates a target with light to measure the distance to it.
  • radar sensors can be installed on vehicles, for example, the sensors in this application can be applied to advanced driving assistance systems (ADAS) (such as autonomous driving), robots, drones, connected vehicles , security monitoring and other fields.
  • ADAS advanced driving assistance systems
  • radar sensors can be installed on mobile devices.
  • radar sensors can be installed on motor vehicles (such as unmanned vehicles, smart vehicles, electric vehicles, digital vehicles, etc.) to be used as in-vehicle radars; for example, radars can be Mounted on drones, as airborne radar, etc.
  • the radar sensor deployed at the front of the vehicle can perceive the fan-shaped area shown by the solid line frame, and the fan-shaped area can be the radar sensing area.
  • the radar signal information is transmitted to the processing module for further processing.
  • the processing module After receiving the information from the radar sensor, the processing module outputs the measurement information of the target radar (for example, the relative distance, angle, and relative speed of the target object).
  • processing module here can be either a computer independent of the radar sensor or a software module in the computer, for example, the processing module in the computer system 160, or can be a computer or a computer deployed in the radar sensor software modules, which are not limited here.
  • measurement information such as the latitude and longitude, speed, orientation, and distance of surrounding objects sensed by the sensor can be obtained in real time or periodically, and then assisted driving or unmanned driving of the vehicle can be realized according to these measurement information.
  • drive For example, use the latitude and longitude to determine the position of the vehicle, or use the speed and orientation to determine the direction and purpose of the vehicle in the future, or use the distance of surrounding objects to determine the number and density of obstacles around the vehicle.
  • the radar sensor involved in the present application can also be installed on a fixed device, for example, the radar sensor can be installed on a roadside unit (RSU), a roof or a base station.
  • RSU roadside unit
  • Radar 1, Radar 2, Radar 3 and Radar 4 as shown in Figure 1c.
  • the radar needs the assistance of other devices in the fixed device to determine its current position and steering information, which can ensure the availability of measurement data.
  • the fixed device may also include a global positioning system (GPS) device and an inertial measurement unit (IMU) device, and the radar can combine the measurement data of the GPS device and the IMU device to obtain the position and speed of the target Equal feature quantities.
  • GPS global positioning system
  • IMU inertial measurement unit
  • the radar can provide the location information of the fixed device through the GPS device in the fixed device, and record the attitude and steering information of the fixed device through the IMU device.
  • the IMU device After determining the distance to the target according to the echo signal and the emitted laser beam, at least one of the geographic location information provided by the GPS device or the attitude and steering information provided by the IMU device can be used to convert the measurement point of the target by relative coordinates. The system is converted to the position point on the absolute coordinate system, and the geographic position information of the target is obtained, so that the radar can be applied to the fixed device.
  • the radar sensor in the present application may be a laser radar, a microwave radar, or a millimeter-wave radar, which is not limited in this embodiment of the present application.
  • the lidar In the following introduction, for the convenience of description, the working process of the radar sensor is described below by taking the lidar as an example.
  • the electromagnetic waves emitted by lidar are called laser beams
  • the electromagnetic waves emitted by microwave radars are called microwaves
  • the laser beams emitted by millimeter-wave radars are called millimeter waves. That is to say, the lidar below can be replaced by millimeter wave radar, and the electromagnetic wave can be replaced by millimeter wave; the lidar below can also be replaced by microwave radar, and the electromagnetic wave can be replaced by microwave.
  • the application does not limit the number of radar sensors and targets included in each scene.
  • the scene may include a plurality of sensor-mounted radar sensors and movable targets, and the present application may also be applied to other possible scenes.
  • Another example is the automated guided vehicle (AGV) trolley scenario, in which the AGV trolley is equipped with automatic navigation devices such as electromagnetic or optical, and can travel along a prescribed navigation path, with safety protection and various transfer functions. transporter.
  • AGV automated guided vehicle
  • remote interaction and real scene reproduction such as telemedicine or remote training, game interaction (such as multiple people playing games, training or participating in other activities in a virtual scene) or dangerous scene training.
  • scenes such as face recognition. Not listed here.
  • the radar may include a transmitter and a receiver.
  • the transmitter is used to transmit the electromagnetic wave energy beam.
  • the electromagnetic wave is transmitted to the antenna through the transceiver switch.
  • the antenna then transmits the electromagnetic wave into the air along a certain direction and angle. If there is a target within a certain distance along the emission direction of the electromagnetic wave energy beam, then The electromagnetic wave energy beam is reflected by the target, and when the electromagnetic wave encounters the target object, a part of the energy will be reflected and received by the antenna of the millimeter wave radar, and then transmitted to the receiver through the transceiver switch.
  • the electromagnetic wave energy beam emitted by the transmitter reaches the target, it is reflected on the surface of the target, and the reflected signal is returned to the receiver as an echo light signal.
  • the receiver is used to determine the information related to the target according to the received echo signal and the transmitted electromagnetic wave energy beam. For example, the distance to the target, the point cloud density of the target, etc.
  • the radar sensor transmits an electromagnetic wave energy beam through the transmitter, and further processes the signal processor to obtain the relative distance, angle and relative speed of the target object.
  • a millimeter-wave radar sensor may include devices such as oscillators, transmit antennas, receive antennas, mixers, processors, and controllers. Specific steps can include:
  • Step 1 The waveform generation in the radar generates the transmit signal, and then transmits the transmit signal through the transmit antenna.
  • an oscillator produces a radar signal whose frequency increases linearly with time, which is typically a frequency-modulated continuous wave.
  • the radar detection device generally transmits radar signals of multiple frequency sweep cycles in a continuous period of time.
  • the frequency sweep cycle here refers to the cycle of transmitting a radar signal with a complete waveform.
  • the radar detection device transmits the radar signal at a frequency called the initial frequency of the radar detection device.
  • the transmission frequency of the radar detection device changes within the transmission period based on the initial frequency.
  • a part of the radar signal is output to the mixer as a local oscillator signal through a directional coupler, and a part is transmitted through a transmitting antenna.
  • the transmitted signal is usually a chirp signal with a carrier frequency, and the expression of the transmitted signal s T (t) can be:
  • f T represents the carrier frequency
  • B sw represents the bandwidth of the transmitted signal
  • T CPI represents the duration of the transmitted signal
  • Step 2 After the transmitted signal is reflected by the obstacle, it is received by the receiving antenna.
  • the receiving antenna receives the reflected radar signal after the transmitted radar signal encounters the object in front of the vehicle.
  • the received signal is the delayed signal of the transmitted signal, and the delayed signal s R (t) of the transmitted signal is expressed as:
  • ⁇ (t) represents the delay that the transmitted signal is sent from the transmitting antenna, reflected by the obstacle, and received by the receiving antenna.
  • Step 3 Perform frequency mixing/down-conversion on the delayed signal of the transmitted signal and the transmitted signal, and then obtain the received signal through sampling.
  • the mixer mixes the received radar signal with the local oscillator signal to obtain an intermediate frequency (IF) signal.
  • IF intermediate frequency
  • part of the FM continuous wave signal generated by the oscillator is used as a local oscillator signal, and a part is transmitted through the transmitting antenna as a transmitting signal, and the reflected signal of the transmitting signal received by the receiving antenna will be mixed with the local oscillator signal to obtain IF signal.
  • the intermediate frequency signal contains the relative distance, speed, and angle of the target object and the radar system.
  • the intermediate frequency signal is sent to the processor after being amplified and processed by a low-pass filter.
  • the processor processes the received signal, generally performing fast Fourier transform and spectrum analysis on the received signal, so as to obtain the relative Information such as range, speed and angle of the radar system.
  • the distance between the target (surface object) and the radar can be determined by the difference between the transmission time of the transmitted signal and the receiving time of echo scattering of different ground objects, thereby determining the position of the target.
  • the position information may be the position information of the target object relative to the current radar
  • the speed information may be the speed information of the target object relative to the current radar
  • the angle information may be the angle information of the target object relative to the current radar.
  • the frequency of the intermediate frequency signal is called the intermediate frequency frequency.
  • Step 4 The processor can output the obtained information to the controller to control the behavior of the vehicle.
  • the decision controls the input parameters in the simulation. That is to say, in the process of simulating the sensor, the sensor model can be based on the traffic participants in the test environment as input, and the output parameters of the sensor model can be obtained by the sensor within the sensing range of the sensor (determined based on the geometric occlusion screening method). The relative distance, relative speed, and angle of the detectable object. Therefore, the output parameters of the sensor model can be used as input parameters corresponding to the sensor modules required in the decision control simulation of automatic driving.
  • the specific process can include:
  • Step 301 Determine traffic participants in the test environment.
  • the test environment may be determined according to the scene to be tested. For example, as shown in Figure 3b, it includes: simulated vehicles (including sensors to be simulated), other vehicles, non-motor vehicles, pedestrians, road environment, traffic environment, buildings, bridges, roadblocks, etc.
  • Step 302 Use the parameters of the traffic participants in the test environment as input parameters of the sensor model.
  • Traffic participants can include: vehicles, pedestrians, roads, roadblocks, etc.
  • the parameters of the traffic participant may include modeling data such as positioning position, moving speed, and size of the traffic participant.
  • Step 303 Screen the perceptible targets of the sensors on the simulated vehicle by using the geometric occlusion method.
  • the maximum ranging distance of the sensor can be determined, so that people can determine the detection range of the radar sensor.
  • the data may be determined based on the factory data of the sensor, or may be obtained by experience, which is not limited herein.
  • the maximum ranging distance of the radar detection device is a parameter related to the configuration of the radar detection device (for example, related to the factory setting parameters of the radar detection device).
  • the radar detection device is radar
  • the maximum ranging distance of long-range adaptive cruise control (ACC) radar is 142m
  • the maximum ranging distance of medium-range radar is 70-150m.
  • the radar sensor deployed at the front end of the vehicle 1 can detect the sector-shaped area shown by the solid line frame, and the sector-shaped area is the detection range of the radar. Vehicles within this detection range can serve as perceptible targets for the sensor.
  • occluded vehicles may be culled based on geometric occlusion relationships between vehicles.
  • vehicle 1 is the vehicle to be tested, and vehicle 2 and vehicle 3 are vehicles in front of vehicle 1 .
  • vehicle 2 and vehicle 3 are vehicles in front of vehicle 1 .
  • the vehicle 3 can be determined that the vehicle 3 is blocked by the vehicle 2 .
  • the vehicle 3 can be deleted.
  • the vehicle 2 can be determined to be a perceivable target of the vehicle 1 .
  • Step 304 Determine the output parameters of the sensor model according to the parameters of the perceptible target of the sensor.
  • the parameters of the perceptible target of the sensor may be determined according to the parameters of the traffic participants in the test environment.
  • the output parameters of the sensor model may include relevant parameters of vehicle 2 relative to vehicle 1 . For example, the position of vehicle 2 relative to vehicle 1, the speed of vehicle 2 relative to vehicle 1, and the angular velocity of vehicle 2 relative to vehicle 1, and the angle of vehicle 2 relative to vehicle 1.
  • the output parameters of the sensor model may also be output parameters after adding corresponding noise, which is used to simulate measurement errors.
  • the output parameters of the sensor model may include relative position and relative velocity parameters of vehicle 2 relative to vehicle 1 .
  • Step 305 Input the parameters of the perceptible target to the decision-making module.
  • the above method has lower requirements on the sensor model and simple structure, which can ensure high efficiency during simulation.
  • the ideal scene of the sensor is considered, and considering that the active perception sensor is to perceive the environment by actively transmitting energy waves, the accuracy of the perception result of this type of sensor also depends on the reflection intensity of the target object. , the propagation of energy waves, the transmission and reception of energy waves and other factors, that is to say, the sensing results of active sensing sensors are affected by multiple factors such as the material of the sensing target, azimuth distance, and environmental weather.
  • the sensor can detect the corresponding detectable object within the perceptible range determined by the geometric method, and the relative distance, relative speed and angle of the detectable object are not obtained by the real sensor. It is set based on the scene simulation.
  • the influence of the sensor on the measurement results in different environments cannot be reflected.
  • Directly using ideal data as the output of the sensor model will lead to a large deviation between the simulation results and the real results.
  • the sensor in the real environment, the sensor can measure the vehicle ahead, but, based on the above model, the sensor model determines that the vehicle ahead is blocked by dividing the determined perceptible area by the geometric area. Therefore, it may also bring more unpredictable influences to the decision-making control algorithm of subsequent automatic driving, and cannot achieve the effect of simulation testing of smart cars.
  • the quality of the sensor model determines the fidelity of the intelligent vehicle perceiving the environmental target object in the simulation test. That is to say, whether the sensor model can truly reflect the influence of the measurement results of the sensor in different environments directly affects the reliability of the simulation test results of the smart car.
  • another possible sensor simulation method may be to model the radar sensor based on the above physical characteristics, for example, the transmission and reception of energy waves of millimeter-wave radars, the propagation of energy waves, and the target-to-energy
  • the real physical working process of wave reflection is modeled in detail, and a mathematical model is established for each hardware module in the working process of the millimeter radar to simulate the entire working process.
  • the modeling of the transceiver loop involves oscillators, filters, amplifiers, mixing frequency converter, etc. Therefore, the model can reflect the workflow details of the interior of the millimeter-wave radar and the propagation of electromagnetic waves, and obtain high-precision simulation results.
  • the modeling process is complex, consumes a lot of computing resources, and has poor real-time performance, which makes it difficult to ensure simulation efficiency and meet the needs of real-time simulation testing of smart cars.
  • this sensor model will consume a lot of computing resources and cannot guarantee the simulation efficiency.
  • it is not suitable for the development of intelligent vehicle decision control algorithm especially in the early stage of intelligent vehicle decision control algorithm development, the parameters of the considered sensor model are limited, and it is difficult to effectively use the above method (for each module of millimeter wave radar The parameters of the sensor simulated by the simulation) also lead to a waste of resources.
  • the present application provides a method for simulating a sensor.
  • the application scenario shown in FIG. 4a may include a measurement device and a test device, wherein the measurement device may be a vehicle with a sensor, and the sensor may include: millimeter wave radar, camera, laser Sensors such as radar. It may also include a test device in the cloud, and the test device may include hardware devices that support running simulation software, such as personal computers, servers, vehicle-mounted mobile terminals, industrial computers, embedded devices, and the like.
  • the test device can be implemented by a server or a virtual machine in the cloud.
  • the test device can also be a chip that supports running simulation software.
  • the modeling of millimeter-wave radar sensor is taken as an example to illustrate.
  • the sensor is a radar sensor
  • the measurement device is a vehicle
  • the test device is a server as an example.
  • the present application provides a method for simulating a sensor, as shown in Figure 4b, which may include:
  • S401 Acquire measurement data of a sensor.
  • the measurement data includes: position information and speed information of the second target vehicle relative to the sensor, and sensor characteristic measurement values of the second target vehicle collected by the sensor;
  • the sensor characteristic measurements include: RCS measurements and SNR measurements; the sensor is located in a measurement vehicle; and the second target vehicle is a vehicle near the measurement vehicle.
  • S402 Perform training according to the measurement data of the sensor and the obtained annotation information to obtain a sensor model.
  • the sample input of the sensor model is the position information, speed information and label information of the second target vehicle relative to the sensor, and the output of the sensor model is the predicted value of the sensor feature of the second target vehicle;
  • the sensor feature prediction value of the second target vehicle includes at least one of the following: RCS prediction value and SNR prediction value;
  • the labeling information includes at least one of the following: a yaw angle of the second target vehicle relative to the sensor, road environment information labelled when the sensor collects data, and vehicle information where the sensor is located.
  • the physical characteristics of the radar sensor when measuring the target are considered, and the output result of the sensor model is optimized, thereby effectively improving the simulation effect.
  • the following example illustrates the physical properties of a target when a radar sensor measures it.
  • the millimeter-wave radar detects the target, and the distance and speed between the moving target and the radar sensor can be obtained. If the millimeter-wave radar is set on the vehicle and the target is another vehicle, the target can be determined according to the echo signal collected by the radar. Information such as the vehicle speed of the vehicle relative to the vehicle, the relative vehicle position, relative distance and azimuth angle of the target vehicle relative to the vehicle.
  • the RCS information of the target can be obtained, and the RCS information can be used to express the backscattering characteristics of the target under the action of the radar.
  • the RCS sequence of a space target is related to the shape and structure of the target, the frequency of the electromagnetic wave, the polarization form of the incident field, the polarization form of the receiving antenna, and the angular position (attitude angle) of the target relative to the incoming wave direction.
  • the frequency of the electromagnetic wave, the polarization form of the incident field, the polarization form of the receiving antenna, and the angular position (attitude angle) of the target to the direction of the incoming wave can be determined. Therefore, the average RCS value of the target can be determined with the The target structure and the target pose establish a relationship.
  • the information of the target object output by the sensor may also include structural information such as width information of the target object.
  • the target object is a vehicle
  • the relative target posture between the sensor and the target vehicle is usually relatively stable, for example, the position behind, in front of, and on the side of the vehicle body can be detected. Therefore, the average RCS value of the target can be used as the feature of identifying the structure of the target, so that the reflection intensity classification of different targets can be obtained, so as to classify the structure of the target.
  • the types of target vehicles such as cars, trucks, buses, etc., can be distinguished according to their length and shape structure.
  • the posture of the target in space is usually relatively stable, and the multiple measurement results of the RCS of the space target are stable. Therefore, the average value of the RCS of the target can be used as a feature to identify the structure of the target, so that the reflection intensity of different targets can be obtained. classification to classify the structure of the target. For example, objects can be distinguished as lane boundaries, lane lines or curbs, road obstacles, tunnels, bridges, etc.
  • the sensor model needs to have at least the following physical characteristics:
  • the information of the target object output by the sensor model may include: pose state information of the target object relative to the sensor, and feature information of the target object.
  • the pose state information between the target object and the sensor may include: the relative distance between the target object and the sensor, the relative speed between the target object and the sensor, the azimuth angle between the target object and the sensor, and the width of the target object Information and other structural information, the yaw angle of the target object relative to the sensor and other information.
  • the feature information of the target object may include: RCS information of the target object, SNR information of the target object, polarization information of the target object, and the like.
  • the measurement data of the sensor may be measurement information collected when the radar sensor of the vehicle is actually used.
  • the measurement information in this application may include at least one of measurement data collected by sensors, environmental information and positioning information, wherein the environmental information may include the number and location of pedestrians in the surrounding environment, pedestrian density, vehicle density, road information, weather information etc., the positioning information may include the latitude and longitude of the current location or a label on the map of the latitude and longitude.
  • the sensor can periodically measure, and then report the measurement information to the test device.
  • the preset area of the sensor of the vehicle is shown as a circle with a dotted line in FIG. 4a, which is an area with the vehicle as the center and the preset distance as the radius.
  • the preset distance may be a value smaller than or equal to the radius of the coverage area of the radar signal emitted by the vehicle A. It can also be an area determined according to other methods, for example, a fan-shaped area as shown in FIG. 1b, which is not limited here.
  • the sensor of the vehicle may be in a preset area, and the measured target object may be a vehicle, an obstacle, a lane line, and the like.
  • the measurement information of the target object within the preset range can be determined by the sensor of the vehicle.
  • the measurement data of the target object output by the sensor is collected.
  • the measurement data may include the position information of the target object relative to the sensor (for example, as shown in Figure 4c, the target object is the distance r of the target object relative to the sensor 1 on the The angle ⁇ of the sensor 1), the speed information of the target object relative to the sensor (for example, the speed of the target object relative to the sensor, the angular velocity of the target object relative to the sensor).
  • the yaw angle ⁇ data of the target can also be collected.
  • the position information of the target object relative to the sensor may further include yaw angle ⁇ data of the target object relative to the sensor.
  • the position information of the vehicle 2 relative to the sensor 1 may also include data of the yaw angle ⁇ of the vehicle 2 relative to the sensor 1 .
  • the yaw angle of the target may be manually marked or obtained by measuring other sensors, which is not limited here.
  • the yaw angle of the target object itself can reflect the difference in the intensity of radar reflections from different parts of the target object, so that the sensor model trained from the measurement information is more accurate.
  • the measurement data may further include: measurement values of characteristic information.
  • the measured value of the feature information may be the sensor feature value of the target object collected by the sensor, for example, the measured value of the signal-to-noise ratio (SNR) information of the target object collected by the sensor, the RCS information of the target object collected by the sensor. measurement value, measurement value of the polarization information of the target object collected by the sensor, etc.
  • SNR signal-to-noise ratio
  • the measured value of the SNR information, the measured value of the RCS information, and the measured value of the polarization information in the echo signal can be stored by imaging, that is, the imaging information can be generated according to the echo signal, and the imaging information can be understood. It is the reflection of the target on the emission signal, mainly the image information formed by the backscattering of the target.
  • the imaging information may include various information, such as RCS information, phase information, amplitude information, polarization information, etc. in the echo signal.
  • a possible implementation method of generating imaging information according to the echo signal reflected by the target is to process the echo signal after receiving the echo signal, such as down-conversion, analog-to-digital conversion, etc., and then process the echo signal according to the process.
  • the imaging information can be obtained by using a synthetic aperture radar (SAR) imaging algorithm.
  • the imaging information can be stored in the form of point cloud data.
  • the point cloud data can include the target's distance, azimuth, pitch angle, target speed and other radar characteristic information.
  • the measurement data may be data transmitted over the CAN line to the processor of the vehicle so that the processor can make decisions based on the measurement data obtained.
  • the sensor of the measuring vehicle may collect characteristic information of the echo signal returned by the second target vehicle.
  • the environmental information can reflect the influence of environmental factors such as rain, snow and road material on the radar reflection intensity
  • the corresponding environmental information for example, weather, road
  • other measurements can also be obtained when the test device collects the measurement data of the target information. Therefore, the sensor model trained by adding measurement information of environmental information is more accurate.
  • These measurement information may be manually marked or obtained by other means, for example, according to the road information stored in the current map server.
  • the weather in the environmental information can be divided into four categories: sunny days, rainy days, haze, and snowy days, and of course other types of information can also be included.
  • the sensor of the measuring vehicle can collect the characteristic information of the echo signal on the road.
  • the occlusions, stagnant water on the target can be determined according to the polarization information of the echo signal.
  • the polarization characteristics of snow to determine the occluder on the target, the boundary characteristics and material characteristics of water or snow, so as to determine the influence of the occluder on the echo signal of the target, and the effect of water on the target can be determined according to the polarization information of the echo signal.
  • the influence of the echo signal or the influence of the echo signal on the target under snow conditions and then more accurately determine the target as an occluded vehicle.
  • the polarization information collected by the vehicle can be used to train the sensor model, so that the sensor model can also predict the feature information in different scenarios accordingly, so as to provide more information for the subsequent decision-making module, so as to be closer to the real scene, In order to improve the simulation effect of the decision module.
  • the decision-making module can also remove the occlusion according to the polarization characteristics of the occluders, water or snow measured by the sensor. processing to improve decision-making.
  • the sensor model simulates the sensor
  • the polarization information collected by the sensor can be used as the output parameter predicted by the sensor model, so that the decision-making module can obtain more simulation information of the real sensor according to the predicted polarization information. It is possible to improve the simulation effect of the decision-making module accordingly.
  • the boundary of the target may change due to different road conditions, for example, in rainy or snowy days, water or snow on the vehicle may cause the echo signal of the target vehicle to change , so that according to the polarization information in the echo signal, it is possible to determine whether the material features on the target vehicle are affected by rain or snow, to identify whether the vehicle has accumulated water or snow, and to further determine the boundary characteristics of the accumulated water. , the boundary features of the road to improve the decision-making effect of the decision-making module.
  • the size of the stagnant water area may cause the passable road to change, so that the stagnant water on the road and the material characteristics of the lane can be determined according to the polarization information in the echo signal. Identify whether there is water in the lane or not, and further determine the boundary characteristics of the water and the boundary characteristics of the road, so as to more accurately determine the characteristic information predicted by the sensor model under the condition of the road with water, so that the subsequent decision module can be based on the sensor.
  • the feature information predicted by the model can determine the current ponding situation of the lane, for example, the boundary information of ponding water, so as to better simulate the navigation or planning path.
  • the polarization information of the echo signal detected on one lane corresponds to the polarization characteristics of the lane under stagnant water (for example, the boundary features of stagnant water, the material characteristics of stagnant water) ), and the polarization information generated on other lanes corresponds to the polarization features of the lanes without water (for example, the boundary features of the water, the material characteristics of the water), so it can be determined that the lane is covered by the water, other The lane is passable.
  • road types can be divided into four categories: ordinary asphalt pavement, ordinary concrete pavement, bridge deck, and tunnel. Of course other types of information may also be included.
  • the environment can be further divided according to the attributes of the environmental objects, so that it is beneficial to provide more training information (environmental information) when building the sensor model, so that the trained sensor model can obtain different
  • the environment objects in the scene are closer to the simulation results of the environment objects in the real scene, and the effect of the sensor model is improved, which is beneficial to the subsequent use of the prediction data obtained by the sensor model for decision-making, so as to achieve the purpose of simulation and improve the simulation effect.
  • the environmental objects are distinguished according to lanes and non-lanes, the boundary information of the environmental objects is determined, and then the environmental objects are identified.
  • the boundary information of the environmental object may refer to key points or lines for describing boundary information of obstacles in the road, or boundary information for describing lanes.
  • the lane can be divided into various environmental objects, and the types of the boundary of the environmental objects can include, but are not limited to, any one or more of the following: lane lines, road edges, road obstacles, etc. Lanes can be divided into: single lane, dual lane, multi-lane, starting lane, middle lane, merging lane, bifurcation lane, intersection, etc.
  • the starting lane may be: a lane corresponding to several lane lines including the starting point on a road.
  • the boundary of the starting and ending lanes may be the starting line of the lane.
  • the termination lane may be: a lane corresponding to several lane lines including the termination point on a road.
  • the boundary of the terminating lane is the stop line of the lane.
  • the starting line of the lane and the stop line of the opposite lane are in a straight line in practical applications.
  • the merging lane and the bifurcation lane can be marked by the lane change point on the lane.
  • the lane change point can be the bifurcation point generated by the additional turning lane when some roads are approaching the intersection, or it can be reduced by entering a new road through the intersection.
  • the resulting junction can also be the fork of the outgoing lanes of the expressway/viaduct, or the junction of the incoming lanes of the expressway/viaduct.
  • the lanes can be further classified according to the obstacles existing in the lanes.
  • the lanes can also include: tunnel lanes, elevated entry lanes, elevated exit lanes, bridges, and the like.
  • different scenarios may be set to obtain measurement information of different sensors.
  • the hypothetical scenarios include a downtown scene, a suburban scene, a highway scene, and a special weather scene.
  • the parameters of the sensor corresponding to the bustling scene may include: the millimeter-wave radar sensor works in the SRR mode. Therefore, when the sensor works in the SRR mode, the corresponding target object relative to the sensor distance r, angle ⁇ , speed, and energy characteristic information such as SNR and RCS are obtained.
  • the parameters of the sensor corresponding to the highway scene may include: the millimeter-wave radar sensor works in the LRR mode. Therefore, when the sensor works in LRR mode, the distance r, angle ⁇ , speed, and sensor characteristic information such as SNR and RCS of the corresponding target relative to the sensor are obtained.
  • the parameters of the sensor can include: the millimeter-wave radar sensor works in SRR mode.
  • the sensor works in SRR mode under special weather, the corresponding target object relative sensor distance r, angle ⁇ , speed, and energy characteristic information such as SNR, RCS, and polarization information are obtained.
  • the period for collecting measurement information can also be set as required to obtain a better modeling effect.
  • the vehicle may combine multiple types of sensors to make decisions. Therefore, when collecting the measurement information of the sensor, the measurement information of the various types of sensors can also be collected based on the scenarios of the various types of sensors, so as to obtain more accurate environmental information, which is beneficial for the model to better simulate different scenarios. .
  • Different measurement information categories are represented by different scene names, and the hypothetical scenes include downtown scenes, suburban scenes, and highway scenes.
  • the parameters corresponding to the bustling scene can include GPS working in high-precision positioning mode, IMU and camera sensors reporting measurement information at fixed intervals with a set period, and lidar sensors and millimeter-wave radar sensors working in SRR mode. Therefore, the determined measurement information includes: the positioning information of the sensor, the measurement information reported by the IMU and the camera sensor, and the measurement information reported by the radar sensor.
  • the measurement data collected by the MRR type or LRR type sensor model can also be collected to provide more training samples and improve the accuracy and robustness of the model.
  • the parameters corresponding to the suburban scene can include GPS working in low-precision positioning mode, IMU reporting measurement information at fixed intervals with a set period, camera sensors reporting measurement information when pedestrians are detected within the set range, lidar sensors and millimeter waves.
  • the radar sensor works in MRR mode. Therefore, the determined measurement information includes: the positioning information of the sensor, the measurement information reported by the IMU and the camera sensor, and the measurement information reported by the radar sensor.
  • the measurement data collected by the SRR type or LRR type sensor model can also be collected to provide more training samples and improve the accuracy and robustness of the model.
  • the parameters corresponding to the highway scene can include GPS working in low-precision positioning mode, IMU and camera sensors reporting measurement information when pedestrians or vehicles are detected within the set range, and lidar sensors and millimeter-wave radar sensors working in LRR mode. Therefore, the determined measurement information includes: the positioning information of the sensor, the measurement information reported by the IMU and the camera sensor, and the measurement information reported by the radar sensor.
  • the measurement data collected by the SRR type or MRR type sensor model can also be collected to provide more training samples and improve the accuracy and robustness of the model.
  • the test device can be modeled accordingly based on different sensor types, and obtain more scene-related parameters through other sensors, thereby facilitating the subsequent decision-making module to use more information Make decisions and improve the simulation effect of the verification decision-making module.
  • the measurement information collected by the sensor during use is used as a training sample of the sensor model for training, so as to obtain different position information (for example, relative distance, relative angle, yaw angle) between the target vehicle and the sensor. , speed information and corresponding sensor models under different environmental information (eg, different weather, different road conditions, different road types).
  • position information for example, relative distance, relative angle, yaw angle
  • speed information and corresponding sensor models under different environmental information eg, different weather, different road conditions, different road types.
  • the output of the sensor model is the predicted value of the sensor's characteristic information (eg, SNR and RCS, polarization information, etc.), and other measurement information (eg, measurement data other than the sensor's characteristic information, positioning information, and environmental information, etc.) as input to the sensor model trained by supervised learning. Therefore, in the training process, a training sample can include training data and validation data.
  • the training data is: input data of the sensor model, that is, measurement information, such as measurement data, positioning information, and environmental information, other than the characteristic information of the sensor.
  • Validation data are measurements of characteristic information of the sensors in the training samples.
  • the output parameters of the millimeter-wave radar sensor model can be the predicted value of the characteristic information of the millimeter-wave radar sensor, for example, the predicted value of SNR and the predicted value of RCS.
  • the input parameters of the millimeter-wave radar sensor model may include: position information of the target relative to the sensor (distance r, angle ⁇ , yaw angle), velocity information, environmental information, positioning information and other measurement information other than characteristic information.
  • the environmental information may include: weather type, road type, and the like.
  • the environmental information may also include parameters obtained by other sensors, for example, within the perceptible range of the sensor, whether there is occlusion by fallen leaves, occlusion by rain, occlusion by snow, etc.
  • the corresponding sensor models can be trained separately.
  • a supervised learning algorithm of a support vector regression (SVR) model can be used to train the measurement information collected by this type of sensor.
  • the input data of the SVR model may include other measurement information in the measurement information except the characteristic information of the sensor.
  • the output data of the SVR model may include: the predicted value of the characteristic information of the sensor, for example, the predicted value of SNR, the predicted value of RCS.
  • training can be performed on the feature information of each sensor, for example, training on the predicted value of SNR, after the feature information training of SNR meets the accuracy requirements of the model, can be trained on the predicted value of RCS again .
  • training can also be performed on the predicted value of the RCS, and after the feature information training of the RCS meets the accuracy requirements of the model, the training can be performed on the predicted value of the SNR.
  • training can also be performed for all feature information together, which is not limited here.
  • the measurement information of the same sensor type can also be trained separately for the measurement information collected in different scenarios, assuming that the scenarios include a downtown scene, a suburban scene, and a highway scene.
  • the parameters corresponding to the downtown scene may include that the GPS 126 works in a high-precision positioning mode, the IMU 125 and the camera sensor 123 report measurement information at a fixed time with a set period, and the lidar sensor and the millimeter-wave radar sensor work in the SRR mode; thus; , for the sensor model of the lidar sensor or millimeter-wave radar sensor, the measured measurement information can be stored in the SRR type and the busy market, so that the subsequent sensor model can call the corresponding measurement information as training data for training.
  • measurement data collected by other types of sensor models can also be collected in this scenario to provide more training samples and improve the accuracy and robustness of the model.
  • the configuration parameters corresponding to the suburban scene may include that the GPS 126 works in a low-precision positioning mode, the IMU 125 reports the measurement information at a fixed time with a set period, the camera sensor 123 reports the measurement information when it detects pedestrians within the set range, and the lidar Sensors and millimeter-wave radar sensors work in MRR mode; thus, for the sensor model of a lidar sensor or millimeter-wave radar sensor, the measured measurement information can be stored in the MRR type and suburban scene, so that subsequent sensor models can call the corresponding The measurement information is used as training data for training.
  • the configuration parameters corresponding to the highway scene can include that the GPS 126 works in a low-precision positioning mode, the IMU 125 and the camera sensor 123 report measurement information when a pedestrian or vehicle is detected within the set range, and the lidar sensor and millimeter-wave radar sensor use LRR. mode works. Therefore, for the sensor model of the lidar sensor or the millimeter-wave radar sensor, the measured measurement information can be stored in the LRR type and the highway scene, so that the subsequent sensor model can call the corresponding measurement information as training data for training.
  • a variety of scenarios can be trained. For example, when training a busy market scenario, training samples of measurement information collected by an SRR type sensor in a busy market scenario may be selected for training. For example, when training a suburban scene, training samples of measurement information collected by an SRR-type sensor in the suburban scene may be selected for training. For example, when training a suburban scene, training samples of measurement information collected by an MRR-type sensor in the suburban scene may be selected for training. For example, when training a suburban scene, training samples of measurement information collected by an LRR-type sensor in the suburban scene may be selected for training.
  • the trained sensor model can be used in different scenarios.
  • MRR-type sensor models in scenes such as a downtown scene, a suburban scene, and a highway scene can be trained.
  • LRR-type sensor models LRR-type sensor models in scenes such as downtown scenes, suburban scenes, and highway scenes can be trained.
  • the above sensor model is only an example of the SVR model, and the sensor model may also be determined by other models or algorithms.
  • the sensor model includes but is not limited to regression models, NN models, random forests, and deep neural networks. network, autoregressive moving average model (ARMA), gradient boosting decision tree (GBDT) model, or XGBoost model, etc.
  • FIG. 5a is an exemplary functional block diagram of a sensor testing system according to an embodiment of the present application.
  • the system can be applied in a test device or in other use carriers.
  • the cloud server is used as a carrier for description.
  • the system includes at least one sensor model, a decision module and a scenario module, wherein the sensor model can be a sensor model for simulating any one or more sensors in the sensor system 120 shown in FIG. 1a, and the decision module and the scenario module can be It is a whole integrated in a test device, the test device, the sensor model and the decision module and the scene module can also be independent modules, and then the two share the memory of the test environment.
  • the application scenario of the test system shown in FIG. 5a may include a test device, wherein the test device may be a test device with a sensor model, and the network elements of the test device include hardware devices that support running simulation software, such as personal computers, servers, vehicle-mounted Mobile terminals, industrial computers, embedded devices, etc.
  • the test device can be implemented by a server or a virtual machine in the cloud.
  • the test device can also be a chip that supports running simulation software.
  • a method for simulating a vehicle provided in an embodiment of the present application specifically includes:
  • S501 Input the position information and speed information of the first target vehicle relative to the simulated vehicle and the road environment information of the simulated vehicle into a sensor model to obtain a sensor feature prediction value of the first target vehicle.
  • the sensor feature prediction value includes at least one of the following: RCS prediction value and SNR prediction value; the sensor model is used to simulate the sensors in the simulated vehicle, and the first target vehicle is a vehicle in the test environment where the simulated vehicle is located.
  • the position information and speed information of the target vehicle relative to the simulated vehicle and the road environment information of the simulated vehicle are determined according to the test environment, and the sensor model is obtained by training according to the measurement data of the sensor and the marked road environment information.
  • S502 Input the sensor feature prediction value of the first target vehicle into a decision-making module of the simulated vehicle to obtain a simulation decision-making result of the simulated vehicle.
  • the decision module is used to output the vehicle driving decision determined based on the predicted value of the sensor feature.
  • the output result of the sensor model is optimized by considering the physical characteristics of the sensor, thereby effectively improving the simulation effect.
  • the present application establishes the output parameters of the corresponding radar sensor model after combining the physical characteristics of the target with the radar sensor. Thereby, it is closer to the output parameters of the real millimeter-wave radar sensor.
  • the simulation device of the vehicle can determine the parameters of the sensor and the target object in the test environment.
  • the senor is the sensor to be tested, and the following description takes the sensor located on the simulated vehicle as an example, the sensor is located on other devices to be tested, and this embodiment can be referred to.
  • the testing device may acquire the testing information of the target object involved in the testing environment relative to the sensor in the testing environment.
  • the target object may not be limited to the target object near the sensor. It can also be a target object within a preset area near the sensor. The preset area may be determined according to the detectable range of the sensor, or may be determined according to other methods, which are not limited herein.
  • the target object is not limited to vehicles, but can also be various objects in the test environment, such as roadside buildings, pedestrians, lanes, bridges, tunnels, etc.
  • test information may include: position and attitude state information of the target object relative to the sensor, and test data such as environmental information.
  • the pose state information may include: position information and velocity information.
  • Environmental information may include information such as weather, roads, traffic signs, and traffic light data.
  • the relative angle of the target object to the sensor For example, the relative angle of the target object to the sensor, the relative distance of the target object to the sensor, the relative velocity of the target object to the sensor, the relative angular velocity of the target object to the sensor, the relative acceleration of the target object to the sensor, the relative angular acceleration of the target object to the sensor , the size of the target object and other structural information.
  • the test environment it may be determined that the first target vehicle is a vehicle in the test environment where the simulated vehicle is located. Furthermore, the test information of the first target vehicle may also be determined according to the test environment. For example, the test information of the first target vehicle may include: a pose state of the first target vehicle relative to the simulated vehicle, and environmental information of the simulated vehicle.
  • test information may be determined according to the collected measurement information, or may be determined in other ways.
  • the test environment may be provided by intelligent vehicle simulation test software for simulating real-world traffic scene data, and test information of simulated traffic objects can be extracted from the test environment, including, for example, the simulation software may is vehicle test software (eg, VTD software), and the test environment is provided by the vehicle test software.
  • vehicle test software eg, VTD software
  • different scenarios may correspond to test information of different types of sensors, and hypothetical scenarios include downtown scenarios, suburban scenarios, and highway scenarios.
  • the parameters corresponding to the downtown scene may include that the GPS 126 works in a high-precision positioning mode, the IMU 125 and the camera sensor 123 report measurement information at a fixed time with a set period, and the lidar sensor and the millimeter-wave radar sensor work in the SRR mode; thus;
  • the corresponding test information of the radar sensor of the SRR type can be called, so that the subsequent sensor model can call the corresponding test information for prediction.
  • the configuration parameters corresponding to the suburban scene may include that the GPS 126 works in a low-precision positioning mode, the IMU 125 reports the measurement information at a fixed time with a set period, the camera sensor 123 reports the measurement information when it detects pedestrians within the set range, and the lidar
  • the sensor and the millimeter-wave radar sensor work in the MRR mode; thus, in this scenario, the corresponding test information of the MRR type radar sensor can be called, so that the subsequent sensor model can call the corresponding test information for prediction.
  • the configuration parameters corresponding to the highway scene can include that the GPS 126 works in a low-precision positioning mode, the IMU 125 and the camera sensor 123 report measurement information when a pedestrian or vehicle is detected within the set range, and the lidar sensor and millimeter-wave radar sensor use LRR. mode works. Therefore, in this scenario, the corresponding test information of the LRR type radar sensor can be called, so that the subsequent sensor model can call the corresponding test information for prediction.
  • the position and attitude state information of the first target vehicle relative to the sensor and the environmental information of the simulated vehicle may be input into the sensor model to obtain the predicted value of the sensor feature of the first target vehicle;
  • the sensor feature prediction value includes at least one of the following: RCS prediction value and SNR prediction value;
  • the first target vehicle is a vehicle in the test environment where the simulated vehicle is located, and the first target vehicle is relative to the simulated vehicle.
  • the pose state information and the road environment information of the simulated vehicle are determined according to the test environment.
  • the sensor model is obtained by training the measurement information collected by the sensor by means of supervised learning.
  • test information from the test environment can be used as input.
  • the test information may include other test information except the characteristic information of the sensor.
  • the environment information determined in the test environment the position and attitude state information of the target vehicle relative to the sensor, etc. Therefore, the predicted value of the feature information of the sensor corresponding to the output target is predicted.
  • the pose state information and environmental information of the target object within the detection range of the sensor model are obtained through the communication interface provided by the test environment, as the input of the sensor model. Therefore, the predicted value of the feature information of the target object can be obtained through the predicted data output by the sensor model. That is, the predicted value of the feature information of the target object predicted by each target object under different pose state information and different environmental information can be obtained through the sensor model.
  • the prediction information of the target object can be determined according to the predicted value of the characteristic information of the target object and the test information of the target object.
  • the prediction information of the target object includes: test information of the target object (for example, test data such as the pose state information and environmental information of the target object), and the predicted value of the feature information of the target object (for example, the predicted value of RCS, SNR predicted value).
  • the simulation device of the vehicle may input the prediction information of the target object to the decision-making module.
  • the prediction information of the target object can be used as the input of the decision control (or fusion perception) algorithm to verify the decision control (or fusion perception) algorithm, and the target object information output by the radar sensor can be received as the input. calculations in order to arrive at decision-making results.
  • the predicted value of the characteristic information of the target object in the test environment can be obtained, so that the predicted information of the target object is closer to the output of the real millimeter wave radar sensor, which reflects the physical characteristics of the sensor and is more conducive to simulating the decision algorithm in The performance of the actual use process is good or bad, and the effect of the simulation is improved.
  • the output result of the sensor model can also be optimized by considering the effect produced by the physical characteristics of the sensor, thereby effectively improving the simulation effect.
  • the following example illustrates the physical properties of a target when a radar sensor measures it.
  • the radar may be indistinguishable for two objects with the same distance and close proximity.
  • the sensor model should also be able to distinguish the same distance and close proximity.
  • the output of the two objects is a target object, which is beneficial to the subsequent verification of whether the decision-making control module of automatic driving can handle the scene in which the sensor recognizes the error.
  • the sensor can sometimes detect occluded objects due to the phenomenon of multipath propagation. Therefore, the target object output by the sensor model should also include objects that may be occluded.
  • the posture of the target in space is usually relatively stable, and the multiple measurement results of the RCS of the space target are stable. Therefore, the average value of the RCS of the target can be used as a feature to identify the structure of the target, so that the reflection intensity of different targets can be obtained. classification to classify the structure of the target. For example, objects can be distinguished as lane boundaries, lane lines or curbs, road obstacles, tunnels, bridges, etc.
  • the transmitted signal can include polarization information, and the polarization reflects the time-varying rule of the electric field vector end point of the wave, which can be divided into line, circle, elliptical polarization and left-handed, Right-handed polarization.
  • the polarization state of the electromagnetic wave reflects the time-varying characteristics of the electric field orientation of the electromagnetic wave received by the radar.
  • the polarization parameters of the received signal can be estimated by using a polarization antenna or polarization-sensitive array at the receiving end.
  • the transmitted signal interacts with the target, resulting in different echo scattering. Both wavelength and polarization will affect the received received signal.
  • the polarization information in the received signal may include: the polarization scattering matrix of the target and the polarization state of the electromagnetic wave.
  • the polarization scattering matrix of the target is the polarization scattering effect of the target on the electromagnetic wave under a certain attitude and observation frequency.
  • the polarization scattering matrix of the target represents the change of the polarization state of the radar target to the electromagnetic wave signal, that is, the target is irradiated by the radar electromagnetic wave, and the polarization state of the scattered electromagnetic wave may be different from the polarization state of the incident electromagnetic wave. Changing the polarization state of the electromagnetic wave by the target can be called the depolarization characteristic of the target.
  • the radar target changes the polarization state of the electromagnetic wave, and the change of the polarization state is determined by the shape, structure and material of the target. Therefore, the polarization information in the target echo signal can be used to identify the target. That is, the polarization information can obtain the scattering characteristics of different targets, and can be used to calibrate the surface characteristics, shape, roughness and other surface feature information of the target. Further, through the combination of different polarization modes and wavelengths, different and complementary polarization information of the target can be determined, which is beneficial to obtain more accurate surface feature information such as the structure and material of the target.
  • the main source of noise may be the noise generated by the transmitter, the noise received by the receiver, or the interference from other radars. If the power of the jamming signal is greater than the receiver sensitivity, the jamming signal will interfere with the current radar. If the power of the jamming signal is not greater than the receiver sensitivity, the jamming signal will not interfere with the current radar, and the jamming signal will not interfere with the current radar. will be treated as noise.
  • the radar sensor needs to pass the corresponding threshold to determine whether the received signal is noise or a target object.
  • the transmit power of the radar signal and the sensitivity of the receiver are different, therefore, the corresponding thresholds are also different.
  • the results of measuring the target object may have false negative and false positive results.
  • false negative means that in the process of radar detection, due to the ubiquitous existence and fluctuation of noise, the target actually exists, and the signal energy of the target object is less than a certain threshold and cannot be detected.
  • False positive means that in the process of radar detection, the signal energy of the target object is not higher than the noise energy or even lower than the noise energy.
  • the threshold detection method due to the ubiquitous existence and fluctuation of noise, the threshold value is set too small and is rejected by millimeters. The wave radar detects that the target does not actually exist, but it is judged that there is a target.
  • the sensor model can have at least one of the following physical characteristics:
  • the resolving power of the measurement target object is considered.
  • the radar may be indistinguishable between two objects with the same distance and close proximity.
  • the sensor model should also be able to compare the two objects with the same distance and close proximity.
  • the output is a target object, which is helpful for subsequent verification of whether the decision-making control module of automatic driving can handle the scene in which the sensor identifies an error.
  • the sensor can sometimes detect occluded objects. Therefore, the target object output by the sensor model should also include objects that may be occluded.
  • the results of measuring the target object may have false negative and false positive results.
  • the present application Compared with only the relative speed, phase distance, and angle data in the test environment as the output parameters of the radar sensor, the present application establishes the output parameters of the corresponding radar sensor model after combining the physical characteristics of the target with the radar sensor. Thereby, it is closer to the output parameters of the real millimeter-wave radar sensor.
  • the target object may be screened according to the detectable range of the sensor.
  • the simulation apparatus of the vehicle determines that the first target vehicle is the position information and speed information of the candidate vehicle relative to the simulated vehicle, and the sensor is determined in the candidate vehicle according to the position information and speed information of the candidate vehicle relative to the simulated vehicle.
  • the vehicle within the detection range of the candidate vehicle; the position information and speed information of the candidate vehicle relative to the simulated vehicle are determined according to the test environment, and the candidate vehicle is a vehicle in the test environment where the simulated vehicle is located.
  • each target object may be a target object within the detectable range of the sensor.
  • the detectable range of the sensor can be determined according to the parameters of the sensor obtained by the radar sensor model during modeling. Considering that the detectable range of the sensor may vary according to the environment, it can also be based on the measurement information collected by the sensor and the current The environmental information in the test environment is determined, which is not limited here.
  • the detection range of a sensor provided in this embodiment is a cone-shaped area.
  • the tapered region can be determined by the following parameters.
  • the left detectable angle ⁇ of the sensor, the right detectable angle ⁇ of the sensor, the detectable distance of the near end of the sensor can be the first distance, and the detectable distance of the far end of the sensor can be the second distance.
  • the specific process may include: determining the detection range according to the detectable distance range and detectable angle range of the radar, removing the target objects that are not in the detectable range, and retaining the target objects that intersect the boundary of the detectable range.
  • the target objects 4 and 5 that are not within the detectable range of the radar of the vehicle 1 are eliminated, and the target object 3 that intersects with the boundary of the detectable range is retained, and the same is retained.
  • the target object 2 shown in FIG. 6b is completely occluded by the target object 1, but in this embodiment of the present application, the target object 2 is not determined as an undetectable target object.
  • the occluded target object thus reflecting the physical characteristics of the occluded object that the radar sensor may be able to detect due to the phenomenon of multipath propagation, thus provides a basis for the sensor model to detect objects in the non-line-of-sight range. That is, in step 502, according to the detectable range parameters of the millimeter-wave radar, the target objects that are completely out of the detection range are deleted.
  • the target object in the case of occlusion but within the detection range is also used as the target object of the sensor.
  • the target object and the prediction information of the target object may be screened by using physical characteristics.
  • the target object is further screened, so as to better obtain the target output from the radar sensor and the target close to the target. Prediction information of the sensor's measurement information.
  • the predicted value of the SNR of each target object it is determined whether the target object is visible relative to the sensor.
  • the simulation apparatus of the vehicle determines that the predicted SNR value of the first target vehicle is greater than a visible threshold.
  • This application is not limited.
  • the following uses SNR as an example to illustrate an example of judging whether the target object is visible relative to the sensor.
  • the predicted value of the SNR by the target object is compared to a corresponding visibility threshold.
  • the visibility threshold is 1, that is, when the signal strength of the RCS in the echo signal is greater than the signal strength of the noise, it is considered that the target object exists. That is, when the predicted value of the SNR is greater than or equal to 1, it is considered that the target object exists.
  • the target objects whose predicted value of SNR is less than 1 can be deleted.
  • the sensor will consider that there is no target, that is, a false negative situation occurs.
  • Another possible scenario is that when the noise is too large, it is identified as a target object by the sensor, that is, a false positive situation occurs. Therefore, it can reflect the characteristics of false negative and false positive results of millimeter wave radar. And further effectively ensure the physical characteristics of objects that are not in the line of sight can be detected.
  • a corresponding probability threshold for example, setting a discovery probability threshold
  • the discovery probability of the target object output by the sensor model can be output.
  • a false negative probability threshold is set, and when the SNR is greater than the false negative probability threshold, the false negative probability of the target object output through the sensor model can be output.
  • set the correct non-discovery probability threshold when the SNR is greater than the correct non-discovery probability threshold, the correct non-discovery probability of the target object output through the sensor model can be output.
  • a false alarm probability threshold is set, and when the SNR is greater than the false alarm probability threshold, the false alarm probability of the target object output through the sensor model can be output. Therefore, the decision-making model can also obtain the probability of misjudgment by the sensor based on the corresponding probability, thereby improving the accuracy of decision-making.
  • the target object and the prediction information of the target object can be updated according to the physical characteristics of the sensor and the pose state information of the target object.
  • the testing device may also be based on at least one item or combination of the pose state information of the target object: the relative angle of the target object to the sensor, the relative distance of the target object to the sensor, the relative speed of the target object to the sensor, the relative speed of the target object to the sensor, and the relative angle of the target object to the sensor.
  • the angular velocity, the relative acceleration of the target object to the sensor, the relative angular acceleration of the target object to the sensor, etc. determine whether there are multiple indistinguishable target objects.
  • the target object determined above can be used as a candidate object, and in this case, the first candidate object and the second candidate object can be determined according to the pose state information of the first candidate object and the pose state information of the second candidate object For the sensor is indistinguishable. That is, whether to use the first candidate object and the second candidate object as one target object or as two target objects.
  • the vehicle simulation apparatus determines that the first target vehicle includes a first candidate vehicle and a second candidate vehicle; the predicted value of the sensor feature of the first target vehicle is based on the first candidate vehicle. The predicted value of the sensor feature of the vehicle and the predicted value of the sensor feature of the second candidate vehicle are determined;
  • the vehicle simulation device determines that the first candidate vehicle and the second candidate vehicle satisfy: the relative position of the first position relative to the second position is less than the first position threshold; the first position is relative to the first candidate target vehicle.
  • the position of the simulated vehicle, and the second position is the position of the second candidate target vehicle relative to the simulated vehicle.
  • the first candidate vehicle and the second candidate vehicle further satisfy: the relative speed of the first speed relative to the second speed is less than the first speed threshold; the first speed is the relative speed of the first candidate target vehicle. The speed of the simulated vehicle, and the second speed is the speed of the second candidate target vehicle relative to the simulated vehicle.
  • the two target objects include a first candidate object and a second candidate object.
  • the first position information of the first candidate object relative to the simulated vehicle is different from the second candidate object relative to the second position information of the simulated vehicle.
  • the location information is less than the first location threshold.
  • the first position information may be the position information of the center position of the first candidate object
  • the second position information may be the position information of the center position of the second candidate object.
  • it can also be other location information.
  • the first position information may be the position information of the closest position of the first candidate object relative to the vehicle 1
  • the second position information may be the position information of the closest position of the second candidate object vehicle 1. It can also be determined according to the characteristics of the candidate objects, so as to better simulate the situation that the real radar sensor determines different candidate objects as the same target object, which is not limited in this application.
  • the first candidate object and the second candidate object can be output as one target object.
  • the prediction information obtained by the first candidate object and the prediction information of the second candidate object may be output as the prediction information of one target object.
  • the predicted value of the sensor feature of the first target object is determined according to the predicted value of the sensor feature of the first candidate object and the predicted value of the sensor feature of the second candidate object.
  • an average value, or a weighted average value, of the sensor feature predicted value of the first candidate object and the sensor feature predicted value of the second candidate object can be used as the sensor feature predicted value of the first target object.
  • the weighting method may be determined according to the characteristics of the first candidate object and the second candidate object, or may be determined based on the relationship between the first candidate object and the sensor, and the relationship between the first candidate object and the sensor, or may be determined according to other The factors are determined and are not limited here.
  • condition 2 when determining the first position information of the first candidate object relative to the simulated vehicle, the relative position of the second candidate object is the same as that of the second candidate object.
  • the angle information can also be used to determine whether the first candidate object and the second candidate object meet the proximity conditions, and will be regarded by the sensor as the same target object.
  • the first candidate object and the second candidate object further satisfy: the first angle information of the first candidate object relative to the sensor, and the second candidate object relative to the sensor The second angle information is smaller than the first angle threshold.
  • the relative position of the second candidate object is the same as that of the second candidate object.
  • the speed information can also be used to determine whether the first candidate object and the second candidate object meet the proximity conditions, and will be regarded by the sensor as the same target object.
  • the first candidate object and the second candidate object further satisfy: the first candidate object relative to the first speed information of the simulated vehicle, and the second candidate object relative to the simulation vehicle The second speed information of the vehicle is less than the first speed threshold.
  • condition 4 after determining the first position information of the first candidate object relative to the simulated vehicle, the second candidate object relative to the simulated vehicle After the position information is less than the first position threshold, it can also be judged by the acceleration information whether the first candidate object and the second candidate object meet the proximity conditions, and will be considered by the sensor as the same target object.
  • the first candidate object and the second candidate object further satisfy: the first candidate object relative to the first acceleration information of the simulated vehicle, and the second candidate object relative to the simulation vehicle The second acceleration information of the vehicle is less than the first acceleration threshold.
  • the specific threshold value may be set according to the resolution parameter of the sensor, and may also be determined in other ways, for example, determined by the collected measurement information of the sensor, which is not limited herein.
  • the conditions for whether the first candidate object and the second candidate object will be mistakenly regarded as the same target object by the sensor can also be determined in other ways.
  • conditions for judging whether the first candidate object and the second candidate object will be mistaken by the sensor as the same target object under different weather conditions may be additionally added.
  • Condition 1 for example, under the influence of snow, it can be determined whether the influence of snow needs to be considered according to the characteristic value output by the sensor model, so as to select the second position threshold under the influence of snow.
  • the second location threshold may be a larger threshold than the first location threshold because snow makes it easier for the sensor to fail to distinguish between the 2 candidates.
  • the tail of the first candidate object and the tail of the second candidate object are predicted to have snow characteristic information, therefore, a second position threshold can be selected to determine whether the first candidate object and the second candidate object are sensors Unable to distinguish 2 candidates.
  • the first position information of the first candidate object relative to the simulated vehicle, and the second position information of the second candidate object relative to the simulated vehicle are smaller than a second position threshold.
  • Condition 2 Condition 3 and Condition 4 can also be set according to different weather conditions, and the corresponding setting of Condition 1 can be referred to, and details are not repeated here.
  • the sensor is considered to be the same target object only when at least several of the above conditions are met.
  • the sensor will only be considered to be the same target object when it is determined that all conditions are met. It can also be set to be considered as the same target object by the sensor when at least 3 conditions are met.
  • the number of satisfying conditions can be set according to the accuracy of the sensor, which is not limited here.
  • a priority can also be set for the above conditions. For example, condition 1 has the highest priority and condition 4 has the lowest priority. Therefore, the scene in which different target objects are misjudged when the sensor outputs the target object can be better simulated.
  • the senor may be indistinguishable from two objects that are close together.
  • noise simulation can also be added to the prediction information output by the sensor model to simulate errors caused by the real sensor being affected by external environmental noise.
  • Gaussian white noise can be added to the output pose state information of the target object and the feature information output by the sensor model respectively.
  • the noise power is selected according to the real sensor parameters, which is not limited here.
  • the pass-through error simulation can simulate the characteristics of real sensor data affected by environmental noise.
  • FIG. 8a is an exemplary functional block diagram of a sensor testing system according to an embodiment of the present application.
  • the system can be applied in a test device or in other use carriers.
  • the cloud server is used as a carrier for description.
  • the system includes at least one sensor module (which can be the sensor model obtained by the above training), a sensor detection range screening module, a physical characteristic screening module, a noise simulation module, a decision module and a scene module, wherein the sensor module can be as shown in Figure 4d or The sensor model shown in 4e for simulating any one or more sensors in the sensor system 120 shown in FIG. 1a, the decision module and the scene module can be integrated in a test device, the one test device computer system 160.
  • the sensor model and decision module and the scenario module can also be independent modules, and then the two share the memory of the test environment.
  • the sensor module, the decision module and the scene module of the present application may be implemented in any achievable combination, which is not specifically limited in the present application.
  • the application scenario of the test system shown in FIG. 8a may include a test device, wherein the test device may be a test device with a sensor model, and the network elements of the test device include hardware devices that support running simulation software, such as personal computers, servers, vehicle-mounted Mobile terminals, industrial computers, embedded devices, etc.
  • test device can be implemented by a server or a virtual machine in the cloud.
  • the test device can also be a chip that supports running simulation software.
  • vehicle simulation method provided by the present application with a specific example, as shown in Figure 8b, including:
  • Step 801 Determine the parameters of the sensor and target object in the test environment.
  • Step 802 Determine whether the target object is visible relative to the sensor through the detection range of the sensor. If yes, go to step 803, if not, go to step 808.
  • Step 803 Determine the prediction data of the target object output by the radar sensor model according to the test information of the sensor and the test information of the target object in the test environment.
  • Step 804 Determine whether the target object is visible relative to the sensor according to the predicted value of the SNR of the target object. If yes, go to step 805, if not, go to step 808.
  • Step 805 Determine whether there are at least two indistinguishable target objects according to the physical characteristics of the sensor and the pose state information of the target object. If yes, go to step 808, if not, go to step 806.
  • Step 806 Update the at least two indistinguishable target objects into one target object and the updated prediction information of the target object.
  • Step 807 Output the prediction information of the target object to the decision-making module.
  • Step 808 Delete the prediction information of the target object.
  • the vehicle simulation apparatus 900 may include: a sensor feature prediction module 901 and an output module 902 .
  • the vehicle simulation device 900 can be applied to a test device, wherein the test device can be a test device with a sensor model, and the network elements of the test device can include hardware devices that support running simulation software, such as personal computers, servers, vehicle-mounted mobile terminals, Industrial computers, embedded devices, etc.
  • the test device can be implemented by a server or a virtual machine in the cloud.
  • the test device can also be a chip that supports running simulation software.
  • the vehicle simulation apparatus 900 may further include: a first determination module, a second determination module, and a third determination module.
  • the vehicle simulation apparatus 900 may further include a sensor model training module, the sensor model training module is used for training the sensor model, and the sensor model training module may include: an acquisition module and a training module.
  • the sensor feature prediction module 901 can be used to input the position information and speed information of the first target vehicle relative to the simulated vehicle and the road environment information of the simulated vehicle into the sensor model to obtain the sensor of the first target vehicle.
  • feature prediction value includes at least one of the following: radar reflection cross section RCS prediction value and signal-to-noise ratio SNR prediction value; wherein, the sensor model is used to simulate the sensor in the simulated vehicle, and the first A target vehicle is a vehicle in the test environment where the simulated vehicle is located, and the position information and speed information of the first target vehicle relative to the simulated vehicle and the road environment information of the simulated vehicle are determined according to the test environment , the sensor model is obtained by training according to the measurement data of the sensor and the marked road environment information;
  • An output module 902 configured to input the predicted value of the sensor feature of the first target vehicle to the decision module of the simulated vehicle, to obtain a simulation decision result of the simulated vehicle; wherein, the decision module is used to output the output based on the sensor Vehicle driving decisions determined by feature predictions.
  • the decision-making module may be a module in the vehicle simulation device 900, or may be a module provided separately, which is not limited herein.
  • the vehicle simulation device 900 may further include:
  • a first determination module configured to determine a vehicle within the detection range of the sensor as the first target vehicle in the candidate vehicle according to the position information and speed information of the candidate vehicle relative to the simulated vehicle; the The position information and speed information of the candidate vehicle relative to the simulated vehicle are determined according to the test environment; the candidate vehicle is a vehicle in the test environment where the simulated vehicle is located.
  • the vehicle simulation apparatus 900 may further include: a second determination module, configured to determine that the predicted SNR value of the first target vehicle is greater than a visible threshold.
  • a possible implementation further includes: a third determination module, configured to determine the predicted value of the sensor feature of the first target vehicle according to the predicted value of the sensor feature of the first candidate vehicle and the predicted value of the sensor feature of the second candidate vehicle;
  • the first target vehicle includes a first candidate vehicle and a second candidate vehicle; the first candidate vehicle and the second candidate vehicle satisfy: the relative position of the first position relative to the second position is less than a first position threshold; the The first position is the position of the first candidate target vehicle relative to the simulated vehicle, and the second position is the position of the second candidate target vehicle relative to the simulated vehicle.
  • the first candidate vehicle and the second candidate vehicle also satisfy: the relative speed of the first speed relative to the second speed is less than a first speed threshold; the first speed is the first candidate The speed of the target vehicle relative to the simulated vehicle, and the second speed is the speed of the second candidate target vehicle relative to the simulated vehicle.
  • the vehicle simulation device 900 further includes: a sensor model training module, where the sensor model training module includes:
  • an acquisition module configured to acquire measurement data of a sensor;
  • the measurement data includes: position information and speed information of the second target vehicle relative to the sensor, and sensor characteristic values of the second target vehicle collected by the sensor;
  • the sensor characteristic values include: RCS measurement value and SNR measurement value; the sensor is located in the measurement vehicle, and the second target vehicle is a vehicle near the measurement vehicle;
  • a training module configured to perform training according to the measurement data of the sensor and the obtained label information to obtain a sensor model;
  • the label information includes at least one of the following: the yaw angle of the second target vehicle relative to the sensor, the The road environment information and the vehicle information of the measuring vehicle marked when the sensor collects data;
  • the input of the sensor model is the position information, speed information and the marked information of the first target vehicle relative to the sensor, the said The output of the sensor model is the predicted value of the sensor feature of the first target vehicle.
  • modules in the above-mentioned embodiments of the present application is illustrative, and is only a logical function division. In actual implementation, there may be other division methods.
  • the functions in the various embodiments of the present application Modules can be integrated in one processing module, or they can exist physically alone, or two or more modules can be integrated into one module. Only one or more of the above-mentioned various modules may be implemented in software, hardware, firmware, or a combination thereof.
  • the software or firmware includes, but is not limited to, computer program instructions or code, and can be executed by a hardware processor.
  • the hardware includes, but is not limited to, various types of integrated circuits, such as a central processing unit (CPU), a digital signal processor (DSP), a field programmable gate array (FPGA), or an application specific integrated circuit (ASIC).
  • CPU central processing unit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • the integrated modules if implemented in the form of software functional modules and sold or used as independent products, can be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art, or all or part of the technical solution, and the computer software product can be stored in a storage medium , including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes .
  • the vehicle simulation apparatus 1000 includes: a communication interface 1010 , a processor 1020 , and a memory 1030 .
  • the communication interface 1010 and the memory 1030 and the processor 1020 are connected to each other.
  • the communication interface 1010 and the memory 1030 and the processor 1020 may be connected to each other through a bus; the bus may be a peripheral component interconnect standard (peripheral component interconnect, PCI) bus or an extended industry standard architecture (extended industry standard architecture, EISA) bus, etc.
  • the bus can be divided into address bus, data bus, control bus and so on. For ease of presentation, only one thick line is used in FIG. 10, but it does not mean that there is only one bus or one type of bus.
  • the communication interface 1010 is used to implement communication in the simulation device of the vehicle. For example, the position information and speed information of the first target vehicle relative to the simulated vehicle and the road environment information of the simulated vehicle are input into the sensor model to obtain the sensor feature prediction value of the first target vehicle; the sensor feature prediction value is obtained.
  • the value includes at least one of the following: radar reflection cross section RCS prediction value and signal-to-noise ratio SNR prediction value; wherein, the sensor model is used to simulate the sensor in the simulated vehicle, and the first target vehicle is where the simulated vehicle is located
  • the vehicle in the test environment, the position information and speed information of the first target vehicle relative to the simulated vehicle and the road environment information of the simulated vehicle are determined according to the test environment, and the sensor model is determined according to the The measurement data of the sensor and the marked road environment information are obtained through training; the predicted value of the sensor feature of the first target vehicle is input into the decision-making module of the simulated vehicle to obtain the simulated decision-making result of the simulated vehicle; wherein, the The decision module is used for outputting a vehicle driving decision determined based on the predicted value of the sensor feature.
  • the communication interface 1010 may also be used to implement communication between the simulation device of the vehicle and other devices.
  • the processor 1020 is configured to implement the vehicle simulation method shown in FIG. 4b to FIG. 8b. For details, refer to the description in the embodiments shown in FIG. 4b to FIG. 8b, which will not be repeated here.
  • the processor 1020 may be a central processing unit (central processing unit, CPU), or other hardware chips.
  • the above-mentioned hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
  • the above-mentioned PLD can be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), a general-purpose array logic (generic array logic, GAL) or any combination thereof.
  • CPLD complex programmable logic device
  • FPGA field-programmable gate array
  • GAL general-purpose array logic
  • the memory 1030 is used to store program instructions, data, and the like.
  • the program instructions may include program code, which includes instructions for computer operation.
  • the memory 1030 may include random access memory (RAM), and may also include non-volatile memory (non-volatile memory), such as at least one disk storage.
  • the processor 1020 executes the program stored in the memory 1030, and implements the above functions through the above components, so as to finally implement the methods provided by the above embodiments.
  • the present application provides a schematic structural diagram of a sensor simulation device, the device may include: an acquisition module and a training module. The device can be applied to a test device.
  • the acquisition module 1101 is used to acquire measurement data of the sensor;
  • the measurement data includes: the position information and speed information of the second target vehicle relative to the sensor, and the sensor characteristic measurement of the second target vehicle collected by the sensor
  • the sensor characteristic measurement value includes: RCS measurement value and SNR measurement value; the sensor is located in the measurement vehicle; the second target vehicle is a vehicle near the measurement vehicle;
  • the training module 1102 is used for training according to the measurement data of the sensor and the obtained annotation information to obtain a sensor model;
  • the sample input of the sensor model is the position information, speed information and speed information of the second target vehicle relative to the sensor.
  • Labeling information, the output of the sensor model is the predicted value of the sensor feature of the second target vehicle;
  • the predicted value of the sensor feature of the second target vehicle includes at least one of the following: a predicted value of RCS and a predicted value of SNR;
  • the labeling information includes at least one of the following: a yaw angle of the second target vehicle relative to the sensor, road environment information labelled when the sensor collects data, and vehicle information where the sensor is located.
  • modules in the above-mentioned embodiments of the present application is illustrative, and is only a logical function division. In actual implementation, there may be other division methods.
  • the functions in the various embodiments of the present application Modules can be integrated in one processing module, or they can exist physically alone, or two or more modules can be integrated into one module. Only one or more of the above-mentioned various modules may be implemented in software, hardware, firmware, or a combination thereof.
  • the software or firmware includes, but is not limited to, computer program instructions or code, and can be executed by a hardware processor.
  • the hardware includes, but is not limited to, various types of integrated circuits, such as a central processing unit (CPU), a digital signal processor (DSP), a field programmable gate array (FPGA), or an application specific integrated circuit (ASIC).
  • CPU central processing unit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • the integrated modules if implemented in the form of software functional modules and sold or used as independent products, can be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art, or all or part of the technical solution, and the computer software product can be stored in a storage medium , including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes .
  • the present application provides a schematic structural diagram of a sensor simulation device.
  • the sensor simulation device 1200 may include: a communication interface 1210 , a processor 1220 , and a memory 1230 .
  • the communication interface 1210 and the memory 1230 and the processor 1220 are connected to each other.
  • the communication interface 1210, the memory 1230 and the processor 1220 can be connected to each other through a bus;
  • the bus can be a peripheral component interconnect (PCI) bus or an extended industry standard architecture (extended industry standard architecture, EISA) bus, etc.
  • PCI peripheral component interconnect
  • EISA extended industry standard architecture
  • the bus can be divided into address bus, data bus, control bus and so on. For ease of representation, only one thick line is shown in FIG. 12, but it does not mean that there is only one bus or one type of bus.
  • the communication interface 1210 may be used to enable communication between the simulation apparatus of the sensor and other devices (eg, with the simulation apparatus 1000 of the vehicle). For example, a simulation device of a vehicle is made to obtain a sensor model.
  • the processor 1220 is configured to implement the simulation method of the sensor shown in FIG. 4b. For details, reference may be made to the description in the embodiment shown in FIG. 4b, which will not be repeated here.
  • the processor 1220 may be a central processing unit (central processing unit, CPU), or other hardware chips.
  • the above-mentioned hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
  • the above-mentioned PLD can be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), a general-purpose array logic (generic array logic, GAL) or any combination thereof.
  • CPLD complex programmable logic device
  • FPGA field-programmable gate array
  • GAL general-purpose array logic
  • the memory 1230 is used to store program instructions, data, and the like.
  • the program instructions may include program code, which includes instructions for computer operation.
  • the memory 1230 may include random access memory (RAM), and may also include non-volatile memory (non-volatile memory), such as at least one disk storage.
  • the processor 1220 executes the program stored in the memory 1230, and implements the above functions through the above components, thereby finally implementing the methods provided by the above embodiments.
  • the present application provides a computer-readable storage medium, including computer instructions, which, when executed by a processor, cause a simulation device of the vehicle to execute any of the possible methods described in the foregoing embodiments.
  • the present application provides a computer-readable storage medium, including computer instructions, which, when executed by a processor, cause the sensor simulation apparatus to execute any of the possible methods described in the foregoing embodiments.
  • the present application provides a computer program product that, when the computer program product runs on a processor, causes the vehicle simulation apparatus to execute any of the possible methods described in the above embodiments.
  • the present application provides a computer program product that, when the computer program product runs on a processor, causes the sensor simulation device to execute any of the possible methods described in the above embodiments.
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
  • the apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Manufacturing & Machinery (AREA)
  • Traffic Control Systems (AREA)

Abstract

一种车辆的仿真方法,包括:将第一目标车辆相对仿真车辆的位置信息和速度信息及仿真车辆的道路环境信息,输入至传感器模型中,得到第一目标车辆的传感器特征预测值;其中,传感器模型用于仿真该仿真车辆中的传感器,第一目标车辆为仿真车辆所在的测试环境中的车辆,第一目标车辆相对仿真车辆的位置信息和速度信息及仿真车辆的道路环境信息为根据测试环境确定的,传感器模型为根据传感器的测量数据及标注的道路环境信息训练得到的;将第一目标车辆的传感器特征预测值输入至仿真车辆的决策模块,获得仿真车辆的仿真决策结果;其中,决策模块用于输出基于传感器特征预测值确定的车辆行驶决策。还提供一种传感器的仿真方法,车辆和传感器的仿真装置,以及一种计算机可读存储介质,车联网通信系统和芯片系统。

Description

一种车辆、传感器的仿真方法及装置
相关申请的交叉引用
本申请要求在2021年03月04日提交中国专利局、申请号为202110238478.8、申请名称为“一种车辆、传感器的仿真方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及智能网联车技术领域,尤其涉及一种车辆、传感器的仿真方法及装置。
背景技术
自动驾驶是一种由计算机系统代替人类来驾驶机动车辆的技术,其包括环境感知、位置定位、路径规划、决策控制和动力系统等功能模块。其中,实现环境感知功能的方式包括以下两种:通过激光雷达、毫米波雷达等高精度低维度传感器实现环境感知功能,以及,通过单目/多目高清摄影头等高维度低精度传感器实现环境感知功能。
为保证自动驾驶的安全性,智能汽车需要经过大量的路程测试充分验证安全性,然而,这会带来巨大的时间成本和经济成本。因此,在对智能汽车进行路程测试之前,可以通过虚拟仿真对智能汽车进行测试和验证,提高测试效率,降低测试成本。在智能汽车的仿真中,传感器的仿真是至关重要的一个环节,传感器仿真得到的仿真数据影响到了车辆仿真结果的可信度。
然而,目前的传感器测试过程复杂且实时性差,难以满足智能汽车实时仿真的需求。
发明内容
本申请提供一种车辆、传感器的仿真方法及装置,用于提高车辆的仿真效果。
第一方面,本申请提供一种车辆的仿真方法,该仿真方法可以应用于测试装置上,可以包括支持运行仿真软件的硬件设备,如个人计算机、服务器、车载移动终端、工控机、嵌入式设备等。例如,测试装置可以由云端的服务器或虚拟机实现。测试装置还可以是支持运行仿真软件的芯片。该测试装置用于测试仿真车辆,该测试装置可以包括传感器模型,所述传感器模型用于仿真所述仿真车辆中的传感器。例如,该测试装置可以是测试仿真车辆的服务器或服务器上的芯片。该方法可以包括:
将第一目标车辆相对仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息,输入至传感器模型中,得到所述第一目标车辆的传感器特征预测值;所述传感器特征预测值包括以下至少一项:雷达反射截面(radar cross section,RCS)预测值和信噪比(signal-noise ratio,SNR)预测值;其中,所述第一目标车辆为所述仿真车辆所在的测试环境中的车辆,所述第一目标车辆相对所述仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息为根据所述测试环境确定的,所述传感器模型为根据所述传感器的测量数据及标注的道路环境信息训练得到的;将所述第一目标车辆的传感器特征预测值输入至所述仿真车辆的决策模块,获得所述仿真车辆的仿真决策结果;其中,所述决策模块用于 输出基于所述传感器特征预测值确定的车辆行驶决策。
在上述方法中,传感器模型可以得到在测试环境中的目标对象的传感器特征预测值,例如,RCS预测值和SNR预测值,使得输入至决策模块用于模拟传感器获得的相关信息更加接近真实毫米波雷达传感器的输出,提高传感器模型对传感器仿真的逼真程度,有利于决策模块更好的模拟出真实场景中,车辆基于传感器采集到的相关信息作出的可能的决策结果,从而,提高车辆的仿真效果。另外,由于传感器模型是根据所述传感器的测量数据及标注的道路环境信息训练得到的,因此,可以基于不同的测试环境(可以对应不同的道路环境信息),输出相应的传感器特征预测值,有效提高车辆仿真性能及仿真结果的鲁棒性。
一种可能的实现方式,所述第一目标车辆为根据候选车辆相对所述仿真车辆的位置信息和速度信息,在所述候选车辆中确定出的在所述传感器的探测范围内的车辆;所述候选车辆相对所述仿真车辆的位置信息和速度信息为根据所述测试环境确定的,所述候选车辆为所述仿真车辆所在的测试环境中的车辆。
通过上述方法,可以基于在所述仿真车辆的传感器的探测范围内的车辆,对候选车辆进行筛选,使得确定出的第一目标车辆,可以是传感器的探测范围内的所有可能的候选车辆,考虑了传感器可能由于多个第一目标车辆相对仿真车辆之间可能造成遮挡时,由于传感器的多径效应,仍可以采集到该多个第一目标车辆的测量数据的情况,因此,可以更好的仿真出传感器的多径效应,使得传感器模型在该情况下,也可以输出多个第一目标车辆的传感器特征预测值。从而,使得传感器模型可以体现传感器的多径效应,提高传感器模型的仿真效果。
一种可能的实现方式,确定所述第一目标车辆的SNR预测值大于可见阈值。
考虑到传感器可以通过确定信噪比是否大于预设阈值的方式,判断采集到的测量数据为目标对象还是噪声。传感器可能会出现4种不同的情况,存在目标时,判为有目标,判断正确,这种情况称为“发现”;存在目标时,判为无目标,判断错误,这种情况称为“漏报(假阴性)”;不存在目标时,判为无目标,判断正确,这种情况称为“正确不发现”;不存在目标时,判为有目标,判断错误,这种情况称为“虚警(假阳性)”。
因此,将该方法引入到传感器模型中,可以使得传感器模型可以模拟出传感器可能基于信噪比产生对目标对象误判的物理特性。本申请中,对候选车辆的SNR预测值进行筛选,从而确定该候选车辆是否为第一目标车辆。在候选车辆的SNR预测值大于可见阈值时,可以确定该候选车辆为第一目标车辆。在候选车辆的SNR预测值小于或等于可见阈值时,可以确定该候选车辆被传感器判定为噪声。从而,可以体现出传感器可能造成误判的物理特性,提高传感器模型仿真传感器的效果。
一种可能的实现方式,所述第一目标车辆包括第一候选车辆和第二候选车辆;所述第一目标车辆的传感器特征预测值为根据所述第一候选车辆的传感器特征预测值和第二候选车辆的传感器特征预测值确定的;所述第一候选车辆和所述第二候选车辆满足:第一位置相对第二位置的相对位置小于第一位置阈值;所述第一位置为所述第一候选目标车辆相对所述仿真车辆的位置,所述第二位置为所述第二候选目标车辆相对所述仿真车辆的位置。
考虑到传感器在输出目标车辆的测量数据时,可能会出现将2个或2个以上的车辆输出为一个车辆的情况。例如,可以将第一候选车辆和第二候选车辆满足第一位置相对第二位置的相对位置小于第一位置阈值时,确定传感器会将第一候选车辆和第二候选车辆误判 为第一目标车辆。
因此,本申请中,通过上述方法,使得传感器模型,在确定第一候选车辆和第二候选车辆满足第一位置相对第二位置的相对位置小于第一位置阈值时,将第一候选车辆和第二候选车辆作为第一目标车辆输出,从而,仿真出传感器可能无法区分多个候选车辆的物理特性,提高传感器模型仿真传感器的效果。
一种可能的实现方式,所述第一候选车辆和所述第二候选车辆还满足:第一速度相对第二速度的相对速度小于第一速度阈值;所述第一速度为所述第一候选目标车辆相对所述仿真车辆的速度,所述第二速度为所述第二候选目标车辆相对所述仿真车辆的速度。
考虑到传感器在输出目标车辆的测量数据时,传感器可能会将2个或2个以上的车辆输出为一个车辆的情况可以是基于相对位置和相对速度确定的。因此,本申请中,可以通过在确定第一候选车辆和第二候选车辆满足第一位置相对第二位置的相对位置小于第一位置阈值,并确定第一速度相对第二速度的相对速度小于第一速度阈值后,将第一候选车辆和第二候选车辆作为第一目标车辆输出,从而,更好的仿真出传感器可能无法区分多个候选车辆的物理特性,提高传感器模型仿真传感器的效果。
一种可能的实现方式,所述传感器模型为根据所述传感器的测量数据及标注的道路环境信息训练得到的,包括:获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征值;所述传感器特征值包括:RCS测量值和SNR测量值;所述传感器位于测量车辆中,所述第二目标车辆为所述测量车辆附近的车辆;根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述传感器所在的车辆信息;所述传感器模型的输入为所述第一目标车辆相对所述传感器的位置信息、速度信息及所述标注信息,所述传感器模型的输出为所述第一目标车辆的传感器特征预测值。
通过上述方法,可以基于传感器采集的第二目标车辆的传感器特征值、相对所述传感器的位置信息、速度信息等测量数据,及标注的道路环境信息,作为训练样本进行训练,使得训练出的传感器模型可以输出目标车辆的传感器预测值,该传感器预测值是基于传感器采集的第二目标车辆的传感器特征值训练得到的,因此,该传感器模型可以更接近传感器的真实输出的测量数据,另外,由于训练样本中还考虑了传感器采集测量数据时的道路环境信息,使得传感器模型输出的传感器特征预测值,可以更好的体现在不同道路环境信息下传感器的输出,提高传感器模型仿真传感器的效果。
第二方面,本申请提供一种传感器的仿真方法,包括:
获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征测量值;所述传感器特征测量值包括:RCS测量值和SNR测量值;所述传感器位于测量车辆中;所述第二目标车辆为所述测量车辆附近的车辆;根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述传感器模型的样本输入为所述第二目标车辆相对所述传感器的位置信息、速度信息及标注信息,所述传感器模型的输出为所述第二目标车辆的传感器特征预测值;所述第二目标车辆的传感器特征预测值包括以下至少一项:RCS预测值和SNR预测值;所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆 角、所述传感器采集数据时标注的道路环境信息、所述传感器所在的车辆信息。
通过上述方法,可以基于传感器采集的第二目标车辆的传感器特征值、相对所述传感器的位置信息、速度信息等测量数据,及标注的道路环境信息,作为训练样本进行训练,使得训练出的传感器模型可以输出目标车辆的传感器预测值,该传感器预测值是基于传感器采集的第二目标车辆的传感器特征值训练得到的,因此,该传感器模型可以更接近传感器的真实输出的测量数据,另外,由于训练样本中还考虑了传感器采集测量数据时的道路环境信息,使得传感器模型输出的传感器特征预测值,可以更好的体现在不同道路环境信息下传感器的输出,提高传感器模型仿真传感器的效果,从而,有利于提供车辆的仿真效果。
第三方面,本申请提供一种车辆的仿真装置,包括:
传感器特征预测模块,用于将第一目标车辆相对仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息,输入至传感器模型中,得到所述第一目标车辆的传感器特征预测值;所述传感器特征预测值包括以下至少一项:雷达反射截面RCS预测值和信噪比SNR预测值;其中,所述传感器模型用于仿真所述仿真车辆中的传感器,所述第一目标车辆为所述仿真车辆所在的测试环境中的车辆,所述第一目标车辆相对所述仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息为根据所述测试环境确定的,所述传感器模型为根据所述传感器的测量数据及标注的道路环境信息训练得到的;
输出模块,用于将所述第一目标车辆的传感器特征预测值输入至所述仿真车辆的决策模块,获得所述仿真车辆的仿真决策结果;其中,所述决策模块用于输出基于所述传感器特征预测值确定的车辆行驶决策。
一种可能的实现方式,该装置还可以包括:
第一确定模块,用于根据候选车辆相对所述仿真车辆的位置信息和速度信息,在所述候选车辆中将在所述传感器的探测范围内的车辆确定为所述第一目标车辆;所述候选车辆相对所述仿真车辆的位置信息和速度信息为根据所述测试环境确定的;所述候选车辆为所述仿真车辆所在的测试环境中的车辆。
一种可能的实现方式,该装置还可以包括:第二确定模块,用于确定所述第一目标车辆的SNR预测值大于可见阈值。
一种可能的实现方式,还包括:第三确定模块,用于根据第一候选车辆的传感器特征预测值和第二候选车辆的传感器特征预测值确定所述第一目标车辆的传感器特征预测值;所述第一目标车辆包括第一候选车辆和第二候选车辆;所述第一候选车辆和所述第二候选车辆满足:第一位置相对第二位置的相对位置小于第一位置阈值;所述第一位置为所述第一候选目标车辆相对所述仿真车辆的位置,所述第二位置为所述第二候选目标车辆相对所述仿真车辆的位置。
一种可能的实现方式,所述第一候选车辆和所述第二候选车辆还满足:
第一速度相对第二速度的相对速度小于第一速度阈值;所述第一速度为所述第一候选目标车辆相对所述仿真车辆的速度,所述第二速度为所述第二候选目标车辆相对所述仿真车辆的速度。
一种可能的实现方式,该装置还包括:传感器模型训练模块,所述传感器模型训练模块包括:
获取模块,用于获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述 传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征值;所述传感器特征值包括:RCS测量值和SNR测量值;所述传感器位于测量车辆中,所述第二目标车辆为所述测量车辆附近的车辆;
训练模块,用于根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述测量车辆的车辆信息;所述传感器模型的输入为所述第一目标车辆相对所述传感器的位置信息、速度信息及所述标注信息,所述传感器模型的输出为所述第一目标车辆的传感器特征预测值。
第四方面,本申请提供一种传感器的仿真装置,包括:
获取模块,用于获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征测量值;所述传感器特征测量值包括:RCS测量值和SNR测量值;所述传感器位于测量车辆中;所述第二目标车辆为所述测量车辆附近的车辆;
训练模块,用于根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述传感器模型的样本输入为所述第二目标车辆相对所述传感器的位置信息、速度信息及标注信息,所述传感器模型的输出为所述第二目标车辆的传感器特征预测值;所述第二目标车辆的传感器特征预测值包括以下至少一项:RCS预测值和SNR预测值;
所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述传感器所在的车辆信息。
第五方面,本申请提供一种车辆的仿真装置,包括:处理器和接口电路;其中,所述处理器通过所述接口电路与存储器耦合,所述处理器用于执行所述存储器中的程序代码,实现上述第一方面或第一方面的任一可能的实现方式所描述的方法。
第六方面,本申请提供一种传感器的仿真装置,包括:处理器和接口电路;其中,所述处理器通过所述接口电路与存储器耦合,所述处理器用于执行所述存储器中的程序代码,实现上述第二方面的实现方式所描述的方法。
第七方面,本申请提供一种计算机可读存储介质,包括计算机指令,当所述计算机指令在被处理器运行时,使得所述车辆的仿真装置执行第一方面中任一项所述的方法或第二方面所述的方法。
第八方面,本申请提供一种计算机程序产品,当所述计算机程序产品在处理器上运行时,使得所述车辆的仿真装置执行第一方面任一项所述的方法或第二方面所述的方法。
第九方面,本申请实施例提供了一种车联网通信系统,该系统包含车载系统和如第三方面或第四方面所述的装置,其中,车载系统与该装置通信连接。
第十方面,本申请实施例提供了一种芯片系统,该芯片系统包括处理器,用于调用存储器中存储的计算机程序或计算机指令,以使得该处理器执行如第一方面或第二方面中任意一种可能的实现方式所述的方法。
在一种可能的实现方式中,该处理器通过接口与存储器耦合。
在一种可能的实现方式中,该芯片系统还包括存储器,该存储器中存储有计算机程序或计算机指令。
本申请实施例还提供了一种处理器,该处理器用于调用存储器中存储的计算机程序或计算机指令,以使得该处理器执行如第一方面或第二方面中任意一种可能的实现方式所述 的方法。
另外,第三方面至第十方面中任一种实现方式所带来的技术效果可参见第一方面至第二方面中不同实现方式所带来的技术效果,此处不再赘述。
附图说明
图1a为本申请实施例提供的一种车辆的系统架构示意图;
图1b为本申请实施例提供的一种应用场景的示意图;
图1c为本申请实施例提供的一种应用场景的示意图;
图2为本申请实施例提供的一种雷达传感器的原理示意图;
图3a为一种车辆仿真方法的流程示意图;
图3b为本申请实施例提供的一种车辆仿真方法的测试环境示意图;
图3c为本申请实施例提供的一种车辆遮挡场景示意图;
图4a为本申请实施例提供的一种车辆采集测量数据的场景示意图;
图4b为本申请实施例提供的一种传感器的仿真方法的流程示意图;
图4c为本申请实施例提供的一种车辆采集的测量数据的示意图;
图4d为本申请实施例提供的一种传感器的仿真方法的示意图;
图4e为本申请实施例提供的一种传感器的仿真方法的示意图;
图5a为本申请实施例提供的一种车辆的仿真结构示意图;
图5b为本申请实施例提供的一种车辆的仿真方法的流程示意图;
图6a为本申请实施例提供的一种车辆的传感器的探测范围的示意图;
图6b为本申请实施例提供的一种确定目标对象的示意图;
图7a-图7d为本申请实施例提供的一种确定目标对象的示意图;
图8a为本申请实施例提供的一种车辆的仿真结构示意图;
图8b为本申请实施例提供的一种车辆的仿真方法的流程示意图;
图9为本申请实施例提供的一种车辆的仿真装置的结构示意图;
图10为本申请实施例提供的一种车辆的仿真装置的结构示意图;
图11为本申请实施例提供的一种传感器的仿真装置的结构示意图;
图12为本申请实施例提供的一种传感器的仿真装置的结构示意图。
具体实施方式
本申请的说明书实施例和权利要求书及附图中的术语“第一”、“第二”等仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
应当理解,在本申请中,“至少一个(项)”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系,例如,“A和/或B”可以表示:只存在A,只存在B以及同时存在A和B三种情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达, 是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。
图1a是本申请实施例车辆100的一种示例性功能框图。在一个实施例中,车辆100可以配置为完全或部分地自动驾驶模式。例如,车辆100可以在处于自动驾驶模式中可以同时控制自身,并且可以通过人为操作来确定车辆及其周边环境的当前状态,确定周边环境中的至少一个其他车辆的可能行为,并确定该其他车辆执行可能行为的可能性相对应的置信水平,并基于所确定的信息来控制车辆100。在车辆100处于自动驾驶模式中时,可以将车辆100置为在没有和人交互的情况下操作。
如图1a所示,耦合到车辆100或包括在车辆100中的组件可以包括推进系统110、传感器系统120、控制系统130、外围设备140、电源150、计算机系统160以及用户接口170。车辆100的组件可以被配置为以与彼此互连和/或与耦合到各系统的其它组件互连的方式工作。例如,电源150可以向车辆100的所有组件提供电力。计算机系统160可以被配置为从推进系统110、传感器系统120、控制系统130和外围设备140接收数据并对它们进行控制。计算机系统160还可以被配置为在用户接口170上生成图像的显示并从用户接口170接收输入。
需要说明的是,在其它示例中,车辆100可以包括更多、更少或不同的系统,并且每个系统可以包括更多、更少或不同的组件。此外,示出的系统和组件可以按任意种的方式进行组合或划分,本申请对此不做具体限定。
推进系统110可以为车辆100提供动力运动。如图1a所示,推进系统110可以包括引擎/发动机114、能量源113、传动装置(transmission)112和车轮/轮胎111。另外,推进系统110可以额外地或可替换地包括除了图1a所示出的组件以外的其他组件。本申请对此不做具体限定。
传感器系统120可以包括用于感测关于车辆100所位于的环境的信息的若干个传感器。如图1a所示,传感器系统120的传感器包括全球定位系统(Global PositioningSystem,GPS)126、惯性测量单元(Inertial Measurement Unit,IMU)125、激光雷达122、相机传感器123、毫米波雷达124以及用于修改传感器的位置和/或朝向的制动器121。毫米波雷达124可利用无线电信号来感测车辆100的周边环境内的目标。在一些实施例中,除了感测目标以外,毫米波雷达124还可用于感测目标的速度和/或前进方向。激光雷达122可利用激光来感测车辆100所位于的环境中的目标。在一些实施例中,激光雷达122可包括一个或多个激光源、激光扫描器以及一个或多个检测器,以及其他系统组件。相机传感器123可用于捕捉车辆100的周边环境的多个图像。相机传感器123可以是静态相机或视频相机。
GPS 126可以为用于估计车辆100的地理位置的任何传感器。为此,GPS 126可以包括收发器,基于卫星定位数据估计车辆100相对于地球的位置。在示例中,计算机系统160可以用于结合地图数据使用GPS 126来估计车辆100行驶的道路。IMU125可以用于基于惯性加速度及其任意组合来感测车辆100的位置和朝向变化。在一些示例中,IMU125中传感器的组合可包括例如加速度计和陀螺仪。另外,IMU 125中传感器的其它组合也是可能的。
传感器系统120还可包括被监视车辆100的内部系统的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个的传感器数据可用于 检测对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是车辆100的安全操作的关键功能。传感器系统120还可以包括其它传感器。本申请对此不做具体限定。
控制系统130为控制车辆100及其组件的操作。控制系统130可包括各种元件,其中包括转向单元136、油门135、制动单元134、传感器融合算法133、计算机视觉系统132、路线控制系统131以及障碍规避系统137。转向单元136可操作来调整车辆100的前进方向。例如在一个实施例中可以为方向盘系统。油门135用于控制引擎114的操作速度并进而控制车辆100的速度。控制系统130可以额外地或可替换地包括除了图1a所示出的组件以外的其他组件。本申请对此不做具体限定。
制动单元134用于控制车辆100减速。制动单元134可使用摩擦力来减慢车轮111。在其他实施例中,制动单元134可将车轮111的动能转换为电流。制动单元134也可采取其他形式来减慢车轮111转速从而控制车辆100的速度。计算机视觉系统132可以操作来处理和分析由相机传感器123捕捉的图像以便识别车辆100周边环境中的目标和/或特征。所述目标和/或特征可包括交通信号、道路边界和障碍物。计算机视觉系统132可使用目标识别算法、运动中恢复结构(structure from motion,SFM)算法、视频跟踪和其他计算机视觉技术。在一些实施例中,计算机视觉系统132可以用于为环境绘制地图、跟踪目标、估计目标的速度等等。路线控制系统131用于确定车辆100的行驶路线。在一些实施例中,路线控制系统131可结合来自传感器系统120、GPS 126和一个或多个预定地图的数据以为车辆100确定行驶路线。障碍规避系统137用于识别、评估和避免或者以其他方式越过车辆100的环境中的潜在障碍物。当然,在一个实例中,控制系统130可以增加或替换地包括除了所示出和描述的那些以外的组件。或者也可以减少一部分上述示出的组件。
外围设备140可以被配置为允许车辆100与外部传感器、其它车辆和/或用户交互。为此,外围设备140可以包括例如无线通信系统144、触摸屏143、麦克风142和/或扬声器141。外围设备140可以额外地或可替换地包括除了图1a所示出的组件以外的其他组件。本申请对此不做具体限定。
在一些实施例中,外围设备140提供车辆100的用户与用户接口170交互的手段。例如,触摸屏143可向车辆100的用户提供信息。用户接口170还可操作触摸屏143来接收用户的输入。在其他情况中,外围设备140可提供用于车辆100与位于车内的其它设备通信的手段。例如,麦克风142可从车辆100的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器141可向车辆100的用户输出音频。
无线通信系统144可以直接地或者经由通信网络来与一个或多个设备无线通信。例如,无线通信系统144可使用3G蜂窝通信,例如码分多址(code division multiple access,CDMA)、EVD0、全球移动通信系统(global system for mobile communications,GSM)/通用分组无线服务技术(general packet radio service,GPRS),或者4G蜂窝通信,例如长期演进(long term evolution,LTE),或者5G蜂窝通信。无线通信系统144可利用无线保真(wirelessfidelity,WiFi)与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信系统144可利用红外链路、蓝牙或ZigBee等无线协议与设备直接通信。其他无线协议,例如各种车辆通信系统,例如,无线通信系统144可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备,这些设备可包括车辆和/或路边台站之间的公共和/或私有数据通信。
电源150可以被配置为向车辆100的一些或全部组件提供电力。为此,电源150可以 包括例如可再充电锂离子或铅酸电池。在一些示例中,一个或多个电池组可被配置为提供电力。其它电源材料和配置也是可能的。在一些示例中,电源150和能量源113可以一起实现,如一些全电动车中那样。车辆100的组件可以被配置为以与在其各自的系统内部和/或外部的其它组件互连的方式工作。为此,车辆100的组件和系统可以通过系统总线、网络和/或其它连接机制通信地链接在一起。
车辆100的部分或所有功能受计算机系统160控制。计算机系统160可包括处理器161、收发器162和存储器163。其中,处理器161执行存储在例如存储器163这样的非暂态计算机可读介质中的指令1631。计算机系统160还可以是采用分布式方式控制车辆100的个体组件或子系统的多个计算设备。
处理器161可以是任何常规的处理器,诸如商业可获得的中央处理器(central processing unit,CPU)。替选地,该处理器可以是诸如专用集成电路(application specific integrated circuits,ASIC)或其它基于硬件的处理器的专用设备。尽管图1a功能性地图示了处理器、存储器、和在相同块中的计算机系统160的其它元件,但是本领域的普通技术人员应该理解该处理器、计算机、或存储器实际上可以包括可以或者可以不存储在相同的物理外壳内的多个处理器、计算机、或存储器。例如,存储器可以是硬盘驱动器或位于不同于计算机系统160的外壳内的其它存储介质。因此,对处理器或计算机的引用将被理解为包括对可以或者可以不并行操作的处理器或计算机或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其自己的处理器,所述处理器只执行与特定于组件的功能相关的计算。
在此处所描述的各个方面中,处理器可以位于远离该车辆并且与该车辆进行无线通信。在其它方面中,此处所描述的过程中的一些在布置于车辆内的处理器上执行而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。
在一些实施例中,存储器163可包含指令1631(例如,程序逻辑),指令1631可被处理器161执行来执行车辆100的各种功能,包括以上描述的那些功能。存储器163也可包含额外的指令,包括向推进系统110、传感器系统120、控制系统130和外围设备140中的一个或多个发送数据、从其接收数据、与其交互和/或对其进行控制的指令。
除了指令1631以外,存储器163还可存储数据,例如道路地图、路线信息,车辆的位置、方向、速度以及其它这样的车辆数据,以及其他信息。这种信息可在车辆100在自主、半自主和/或手动模式中操作期间被车辆100和计算机系统160使用。
用户接口170,用于向车辆100的用户提供信息或从其接收信息。可选地,用户接口170可包括在外围设备140的集合内的一个或多个输入/输出设备,例如无线通信系统144、触摸屏143、麦克风142和扬声器141。
计算机系统160可基于从各种子系统(例如,推进系统110、传感器系统120和控制系统130)以及从用户接口170接收的输入来控制车辆100的功能。例如,计算机系统160可利用来自控制系统130的输入以便控制转向单元136来避免由传感器系统120和障碍规避系统137检测到的障碍物。在一些实施例中,计算机系统160可操作来对车辆100及其子系统的许多方面提供控制。
可选地,上述这些组件中的一个或多个可与车辆100分开安装或关联。例如,存储器163可以部分或完全地与车辆100分开存在。上述组件可以按有线和/或无线方式来通信地耦合在一起。
可选地,上述组件只是一个示例,实际应用中,上述各个模块中的组件有可能根据实际需要增添或者删除,图1a不应理解为对本申请实施例的限制。
在道路行进的自动驾驶汽车,如上面的车辆100,可以识别其周围环境内的目标以确定对当前速度的调整。所述目标可以是其它车辆、交通控制设备、或者其它类型的目标。在一些示例中,可以独立地考虑每个识别的目标,并且基于目标的各自的特性,诸如它的当前速度、加速度、与车辆的间距等,可以用来确定自动驾驶汽车所要调整的速度。
可选地,自动驾驶汽车车辆100或者与自动驾驶车辆100相关联的计算设备(如图1a的计算机系统160、计算机视觉系统132、存储器163)可以基于所识别的目标的特性和周围环境的状态(例如,交通、雨、道路上的冰等等)来预测所述识别的目标的行为。可选地,每一个所识别的目标都依赖于彼此的行为,因此还可以将所识别的所有目标全部一起考虑来预测单个识别的目标的行为。车辆100能够基于预测的所述识别的目标的行为来调整它的速度。换句话说,自动驾驶汽车能够基于所预测的目标的行为来确定车辆将需要调整到(例如,加速、减速、或者停止)什么稳定状态。在这个过程中,也可以考虑其它因素来确定车辆100的速度,诸如,车辆100在行驶的道路中的横向位置、道路的曲率、静态和动态目标的接近度等等。
除了提供调整自动驾驶汽车的速度的指令之外,计算设备还可以提供修改车辆100的转向角的指令,以使得自动驾驶汽车遵循给定的轨迹和/或维持与自动驾驶汽车附近的目标(例如,道路上的相邻车道中的轿车)的安全横向和纵向距离。
上述车辆100可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不做特别的限定。
此外,同样需要说明的是,本申请实施例中所述的雷达系统可以应用于多种领域,示例性地,本申请实施例中的雷达系统包括但不限于车载雷达、路边交通雷达,无人机雷达。
下面具体介绍传感器系统。
汽车上的传感器可根据感知方式分为两大类:被动感知类传感器和主动感知类传感器。
其中,被动感知类传感器的依赖外界环境的辐射信息。
例如,典型的被动感知类传感器是摄像头,摄像头的感知不是通过发射和接收能量波的形式,它的感知结果准确性主要是取决于图像处理和分类算法。
相机传感器123可以包括用于获取车辆100所位于的环境的图像的任何相机(例如,静态相机、视频相机等)。为此,相机传感器123可以被配置为检测可见光,或可以被配置为检测来自光谱的其它部分(诸如红外光或紫外光)的光。其它类型的相机传感器123也是可能的。相机传感器123可以是二维检测器,或可以具有三维空间范围检测功能。在一些示例中,相机传感器123例如可以是距离检测器,其被配置为生成指示从相机传感器123到环境中的若干点的距离的二维图像。为此,相机传感器123可以使用一种或多种距离检测技术。例如,相机传感器123可以被配置为使用结构光技术,其中车辆100利用预定光图案,诸如栅格或棋盘格图案,对环境中的物体进行照射,并且使用相机传感器123检测从物体的预定光图案的反射。基于反射的光图案中的畸变,车辆100可以被配置为检测到物体上的点的距离。预定光图案可以包括红外光或其它波长的光。相机传感器123可以包括用于获取车辆100所位于的环境的图像的任何相机(例如,静态相机、视频相机等)。在一些示例中,相机传感器123例如可以是距离检测器,其被配置为生成指示从相机传感器123 到环境中的若干点的距离的二维图像。为此,相机传感器123可以使用一种或多种距离检测技术。当相机传感器感知到图像感知区域中存在目标时,将图像信息传输至处理模块,由处理模块进行进一步处理。
其中,相机传感器123可以为下列相机传感器中的一种或多种,例如:1)红外线相机传感器(infrared radiation-red green blue image sensor,IR-RGB image sensor),采用CCD单元(charge-coupled device,电荷耦合器件)或标准CMOS单元(complementary meta-oxide semiconductor,互补金属氧化物半导体),通过滤波片滤波,只允许透过彩色波长段和设定的红外波长段的光,在图像信号处理器中分离IR(infrared radiation,红外)图像数据流以及RGB(red green blue,三原色)图像数据流,IR图像数据流为微光环境下得到的图像数据流,分离得到的该两个图像数据流用做其他应用处理。2)可见光相机传感器,采用CCD单元(charge-coupled device,电荷耦合器件)或标准CMOS单元(complementary meta-oxide semiconductor,互补金属氧化物半导体),获得可见光数据图像。
主动感知类传感器是通过主动发射能量波进行环境感知。例如,主动感知类传感器可以是雷达传感器。车载的雷达传感器通过天线向外发射检测信号(电磁波)以及接收目标反射的信号,对目标反射的信号进行放大以及下变频等处理,得到汽车与目标之间的相对距离、相对速度以及角度等信息,然后根据得到的信息进行目标跟踪和识别分类,并经过合理决策后,能够实现障碍物测量、碰撞预测、自适应巡航控制等功能。例如,雷达传感器根据得到的信息进行目标跟踪和识别分类后,经合理决策后,以声、光及触觉等多种方式告知或警告驾驶员,或者及时对汽车做出主动干预,可以有效地降低驾驶难度、减少驾驶员负担以及减少事故的发生率,从而保证驾驶过程的安全性和舒适性,因而在汽车领域得到了广泛应用。
雷达传感器基于不同的测量范围可以分为长距雷达(Long Range Radar,LRR)、中距雷达(Middle RangeRadar,MRR)和短距雷达(Short Range Radar,SRR)。
其中,LRR具有测距与防碰撞功能,广泛应用于自适应巡航控制(Adaptive Cruise Control,ACC)、前向碰撞警告(Forward Collision Warning,FCW)、自动紧急刹车(Automatic Emergency Brake,AEB)等领域。示例性的,将LRR安装在车辆前方保险杠的正中心位置,方位角为0°,当高度低于50cm时仰角设置为1.5°,当高度超过50cm时仰角设置为0°,这样可以实现卡车150米,汽车100米,行人60米的运动目标检测能力。LRR的ACC、FCW、AEB等功能在驾驶者分神、疲劳犯困或者使用手机等未能注意到前方状况时具有显著的安全提示效果。
MRR和SRR具有盲点检测(Blind Spot Detection,BSD)、车道变换辅助(Lane Change Assistance,LCA)、后向目标横穿警告(Rear CrossTraffic Alert,RCTA)、开门辅助(Exit Assistant Function,EAF)、前向目标横穿警告(Forward Cross Traffic Alert,FCTA)等功能,能精确探测车辆前后左右一定范围内的目标。而作为ADAS系统中的典型应用,SRR在BSD、LCA等领域可以有效降低驾驶员在夜晚、雾天、大雨等气候恶劣条件下观察不便导致的危险系数,以及避免驾驶员在并道操作过程中,相邻车道和“视野”盲区可能碰撞的险境。
不同的应用场景对雷达的检测距离有不同的需求,LRR、MRR和SRR均在高级驾驶辅助系统(Advanced Driving Assistant System,ADAS)中承担重要的功能。
下面以具体的雷达传感器举例说明。
超声波雷达传感器,超声波是指频率高于20千赫兹的机械波。为了以超声波作为检测手段,必须产生超生波和接收超声波。完成这种功能的装置就是超声波雷达。超声波雷达有发送器和接收器,但一个超声波雷达也可具有发送和接收声波的双重作用。超声波雷达是利用压电效应的原理将电能和超声波相互转化,即在发射超声波的时候,将电能转换,发射超声波;而在收到回波的时候,则将超声振动转换成电信号。
毫米波雷达传感器(Millimeter-Wave Radar),是工作在毫米波波段(millimeter wav)探测的雷达。通常毫米波是指30~300吉赫(GHz)频域(波长为1~10毫米)的。毫米波的波长介于微波和厘米波之间,因此毫米波雷达兼有微波雷达和光电雷达的一些优点。具有体积小、质量轻和空间分辨率高的特点,穿透雾、烟、灰尘的能力强,广泛应用于车辆、飞机等的导航系统中。毫米波雷达传感器的测量值具备深度信息,可以提供目标的距离;其次,由于毫米波雷达传感器有明显的多普勒效应,对速度非常敏感,可以直接获得目标的速度,通过检测其多普勒频移可将目标的速度提取出来。目前主流的两种车载毫米波雷达应用频段分别为24GHz和77GHz,前者波长约为1.25cm,主要用于短距离感知,如车身周围环境、盲点、泊车辅助、变道辅助等;后者波长约为4mm,用于中长距离测量,如自动跟车、自适应巡航(ACC)、紧急制动(AEB)等。
激光雷达传感器可以被看作物体检测系统,该传感器使用光感测检测车辆100所位于的环境中的物体。激光雷达,工作在红外和可见光波段的,以激光为工作光束的雷达称为激光雷达。而激光雷达的工作原理是向目标发射探测信号(激光束),然后将接收到的从目标反射回来的信号(目标回波)与发射信号进行比较,作适当处理后,就可获得目标的有关信息,如目标距离、方位、高度、速度、姿态、甚至形状等参数。通常激光雷达传感器可以通过利用光照射目标来测量到目标的距离或目标的其它属性的光学遥感技术。作为示例,激光雷达传感器可以包括被配置为发射激光脉冲的激光源和/或激光扫描仪,和用于为接收激光脉冲的反射的检测器。例如,激光雷达传感器可以包括由转镜反射的激光测距仪,并且以一维或二维围绕数字化场景扫描激光,从而以指定角度间隔采集距离测量值。在示例中,激光雷达传感器可包括诸如光(例如,激光)源、扫描仪和光学系统、光检测器和接收器电子器件之类的组件,以及位置和导航系统。激光雷达传感器通过扫描一个物体上反射回来的激光确定物体的距离,可以形成精度高达厘米级的3D环境图。激光雷达传感器可以被看作物体检测系统,该传感器利用光照射目标来测量到目标的距离。
如图1b所示,为本申请提供的一种可能的应用场景示意图。该场景中,雷达传感器可以被安装在车辆上,例如,本申请中的传感器可应用于高级驾驶辅助系统(advanced driving assistant system,ADAS)(例如自动驾驶)、机器人、无人机、网联车、安防监控等领域。该场景中,雷达传感器可被安装在移动设备上,例如,雷达传感器可以安装在机动车辆(例如无人车、智能车、电动车、数字汽车等)上,用作车载雷达;再比如雷达可以安装在无人机上,作为机载雷达,等等。结合图1a中的车辆的示例,如图1b所示,部署于车辆前端的雷达传感器可感知如实线框所示的扇形区域,该扇形区域可以为雷达感知区域,当雷达传感器感知到雷达感知区域中存在目标时,将雷达信号信息传输至处理模块,由处理模块进行进一步处理。处理模块在接收到雷达传感器的信息后,输出目标雷达的测量信息(例如,目标对象的相对距离、角度、相对速度)。需要说明的是,此处中的处理模块既可以是独立于雷达传感器的计算机或计算机中的软件模块,例如,计算机系统160中的处理模块,还可以是部署于雷达传感器中的计算机或计算机中的软件模块,此处不作 限定。
可见,将上述传感器安装在车身上,可以实时或周期性地获取传感器感测到车辆的经纬度、速度、朝向、周围物体的距离等测量信息,再根据这些测量信息实现车辆的辅助驾驶或无人驾驶。例如,利用经纬度确定车辆的位置,或利用速度和朝向确定车辆在未来一段时间的行驶方向和目的,或利用周围物体的距离确定车辆周围的障碍物数量、密度等。
如图1c所示,为本申请提供的另一种可能的应用场景。本申请涉及的雷达传感器还可以安装在固定装置,例如,雷达传感器可以安装于路侧单元(road side unit,RSU)、屋顶或基站等。例如,如图1c所示的雷达1,雷达2,雷达3和雷达4。对于雷达安装于固定装置的场景中,雷达需要固定装置中的其它装置的协助以确定自身当前的位置和转向信息,这样可保证测量数据的可用性。例如,固定装置中还可以包括全球定位系统(global positioning system,GPS)装置和惯性测量单元(inertial measurement unit,IMU)装置,雷达可以结合GPS装置和IMU装置的测量数据进而得到目标的位置、速度等特征量。例如,雷达可以通过固定装置中的GPS装置提供固定装置的地理位置信息,通过IMU装置记录固定装置的姿态和转向信息。再根据回波信号和发射激光束确定与目标之间的距离后,可以通过GPS装置提供的地理位置信息或IMU装置提供的姿态和转向信息中的至少一种,将目标的测量点由相对坐标系转换为绝对坐标系上的位置点,得到目标的地理位置信息,从而使雷达可以应用于固定装置中。
本申请中的雷达传感器可以是激光雷达、也可以是微波雷达、或者也可以是毫米波雷达,本申请实施例对此不作限定。
在下文的介绍中,为了便于说明,如下以激光雷达为例来说明雷达传感器的工作过程。需要说明的是,激光雷达发射的电磁波称为激光束,微波雷达发射的电磁波称为微波,毫米波雷达发射的激光束称为毫米波。也就是说,下文中的激光雷达可用毫米波雷达替换,电磁波可用毫米波替换;下文中的激光雷达也可用于微波雷达替换,电磁波可用微波替换。
需要说明的是,申请对各场景中包括的雷达传感器的数量、目标的数量均不做限定。例如,场景中可包括多个安装在传感器的雷达传感器和可移动的目标,本申请还可应用于其它可能的场景。例如图1c所示的车路协同(或称为智能车路协同系统)场景。再比如,自动导引运输车(automated guided vehicle,AGV)小车场景,其中,AGV小车指装备有电磁或光学等自动导航装置,能够沿规定的导航路径行驶,具有安全保护以及各种移载功能的运输车。再比如,远程交互及真实场景再现,该场景例如可以是远医疗或者远程培训、游戏交互(如多人在虚拟场景中共同玩游戏、训练或参与其它活动)或危险场景训练等。再比如,人脸识别等场景。此处不再一一列举。
如图2所示,为本申请提供的一种雷达探测目标的原理示意图。该雷达可包括发射机和接收机。发射机用于发射电磁波能量束,电磁波经过收发转换开关传给天线,天线再将电磁波沿着某一方向和角度发射到空中,若在沿电磁波能量束的发射方向的一定距离内存在目标,则该电磁波能量束被目标反射,电磁波遇到目标对象会有一部分能量得到反射并被毫米波雷达的天线接收到,进而通过收发转换开关传给接收机。以电磁波能量束的发射方向存在目标(如图1a中的汽车)为例,发射机发射的电磁波能量束在到达目标后,在目标的表面发生反射,被反射的信号作为回波光信号返回至接收机,接收机用于根据接收到的回波信号和发射的电磁波能量束,确定出与目标相关的信息。例如与目标的距离、目标 的点云密度等。雷达传感器通过发射机发射电磁波能量束,进一步经信号处理机处理得到目标对象的相对距离、角度、相对速度。
下面以毫米波雷达传感器为例说明雷达传感器的实现方式。例如,毫米波雷达传感器可以包括振荡器、发射天线、接收天线、混频器、处理器和控制器等装置。具体步骤可以包括:
步骤一、雷达中的波形生成器(waveform generation)产生发射信号,然后通过发射天线(transmit antenna)进行发射发射信号。
例如,振荡器会产生一个频率随时间线性增加的雷达信号,该雷达信号一般是调频连续波。雷达探测装置一般会在一段连续的时长内进行多个扫频周期的雷达信号发送。此处的扫频周期指进行一个完整波形的雷达信号发射的周期。在一个发射周期的开始,雷达探测装置的会以一个频率发射雷达信号,该频率称为雷达探测装置的初始频率。并且雷达探测装置的发射频率以该初始频率为基础在发射周期内变化。
该雷达信号的一部分经过定向耦合器输出至混频器作为本振信号,一部分通过发射天线发射出去。发射信号通常为带有载频的线性调频信号,发射信号s T(t)表达式可以为:
Figure PCTCN2022078997-appb-000001
其中f T表示载频,B sw表示发射信号带宽,T CPI表示发射信号的持续时间。
步骤二、发射信号经过障碍物反射后,被接收天线(receive antenna)接收,例如,接收天线接收发射出去的雷达信号遇到车辆前方的物体后,反射回来的雷达信号。接收到的信号为发射信号的延时信号,发射信号的延时信号s R(t)表达式为:
s R(t)=s T[t-τ(t)]   (2)
其中τ(t)表示发射信号从发射天线发送,经过障碍物反射,被接收天线接收的延时。
步骤三、将发射信号的延时信号和发射信号进行混频/下变频(down-conversion),然后通过采样(sampling)得到接收信号。
例如,混频器将接收的雷达信号与本振信号进行混频,得到中频(intermediate frequency,IF)信号。具体来说,通过振荡器产生的调频连续波信号,一部分作为本振信号,一部分作为发射信号通过发射天线发射出去,而接收天线接收的发射信号的反射信号,会与本振信号混频,得到中频信号。中频信号包含了目标对象与该雷达系统的相对距离、速度、以及角度等信息。中频信号经过低通滤波器并经过放大处理后输送到处理器,处理器对接收的信号进行处理,一般是对接收的信号进行快速傅里叶变换,以及频谱分析等,以得到目标对象相对于该雷达系统的距离、速度和角度等信息。
其中,通过发射信号的发射时间和不同地物的回波散射的接收时间之差,可以确定目标(地物)与雷达的距离,从而确定目标的位置。
其中,位置信息可以是目标对象相对于当前的雷达的位置信息,速度信息可以是目标对象相对于当前的雷达的速度信息,角度信息可以是目标对象相对于当前的雷达的角度信息。进一步的,中频信号的频率称为中频频率。
步骤四、处理器可以将得到的信息输出给控制器,以控制车辆的行为。
在智能汽车的仿真测试中,一个重要的部分是对自动驾驶的决策控制算法进行验证。例如,对车辆的变道能力进行验证,对前方车辆是否过近进行识别。在验证过程中,需要 构建不同的场景,在每个不同的场景中,验证车辆是否可以达到相应能力的自动驾驶的决策控制。
因此,验证自动驾驶的决策控制的过程中,需要获得传感器通过传感器确定出的目标车辆相对本车的车辆速度,目标车辆相对本车的相对车辆位置、相对距离和方位角等信息,作为自动驾驶的决策控制模拟中的输入参数。也即对传感器的模拟过程中,传感器模型,可以基于测试环境中交通参与物为输入,该传感器模型的输出参数可以是传感器可能得到的在传感器感知范围(基于几何遮挡筛选方法确定的)内的可探测物体的相对距离、相对速度、角度。从而,该传感器模型的输出参数可以作为自动驾驶的决策控制模拟中所需的传感器模块所对应的输入参数。
由于主要是对自动驾驶的决策控制算法进行验证,因此,对传感器模型构建的一种可能的方式为,通过场景中构建的可感知目标,及可感知目标相对本车的速度,可感知目标相对本车的相对车辆位置、相对距离和方位角等信息,作为传感器模块对可感知目标的输出。如图3a所示,具体过程可以包括:
步骤301、确定测试环境中的交通参与物。
其中,测试环境可以是根据需要测试的场景确定的。例如,如图3b所示,包括:仿真车辆(包括待仿真的传感器),其他车辆,非机动车,行人,道路环境,交通环境,建筑物,桥梁、路障等。
步骤302、将测试环境中的交通参与物的参数作为传感器模型的输入参数。
交通参与物可以包括:车辆,行人,道路,路障等。
交通参与物的参数可以包括:定位位置,移动速度,交通参与物的大小等建模数据。
步骤303、通过几何遮挡方法,筛选仿真车辆上的传感器的可感知目标。
一种可能的实现方式,根据不同型号的传感器,可以确定出该传感器的最大测距距离,从而人,可以确定出雷达传感器的探测范围。该数据可以是基于传感器的出厂数据确定的,也可以是根据经验得到的,在此不做限定。
其中,雷达探测装置的最大测距距离,或称雷达探测装置的最大探测距离,是与雷达探测装置的配置有关的参数(例如,与雷达探测装置的出厂设置参数相关)。例如雷达探测装置为雷达,长距自适应巡航控制(adaptive cruise control,ACC)雷达的最大测距距离为142m,中距雷达的最大测距距离为70~150m。
例如,如图3b所示,部署于车辆1前端的雷达传感器可探测如实线框所示的扇形区域,该扇形区域为雷达的探测范围。在该探测范围内的车辆可以作为传感器的可感知目标。
在一些实施例中,可以根据车辆之间的几何遮挡关系,剔除被遮挡的车辆。
例如,如图3c所示,车辆1为待测试车辆,车辆2和车辆3为车辆1的前方车辆。根据车辆2、车辆3与车辆1之间的几何位置关系,可以确定车辆3被车辆2遮挡。此时,可以将车辆3删除。从而,可以确定车辆2为车辆1的可感知目标。
步骤304、根据传感器的可感知目标的参数,确定传感器模型的输出参数。
其中,传感器的可感知目标的参数可以是根据测试环境中的交通参与物的参数确定的。结合步骤303中的示例,传感器模型的输出参数可以包括车辆2相对车辆1的相关参数。例如,车辆2相对车辆1的位置,车辆2相对车辆1的速度,和车辆2相对车辆1的角速度、车辆2相对车辆1的角度。
可选的,传感器模型的输出参数还可以是添加相应的噪声后的输出参数,用于模拟测 量误差。例如,传感器模型的输出参数可以包括车辆2相对车辆1的相对位置和相对速度参数。
步骤305、将可感知目标的参数输入至决策模块。
上述方法,对传感器模型的要求较低,结构简单,能够保证仿真时的高效率。
但是,上述方法中,考虑的是传感器的理想场景,而考虑到由于主动感知类传感器是通过主动发射能量波进行环境感知,因此,该类传感器的感知结果准确性还取决于目标对象的反射强度、能量波的传播、能量波发射和接收等因素,也就是说,主动感知类传感器的感知结果受到感知目标的材料、方位距离、环境天气等多因素的影响。但是,上述方法中,仅考虑了理想情况下,传感器可以在几何方式确定的可感知范围内检测到相应的可探测物体,而可探测物体的相对距离、相对速度、角度,并不是真实传感器得到的,而是基于场景模拟时设置的,因此,在该传感器模拟的过程中,不能反映出传感器在不同环境下对测量结果的影响。而直接采用理想情况的数据作为传感器模型的输出,则会导致模拟结果与真实结果的较大偏差。例如,在一个可能的场景中,真实环境下,传感器可以测量到前方车辆,但是,基于上述模型,通过几何区域划分确定的可感知区域的方式,传感器模型判定前方车辆被遮挡。从而,可能对后续自动驾驶的决策控制算法也带来更多的难以预测的影响,无法达到对智能汽车的仿真测试的作用。
因此,传感器模型的质量决定了仿真测试中智能汽车感知环境目标对象的逼真度。也就是说,传感器模型是否可以真实的反映出传感器在不同环境下的测量结果的影响,直接影响到对智能汽车的仿真测试结果的可信度。
基于上述问题,另一种可能的传感器模拟的方法,可以是基于上述物理特性,对雷达传感器进行建模,例如,对毫米波雷达的能量波的发射和接收、能量波的传播、目标对能量波的反射这些真实物理工作过程进行详细建模,对毫米雷达工作过程中的各个硬件模块建立数学模型,来模拟整个工作过程,如收发回路的建模涉及到振荡器、滤波器、放大器、混频器等。从而,该模型能反映毫米波雷达内部和电磁波传播的工作流程细节,得到高精度的仿真结果。但是,该方法中,建模过程复杂,消耗大量计算资源,且实时性差,难以保证仿真效率,难以满足智能汽车实时仿真测试的需求。尤其对于基于云平台的大批量场景仿真来说,这种传感器模型会带来大量计算资源的消耗,不能保证仿真效率。另外,也不适合智能汽车决策控制算法开发的过程中,尤其是智能汽车决策控制算法开发的早期,考虑的传感器的模型的参数有限,难以有效利用上述方法(针对毫米波雷达的每个模块都进行模拟)模拟出的传感器的参数,也导致了资源的浪费。
因此,本申请提供一种传感器的仿真方法,图4a所示的应用场景中可以包括测量装置和测试装置,其中,测量装置可以是具有传感器的车辆,传感器可以包括:毫米波雷达、摄像头、激光雷达等传感器。还可以包括云端的测试装置,测试装置可以包括支持运行仿真软件的硬件设备,如个人计算机、服务器、车载移动终端、工控机、嵌入式设备等。例如,测试装置可以由云端的服务器或虚拟机实现。测试装置还可以是支持运行仿真软件的芯片。
以毫米波雷达传感器的建模为例进行说明。以图4a所示的场景中传感器为雷达传感器,测量装置为车辆,测试装置为服务器为例。本申请提供一种传感器的仿真方法,如图4b所示,可以包括:
S401:获取传感器的测量数据。
其中,所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征测量值;
所述传感器特征测量值包括:RCS测量值和SNR测量值;所述传感器位于测量车辆中;所述第二目标车辆为所述测量车辆附近的车辆。
S402:根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型。
其中,所述传感器模型的样本输入为所述第二目标车辆相对所述传感器的位置信息、速度信息及标注信息,所述传感器模型的输出为所述第二目标车辆的传感器特征预测值;所述第二目标车辆的传感器特征预测值包括以下至少一项:RCS预测值和SNR预测值;
所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述传感器所在的车辆信息。
本申请在对传感器建模时,无需对传感器的每个模块进行建模的前提下,考虑了雷达传感器测量目标时的物理特性,优化传感器模型的输出结果,从而,有效提高模拟的效果。
下面举例说明,雷达传感器测量目标时的物理特性。
毫米波雷达对目标进行检测,可以得到运动的目标与雷达传感器之间的距离和速度,若毫米波雷达设置与车辆上,目标为另一车辆,则根据雷达采集的回波信号,可以确定目标车辆相对本车的车辆速度,目标车辆相对本车的相对车辆位置、相对距离和方位角等信息。
进一步的,通过接收信号,可以获得目标的RCS信息,RCS信息可以用于表述目标在雷达作用下的后向散射特征。空间目标的RCS序列,与目标的形状结构、电磁波的频率、入射场的极化形式、接收天线的极化形式、目标对于来波方向的角向位置(姿态角)等因素相关。而对于同一测量雷达来说,电磁波的频率、入射场的极化形式、接收天线的极化形式、目标对于来波方向的角向位置(姿态角)可以确定,因此,目标的RCS均值可以与目标结构和目标姿态建立关系。
例如,传感器输出的目标对象的信息还可以包括目标对象的宽度信息等结构信息。
考虑目标对象为车辆时,由于传感器与目标车辆之间的相对目标姿态通常相对稳定,例如,可以探测到车辆的车身后方,前方,侧方等位置。因此,可以通过目标的RCS均值作为识别目标的结构的特征,从而可以得到不同目标的反射强度分类,从而对目标的结构进行分类。例如,可以根据车辆的长度和形状结构区分目标车辆的类型,如小轿车,卡车,公交车等。
在地图场景中,空间中的目标姿态通常相对稳定,空间目标RCS的多次测量结果具有稳定性,因此,可以通过目标的RCS均值作为识别目标的结构的特征,从而可以得到不同目标的反射强度分类,从而对目标的结构进行分类。例如,可以区分目标为车道边界、车道线或路沿、道路的障碍物、隧道、桥梁等。
因此,考虑上述雷达传感器测量目标对象时的物理特性后,可以确定传感器模型需要具备至少以下物理特征:
传感器模型输出的目标对象的信息可以包括:目标对象相对传感器之间的位姿状态信息,及目标对象的特征信息。
其中,目标对象相对传感器之间的位姿状态信息可以包括:目标对象与传感器之间的 相对距离、目标对象与传感器之间的相对速度、目标对象与传感器之间的方位角、目标对象的宽度信息等结构信息、目标对象相对传感器之间的横摆角等信息。
目标对象的特征信息可以包括:目标对象的RCS信息、目标对象的SNR信息、目标对象极化信息等。
在S401的一些实施例中,传感器的测量数据可以是车辆的雷达传感器实际使用时,采集到的测量信息。本申请中测量信息可以包括传感器采集的测量数据,环境信息和定位信息中的至少一种,其中,环境信息可以包括周围环境中的行人数量和位置,行人密度,车辆密度,道路信息,天气信息等,定位信息可以包括当前所处位置的经纬度或该经纬度在地图上的标注等。传感器可以周期性的进行测量,然后将测量信息上报测试装置。
举例来说,车辆的传感器的预设区域如图4a中虚线框圆圈所示,以车辆为中心,以预设距离为半径的区域。该预设距离可以是小于或等于车辆A所发出的雷达信号的覆盖范围的半径的一个值。也可以是根据其他方法确定的区域,例如,如图1b所示的扇形区域,在此不做限定。如图4a所示,车辆的传感器可以在预设区域中,测量到的目标对象可以是车辆、障碍物、车道线等。通过车辆的传感器可以确定预设范围内的目标对象的测量信息。
例如,在通过车辆的传感器工作时,采集传感器输出的目标对象的测量数据。
以目标对象为车辆为例,例如,测量数据可以包括目标对象相对传感器的位置信息(例如,如图4c所示,目标对象为车辆2相对车辆1上的传感器1的距离r、目标车辆2相对传感器1的角度θ)、目标对象相对传感器的速度信息(例如,目标对象相对传感器的速度,目标对象相对传感器的角速度)。
可选的,考虑到目标对象的不同位置对雷达信号的反射强度不同(如车辆尾部反射雷达信号能力比侧面更强),因此,还可以采集目标的横摆角α数据。目标对象相对传感器的位置信息还可以包括目标对象相对传感器的横摆角α数据。如图4c所示,车辆2相对传感器1的位置信息还可以包括车辆2相对传感器1的横摆角α数据。目标的横摆角可以是人工标注的方式,也可以是通过其他传感器测量获得,在此不做限定。目标对象自身横摆角可以反映自身不同部位对雷达反射强度的不同,这样由测量信息训练得到的传感器模型精度更高。
在一些实施例中,测量数据还可以包括:特征信息的测量值。其中,例如,特征信息的测量值可以是传感器采集到的目标对象的传感器特征值,例如,传感器采集到的目标对象的信噪比SNR信息的测量值、传感器采集到的目标对象的RCS信息的测量值、传感器采集到的目标对象的极化信息的测量值等。
一种可能的实现方式,回波信号中的SNR信息的测量值、RCS信息的测量值和极化信息的测量值可以通过成像方式存储,即可以根据回波信号生成成像信息,成像信息可以理解为目标对发射信号的反映,主要是目标的后向散射形成的图像信息。成像信息可以包括多种信息,比如回波信号中的RCS信息、相位信息、幅度信息、极化信息等。根据目标反射的回波信号生成成像信息的一种可能的实现方式为,接收到回波信号后,通过回波信号进行处理,比如对回波信号进行下变频、模数转换等,进而根据处理得到的信号,采用合成孔径雷达(synthetic aperture radar,SAR)成像算法,可以得到成像信息。一种可能的实现方式,成像信息可以为点云数据的形式存储。点云数据可以包括目标的距离,方位角,俯仰角,目标速度等雷达特征信息。例如,该测量数据可以是在CAN线上传输给车辆的 处理器的数据,以便处理器根据获得的测量数据进行决策。
在一些实施例中,以目标对象为车辆为例,测量车辆的传感器可以采集到第二目标车辆返回的回波信号的特征信息。
考虑到环境信息可反映雨雪和道路材质等环境因素对雷达反射强度的影响,可选的,还可以获得在测试装置采集目标的测量数据时对应的环境信息(例如,天气、道路)等测量信息。因此,通过添加环境信息的测量信息训练得到的传感器模型精度更高。这些测量信息可以是通过人工标注的方式,也可以通过其他方式获得,例如,根据当前地图服务器中存储的道路信息等。
其中,环境信息中的天气可以分为晴天,雨天,雾霾,雪天四类,当然还可以包括其他类型的信息。
以目标对象为道路为例,测量车辆的传感器可以采集到道路上的回波信号的特征信息。例如,在不同的道路环境下,例如,有遮挡物(例如,落叶)条件下,有积水或积雪条件下,可以根据回波信号的极化信息,确定目标上的遮挡物、积水或积雪的极化特征,以确定目标上的遮挡物、积水或积雪的边界特征和材质特征,从而确定遮挡物条件下对目标的回波信号的影响、有积水下对目标的回波信号的影响或积雪条件下对目标的回波信号的影响,进而更准确的确定目标为有遮挡的车辆。因此,通过车辆采集的极化信息,可以用于训练传感器模型,使得传感器模型也可以相应的预测不同场景下的特征信息,从而为后续决策模块提供更多的信息,从而,更接近真实场景,以提高决策模块的模拟效果。
再比如,在车道有遮挡物覆盖时,可能遮挡车道线、车道边界等道路标识,此时,决策模块还可以根据传感器测量得到的遮挡物、积水或积雪的极化特征,进行去除遮挡物的处理,以提高决策效果。相应的,在传感器模型模拟传感器时,可以将传感器采集到的极化信息,作为传感器模型预测的输出参数,从而,决策模块可以根据该预测的极化信息,获得更多真实传感器的模拟信息,为相应的提高决策模块的模拟效果提供可能。
在另一种可能的方式中,由于不同的道路环境下,可能会改变目标的边界,例如,在有雨天或下雪天时,在车辆上的水或雪可能导致目标车辆的回波信号发生变化,从而可以根据回波信号中的极化信息,确定目标车辆上的材质特征是否受到雨水或积雪的影响,对车辆是否有无积水或积雪进行识别,并进一步确定积水的边界特征、道路的边界特征,以提高决策模块的决策效果。
再比如,在有积水的路面中,积水面积的大小可能导致可通行的道路发生变化,从而可以根据回波信号中的极化信息,确定道路上的积水和车道的材质特征,对车道是否有无积水进行识别,并进一步确定积水的边界特征、道路的边界特征,从而更准确的确定道路在积水路况下,根据传感器模型预测的特征信息,以便后续决策模块可以根据传感器模型预测的特征信息,确定车道当前的积水情况,例如,积水的边界信息,从而,更好的进行导航或规划路径的模拟。比如,若积水面积占用了一条车道,根据在一条车道上检测到的回波信号的极化信息对应积水下的车道的极化特征(例如,积水的边界特征、积水的材质特征),及其他车道上产生的极化信息对应无积水下的车道的极化特征(例如,积水的边界特征、积水的材质特征),因此,可以确定该车道被积水覆盖,其他车道可以通行。
另外,道路类型可以分为普通沥青路面、普通混凝土路面、桥面、隧道四类。当然还可以包括其他类型的信息。
在一些实施例中,还可以根据环境对象的属性对环境进一步划分,从而,有利于在构 建传感器模型时,提供更多的训练信息(环境信息),从而,训练出来的传感器模型可以获得不同的场景中的环境对象下更接近真实场景中的环境对象的模拟结果,提高传感器模型的效果,从而,有利于后续使用传感器模型得到的预测数据进行决策,以达到仿真的目的,提高仿真效果。
例如,根据车道、非车道区分环境对象,确定环境对象的边界信息,进而对环境对象进行识别。例如,环境对象的边界信息可以是指用于描述道路中的障碍物的边界信息的关键点或线,或者是指用于描述车道的边界信息。例如,根据车道的边界,可以将车道分为多种环境对象,环境对象边界的类型可以但不限于包括以下任意一种或多种:车道线、路沿、道路的障碍物等。车道可以分为:单车道,双车道,多车道,起始车道,中间车道,汇合车道,分叉车道,路口等。起始车道可以为:一条道路上的包括起始点的若干条车道线对应的车道。起止车道的边界可以是车道的起始线。终止车道可以为:一条道路上的包括终止点的若干条车道线对应的车道。终止车道的边界是车道的停止线。一般来说,实际应用中车道的起始线与逆向车道的停止线在一条直线上。汇合车道和分叉车道可以通过车道上的车道变换点来标定,车道变换点可以是有些道路快到路口时增设的转弯车道而产生的分叉点,也可以是通过路口进入一条新路车道减少一个而产生的汇合点,还可以是高速路/高架桥的驶出车道的分叉口,或高速路/高架桥的驶入车道的汇合点。根据车道存在的障碍物还可以进一步对车道进行分类,例如,车道还可以包括:隧道车道、高架入口车道、高架出口车道、桥等。
可选的,在另一些实施例中,可以设置在不同的场景下,获得不同传感器的测量信息。
以传感器为毫米波雷达传感器为例,假设场景包括闹市场景、郊区场景、高速路场景和特殊天气场景。
其中,闹市场景对应的传感器的参数可以包括:毫米波雷达传感器以SRR模式工作。从而,在该传感器以SRR模式工作时,得到相应的目标对象相对传感器距离r、角度θ、速度以及SNR、RCS等能量特征信息。
高速路场景对应的传感器的参数可以包括:毫米波雷达传感器以LRR模式工作。从而,在该传感器以LRR模式工作时,得到相应的目标相对传感器的距离r、角度θ、速度以及SNR、RCS等传感器特征信息。
特殊天气场景,例如雨天场景,传感器的参数可以包括:毫米波雷达传感器以SRR模式工作。在该传感器在特殊天气下以SRR模式工作时,得到相应的目标对象相对传感器距离r、角度θ、速度以及SNR、RCS、极化信息等能量特征信息。
相应的,采集测量信息的周期也可以根据需要设置,以获得更好的建模效果。
在另一些实施例中,考虑到车辆可能联合多种类型的传感器进行决策。因此,在采集传感器的测量信息时,也可以基于多种类型的传感器的场景,采集多种类型的传感器的测量信息,从而,获得更准确的环境信息,有利于模型更好的模拟不同的场景。
以不同的场景名称表示不同的测量信息类别,假设场景包括闹市场景、郊区场景和高速路场景。
其中,闹市场景对应的参数可以包括GPS以高精度定位模式工作,IMU和相机传感器以设定周期每隔固定时间上报测量信息,激光雷达传感器和毫米波雷达传感器以SRR模式工作。从而,确定的测量信息包括:传感器的定位信息,IMU和相机传感器上报的测量信息,及雷达传感器上报的测量信息。
当然,还可以在该场景下,采集MRR类型或LRR类型的传感器模型采集到的测量数据,以提供更多的训练样本,提高模型的精度和鲁棒性。
郊区场景对应的参数可以包括GPS以低精度定位模式工作,IMU以设定周期每隔固定时间上报测量信息,相机传感器在检测到设定范围内出现行人时上报测量信息,激光雷达传感器和毫米波雷达传感器以MRR模式工作。从而,确定的测量信息包括:传感器的定位信息,IMU和相机传感器上报的测量信息,及雷达传感器上报的测量信息。
当然,还可以在该场景下,采集SRR类型或LRR类型的传感器模型采集到的测量数据,以提供更多的训练样本,提高模型的精度和鲁棒性。
高速路场景对应的参数可以包括GPS以低精度定位模式工作,IMU和相机传感器在检测到设定范围内出现行人或车辆时上报测量信息,激光雷达传感器和毫米波雷达传感器以LRR模式工作。从而,确定的测量信息包括:传感器的定位信息,IMU和相机传感器上报的测量信息,及雷达传感器上报的测量信息。
当然,还可以在该场景下,采集SRR类型或MRR类型的传感器模型采集到的测量数据,以提供更多的训练样本,提高模型的精度和鲁棒性。
基于传感器类别和传感器的参数之间的对应关系,测试装置可以基于不同的传感器的类型相应建模,并通过其他传感器获得更多场景相关的参数,从而,有利于后续决策模块使用更多的信息进行决策,提高验证决策模块的模拟效果。
在S402中,通过传感器在使用过程中采集到的测量信息,作为传感器模型的训练样本进行训练,得到目标车辆在与传感器之间相距不同位置信息(例如,相对距离、相对角度、横摆角)、速度信息及在不同的环境信息(例如,不同的天气、不同的路况、不同的道路类型)下时对应的传感器模型。
该传感器模型的输出为传感器的特征信息(例如,SNR和RCS,极化信息等)的预测值,其他测量信息(例如,除传感器的特征信息之外的测量数据、定位信息和环境信息等)作为监督学习训练的传感器模型输入。因此,在训练过程中,一个训练样本中,可以包括训练数据和验证数据。其中,训练数据为:传感器模型输入数据,即除传感器的特征信息之外的测量数据、定位信息和环境信息等测量信息。验证数据为训练样本中的传感器的特征信息的测量值。
以传感器模型训练一个毫米波雷达传感器模型为例,此时,毫米波雷达传感器模型的输出参数可以为毫米波雷达传感器的特征信息的预测值,例如,SNR的预测值和RCS的预测值,极化信息的预测值等。毫米波雷达传感器模型的输入参数可以包括:目标相对传感器的位置信息(距离r、角度θ、横摆角)、速度信息、环境信息、定位信息等除特征信息之外的测量信息。
其中,环境信息可以包括:天气类型、道路类型等。环境信息还可以包括通过其他传感器获得的参数,例如,在传感器的可感知范围内,是否有落叶遮挡,雨水遮挡,积雪遮挡等情况。
如图4d所示,针对不同类型的传感器,可以分别训练相应的传感器模型。
在一些实施例中,可以采用支持向量回归(support vector regression,SVR)模型的监督学习算法来对该类型的传感器采集到的测量信息进行训练。其中,SVR模型的输入数据可以包括:测量信息中除传感器的特征信息之外的其他测量信息。
SVR模型的输出数据可以包括:传感器的特征信息的预测值,例如,SNR的预测值,RCS的预测值。
其中,在训练过程中,可以针对每个传感器的特征信息进行训练,例如,针对SNR的预测值进行训练,在SNR的特征信息训练达到模型的精度要求后,可以再针对RCS的预测值进行训练。或者,还可以针对RCS的预测值进行训练,在RCS的特征信息训练达到模型的精度要求后,可以再针对SNR的预测值进行训练。当然,还可以针对所有的特征信息一起进行训练,在此不做限定。
可选的,相同传感器类型的测量信息还可以针对不同的场景下采集到的测量信息分别进行训练,假设场景包括闹市场景、郊区场景和高速路场景。
其中,闹市场景对应的参数可以包括GPS 126以高精度定位模式工作,IMU 125和相机传感器123以设定周期每隔固定时间上报测量信息,激光雷达传感器和毫米波雷达传感器以SRR模式工作;从而,针对激光雷达传感器或毫米波雷达传感器的传感器模型而言,可以将测量到的测量信息存储在SRR类型和闹市场景下,以便后续传感器模型调用相应的测量信息作为训练数据进行训练。当然,还可以在该场景下,采集其他类型的传感器模型采集到的测量数据,以提供更多的训练样本,提高模型的精度和鲁棒性。
郊区场景对应的配置参数可以包括GPS 126以低精度定位模式工作,IMU 125以设定周期每隔固定时间上报测量信息,相机传感器123在检测到设定范围内出现行人时上报测量信息,激光雷达传感器和毫米波雷达传感器以MRR模式工作;从而,针对激光雷达传感器或毫米波雷达传感器的传感器模型而言,可以将测量到的测量信息存储在MRR类型和郊区场景下,以便后续传感器模型调用相应的测量信息作为训练数据进行训练。
高速路场景对应的配置参数可以包括GPS 126以低精度定位模式工作,IMU 125和相机传感器123在检测到设定范围内出现行人或车辆时上报测量信息,激光雷达传感器和毫米波雷达传感器以LRR模式工作。从而,针对激光雷达传感器或毫米波雷达传感器的传感器模型而言,可以将测量到的测量信息存储在LRR类型和高速路场景下,以便后续传感器模型调用相应的测量信息作为训练数据进行训练。
在一些实施例中,在训练SRR类型的传感器模型时,可以训练多种场景。例如,在训练闹市场景时,可以选择采用闹市场景下的SRR类型的传感器采集到的测量信息的训练样本进行训练。例如,在训练郊区场景时,可以选择采用郊区场景下的SRR类型的传感器采集到的测量信息的训练样本进行训练。例如,在训练郊区场景时,可以选择采用郊区场景下的MRR类型的传感器采集到的测量信息的训练样本进行训练。例如,在训练郊区场景时,可以选择采用郊区场景下的LRR类型的传感器采集到的测量信息的训练样本进行训练。从而,训练出的传感器模型可以在不同场景下都可以使用。相应的,在训练MRR类型的传感器模型时,可以训练闹市场景、郊区场景和高速路场景等场景下的MRR类型的传感器模型。在训练LRR类型的传感器模型时,可以训练闹市场景、郊区场景和高速路场景等场景下的LRR类型的传感器模型。
需要说明的是,上述传感器模型为SVR模型仅为举例,传感器模型还可以是通过其他模型或算法确定的,例如,其中,传感器模型包括但不限定于回归模型、NN模型、随机森林、深度神经网络、自回归滑动平均模型(autoregressive moving average model,ARMA)、梯度提升迭代决策树(gradient boosting decision tree,GBDT)模型、或XGBoost模型等。
图5a为本申请实施例传感器的测试系统的一种示例性功能框图。如图5a所示,该系统可以应用在测试装置中,也可以应用在其他的使用载体中。下面以云服务器为载体来进行说明。该系统包括至少一个传感器模型、决策模块和场景模块,其中,传感器模型可以是对图1a所示的传感器系统120中的任一种或多种传感器进行模拟的传感器模型,决策模块和场景模块可以是一个整体集成在一个测试装置中,该测试装置,传感器模型和决策模块和场景模块也可以是独立的模块,然后二者共享测试环境的存储器。需要说明的是,本申请传感器模型和决策模块和场景模块可以采用任意可实现的组合方式实现,本申请不做具体限定。为了更好地理解本申请实施例,以下以与图5a所示的系统相同或相似的系统为例对本申请实施例进行说明。图5a所示的测试系统的应用场景中可以包括测试装置,其中,测试装置可以是具有传感器模型的测试装置,测试装置的网元包括支持运行仿真软件的硬件设备,如个人计算机、服务器、车载移动终端、工控机、嵌入式设备等。例如,测试装置可以由云端的服务器或虚拟机实现。测试装置还可以是支持运行仿真软件的芯片。如图5b所示,为本申请实施例提供的一种车辆的仿真方法,具体包括:
S501:将第一目标车辆相对仿真车辆的位置信息和速度信息及该仿真车辆的道路环境信息,输入至传感器模型中,得到该第一目标车辆的传感器特征预测值。
其中,传感器特征预测值包括以下至少一项:RCS预测值和SNR预测值;传感器模型用于仿真该仿真车辆中的传感器,第一目标车辆为仿真车辆所在的测试环境中的车辆,该第一目标车辆相对仿真车辆的位置信息和速度信息及仿真车辆的道路环境信息为根据测试环境确定的,传感器模型为根据传感器的测量数据及标注的道路环境信息训练得到的。
S502:将该第一目标车辆的传感器特征预测值输入至仿真车辆的决策模块,获得该仿真车辆的仿真决策结果。
其中,决策模块用于输出基于传感器特征预测值确定的车辆行驶决策。
本申请实施例中,通过考虑传感器的物理特性,优化传感器模型的输出结果,从而,有效提高模拟的效果。相比仅通过测试环境中的相对速度、相位距离、角度数据,作为雷达传感器的输出参数,本申请结合雷达传感器测量目标时的物理特性后,建立相应的雷达传感器模型的输出参数。从而更加接近真实毫米波雷达传感器的输出参数。
在S501之前,车辆的仿真装置可以确定测试环境中传感器和目标对象的参数。
其中,传感器为待测试的传感器,下文以该传感器位于仿真车辆上为例进行说明,传感器位于其他待测装置上,可以参考该实施例。
在一些实施例中,测试装置可以获取测试环境中涉及到的目标对象在测试环境中相对传感器的测试信息。需要说明的是,目标对象可以不限于传感器附近的目标对象。也可以是传感器附近预设区域内的目标对象。预设区域可以是根据传感器的可探测范围确定的,还可以是根据其他方式确定的,在此不做限定。目标对象不限于车辆,还可以是测试环境中的各种对象,例如,路边的建筑,行人,车道,桥梁,隧道等。
其中,测试信息可以包括:目标对象相对传感器的位姿状态信息以及环境信息等测试数据。
其中,位姿状态信息可以包括:位置信息和速度信息。环境信息可以包括:天气、道路、交通标志和交通灯数据等信息。
例如,目标对象相对传感器的相对角度、目标对象相对传感器的相对距离、目标对象 相对传感器的相对速度、目标对象相对传感器的相对角速度、目标对象相对传感器的相对加速度、目标对象相对传感器的相对角加速度、目标对象的尺寸等结构信息。
在一些实施例中,可以根据测试环境,确定出第一目标车辆为仿真车辆所在的测试环境中的车辆。进而,还可以根据测试环境,确定第一目标车辆的测试信息。例如,第一目标车辆的测试信息可以包括:第一目标车相对所述仿真车辆的位姿状态新及所述仿真车辆的环境信息等。
需要说明的是,测试信息可以是根据采集到的测量信息确定的,还可以是通过其他方式确定的。例如,在一些实施例中,测试环境可以是由智能汽车仿真测试软件提供,用于模拟真实世界的交通场景数据,从测试环境中能够提取到仿真交通对象的测试信息,包括例如,仿真软件可以是车辆测试软件(例如,VTD软件),测试环境由车辆测试软件提供。
例如,在不同的场景下可以对应有不同类型传感器的测试信息,假设场景包括闹市场景、郊区场景和高速路场景。
其中,闹市场景对应的参数可以包括GPS 126以高精度定位模式工作,IMU 125和相机传感器123以设定周期每隔固定时间上报测量信息,激光雷达传感器和毫米波雷达传感器以SRR模式工作;从而,在该场景下,可以调用SRR类型的雷达传感器相应的测试信息,以便后续传感器模型调用相应的测试信息进行预测。
郊区场景对应的配置参数可以包括GPS 126以低精度定位模式工作,IMU 125以设定周期每隔固定时间上报测量信息,相机传感器123在检测到设定范围内出现行人时上报测量信息,激光雷达传感器和毫米波雷达传感器以MRR模式工作;从而,在该场景下,可以调用MRR类型的雷达传感器相应的测试信息,以便后续传感器模型调用相应的测试信息进行预测。
高速路场景对应的配置参数可以包括GPS 126以低精度定位模式工作,IMU 125和相机传感器123在检测到设定范围内出现行人或车辆时上报测量信息,激光雷达传感器和毫米波雷达传感器以LRR模式工作。从而,在该场景下,可以调用LRR类型的雷达传感器相应的测试信息,以便后续传感器模型调用相应的测试信息进行预测。
在S501的一些实施例中,可以根据第一目标车辆相对传感器的位姿状态信息及所述仿真车辆的环境信息,输入至传感器模型中,得到所述第一目标车辆的传感器特征预测值;所述传感器特征预测值包括以下至少一项:RCS预测值和SNR预测值;所述第一目标车辆为所述仿真车辆所在的测试环境中的车辆,所述第一目标车辆相对所述仿真车辆的位姿状态信息及所述仿真车辆的道路环境信息为根据所述测试环境确定的。
其中,传感器模型是由传感器采集的测量信息经过监督学习的方式训练得到的。
在传感器模型的使用过程中,可以将测试环境中的测试信息作为输入。其中,测试信息可以包括:除传感器的特征信息之外的其他测试信息。例如,测试环境中确定出的环境信息、目标车辆相对传感器的位姿状态信息等。从而,预测输出目标对应的传感器的特征信息的预测值。例如,传感器的传感器模型输出的目标对象的SNR的预测值和RCS的预测值、极化信息的预测值等。
通过测试环境提供的通信接口获得传感器模型探测范围内的目标对象的位姿状态信息以及环境信息,作为传感器模型的输入。从而,通过传感器模型输出的预测数据,可以得到目标对象的特征信息的预测值。即可以通过传感器模型得到各个目标对象在不同的位姿状态信息以及不同的环境信息下,预测到的目标对象的特征信息的预测值。
从而,根据预测到的目标对象的特征信息的预测值和目标对象的测试信息,可以确定目标对象的预测信息。
其中,目标对象的预测信息包括:目标对象的测试信息(例如,目标对象的位姿状态信息及环境信息等测试数据),及目标对象的特征信息的预测值(例如,RCS的预测值,SNR的预测值)。
在S502中,车辆的仿真装置可以将目标对象的预测信息输入至决策模块。
在一些实施例中,可以将目标对象的预测信息作为决策控制(或融合感知)算法的输入,用于验证决策控制(或融合感知)算法,用于接收雷达传感器输出的目标对象信息作为输入进行计算,以便得出决策结果。
通过传感器模型可以得到在测试环境中的目标对象的特征信息的预测值,使得目标对象的预测信息更加接近真实毫米波雷达传感器的输出,体现了传感器的物理特性,更有利于模拟出决策算法在实际使用过程中的性能的好坏,提高模拟的效果。
本申请实施例中,还可以通过考虑传感器的物理特性产生的效果,优化传感器模型的输出结果,从而,有效提高模拟的效果。
下面举例说明,雷达传感器测量目标时的物理特性。
(1)考虑到传感器可以目标对象的分辨能力,对于距离相同且挨得较近的两个物体,雷达可能出现无法区分的情况,此时,传感器模型也应当能够将距离相同且挨得较近的两个物体输出为一个目标对象,从而,有利于后续验证自动驾驶的决策控制模块是否可以处理该传感器识别错误的场景。在一种可能的场景中,由于多径传播现象,传感器有时可以探测到被遮挡的物体,因此,传感器模型输出的目标对象也应当包括可能被遮挡的物体。
在地图场景中,空间中的目标姿态通常相对稳定,空间目标RCS的多次测量结果具有稳定性,因此,可以通过目标的RCS均值作为识别目标的结构的特征,从而可以得到不同目标的反射强度分类,从而对目标的结构进行分类。例如,可以区分目标为车道边界、车道线或路沿、道路的障碍物、隧道、桥梁等。
(2)发射信号可以包括极化信息,极化反映了波的电场矢量端点随时间变化的规律,可以按照其形成的空间轨迹形状和旋向可以分为线、圆、椭圆极化和左旋、右旋极化。电磁波的极化状态反映了雷达接收电磁波电场取向的时变特性,可以通过接收端为极化天线或极化敏感阵列,对接收信号进行极化参数估计。根据发射和接收极化方式的不同,发射信号与目标相互作用,回波散射也有差异。波长和极化方式都会影响获取的接收信号。因此,接收信号中的极化信息可以包括:目标的极化散射矩阵和电磁波的极化状态。其中,目标的极化散射矩阵为目标在一定的姿态和观测频率下对电磁波的极化散射效应。目标的极化散射矩阵表征了雷达目标对电磁波的信号的极化状态的改变,即目标受到雷达电磁波的照射,被散射电磁波的极化状态与入射电磁波的极化状态可能不同。通过目标改变电磁波的极化状态可以称为目标的去极化特性。此时,雷达目标改变了电磁波的极化状态,该极化状态的改变由目标的形状、结构和材料决定,因此,可以利用目标回波信号中的极化信息来识别目标。即极化信息可以得到不同目标的散射特征,可以用于标定目标的表面特征、形状、粗糙度等表面特征信息。进一步的,通过不同极化方式和波长的组合,可以确定目标的不同且彼此互补的极化信息,有利于获得更准确的目标的结构、材质等表面特征信息。
(3)考虑到在雷达传感器对目标进行测量时,还需考虑噪声的影响。噪声的来源主可能是发射机产生的噪声,还可能是接收机接收时的噪声,还可能来自其他雷达的干扰等。如果干扰信号的功率大于接收机灵敏度,则所述干扰信号会对当前雷达产生干扰,如果干扰信号的功率不大于接收机灵敏度,则所述干扰信号不会对当前雷达产生干扰,所述干扰信号会被处理为噪声。
因此,在雷达传感器还需通过相应的门限,判断接收到的信号是噪声还是目标对象。
考虑到不同雷达传感器本身的属性或参数不同,例如,雷达信号的发射功率、接收机的灵敏度不同,因此,相应的门限也不同。测量目标对象的结果可能会出现假阴性、假阳性的结果。
其中,假阴性是指雷达探测的过程中,采用门限检测的方法时由于噪声的普遍存在和起伏,实际存在目标,目标对象的信号能量小于某个阈值而不能被探测到,被传感器判断为没有目标。假阳性是指雷达探测的过程中,目标对象的信号能量并不高于噪声能量甚至低于噪声能量,采用门限检测的方法时由于噪声的普遍存在和起伏,但阈值设置得太小而被毫米波雷达探测到,实际不存在目标却判断为有目标。
采用门限检测的方法时,由于门限机制和噪声的普遍存在,使用雷达探测目标时,判断是否有回波信号时,会出现4种不同的情况,这4种情况分别可以用4个概率进行描述。存在目标时,判为有目标,判断正确,这种情况称为“发现”,其概率称为“发现概率”;存在目标时,判为无目标,判断错误,这种情况称为“漏报(假阴性)”,其概率称为“漏报概率;不存在目标时,判为无目标,判断正确,这种情况称为“正确不发现”,其概率称为“正确不发现概率”;不存在目标时,判为有目标,判断错误,这种情况称为“虚警(假阳性)”,其概率称为“虚警概率”。
因此,考虑上述雷达传感器测量目标对象时的物理特性后,可以确定传感器模型可以具备以下至少一项物理特征:
结合物理特性(1),考虑测量目标对象的分辨能力。考虑到传感器的分辨率的问题,对于距离相同且挨得较近的两个物体,雷达可能出现无法区分的情况,此时,传感器模型也应当能够将距离相同且挨得较近的两个物体输出为一个目标对象,从而,有利于后续验证自动驾驶的决策控制模块是否可以处理该传感器识别错误的场景。
结合物理特性(2),由于多径传播现象,传感器有时可以探测到被遮挡的物体,因此,传感器模型输出的目标对象也应当包括可能被遮挡的物体。
结合物理特性(3),测量目标对象的结果可能会出现假阴性、假阳性的结果。
相比仅通过测试环境中的相对速度、相位距离、角度数据,作为雷达传感器的输出参数,本申请结合雷达传感器测量目标时的物理特性后,建立相应的雷达传感器模型的输出参数。从而更加接近真实毫米波雷达传感器的输出参数。
在一种可能的实现方式中,可以根据传感器的可探测范围,筛选目标对象。
在一些实施例中,在S502之前,车辆的仿真装置确定所述第一目标车辆为根据候选车辆相对所述仿真车辆的位置信息和速度信息,在所述候选车辆中确定出的在所述传感器的探测范围内的车辆;所述候选车辆相对所述仿真车辆的位置信息和速度信息为根据所述测试环境确定的,所述候选车辆为所述仿真车辆所在的测试环境中的车辆。
其中,各目标对象可以是满足传感器的传感器可探测范围内的目标对象。
传感器的传感器可探测范围,可以根据雷达传感器模型在建模时获得的传感器的参数确定,考虑到传感器的传感器可探测范围可能根据环境变化,因此,也可以是根据传感器采集到的测量信息及当前测试环境中的环境信息确定的,在此不做限定。
如图6a所示,为本实施例提供的一种传感器的探测范围,该探测范围是一个锥形区域。该锥形区域可以通过以下几个参数确定。例如,传感器的左侧可探测角度β、传感器的右侧可探测角度γ,传感器的近端可探测距离可以为第一距离,传感器的远端可探测距离为第二距离。
通过传感器的探测范围,可以将不在探测范围内的目标对象删除,以减少模拟的算量。具体过程可以包括:根据雷达的可探测距离范围和可探测角度范围确定探测范围,对于不在可探测范围目标对象予以剔除,对于与可探测范围边界相交的目标对象予以保留。
例如,如图6b所示,对于完全不在车辆1的雷达可探测范围内目标对象4、目标对象5予以剔除,对于与可探测范围区域边界相交的目标对象3则予以保留,同样予以保留的还有完全在可探测范围内的目标对象1、目标对象2,不剔除完全被遮挡的目标对象2。
需要说明的是,可以看出,如图6b所示的目标对象2完全被目标对象1遮挡,但是本申请实施例中,并不将目标对象2确定为不可探测的目标对象,对于被其他物体遮挡的目标对象,从而反映由于多径传播现象,雷达传感器可能能探测到被遮挡的物体的物理特性,从而为传感器模型检测到非视线范围内的对象提供了基础。即在步骤502中,根据毫米波雷达的可探测范围参数,删除完全不在探测范围的目标对象。将遮挡情况下但出于探测范围内的目标对象也作为传感器的目标对象。
在一种可能的实现方式中,可以利用物理特性筛选目标对象和目标对象的预测信息。
依据测试环境中的目标对象的测试信息,及传感器模型预测出的测试环境无法提供的目标对象的特征信息,对目标对象进行进一步筛选,从而更好的获得接近雷达传感器输出的目标及接近目标的传感器的测量信息的预测信息。
针对每个目标对象的SNR的预测值,判断该目标对象相对于传感器是否可见。
在一些实施例中,在S502之前,车辆的仿真装置确定所述第一目标车辆的SNR预测值大于可见阈值。
可选的,还可以通过其他表示传感器的特征信息,判断该目标对象相对于传感器是否可见。本申请不做限定。下文中以SNR举例说明,判断该目标对象相对于传感器是否可见的示例。
在一些实施例中,通过目标对象的SNR的预测值与对应的可见阈值进行比较。例如,一种可能的方式中,可见阈值为1,即在回波信号中RCS的信号强度大于噪声的信号强度时,认为存在目标对象。即,在SNR的预测值大于或等于1时,认为存在目标对象。相应的,可以将SNR的预测值小于1的目标对象删除。
即考虑在SNR相对噪声过小,而不能检测到的场景,会被传感器认为没有目标,即出现假阴性的情况。还有一种可能的场景是,在噪声过大,被传感器认定为目标对象的情况,即出现假阳性的情况。从而,能反映毫米波雷达会出现假阴性、假阳性结果的特性。并进一步有效保证了非视线范围的对象也能被检测到的物理特性。
在另一些实施例中,考虑到假阳性和假阴性的可能还可以通过4个概率进行描述。即存在目标时,判为有目标,判断正确,这种情况称为“发现”,其概率称为“发现概率”;存 在目标时,判为无目标,判断错误,这种情况称为“漏报(假阴性)”,其概率称为“漏报概率;不存在目标时,判为无目标,判断正确,这种情况称为“正确不发现”,其概率称为“正确不发现概率”;不存在目标时,判为有目标,判断错误,这种情况称为“虚警(假阳性)”,其概率称为“虚警概率”。
因此,还可以通过设置相应的概率阈值,例如,设置发现概率阈值,在SNR大于发现概率阈值时,可以输出通过传感器模型输出目标对象的发现概率。再比如,设置漏报概率阈值,在SNR大于漏报概率阈值时,可以输出通过传感器模型输出目标对象的漏报概率。再比如,设置正确不发现概率阈值,在SNR大于正确不发现概率阈值时,可以输出通过传感器模型输出目标对象的正确不发现概率。再比如,设置虚警概率阈值,在SNR大于虚警概率阈值时,可以输出通过传感器模型输出目标对象的虚警概率。从而,决策模型还可以基于相应的概率,获得传感器误判的概率,从而,提高决策的精度。
另一种可能的实现方式,可以根据传感器的物理特性和目标对象的位姿状态信息,更新目标对象和目标对象的预测信息。
测试装置还可以根据目标对象的位姿状态信息中的至少一项或者组合:目标对象相对传感器的相对角度、目标对象相对传感器的相对距离、目标对象相对传感器的相对速度、目标对象相对传感器的相对角速度、目标对象相对传感器的相对加速度、目标对象相对传感器的相对角加速度等,确定是否存在无法区分的多个目标对象。
例如,可以将上文中确定出的目标对象作为候选对象,此时,可以根据第一候选对象的位姿状态信息和第二候选对象的位姿状态信息,确定第一候选对象和第二候选对象对于传感器是否无法区分。即是否将第一候选对象和第二候选对象作为一个目标对象,还是作为2个目标对象。
在一些实施例中,在S502之前,车辆的仿真装置确定所述第一目标车辆包括第一候选车辆和第二候选车辆;所述第一目标车辆的传感器特征预测值为根据所述第一候选车辆的传感器特征预测值和第二候选车辆的传感器特征预测值确定的;
车辆的仿真装置确定所述第一候选车辆和所述第二候选车辆满足:第一位置相对第二位置的相对位置小于第一位置阈值;所述第一位置为所述第一候选目标车辆相对所述仿真车辆的位置,所述第二位置为所述第二候选目标车辆相对所述仿真车辆的位置。
可选的,所述第一候选车辆和所述第二候选车辆还满足:第一速度相对第二速度的相对速度小于第一速度阈值;所述第一速度为所述第一候选目标车辆相对所述仿真车辆的速度,所述第二速度为所述第二候选目标车辆相对所述仿真车辆的速度。
以传感器无法区分2个目标对象为例,例如,2个目标对象包括第一候选对象和第二候选对象。
如图7a所示,在一些实施例中(例如,条件1),所述第一候选对象相对所述仿真车辆的第一位置信息,与所述第二候选对象相对所述仿真车辆的第二位置信息小于第一位置阈值。此时,可以认为传感器无法区分2个候选对象,采用传感器对该目标对象进行测量时,测量结果应为1个目标对象的测量结果。其中,第一位置信息可以是第一候选对象的中心位置的位置信息,第二位置信息可以是第二候选对象的中心位置的位置信息。当然还可以是其他位置信息。例如,第一位置信息可以是第一候选对象相对车辆1最近的位置的 位置信息,第二位置信息可以是第二候选对象车辆1最近的位置的位置信息。还可以根据候选对象的特征进行确定,以实现更好的模拟真实雷达传感器将不同候选对象确定为同一个目标对象的情况,本申请不做限定。
因此,该第一候选对象和该第二候选对象可以输出为一个目标对象。
一种可能的实现方式,可以将第一候选对象获得的预测信息和第二候选对象的预测信息输出为一个目标对象的预测信息。例如,该目标对象为第一目标对象,则第一目标对象的传感器特征预测值为根据所述第一候选对象的传感器特征预测值和第二候选对象的传感器特征预测值确定的。
在一些实施例中,可以通过第一候选对象的传感器特征预测值和第二候选对象的传感器特征预测值的平均值,或加权平均值,作为第一目标对象的传感器特征预测值。其中,加权方式可以根据第一候选对象和第二候选对象的特征确定,还可以基于第一候选对象与传感器之间的关系,及第一候选对象与传感器之间的关系确定,还可以根据其他因素确定,在此不做限定。
如图7b所示,在另一种可能的实施例中(例如,条件2),在确定所述第一候选对象相对所述仿真车辆的第一位置信息,与所述第二候选对象相对所述仿真车辆的第二位置信息小于第一位置阈值后,还可以通过角度信息,判断第一候选对象和第二候选对象是否满足接近的条件,会被传感器认为是同一个目标对象。在一些实施例中,所述第一候选对象和所述第二候选对象还满足:所述第一候选对象相对所述传感器的第一角度信息,与所述第二候选对象相对所述传感器的第二角度信息小于第一角度阈值。
如图7c所示,在另一种可能的实施例中(例如,条件3),在确定所述第一候选对象相对所述仿真车辆的第一位置信息,与所述第二候选对象相对所述仿真车辆的第二位置信息小于第一位置阈值后,还可以通过速度信息,判断第一候选对象和第二候选对象是否满足接近的条件,会被传感器认为是同一个目标对象。在一些实施例中,所述第一候选对象和所述第二候选对象还满足:所述第一候选对象相对所述仿真车辆的第一速度信息,与所述第二候选对象相对所述仿真车辆的第二速度信息小于第一速度阈值。
在另一种可能的实施例中(例如,条件4),在确定所述第一候选对象相对所述仿真车辆的第一位置信息,与所述第二候选对象相对所述仿真车辆的第二位置信息小于第一位置阈值后,还可以通过加速度信息,判断第一候选对象和第二候选对象是否满足接近的条件,会被传感器认为是同一个目标对象。在一些实施例中,所述第一候选对象和所述第二候选对象还满足:所述第一候选对象相对所述仿真车辆的第一加速度信息,与所述第二候选对象相对所述仿真车辆的第二加速度信息小于第一加速度阈值。
具体的阈值可以根据传感器的分辨率参数设置,还可以通过其他方式确定,例如,通过采集到的传感器的测量信息确定,在此不做限定。
当然,还可以通过其他方式判断第一候选对象和第二候选对象是否会被传感器误认为是同一个目标对象的条件。
在一些实施例中,结合上述条件,还可以额外增加不同天气情况下,判断第一候选对象和第二候选对象是否会被传感器误认为是同一个目标对象的条件。
以条件1为例,例如,在有积雪影响下,可以根据传感器模型输出的特征值,确定积雪影响是否需要考虑,从而,选择在考虑积雪影响下的第二位置阈值。第二位置阈值可能是比第一位置阈值更大的阈值,因为积雪导致传感器更容易无法区分2个候选对象。如图 7d所示,第一候选对象尾部和第二候选对象的尾部被预测出有积雪的特征信息,因此,可以选择第二位置阈值,确定第一候选对象是否与第二候选对象为传感器无法区分2个候选对象。例如,所述第一候选对象相对所述仿真车辆的第一位置信息,与所述第二候选对象相对所述仿真车辆的第二位置信息小于第二位置阈值。此时,可以认为传感器无法区分2个候选对象,采用传感器对该目标对象进行测量时,测量结果应为1个目标对象的测量结果。
相应的,还可以根据不同的天气设置条件2、条件3和条件4,可以参考条件1的相应设置,在此不再赘述。
在一些实施例中,可以设置满足上述至少几个条件时,才会被传感器认为是同一个目标对象。例如,在确定满足所有条件时,才被才会被传感器认为是同一个目标对象。还可以设置为满足至少3个条件时,会被传感器认为是同一个目标对象。满足条件的数量可以根据传感器的精度设置,在此不做限定。相应的,可以设置在不满足上述至少几个条件时,会被传感器认为是不同的目标对象。
可选的,还可以为上述条件设置优先级,例如,条件1的优先级最高,条件4的优先级最低。从而,可以更好的模拟传感器在输出目标对象时不同的目标对象被误判的场景。
从而用于反映传感器对于挨得较近的两个物体,传感器可能会出现无法区分的情况这一特性。
可选的,在传感器模型输出的预测信息上,还可以添加噪声模拟,用于模拟真实传感器受到外界环境噪声影响带来的误差。
例如,可以对输出的目标对象位姿状态信息和传感器模型输出的特征信息分别添加高斯白噪声。其中,噪声功率根据真实传感器参数进行选择,在此不做限定。
考虑到目标对象的位姿状态信息是在测试环境中提取到的理想数值,因此,通误差模拟能仿真真实传感器数据受到环境噪声影响的特性。
图8a为本申请实施例传感器的测试系统的一种示例性功能框图。如图5a所示,该系统可以应用在测试装置中,也可以应用在其他的使用载体中。下面以云服务器为载体来进行说明。该系统包括至少一个传感器模块(可以是上述训练得到的传感器模型)、传感器探测范围筛选模块、物理特性筛选模块、噪声模拟模块、决策模块和场景模块,其中,传感器模块可以是如图4d或图4e所示的对图1a所示的传感器系统120中的任一种或多种传感器进行模拟的传感器模型,决策模块和场景模块可以是一个整体集成在一个测试装置中,该一个测试装置计算机系统160,传感器模型和决策模块和场景模块也可以是独立的模块,然后二者共享测试环境的存储器。需要说明的是,本申请传感器模块和决策模块和场景模块可以采用任意可实现的组合方式实现,本申请不做具体限定。为了更好地理解本申请实施例,以下以与图8a所示的系统相同或相似的系统为例对本申请实施例进行说明。图8a所示的测试系统的应用场景中可以包括测试装置,其中,测试装置可以是具有传感器模型的测试装置,测试装置的网元包括支持运行仿真软件的硬件设备,如个人计算机、服务器、车载移动终端、工控机、嵌入式设备等。例如,测试装置可以由云端的服务器或虚拟机实现。测试装置还可以是支持运行仿真软件的芯片。下面以一个具体的示例说明本申请提供的一种车辆的仿真方法,如图8b所示,包括:
步骤801:确定测试环境中传感器和目标对象的参数。
具体可以参见步骤601。
步骤802:通过传感器的探测范围,判断目标对象相对于传感器是否可见。若是,则执行步骤803,若否,则执行步骤808。
步骤803:根据测试环境中传感器的测试信息和目标对象的测试信息,确定雷达传感器模型输出的目标对象的预测数据。
具体可以参见步骤602。
步骤804:通过目标对象的SNR的预测值,判断该目标对象相对于传感器是否可见。若是,则执行步骤805,若否,则执行步骤808。
步骤805:根据传感器的物理特性和目标对象的位姿状态信息,确定是否存在无法区分的至少两个目标对象。若是,则执行步骤808,若否,则执行步骤806。
步骤806:将无法区分的至少两个目标对象,更新为一个目标对象和更新后的目标对象的预测信息。
步骤807:输出目标对象的预测信息至决策模块。
步骤808:删除该目标对象的预测信息。
如图9所示,为本申请提供一种车辆的仿真装置900的结构示意图,该车辆的仿真装置900可以包括:传感器特征预测模块901和输出模块902。该车辆的仿真装置900可以应用于测试装置,其中,测试装置可以是具有传感器模型的测试装置,测试装置的网元可以包括支持运行仿真软件的硬件设备,如个人计算机、服务器、车载移动终端、工控机、嵌入式设备等。例如,测试装置可以由云端的服务器或虚拟机实现。测试装置还可以是支持运行仿真软件的芯片。
可选的,该车辆的仿真装置900还可以包括:第一确定模块、第二确定模块、第三确定模块。可选的,该车辆的仿真装置900还可以包括传感器模型训练模块,该传感器模型训练模块用于训练传感器模型,传感器模型训练模块,可以包括:获取模块和训练模块。
其中,传感器特征预测模块901,可以用于将第一目标车辆相对仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息,输入至传感器模型中,得到所述第一目标车辆的传感器特征预测值;所述传感器特征预测值包括以下至少一项:雷达反射截面RCS预测值和信噪比SNR预测值;其中,所述传感器模型用于仿真所述仿真车辆中的传感器,所述第一目标车辆为所述仿真车辆所在的测试环境中的车辆,所述第一目标车辆相对所述仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息为根据所述测试环境确定的,所述传感器模型为根据所述传感器的测量数据及标注的道路环境信息训练得到的;
输出模块902,用于将所述第一目标车辆的传感器特征预测值输入至所述仿真车辆的决策模块,获得所述仿真车辆的仿真决策结果;其中,该决策模块用于输出基于所述传感器特征预测值确定的车辆行驶决策。该决策模块可以是车辆的仿真装置900中的模块,也可以是另外设置的模块,在此不做限定。
一种可能的实现方式,该车辆的仿真装置900还可以包括:
第一确定模块,用于根据候选车辆相对所述仿真车辆的位置信息和速度信息,在所述候选车辆中将在所述传感器的探测范围内的车辆确定为所述第一目标车辆;所述候选车辆相对所述仿真车辆的位置信息和速度信息为根据所述测试环境确定的;所述候选车辆为所述仿真车辆所在的测试环境中的车辆。
一种可能的实现方式,该车辆的仿真装置900还可以包括:第二确定模块,用于确定 所述第一目标车辆的SNR预测值大于可见阈值。
一种可能的实现方式,还包括:第三确定模块,用于根据第一候选车辆的传感器特征预测值和第二候选车辆的传感器特征预测值确定所述第一目标车辆的传感器特征预测值;所述第一目标车辆包括第一候选车辆和第二候选车辆;所述第一候选车辆和所述第二候选车辆满足:第一位置相对第二位置的相对位置小于第一位置阈值;所述第一位置为所述第一候选目标车辆相对所述仿真车辆的位置,所述第二位置为所述第二候选目标车辆相对所述仿真车辆的位置。
一种可能的实现方式,所述第一候选车辆和所述第二候选车辆还满足:第一速度相对第二速度的相对速度小于第一速度阈值;所述第一速度为所述第一候选目标车辆相对所述仿真车辆的速度,所述第二速度为所述第二候选目标车辆相对所述仿真车辆的速度。
一种可能的实现方式,该车辆的仿真装置900还包括:传感器模型训练模块,所述传感器模型训练模块包括:
获取模块,用于获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征值;所述传感器特征值包括:RCS测量值和SNR测量值;所述传感器位于测量车辆中,所述第二目标车辆为所述测量车辆附近的车辆;
训练模块,用于根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述测量车辆的车辆信息;所述传感器模型的输入为所述第一目标车辆相对所述传感器的位置信息、速度信息及所述标注信息,所述传感器模型的输出为所述第一目标车辆的传感器特征预测值。
需要说明的是,本申请上述实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,另外,在本申请各个实施例中的各功能模块可以集成在一个处理模块中,也可以是单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述各个模块的只一个或多个可以软件、硬件、固件或其结合实现。所述软件或固件包括但不限于计算机程序指令或代码,并可以被硬件处理器所执行。所述硬件包括但不限于各类集成电路,如中央处理单元(CPU)、数字信号处理器(DSP)、现场可编程门阵列(FPGA)或专用集成电路(ASIC)。
集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
如图10所示,为本申请提供又一种车辆的仿真装置的结构示意图,车辆的仿真装置1000中包括:通信接口1010、处理器1020,以及存储器1030。
通信接口1010和存储器1030与处理器1020之间相互连接。可选的,通信接口1010和存储器1030与处理器1020之间可以通过总线相互连接;总线可以是外设部件互连标准 (peripheral component interconnect,PCI)总线或扩展工业标准结构(extended industry standard architecture,EISA)总线等。总线可以分为地址总线、数据总线、控制总线等。为便于表示,图10中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。
通信接口1010用于实现车辆的仿真装置中的通信。例如,将第一目标车辆相对仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息,输入至传感器模型中,得到所述第一目标车辆的传感器特征预测值;所述传感器特征预测值包括以下至少一项:雷达反射截面RCS预测值和信噪比SNR预测值;其中,所述传感器模型用于仿真所述仿真车辆中的传感器,所述第一目标车辆为所述仿真车辆所在的测试环境中的车辆,所述第一目标车辆相对所述仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息为根据所述测试环境确定的,所述传感器模型为根据所述传感器的测量数据及标注的道路环境信息训练得到的;将所述第一目标车辆的传感器特征预测值输入至所述仿真车辆的决策模块,获得所述仿真车辆的仿真决策结果;其中,所述决策模块用于输出基于所述传感器特征预测值确定的车辆行驶决策。
通信接口1010还可以用于实现车辆的仿真装置与其他设备之间的通信。
处理器1020用于实现上述如图4b~图8b所示的车辆的仿真方法,具体可以参见上述图4b~图8b所示的实施例中的描述,此处不再赘述。可选的,处理器1020可以是中央处理器(central processing unit,CPU),或者其他硬件芯片。上述硬件芯片可以是专用集成电路(application-specific integrated circuit,ASIC),可编程逻辑器件(programmable logic device,PLD)或其组合。上述PLD可以是复杂可编程逻辑器件(complex programmable logic device,CPLD),现场可编程逻辑门阵列(field-programmable gate array,FPGA),通用阵列逻辑(generic array logic,GAL)或其任意组合。处理器1020在实现上述功能时,可以通过硬件实现,当然也可以通过硬件执行相应的软件实现。
存储器1030用于存放程序指令和数据等。具体地,程序指令可以包括程序代码,该程序代码包括计算机操作的指令。存储器1030可能包含随机存取存储器(random access memory,RAM),也可能还包括非易失性存储器(non-volatile memory),例如至少一个磁盘存储器。处理器1020执行存储器1030所存放的程序,并通过上述各个部件,实现上述功能,从而最终实现以上实施例提供的方法。
如图11所示,为本申请提供一种传感器的仿真装置的结构示意图,该装置可以包括:获取模块和训练模块。该装置可以应用于测试装置。
其中,获取模块1101,用于获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征测量值;所述传感器特征测量值包括:RCS测量值和SNR测量值;所述传感器位于测量车辆中;所述第二目标车辆为所述测量车辆附近的车辆;
训练模块1102,用于根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述传感器模型的样本输入为所述第二目标车辆相对所述传感器的位置信息、速度信息及标注信息,所述传感器模型的输出为所述第二目标车辆的传感器特征预测值;所述第二目标车辆的传感器特征预测值包括以下至少一项:RCS预测值和SNR预测值;
所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述传感器所在的车辆信息。
需要说明的是,本申请上述实施例中对模块的划分是示意性的,仅仅为一种逻辑功能 划分,实际实现时可以有另外的划分方式,另外,在本申请各个实施例中的各功能模块可以集成在一个处理模块中,也可以是单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述各个模块的只一个或多个可以软件、硬件、固件或其结合实现。所述软件或固件包括但不限于计算机程序指令或代码,并可以被硬件处理器所执行。所述硬件包括但不限于各类集成电路,如中央处理单元(CPU)、数字信号处理器(DSP)、现场可编程门阵列(FPGA)或专用集成电路(ASIC)。
集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
如图12所示,为本申请提供一种传感器的仿真装置的结构示意图,传感器的仿真装置1200中可以包括:通信接口1210、处理器1220,以及存储器1230。
通信接口1210和存储器1230与处理器1220之间相互连接。可选的,通信接口1210和存储器1230与处理器1220之间可以通过总线相互连接;总线可以是外设部件互连标准(peripheral component interconnect,PCI)总线或扩展工业标准结构(extended industry standard architecture,EISA)总线等。总线可以分为地址总线、数据总线、控制总线等。为便于表示,图12中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。
通信接口1210可以用于实现传感器的仿真装置与其他设备(例如,与车辆的仿真装置1000)之间的通信。例如,使得车辆的仿真装置获得传感器模型。
处理器1220用于实现上述如图4b所示的传感器的仿真方法,具体可以参见上述图4b所示的实施例中的描述,此处不再赘述。可选的,处理器1220可以是中央处理器(central processing unit,CPU),或者其他硬件芯片。上述硬件芯片可以是专用集成电路(application-specific integrated circuit,ASIC),可编程逻辑器件(programmable logic device,PLD)或其组合。上述PLD可以是复杂可编程逻辑器件(complex programmable logic device,CPLD),现场可编程逻辑门阵列(field-programmable gate array,FPGA),通用阵列逻辑(generic array logic,GAL)或其任意组合。处理器1220在实现上述功能时,可以通过硬件实现,当然也可以通过硬件执行相应的软件实现。
存储器1230用于存放程序指令和数据等。具体地,程序指令可以包括程序代码,该程序代码包括计算机操作的指令。存储器1230可能包含随机存取存储器(random access memory,RAM),也可能还包括非易失性存储器(non-volatile memory),例如至少一个磁盘存储器。处理器1220执行存储器1230所存放的程序,并通过上述各个部件,实现上述功能,从而最终实现以上实施例提供的方法。
本申请提供一种计算机可读存储介质,包括计算机指令,当所述计算机指令在被处理器运行时,使得所述车辆的仿真装置执行上述实施例中所述的任一种可能的方法。
本申请提供一种计算机可读存储介质,包括计算机指令,当所述计算机指令在被处理器运行时,使得所述传感器的仿真装置执行上述实施例中所述的任一种可能的方法。
本申请提供一种计算机程序产品,当所述计算机程序产品在处理器上运行时,使得所述车辆的仿真装置执行上述实施例中所述的任一种可能的方法。
本申请提供一种计算机程序产品,当所述计算机程序产品在处理器上运行时,使得所述传感器的仿真装置执行上述实施例中所述的任一种可能的方法。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本申请进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。

Claims (19)

  1. 一种车辆的仿真方法,其特征在于,包括:
    将第一目标车辆相对仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息,输入至传感器模型中,得到所述第一目标车辆的传感器特征预测值;所述传感器特征预测值包括以下至少一项:雷达反射截面RCS预测值和信噪比SNR预测值;
    其中,所述传感器模型用于仿真所述仿真车辆中的传感器,所述第一目标车辆为所述仿真车辆所在的测试环境中的车辆,所述第一目标车辆相对所述仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息为根据所述测试环境确定的,所述传感器模型为根据所述传感器的测量数据及标注的道路环境信息训练得到的;
    将所述第一目标车辆的传感器特征预测值输入至所述仿真车辆的决策模块,获得所述仿真车辆的仿真决策结果;其中,所述决策模块用于输出基于所述传感器特征预测值确定的车辆行驶决策。
  2. 如权利要求1所述的方法,其特征在于,所述第一目标车辆为根据候选车辆相对所述仿真车辆的位置信息和速度信息,在所述候选车辆中确定出的在所述传感器的探测范围内的车辆;所述候选车辆相对所述仿真车辆的位置信息和速度信息为根据所述测试环境确定的,所述候选车辆为所述仿真车辆所在的测试环境中的车辆。
  3. 如权利要求1或2所述的方法,其特征在于,所述方法还包括:
    确定所述第一目标车辆的SNR预测值大于可见阈值。
  4. 如权利要求1-3任一项所述的方法,其特征在于,所述第一目标车辆包括第一候选车辆和第二候选车辆;所述第一目标车辆的传感器特征预测值为根据所述第一候选车辆的传感器特征预测值和第二候选车辆的传感器特征预测值确定的;
    所述第一候选车辆和所述第二候选车辆满足:第一位置相对第二位置的相对位置小于第一位置阈值;所述第一位置为所述第一候选目标车辆相对所述仿真车辆的位置,所述第二位置为所述第二候选目标车辆相对所述仿真车辆的位置。
  5. 如权利要求4所述的方法,其特征在于,所述第一候选车辆和所述第二候选车辆还满足:
    第一速度相对第二速度的相对速度小于第一速度阈值;所述第一速度为所述第一候选目标车辆相对所述仿真车辆的速度,所述第二速度为所述第二候选目标车辆相对所述仿真车辆的速度。
  6. 如权利要求1-5任一项所述的方法,其特征在于,所述传感器模型为根据所述传感器的测量数据及标注的道路环境信息训练得到的,包括:
    获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征值;所述传感器特征值包括:RCS测量值和SNR测量值;所述传感器位于测量车辆中,所述第二目标车辆为所述测量车辆附近的车辆;
    根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述传感器所在的车辆信息;所述传感器模型的输入为所述第一目标车辆相对所述传感器的位置信息、速度信息及所述标注信息,所述传感器模型的输 出为所述第一目标车辆的传感器特征预测值。
  7. 一种传感器的仿真方法,其特征在于,包括:
    获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征测量值;所述传感器特征测量值包括:雷达反射截面RCS测量值和信噪比SNR测量值;所述传感器位于测量车辆中;所述第二目标车辆为所述测量车辆附近的车辆;
    根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述传感器模型的样本输入为所述第二目标车辆相对所述传感器的位置信息、速度信息及标注信息,所述传感器模型的输出为所述第二目标车辆的传感器特征预测值;所述第二目标车辆的传感器特征预测值包括以下至少一项:RCS预测值和SNR预测值;
    所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述传感器所在的车辆信息。
  8. 一种车辆的仿真装置,其特征在于,包括:
    传感器特征预测模块,用于将第一目标车辆相对仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息,输入至传感器模型中,得到所述第一目标车辆的传感器特征预测值;所述传感器特征预测值包括以下至少一项:雷达反射截面RCS预测值和信噪比SNR预测值;
    其中,所述传感器模型用于仿真所述仿真车辆中的传感器,所述第一目标车辆为所述仿真车辆所在的测试环境中的车辆,所述第一目标车辆相对所述仿真车辆的位置信息和速度信息及所述仿真车辆的道路环境信息为根据所述测试环境确定的,所述传感器模型为根据所述传感器的测量数据及标注的道路环境信息训练得到的;
    输出模块,用于将所述第一目标车辆的传感器特征预测值输入至所述仿真车辆的决策模块,获得所述仿真车辆的仿真决策结果;其中,所述决策模块用于输出基于所述传感器特征预测值确定的车辆行驶决策。
  9. 如权利要求8所述的装置,其特征在于,还包括:
    第一确定模块,用于根据候选车辆相对所述仿真车辆的位置信息和速度信息,在所述候选车辆中将在所述传感器的探测范围内的车辆确定为所述第一目标车辆;所述候选车辆相对所述仿真车辆的位置信息和速度信息为根据所述测试环境确定的;所述候选车辆为所述仿真车辆所在的测试环境中的车辆。
  10. 如权利要求8或9所述的装置,其特征在于,还包括:
    第二确定模块,用于确定所述第一目标车辆的SNR预测值大于可见阈值。
  11. 如权利要求8-10任一项所述的装置,其特征在于,还包括:
    第三确定模块,用于根据第一候选车辆的传感器特征预测值和第二候选车辆的传感器特征预测值确定所述第一目标车辆的传感器特征预测值;所述第一目标车辆包括第一候选车辆和第二候选车辆;所述第一候选车辆和所述第二候选车辆满足:第一位置相对第二位置的相对位置小于第一位置阈值;所述第一位置为所述第一候选目标车辆相对所述仿真车辆的位置,所述第二位置为所述第二候选目标车辆相对所述仿真车辆的位置。
  12. 如权利要求11所述的装置,其特征在于,所述第一候选车辆和所述第二候选车辆还满足:
    第一速度相对第二速度的相对速度小于第一速度阈值;所述第一速度为所述第一候选 目标车辆相对所述仿真车辆的速度,所述第二速度为所述第二候选目标车辆相对所述仿真车辆的速度。
  13. 如权利要求8-12任一项所述的装置,其特征在于,还包括:传感器模型训练模块,所述传感器模型训练模块包括:
    获取模块,用于获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征值;所述传感器特征值包括:RCS测量值和SNR测量值;所述传感器位于测量车辆中,所述第二目标车辆为所述测量车辆附近的车辆;
    训练模块,用于根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述测量车辆的车辆信息;所述传感器模型的输入为所述第一目标车辆相对所述传感器的位置信息、速度信息及所述标注信息,所述传感器模型的输出为所述第一目标车辆的传感器特征预测值。
  14. 一种传感器的仿真装置,其特征在于,包括:
    获取模块,用于获取传感器的测量数据;所述测量数据包括:第二目标车辆相对所述传感器的位置信息、速度信息以及所述传感器采集的所述第二目标车辆的传感器特征测量值;所述传感器特征测量值包括:雷达反射截面RCS测量值和信噪比SNR测量值;所述传感器位于测量车辆中;所述第二目标车辆为所述测量车辆附近的车辆;
    训练模块,用于根据所述传感器的测量数据及获得的标注信息进行训练,得到传感器模型;所述传感器模型的样本输入为所述第二目标车辆相对所述传感器的位置信息、速度信息及标注信息,所述传感器模型的输出为所述第二目标车辆的传感器特征预测值;所述第二目标车辆的传感器特征预测值包括以下至少一项:RCS预测值和SNR预测值;
    所述标注信息包括以下至少一项:所述第二目标车辆相对所述传感器的横摆角、所述传感器采集数据时标注的道路环境信息、所述传感器所在的车辆信息。
  15. 一种车辆的仿真装置,其特征在于,包括:
    处理器和接口电路;
    其中,所述处理器通过所述接口电路与存储器耦合,所述处理器用于执行所述存储器中的程序代码,以实现如权利要求1-6中任一项所述的方法。
  16. 一种传感器的仿真装置,其特征在于,包括:
    处理器和接口电路;
    其中,所述处理器通过所述接口电路与存储器耦合,所述处理器用于执行所述存储器中的程序代码,以实现如权利要求7所述的方法。
  17. 一种计算机可读存储介质,其特征在于,包括计算机指令,当所述计算机指令在被处理器运行时,使得所述车辆的仿真装置执行如权利要求1-6中任一项所述的方法或使得所述传感器的仿真装置执行权利要求7所述的方法。
  18. 一种车联网通信系统,其特征在于,包含车载系统和权利要求8-13、15中任一项所述的车辆的仿真装置,其中,车载系统与所述车辆的仿真装置通信连接;
    或包含车载系统和权利要求14、16中任一项所述的传感器的仿真装置,其中,车载系统与所述传感器的仿真装置通信连接。
  19. 一种芯片系统,其特征在于,包括:
    处理器,用于调用存储器中存储的计算机程序或计算机指令,以使得该处理器执行所述存储器中的程序代码,以实现如权利要求1-6中任一项所述的方法或如权利要求7所述的方法。
PCT/CN2022/078997 2021-03-04 2022-03-03 一种车辆、传感器的仿真方法及装置 WO2022184127A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110238478.8A CN115031981A (zh) 2021-03-04 2021-03-04 一种车辆、传感器的仿真方法及装置
CN202110238478.8 2021-03-04

Publications (1)

Publication Number Publication Date
WO2022184127A1 true WO2022184127A1 (zh) 2022-09-09

Family

ID=83117772

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/078997 WO2022184127A1 (zh) 2021-03-04 2022-03-03 一种车辆、传感器的仿真方法及装置

Country Status (2)

Country Link
CN (1) CN115031981A (zh)
WO (1) WO2022184127A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210394787A1 (en) * 2020-06-17 2021-12-23 Shenzhen Guo Dong Intelligent Drive Technologies Co., Ltd. Simulation test method for autonomous driving vehicle, computer equipment and medium
CN116399339A (zh) * 2023-06-08 2023-07-07 深圳欧米智能科技有限公司 基于转向角度融合的惯性导航方法、装置和计算机设备
US20230252828A1 (en) * 2022-02-10 2023-08-10 Hexagon Geosystems Services Ag Method and system for on-site testing of an off-road vehicle intervention system
CN117421700A (zh) * 2023-12-19 2024-01-19 湖南仕博测试技术有限公司 一种用于自动驾驶中的传感器数据过滤及融合方法及装置

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115985124B (zh) * 2022-11-30 2024-02-06 禾多科技(北京)有限公司 车辆行驶的控制方法、装置、存储介质及电子装置
CN115880946B (zh) * 2023-01-06 2023-06-06 广州通达汽车电气股份有限公司 一种跟车警报方法、装置、设备及存储介质
CN116451590B (zh) * 2023-06-09 2023-11-17 安徽深信科创信息技术有限公司 自动驾驶仿真测试平台的仿真方法及装置
CN117241300B (zh) * 2023-11-16 2024-03-08 南京信息工程大学 一种无人机辅助的通感算网络融合方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020036584A1 (en) * 2000-02-28 2002-03-28 Jocoy Edward H. System and method for avoiding accidents in intersections
CN101837782A (zh) * 2009-01-26 2010-09-22 通用汽车环球科技运作公司 用于碰撞预备系统的多目标融合模块
CN105093227A (zh) * 2015-08-27 2015-11-25 电子科技大学 一种交通流量测量装置及车辆运行信息获得方法
US20190113918A1 (en) * 2017-10-18 2019-04-18 Luminar Technologies, Inc. Controlling an autonomous vehicle based on independent driving decisions
CN111736142A (zh) * 2019-03-25 2020-10-02 通用汽车环球科技运作有限责任公司 用于雷达交叉交通跟踪和操纵风险评估的系统和方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020036584A1 (en) * 2000-02-28 2002-03-28 Jocoy Edward H. System and method for avoiding accidents in intersections
CN101837782A (zh) * 2009-01-26 2010-09-22 通用汽车环球科技运作公司 用于碰撞预备系统的多目标融合模块
CN105093227A (zh) * 2015-08-27 2015-11-25 电子科技大学 一种交通流量测量装置及车辆运行信息获得方法
US20190113918A1 (en) * 2017-10-18 2019-04-18 Luminar Technologies, Inc. Controlling an autonomous vehicle based on independent driving decisions
CN111736142A (zh) * 2019-03-25 2020-10-02 通用汽车环球科技运作有限责任公司 用于雷达交叉交通跟踪和操纵风险评估的系统和方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210394787A1 (en) * 2020-06-17 2021-12-23 Shenzhen Guo Dong Intelligent Drive Technologies Co., Ltd. Simulation test method for autonomous driving vehicle, computer equipment and medium
US20230252828A1 (en) * 2022-02-10 2023-08-10 Hexagon Geosystems Services Ag Method and system for on-site testing of an off-road vehicle intervention system
CN116399339A (zh) * 2023-06-08 2023-07-07 深圳欧米智能科技有限公司 基于转向角度融合的惯性导航方法、装置和计算机设备
CN116399339B (zh) * 2023-06-08 2023-08-25 深圳欧米智能科技有限公司 基于转向角度融合的惯性导航方法、装置和计算机设备
CN117421700A (zh) * 2023-12-19 2024-01-19 湖南仕博测试技术有限公司 一种用于自动驾驶中的传感器数据过滤及融合方法及装置
CN117421700B (zh) * 2023-12-19 2024-03-19 湖南仕博测试技术有限公司 一种用于自动驾驶中的传感器数据过滤及融合方法及装置

Also Published As

Publication number Publication date
CN115031981A (zh) 2022-09-09

Similar Documents

Publication Publication Date Title
WO2022184127A1 (zh) 一种车辆、传感器的仿真方法及装置
US10699142B2 (en) Systems and methods for traffic signal light detection
US11011063B2 (en) Distributed data collection and processing among vehicle convoy members
KR102534562B1 (ko) 긴급 차량 검출 시스템 및 방법
CN109387857B (zh) 激光雷达系统中的跨网段检测方法和设备
CN110441790B (zh) 激光雷达系统中的方法和装置串扰和多径降噪
WO2021218388A1 (zh) 高精度地图的生成方法、定位方法及装置
US20220032955A1 (en) Vehicle control device and vehicle control method
EP3835823B1 (en) Information processing device, information processing method, computer program, information processing system, and moving body device
CN113792566A (zh) 一种激光点云的处理方法及相关设备
CN110596731A (zh) 一种地铁车辆主动障碍物检测系统及方法
EP4102251A1 (en) Determination of atmospheric visibility in autonomous vehicle applications
JP2019067295A (ja) 車両制御装置、車両制御方法、およびプログラム
WO2021110166A1 (zh) 道路结构检测方法及装置
WO2023025777A1 (en) Automotive sensor fusion of radar, lidar, camera systems with improved safety by use of machine learning
WO2022160127A1 (zh) 控制方法和装置
JP7380904B2 (ja) 情報処理装置、情報処理方法、および、プログラム
Tang Pedestrian protection using the integration of v2v communication and pedestrian automatic emergency braking system
JP2019095875A (ja) 車両制御装置、車両制御方法、およびプログラム
JP2019049811A (ja) 車両制御装置、車両制御方法、およびプログラム
Hadj-Bachir et al. Full Virtual ADAS Testing. Application to the Typical Emergency Braking EuroNCAP Scenario
CN112083412A (zh) 毫米波雷达与c-v2x系统的融合方法及其系统和电子设备
CN111862654A (zh) 智能领航方法、应用、智能领航系统和车辆
US20240125921A1 (en) Object detection using radar sensors
TWI841695B (zh) 用於雷達輔助之單一影像三維深度重建之方法、車載電腦及非暫時性電腦可讀媒體

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22762589

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22762589

Country of ref document: EP

Kind code of ref document: A1