CN117437771A - Target state estimation method, device, electronic equipment and medium - Google Patents

Target state estimation method, device, electronic equipment and medium Download PDF

Info

Publication number
CN117437771A
CN117437771A CN202210837658.2A CN202210837658A CN117437771A CN 117437771 A CN117437771 A CN 117437771A CN 202210837658 A CN202210837658 A CN 202210837658A CN 117437771 A CN117437771 A CN 117437771A
Authority
CN
China
Prior art keywords
target
loss
orientation
residual
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210837658.2A
Other languages
Chinese (zh)
Inventor
杨贇晨
刘伟俊
孙嘉明
王乃岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tusimple Technology Co Ltd
Original Assignee
Beijing Tusimple Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tusimple Technology Co Ltd filed Critical Beijing Tusimple Technology Co Ltd
Priority to CN202210837658.2A priority Critical patent/CN117437771A/en
Priority to EP23184310.3A priority patent/EP4345417A3/en
Priority to JP2023114683A priority patent/JP2024012162A/en
Priority to AU2023204626A priority patent/AU2023204626A1/en
Priority to US18/351,955 priority patent/US20240025428A1/en
Publication of CN117437771A publication Critical patent/CN117437771A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/183Compensation of inertial measurements, e.g. for temperature effects
    • G01C21/188Compensation of inertial measurements, e.g. for temperature effects for accumulated errors, e.g. by coupling inertial systems with absolute positioning systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/06Improving the dynamic response of the control system, e.g. improving the speed of regulation or avoiding hunting or overshoot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • G01S13/874Combination of several systems for attitude determination
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0019Control system elements or transfer functions
    • B60W2050/0022Gains, weighting coefficients or weighting functions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4042Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4044Direction of movement, e.g. backwards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9318Controlling the steering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/93185Controlling the brakes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9319Controlling the accelerator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Testing Or Calibration Of Command Recording Devices (AREA)

Abstract

The present disclosure relates to a target state estimation method, comprising: obtaining observed quantity of the target at each moment through a plurality of sensors, and obtaining at least one observed quantity through each sensor; determining a state quantity to be optimized of the target based on the observed quantity; and optimizing the state quantity of the target at each moment by minimizing the loss function to obtain the optimized state quantity of the target at each moment. The loss function includes at least one of a position loss, an orientation loss, a velocity loss, a size loss, a structural constraint of the target. The target state estimation method disclosed by the invention can obtain sufficiently accurate state estimation. In addition, a target state estimation device, an electronic device and a medium are also provided.

Description

Target state estimation method, device, electronic equipment and medium
Technical Field
The present disclosure relates to the field of computers, and in particular, to the field of automatic driving and data processing technologies, and more particularly, to a target state estimation method, apparatus, electronic device, computer readable storage medium, and computer program product.
Background
In identifying or observing a target, it is often necessary to accurately estimate the state of the target based on the target measurement data that has been obtained by the sensor. The target is continuously changed along with the parameters such as speed, angle, acceleration and the like in the moving process, so that the position of the target has strong correlation. For example, an important link in unmanned driving is to estimate the position, speed, size, direction and other states of other vehicles on the road in real time, and this technology largely determines the safety factor of unmanned driving. Therefore, in order to improve the recognition or observation performance of the object, it is highly demanded to study more superior state estimation methods.
The approaches described in this section are not necessarily approaches that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, the problems mentioned in this section should not be considered as having been recognized in any prior art unless otherwise indicated.
Disclosure of Invention
According to one aspect of the present disclosure, there is provided a target state estimation method including: obtaining observed quantity of the target at each moment through a plurality of sensors, wherein at least one observed quantity is obtained through each sensor; determining a state quantity to be optimized of the target based on the observed quantity; optimizing the state quantity of the target at each moment by minimizing a loss function to obtain the state quantity of the target after optimization at each moment; wherein the loss function includes at least one of a position loss, an orientation loss, a velocity loss, a size loss, a structural constraint of the target.
According to another aspect of the present disclosure, there is provided a target state estimating apparatus including: an acquisition unit adapted to acquire observables for a target by a plurality of sensors, wherein at least one observables is acquired by each sensor; a construction unit adapted to determine a state quantity of the target to be optimized based on the observed quantity; the optimizing unit is suitable for optimizing the state quantity of the target at each moment by minimizing a loss function to obtain the state quantity of the target after optimization at each moment; wherein the loss function includes at least one of a position loss, an orientation loss, a velocity loss, a size loss, a structural constraint of the target.
According to another aspect of the present disclosure, there is provided an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the methods described in the present disclosure.
According to another aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the method described in the present disclosure.
According to one or more embodiments of the present disclosure, the target states are estimated by fusing the observed data acquired by the plurality of sensors, and constraints between the target states are formed by a loss function, so that sufficiently accurate state estimates can be obtained, which is critical to subsequent target behavior analysis.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The accompanying drawings illustrate exemplary embodiments and, together with the description, serve to explain exemplary implementations of the embodiments. The illustrated embodiments are for exemplary purposes only and do not limit the scope of the claims. Throughout the drawings, identical reference numerals designate similar, but not necessarily identical, elements.
FIG. 1 is a flowchart illustrating a target state estimation method according to an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating a truck motion model according to an example embodiment;
FIG. 3 is a schematic diagram illustrating a motion model of a vehicle including only a first component according to an exemplary embodiment;
FIG. 4 is a schematic diagram illustrating a sliding time window according to an example embodiment;
fig. 5 is a block diagram showing a structure of a target state estimating apparatus according to an exemplary embodiment; and
FIG. 6 is a block diagram illustrating an exemplary computing device that may be used in connection with the exemplary embodiments.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the present disclosure, the use of the terms "first," "second," and the like to describe various elements is not intended to limit the positional relationship, timing relationship, or importance relationship of the elements, unless otherwise indicated, and such terms are merely used to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, they may also refer to different instances based on the description of the context.
The terminology used in the description of the various illustrated examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, the elements may be one or more if the number of the elements is not specifically limited. Furthermore, the term "and/or" as used in this disclosure encompasses any and all possible combinations of the listed items.
An important link in unmanned is to estimate the position, speed, size, orientation and other states of other vehicles on the road in real time, which determines the safety factor of unmanned to a great extent. The speed, position and other information of the vehicle can be observed through the corresponding observation model. The data observed by the observation model is often affected by noise or the like so that the observed data is in error with the actual running data of the vehicle. The observation data thus needs to be corrected, i.e., the physical state during the running of the vehicle is estimated from the observation data.
Generally, in the state estimation process, estimation of a certain state is performed based on corresponding observed data, for example, estimation of a vehicle speed state based on observed speed information, estimation of a vehicle position state based on observed vehicle center point coordinates, and the like. However, there is a strong correlation between the various types of observed data for the same vehicle. In order to improve the accuracy of state estimation and avoid the false estimation of the state when a single observation model fails, the data observed by multiple observation models can be fused to estimate the state of the vehicle more comprehensively and three-dimensionally, so that the safety coefficient of unmanned driving is improved.
Accordingly, embodiments of the present disclosure provide a target state estimation method, including: obtaining observed quantity of the target at each moment through a plurality of sensors, wherein at least one observed quantity is obtained through each sensor; determining a state quantity to be optimized of the target based on the observed quantity; and optimizing the state quantity of the target at each moment by minimizing a loss function to obtain the state quantity of the target after optimization at each moment. The loss function includes at least one of a position loss, an orientation loss, a velocity loss, a size loss, a structural constraint of the target.
According to the embodiment of the disclosure, the target states are estimated by fusing the observation data acquired by the plurality of sensors, and the constraint among the target states is formed through the loss function, so that a sufficiently accurate and robust state estimation result can be obtained, and the method is important for subsequent target behavior analysis.
Fig. 1 shows a flowchart of a target state estimation method according to an embodiment of the present disclosure. As shown in fig. 1, in step 110, observations of the target at each time instant are obtained by a plurality of sensors, wherein at least one observation is obtained by each sensor.
In an embodiment of the present disclosure, the target may comprise a vehicle. Thus, according to some embodiments, the observed quantity may comprise: at least one of speed, position, and orientation of the target vehicle at each time instant; and a dimension of the target vehicle, the dimension may include at least one of a length, a width, and a height.
According to some embodiments, the plurality of sensors may include at least one of: image acquisition device, point cloud acquisition device. By way of example, the image acquisition apparatus may include a wide variety of devices such as a vision camera, an infrared camera, a camera for ultraviolet or X-ray imaging, and so forth. Different devices may provide different detection accuracy and ranges. The visual camera can capture information such as the running state of a target in real time. The infrared camera can capture objects in night vision. A camera for ultraviolet or X-ray imaging can image a target in various complex environments (night, bad weather, electromagnetic interference, etc.). The point cloud acquisition apparatus may also include a variety of devices such as a laser radar (LiDAR), millimeter wave radar, ultrasonic sensor, and so forth. Different devices may provide different detection accuracy and ranges. Lidar may be used to detect target edges, shape information, for target identification and tracking. Millimeter wave radars can be used to measure distances to targets using the characteristics of electromagnetic waves. The ultrasonic sensor can be used for measuring the distance from the target by utilizing the characteristics of strong ultrasonic directivity and the like. The radar apparatus may also measure a change in velocity with a moving target due to the doppler effect.
According to some embodiments, the plurality of sensors may be located on at least one observation vehicle or on a roadside device. For example, various sensors may be mounted in front of, behind, or other locations of the vehicle during travel of the autonomous vehicle to enable real-time observation of surrounding vehicles. Alternatively, various sensors are located on the road side equipment to observe in real time the targets such as vehicles, pedestrians, etc. passing through the road side equipment.
In some examples, the roadside device may include an electronic device, a communication device, etc., and the electronic device may be integrated with the communication device or may be separately provided. The electronic device can acquire data observed by various sensors, so as to perform data processing and calculation, obtain corresponding observed quantity, and transmit processing and calculation results to the computing device through the communication device. Optionally, the electronic device may also be disposed at a cloud end, so as to obtain data observed by multiple sensors on the roadside device through the communication device, and obtain a corresponding observed quantity through data analysis and calculation.
According to some embodiments, a target state estimation method according to the present disclosure may be implemented in a computing device that obtains at least one observed quantity through each sensor. That is, the observed amounts of the targets acquired by the various sensors at each time point can be analyzed online or offline by the computing device. The computing device may reside on at least one observation vehicle, on a roadside device, or in the cloud, without limitation.
According to some embodiments, the observed quantity may be obtained by an observation model corresponding to each sensor. Illustratively, the observation model includes at least one of: an image-based binocular ranging algorithm, an image-based monocular ranging algorithm, a point cloud-based ranging algorithm, an image and map-based projected ranging algorithm, and a point cloud and map-based projected ranging algorithm.
In the present disclosure, the observation model may be analyzed and calculated based on data acquired by the sensor to output an observed quantity at each time corresponding to the target. Specifically, in some examples, based on a projection ranging algorithm, the center point coordinates of surrounding vehicles, the four corner point coordinates of the detection frame, and the like may be obtained; the coordinates, speed, etc. of the center points of the surrounding vehicles can be obtained based on ranging algorithms such as binocular ranging algorithm, monocular ranging algorithm, etc.
According to some embodiments, the object is a multi-level structure, the structural constraints comprising structural constraints between the multi-level structures. At this time, the observed quantity may include observed quantities respectively corresponding to at least two stages in the multi-stage structure.
For example, in embodiments where the target is a vehicle, the target vehicle may be a truck that is a two-stage structure, i.e., the truck has a tractor in a first stage and a trailer in a second stage, with a pivot (or hinge) structure connecting the tractor and the trailer forming a structural constraint therebetween.
Fig. 2 shows a schematic diagram of a truck motion model according to an embodiment of the present disclosure. As shown in fig. 2, a tractor 201 and a trailer 202 are connected by a pivot structure 203. In some embodiments, the tractor 201 may be processed based on a motion model of a vehicle that includes only a primary structure, but the motion observations of the trailer constrain the motion observations of the tractor. The vehicle including only the primary structure may be, for example, a wheelbarrow, a general four-wheeled vehicle, or the like.
FIG. 3 illustrates a schematic diagram of a vehicle motion model including only primary structures according to an embodiment of the present disclosure. In some examples, the speed direction of the vehicle is distinguished from the head direction to improve the accuracy of the vehicle state estimation. In the motion model shown in fig. 3, omicron is the heading of the vehicle (i.e., the direction of the head), and θ is the speed direction of the vehicle. Let the vehicle be at t i From time to t i+1 With velocity v between moments i Therefore, there is a transformation formula shown in the following formula (1) and formula (2).
px i+1 =px i +v i ·cosθ i Formula (1)
py i+1 =py i +v i ·sinθ i Formula (2)
Wherein px is i And py i Respectively represent t i Coordinates of a vehicle center point at a moment; px (px) i+1 And py i+1 Respectively represent t i+1 Coordinates of a vehicle center point at a moment; θ i Representing t i The angle of the vehicle speed direction at the moment in time to the x-direction in the reference coordinate system.
In the present disclosure, the reference coordinate system is a coordinate system determined based on an observing vehicle or a drive test equipment in which a plurality of sensors are located. For example, when a plurality of sensors are located on an observation vehicle, the reference coordinate system is used for a coordinate system describing a relationship of an object around the vehicle with the vehicle. According to different definitions, the origin points are different, for example, the center of gravity can be used as the origin point, and the extended right-hand coordinate system is the reference coordinate system; or the IMU (inertial measurement unit) is used as the origin of the defined reference coordinate system.
It will be appreciated that any suitable reference frame is possible, for example, the reference frame may also have a lane center line as a lateral axis, a longitudinal axis offset from the lane center line, and a vertical axis perpendicular to the lane center line, without limitation.
As described above, based on the projection ranging algorithm, four corner coordinates of the vehicle detection frame can be obtained, that is, vehicle contour detection is realized. Therefore, in the vehicle body frame shown in fig. 3, the vector from the vehicle center to the i-th vehicle corner point can be expressed as shown in formula (3).
Wherein L and W are the length and width of the vehicle, respectively; [ delta ] ii ]Representing the offset of the ith vehicle corner point in the reference coordinate system relative to the vehicle center point, which is constant for each vehicle corner point; r is R bw A rotation matrix representing a reference coordinate system to an ENU (East-North-Up) coordinate system, wherein R bw Expressed as formula (4).
Thus, it is sufficient to determine a vehicle based on information such as the speed, orientation, size, and center point position of the vehicle.
With continued reference to FIG. 2, in some implementationsIn the example, the trailer 202 and the axle structure 203 are typically oriented identically and thus can be handled as a rigid structure. In addition, it may be assumed that the pivot structure 203 links the center position of the interface with the tractor 201 and trailer 202. Once the center point coordinates p of the towing vehicle 201 0 Length L 0 Width W 0 The center point coordinate p of the trailer 202 is known to be obtained 1 As shown in equations (5) - (7).
p 1 =p 0 +offset 0 -offset 1 Formula (5)
Wherein L is 1 For the length of the trailer L h For the length of the spindle structure, o and β are the angles of the tractor and trailer, respectively, relative to the x-axis direction of the reference frame.
In some examples, the detection frames of the tractor and trailer may be obtained simultaneously by a sensor, such as a lidar. Position the detection frame of the trailer from h 1 (h in FIG. 2 when i is 1 i Position of (d) to h 0 (h in FIG. 2) 0 Is assumed to be another observation of the tractor such that the observation of the trailer places a constraint on the observation of the tractor as shown in equation (8).
The angular velocity of the trailer may be as shown in equation (9):
where v represents the speed of the tractor,indicating the angular velocity of the trailer. Knowing the speed, orientation, size, length of the axle, and position of the tractor, various conditions of a truck can be determined.
According to some embodiments, the object is a vehicle, which may comprise a first part and at least one second part, the second part being rotatable around the first part. The location of the target may include at least one of: a position of the first component, a position of the at least one second component, a position of the vehicle; the dimensions of the target include at least one of: the dimensions of the first component, the dimensions of the at least one second component, the dimensions of the vehicle; the orientation of the target includes at least one of: the direction of speed, the orientation of the first component, the direction of the lane in which the vehicle is located.
As described above, a model has been described in which the target vehicle includes a two-stage structure, that is, the target vehicle includes one first component and one second component. In some embodiments, the second component may also be a plurality of, for example, a train, a truck with multiple hangers, etc., and the motion model may refer to the truck model described above, which is not described herein.
In some embodiments, after the observables of the target at each time are acquired by the various sensors, the acquired observables may be subjected to data preprocessing. Illustratively, the abnormal observations may be deleted, the available observations may be retained, the data formats may be unified, and the like, without limitation.
In step 120, a state quantity of the target to be optimized is determined based on the observed quantity.
According to some embodiments, the state quantity includes at least one of: at least one of the speed, the position and the orientation of the target at each moment, and the state quantity of the speed, the position, the orientation and the like of the target at each moment is the instantaneous state quantity. In addition, the state quantity may further include at least one of an average speed, an average position, and an average orientation of the target at a predetermined period to which each time belongs; and the size of the target.
In the embodiment of the object being a multi-level structure, the observed quantity includes observed quantities respectively corresponding to at least two levels in the multi-level structure, and thus the state quantity includes state quantities respectively corresponding to at least two levels in the multi-level structure. As described above with reference to fig. 2, the tractor and the observed data of the tractor, such as the size of the tractor and the tractor, the tractor position, the tractor speed, etc., are obtained by the sensors, and one or more state quantities of the tractor and the tractor, such as the size of the tractor and the tractor, the position, orientation, speed, etc., are estimated.
During driving, the vehicle may observe surrounding vehicles in real time, for example, through a plurality of sensors, thereby constantly generating observation data. Thus, in some embodiments, optimization of the physical state of the vehicle may be achieved by constructing a sliding time window. Specifically, an observed quantity for the target vehicle observed by at least one observation model within a sliding time window is acquired to construct a state quantity describing a physical state of the target vehicle within the sliding time window based on the observed quantity. When the determined state quantity to be optimized of the target includes an average speed, an average position and an average orientation of the target in a predetermined period to which the target belongs at each moment, the average speed, the average position and the average orientation may be average state quantities of the target in a current sliding time window. Fig. 4 shows a schematic diagram of a sliding time window according to an embodiment of the present disclosure. For example, the state quantity to be optimized in the sliding time window may be constructed according to formula (10).
Wherein the ith frame state quantity s in the sliding time window i For example, may include the state quantity shown in equation (11).
Wherein,θ i 、o i the speed, speed direction and vehicle body orientation of the target vehicle are indicated, respectively. / >Represents the average speed size in the sliding time window, and +.>Indicating the average body orientation over the sliding time window.
In some examples, where the target vehicle is a secondary structure such as a truck, the speed magnitude, speed direction of the target vehicle may be the speed magnitude and speed direction of a tractor. The vehicle body direction is a tractor direction, and the i-th frame state quantity s i May also include a trailer orientation beta i I.e.
It will be appreciated that the length n of the sliding time window and the sliding step length may be set according to practical situations, and are not limited herein. In addition, the amount of state to be optimized within the sliding time window shown in formulas (10) and (11) is merely exemplary and not limiting herein.
In step 130, the state quantity of the target at each moment is optimized by minimizing a loss function, to obtain the optimized state quantity of the target at each moment, where the loss function includes at least one of a position loss, an orientation loss, a speed loss, a size loss, and a structural constraint of the target.
In the present disclosure, optimization of state quantity at each time of the target is achieved by minimizing the loss function. For example, where the loss function includes a position loss, an orientation loss, a velocity loss, a size loss, the loss function may be constructed based on equation (12).
E=E p +E v +E o +E s Formula (12)
Wherein E is p 、E v 、E o And E is s The position loss, the orientation loss, the speed loss, and the size loss are indicated, respectively. The loss function is determined based on the state quantity to be optimized. Specifically, each of the position loss, the orientation loss, the speed loss, and the size loss may be determined based on a state quantity to be optimized, an observed quantity corresponding to the state quantity, and other observed quantities that may provide constraints on the state quantity.
In general, state estimation and fusion algorithms may employ state estimation techniques based on bayesian filtering algorithms. And carrying out state estimation at each time when data are acquired, and carrying out maximum posterior estimation on the vehicle state by using the prior probability and the likelihood probability of the state at each time. However, such techniques often require a markov assumption that the state at the current time is only affected by the last estimated time and that the state quantity is assumed to follow a particular distribution. Moreover, such methods do not effectively utilize all observations when used to generate tracks offline, because the state estimation for the current time only uses the data at and before the current time, and does not utilize the data after the current time. In addition, the method of the Rauch-tune-Striebel smoothing technique, for example, uses data at all times, but makes a markov assumption and a linear gaussian system assumption, and introduces errors into a complex system.
In an exemplary scenario according to the present disclosure, however, the speed observance of the target vehicle, the position of the target vehicle, etc. may provide constraints on the speed magnitude, speed direction of the target vehicle; in addition, the speed prior, the average speed and the like of the target vehicle can also provide constraint conditions for the speed and the speed direction of the target vehicle; the lane line direction, the speed direction, the target vehicle orientation observed by the laser radar sensor, the orientation priori, the average orientation and the like of the target vehicle can provide constraint conditions for the vehicle body orientation of the target vehicle; etc. This will be described in detail below.
In the method, the observed quantity of the target at each moment is obtained through a plurality of sensors, and a corresponding loss function is constructed, so that the transition from single-sensor identification to multi-sensor fusion is realized. Therefore, in the running process of the vehicle, the surrounding vehicles can be modeled by combining the sensing results of various sensors, and the state information of the surrounding vehicles is updated in real time, so that the unmanned system can make a safe path planning based on the result, and traffic accidents are avoided.
According to some embodiments, the location comprises a location of at least one reference point comprising at least one of: center points and contour corner points (e.g., four corner points of a vehicle detection frame). The position loss includes at least one reference point residual, the reference point residual including at least one of: a center point residual representing a difference in observed quantity and state quantity for the center point, and a contour corner residual representing a difference in observed quantity and state quantity for the contour corner.
Specifically, it is assumed that the state quantity of the target vehicle is optimized based on observation data obtained by L observation models, L being a positive integer. If the observed quantity of the central point of the first observation model isThe center point residual +.>If the first observation model also provides contour observation, the contour observation amount is +.>The contour corner residual can be constructed based on the difference between the observed quantity and the state quantity of the contour corner
In some embodiments, the state quantity of the center point may be characterized based on speed to achieve further optimization of the speed state quantity by the center point residual. Specifically, when the observed quantity includes a center point coordinate of the target vehicle at each time in the sliding time window, and the state quantity includes a speed of the target vehicle at each time in the sliding time window, the center point residual error may be calculated according to the center point coordinate of the target vehicle at each time in the sliding time window and the speed of the target vehicle at each time in the sliding time window.
Specifically, it is assumed that the state quantity of the target vehicle is optimized based on observation data obtained by L observation models, L being a positive integer. If the observed quantity of the central point of the first observation model is Determining that the position coordinate of a first frame corresponding to the target vehicle is p 0Representing the first observation model at t k The center point residual vector of the moment is shown in formula (13):
wherein,
v i =[υ i cos(θ i ),υ i sin(θ i )] T formula (15)
In some embodiments, the state quantity of the contour corner point may be characterized based on the state quantity of the center point to achieve further optimization of the center point state quantity by the contour corner point residual. Specifically, when the observed quantity includes coordinates of the contour corner of the target vehicle at each moment in the sliding time window, the reference point residual may be calculated according to the following: the center point coordinates of the target vehicle at the initial time in the sliding time window, the speed of the target vehicle at each time in the sliding time window, the contour angular point coordinates of the target vehicle at each time in the sliding time window, and the corresponding vector from the center point coordinates of the target vehicle at each time in the sliding time window to the contour angular point coordinates.
Specifically, if the first observation model also provides a contour observation, the contour observation isThen the contour corner residuals may be obtained as shown in equation (16).
Wherein,
wherein phi is m A vector representing the vehicle center point to the vehicle contour corner point.
As described above, in the truck motion model described with reference to fig. 2, the trailer places constraints on the profile observations of the tractor. Thus, in optimizing the corresponding state quantity of the tractor (such as the speed state quantity described above), the constraints of the profile observation of the tractor by the trailer can be further introduced on the basis of the reference point residuals described above.
According to some embodiments, the center point residual and the contour corner residual have corresponding weights, respectively, and the weights are diagonal matrices; each of the center point residual and the contour corner point residual includes a lateral residual component and a longitudinal residual component, and the lateral residual component and the longitudinal residual component have corresponding weights, respectively.
In examples according to the present disclosure, when the target is a vehicle, the lateral direction may be a horizontal direction perpendicular to the approximate orientation of the target vehicle; the longitudinal direction may be a horizontal direction parallel to the approximate orientation of the target vehicle. Specifically, the "approximate orientation" may include, for example, an observed body orientation of the target vehicle, a lane orientation of a lane in which the target vehicle is located (i.e., lane line heading), and the like.
Thus, according to some embodiments, when the target is a vehicle, the lateral residual component is perpendicular to the lane orientation in which the vehicle is located, and the longitudinal residual component is parallel to the lane heading in which the vehicle is located; or the lateral residual component is perpendicular to the body orientation of the vehicle and the longitudinal residual component is parallel to the body orientation of the vehicle.
In the present disclosure, the lateral direction and the longitudinal direction are focused on when the state quantity is estimated, and the lateral direction and the longitudinal direction may be decoupled for convenience of model tuning. In some examples, for example, the body orientation or lane orientation observed by radar sensors is known, so residuals in the ENU coordinate system can be passed through R bw The position loss function including the center point residual and the contour corner point residual may be as shown in equation (18) if the matrix is rotated into the reference coordinate system.
Wherein ρ (·) is a robust function;different weights are set for the transverse residual error and the longitudinal residual error respectively for the weight matrix (diagonal matrix); r is R bw As described above with reference to equation (4).
In the present disclosure, the robust function ρ (·) may be a robust function based on any suitable loss function, including, but not limited to Cauchy (Lorentzian), charbonnier (pseudo-Huber, L1-L2), huber, geman-McClure, smooth truncated quadratic, truncated quadratic, tukey's big, and so forth. Illustratively, a convex loss function such as Huber may be selected to maintain a convex optimization problem. However, the convex loss function may have limited robustness to outliers. Thus, in some examples, a non-convex loss function may be selected.
According to some embodiments, when the lateral variance of one of the center point residual and the contour corner point residual is less than a predetermined threshold, the weight of the corresponding lateral residual component takes a first fixed value; when the longitudinal variance of one of the center point residual and the contour corner point residual is smaller than a predetermined threshold, the weight of the corresponding longitudinal residual component takes a first fixed value.
In some examples, taking the center point residual as an example, if at least one of the lateral center point variance component and the longitudinal center point variance component of the center point variance is less than a respective first threshold, the respective weights of the lateral center point residual component and the longitudinal center point residual component are a first fixed value. And when at least one of the lateral center point variance component and the longitudinal center point variance component is not less than the respective first threshold, the weight of at least one of the lateral center point residual component and the longitudinal center point residual component is inversely related to the at least one of the lateral center point variance component and the longitudinal center point variance component.
In some examples, the contour corner residuals may be similar to the center point residuals described above, i.e., weights corresponding to the contour corner residuals are determined based on contour corner variances.
Specifically, the weight matrix is inversely related to the variance, and the lateral variance and the longitudinal variance are known, and the weight matrix can be expressed as shown in formula (19):
wherein w is long 、w lat And a and b are super parameters. Limited by the accuracy of the observation model, the true error cannot be accurately reflected when the variance is small, so by equation (19), when the variance is below the threshold, a fixed weight is used. In the present disclosure, a weight formula similar to formula (19) may be used for all observation loss terms.
According to some embodiments, the target is a vehicle, and the transverse residual component and the longitudinal residual component in the center point residual are a transverse residual component and a longitudinal residual component calibrated according to a vehicle size, so that the residual is minimum when the center point observed by the sensor is located in an area occupied by the vehicle.
Specifically, in some embodiments, calibrating the center point residual may include: subtracting the respective second threshold from at least one of the lateral center point residual component and the longitudinal center point residual component in response to the at least one of the lateral center point residual component and the longitudinal center point residual component being greater than the respective second threshold; in response to at least one of the lateral center point residual component and the longitudinal center point residual component being less than a respective third threshold, adding the respective third threshold to at least one of the lateral center point residual component and the longitudinal center point residual component; and setting at least one of the lateral center point residual component and the longitudinal center point residual component to 0 in response to the value of the at least one of the lateral center point residual component and the longitudinal center point residual component being between the respective second threshold and the respective third threshold.
In some embodiments, for the lateral center point residual component, the second threshold is half the second vehicle lateral dimension, the third threshold is half the negative of the second vehicle lateral dimension, and for the longitudinal center point residual component, the second threshold is half the second vehicle longitudinal dimension, the third threshold is half the negative of the second vehicle longitudinal dimension.
Specifically, for example, the projection ranging algorithm, the binocular ranging algorithm, and the observation center of the model corresponding to the binocular ranging algorithm, the observed position is related to the observation angle of the device in which the sensor is located. The positions of the observation points at different perspectives are different, and it is impossible to determine which point on the target vehicle is. To use these observations, the lateral center point residuals and the longitudinal center point residuals may be calibrated according to formulas (20) - (21), assuming a uniform distribution over the size of the vehicle.
Wherein,and->A longitudinal center point residual component and a transverse center point residual component before calibration, r l And r d The calibrated longitudinal center point residual component and the transverse center point residual component are respectively. That is, the vehicle body center point observation of the target vehicle is to be output, but there is a deviation in the actual output, so the lateral center point residual and the longitudinal center point residual are calibrated according to the calibration formulas (20) - (21).
According to some embodiments, optimizing the state quantity of the target at each time instant by minimizing a loss function may include at least one of: if the observation model outputs a plurality of coordinate values for the same datum point, discarding the output of the observation model; and if a plurality of observation models output a plurality of observation amounts at the same time to the same reference point, respectively normalizing the sum of squares of the lateral residual components (lateral reference point residual components) and the sum of the squares of the longitudinal reference point residual components (longitudinal reference point residual components) corresponding to the plurality of observation amounts in the position loss.
In the state quantity optimization process, the position factor and the speed smoothing factor conflict with each other. To smooth the velocity, the sum of the squares of the weights of the location factors may be normalized when there are multiple location observations.
In particular, there are a plurality of observation models corresponding to the at least one sensor. When the observed quantity comprises a plurality of observed values, which are output by the plurality of observed models and are used for the central point of the target vehicle at the same moment, the normalization processing is carried out on the sum of the square weights of the residual components of the transverse central point corresponding to the plurality of observed values in the position loss item, and the normalization processing is carried out on the sum of the square weights of the residual components of the longitudinal central point corresponding to the plurality of observed values in the position loss item.
When the observed quantity comprises a plurality of observed values, which are output by a plurality of observed models and are used for the same contour angular point of the target vehicle at the same moment, the square sum of the transverse datum point residual components corresponding to the observed values in the position loss item is normalized, and the square sum of the longitudinal datum point residual components corresponding to the observed values in the position loss item is normalized.
According to some embodiments, the speed loss comprises at least one of: speed prior loss and speed smoothing loss; the speed priori loss comprises a residual error between the speed of the target at each current moment and the speed after the last optimization; the speed smoothing loss includes a residual of the speed of the target at each current time and an average speed of the target within a predetermined period of time at the current time.
Specifically, in some embodiments, the observed quantity includes a speed of the target vehicle at each time within the sliding time window, and the state quantity includes the speed of the target vehicle at each time within the sliding time window. The speed penalty may be determined based on a speed residual calculated from the speed in the observables and the speed in the state quantity.
The speed residual loss can also be flexibly added to the speed loss when one observation model can provide a speed observation quantity, such as a radar model. Suppose the observations of the first observation model are:velocity loss term e in equation (22) ov It is necessary to add to the speed loss formula, where +.>Representing the number of models that can provide a speed observation.
In some embodiments, the velocity vector observed by, for example, a radar model is unreliable, however, a velocity norm may be employed. If only the velocity norms are available at this time, the velocity loss term e ov May be as shown in equation (23).
Additionally or alternatively, in some embodiments, the state quantity includes an average speed of the target vehicle within a sliding time window, and the speed loss term may be based on a speed smoothing loss calculated from the speed of the target vehicle at each time within the sliding time window and the average speed of the target vehicle within the sliding time window.
Specifically, to ensure speed smoothing within the sliding time window, the speed at each instant within the sliding time window may be limited to an average value using a speed smoothing loss term shown in equation (24).
Wherein w is a And the weight value corresponding to the speed smoothing loss is obtained.
In some embodiments, the weight value w a The determination may be based on a distance between the target vehicle and the vehicle or roadside device in which the plurality of sensors are located. Illustratively, the weight value is positively correlated with the distance when the distance is greater than a preset threshold; when the distance is not greater than the preset threshold, the weight value is a fixed value.
In some embodiments, the weight w of the velocity smoothing loss a The determination may be further based on a rate of change of the speed of the target vehicle calculated from the speed of the target vehicle at each instance in the sliding time window. Specifically, the weight value when the rate of change of speed is greater than another preset threshold value is less than the rate of change of speed is not greater than the other preset threshold valueWeight value at value.
Additionally or alternatively, in some embodiments, the sliding step size of the sliding time window is less than the length of the sliding time window, and the speed penalty may be determined based on the speed prior penalty. The speed a priori loss is calculated from: the speed at each time in the overlapping area of the sliding time window and the previous sliding time window, and the optimized speed at each time in the overlapping area in the state quantity optimization process performed for the previous sliding time window.
Specifically, to preserve the optimization information that each current time was optimized before, the speed a priori loss term shown in equation (25) may be used to limit the speed at each time within the sliding time window to be close to the speed after the last optimization at that time.
Wherein,for the last optimized speed at the current time, the value of k is from 0 to n-2, which means that the sliding step length of the sliding time window is 1 at the moment, and v 0 、v 1 、…、v n-2 Its optimal solution has been obtained in the last optimization (previous sliding time window); w (w) p The weight value corresponding to the speed prior loss is obtained.
In some embodiments, the weight value may be determined based on a distance between the target vehicle and the vehicle or roadside device in which the plurality of sensors are located. When the distance is greater than a preset threshold, the weight value is positively correlated with the distance; when the distance is not greater than the preset threshold, the weight value is a fixed value.
In summary, the complete speed loss term can be expressed as shown in equation (26).
According to some embodiments, the orientation penalty term includes at least one of: a priori loss of orientation of the first component, a smooth loss of orientation of the first component, a priori loss of orientation of the second component, a smooth loss of orientation of the second component, and an angular velocity loss; the orientation priori loss comprises residual errors of an orientation vector of the target at each current moment and an orientation vector after the last optimization; the orientation smoothing loss comprises a residual error of the orientation of the target at each current moment and the average orientation of the target within a preset period of time of the current moment; the angular velocity loss includes a residual of a first angular rate of change related to a vehicle size, a vehicle speed, an angle of the first component, and an angle of the second component, and a second angular rate of change related to an amount of angular change of the second component over a predetermined time interval.
In some embodiments, the target vehicle is a vehicle and the target vehicle is a vehicle that includes only a first component (i.e., a primary structure), and the state quantity includes an orientation of the target vehicle at each moment in the sliding time window. At this time, the heading loss may include calculation of the heading residual from the heading of the target vehicle at each time in the sliding time window and the heading observations of the target vehicle at each time in the sliding time window based on the heading residual.
Specifically, the orientation observation may directly constitute a constraint of orientation, and thus the orientation loss term may be as shown in formula (27).
Wherein,is a group of different observation sources, +.>Is the first observation sourceThe corresponding weight may be calculated as shown in equation (19). For speed-based orientation observations, the weight λ k Can be calculated according to the formula (28),
wherein w is v And a is a superparameter.
In some embodiments, the heading observation may be a body heading of the target vehicle, a lane line heading, or a speed direction of the target vehicle observed by the at least one observation model. In some embodiments, to optimize the heading, when no reliable heading observation is given, the vehicle should walk along the lane, at which point the lane line heading of the lane may be considered as a heading observation with a fixed variance; further, the speed direction may also be considered as an observation of an orientation, and the higher the speed, the smaller the difference in speed direction from the vehicle orientation.
Just as the velocity loss, the heading loss also has a similar smooth loss term and a priori loss term. In some embodiments, the state quantity further includes an average orientation of the first component of the target vehicle within the sliding time window, and thus the orientation loss may be calculated based on an orientation smoothing loss calculated from the orientation state quantity of the target vehicle at each time within the sliding time window and the average orientation state quantity of the target vehicle within the sliding time window.
In some embodiments, the sliding step size of the sliding time window is less than the length of the sliding time window. At this time, the orientation loss is further calculated based on the orientation prior loss, which is calculated according to the following: the method comprises the steps of enabling a sliding time window to overlap with a previous sliding time window, enabling the state quantity to be oriented at each time point in the overlapping area, and enabling the state quantity to be optimized at each time point in the overlapping area in the state quantity optimization process carried out on the previous sliding time window.
Specifically, the smooth loss term and the a priori loss term towards loss may be as shown in equations (29) and (30), respectively.
Wherein,for the last optimized orientation at the current time (assuming a sliding step of 1 for the sliding time window at this time), >Is the average orientation within the current sliding time window.
In some embodiments, the target vehicle is a vehicle, such as a truck, comprising a first component and a second component. The first and second members may form a structural constraint therebetween by a pivot structure (or hinge). The state quantity includes an orientation of the first component at each time within the sliding time window and an orientation of the second component at each time within the sliding time window.
Thus, in some embodiments, the orientation penalty may be based on a first part orientation residual and a first part orientation residual, wherein the first part orientation residual is calculated from an orientation of the first part at each time within the sliding time window and an orientation observation of the first part at each time within the sliding time window, and the second part orientation residual is calculated from an orientation of the second part at each time within the sliding time window and an orientation observation of the second part at each time within the sliding time window. The first component orientation residual and the first component orientation residual may be referred to above and will not be described here.
In some embodiments, the orientation observation of the first component is an orientation of the first component, a lane line orientation, or a speed direction of the first component observed by the at least one observation model, and the orientation observation of the second component is an orientation of the second component, a lane line orientation, or a speed direction of the second component observed by the at least one observation model.
In some embodiments, the state quantity comprises an average orientation of the first part within the sliding time window, and thus the orientation loss may comprise an orientation smoothing loss of the first part calculated from the orientation of the first part at each moment within the sliding time window and the average orientation of the first part within the sliding time window.
In some embodiments, the sliding step size of the sliding time window is less than the length of the sliding time window. Thus, the orientation loss may comprise an orientation prior loss of the first component, the orientation prior loss of the first component calculated from: the orientation of the first component at each moment in the overlapping area of the sliding time window and the previous sliding time window, and the optimized orientation of the first component at each moment in the overlapping area in the state quantity optimization process performed for the previous sliding time window.
In some embodiments, the observed quantity includes a length of the second component observed at each time within the sliding time window and a length of a rotating shaft structure (or hinge) forming the structural constraint observed at each time within the sliding time window. The state quantity includes a speed of the target vehicle at each time within the sliding time window, an orientation of the first component at each time within the sliding time window, and an orientation of the second component at each time within the sliding time window. At this time, the orientation loss may include an angular velocity loss calculated from: the speed of the target vehicle at each time within the sliding time window, the length of the second component observed at each time within the sliding time window, the length of the spindle structure observed at each time within the sliding time window, the orientation of the first component at each time within the sliding time window, and the orientation of the second component at each time within the sliding time window.
Specifically, there is also a motion constraint as shown in equation (9) for the orientation observation of the second component. The angular velocity loss can be as shown in equation (31).
Wherein L is t And L h The length of the first part and the spindle structure, respectively, is calculated in a manner which will be described below with reference to the loss of dimension.
In summary, the complete heading loss term can be expressed as shown in equation (32).
In some embodiments, the orientation observed by, for example, a radar sensor or the like may be flipped 180 degrees, e.g., the speed direction may be 180 degrees out of phase with the vehicle body orientation when the vehicle is reversed. Thus, in optimizing the state quantity by minimizing the loss function, the vehicle body orientation may be corrected, including: when the difference value between the orientation state quantity of the target vehicle obtained by optimizing the previous sliding time window and the orientation observation value at the corresponding time in the current sliding time window is larger than 90 degrees, turning over the orientation observation value by 180 degrees; and when the turnover number of the orientation observation value under the sliding time window continuously exceeds a preset value, turning over the orientation state quantity under the sliding time window by 180 degrees.
According to some embodiments, the size loss term comprises at least one of: size priori loss, and optimized size accumulated loss at each moment; the dimension prior loss comprises a residual error between the dimension of the target at each current moment and the dimension after the last optimization; the cumulative size loss includes a sum of the size losses of the target between the initial time and the last optimization time.
In some embodiments, the sliding step size of the sliding time window is less than the length of the sliding time window. Thus, the size loss term may include a size prior loss calculated from: the method comprises the steps of enabling a sliding time window to overlap with a previous sliding time window, and enabling the sliding time window to overlap with the previous sliding time window, wherein the sliding time window is in an overlapping area, and enabling the sliding time window to overlap with the previous sliding time window.
According to some embodiments, the size cumulative loss is calculated using an incremental update method; the observed quantity of the target at each time is the observed quantity of the target at each time in a sliding time window; the state quantity of the target at each time is the state quantity of the target at each time in a sliding time window; the sliding time window includes a plurality of data moments, and each moment is at least two moments in the plurality of data moments.
In particular, the cumulative size loss comprises the sum of the size losses of the targets between the initial time and the last optimization time. The initial time is a time at which the state quantity optimization is initially started, for example, the obtained first frame data time. The last time of optimization may be, for example, the last time within the previous time sliding window. For example, the target vehicle profile observation may provide size information of the target vehicle, and thus, the size cumulative loss may be calculated from: the method comprises the steps of determining size loss which does not fall within a current sliding time window and corresponds to each moment falling within a previous sliding time window and is determined based on a datum point residual error, and performing size accumulation loss used in a state quantity optimization process for the previous sliding time window.
In some embodiments, the observables include contour corner coordinates of the target vehicle at each time instant within the sliding time window and center point coordinates of the target vehicle at each time instant within the sliding time window. The state quantity comprises the speed of the target vehicle at each moment in the sliding time window, and the reference point residual error corresponding to each moment is calculated according to the following steps: the respective vectors of the observed quantity of the center point coordinates of the target vehicle at the time, the observed quantity of the contour angular point coordinates of the target vehicle at the time, and the observed quantity of the center point coordinates of the target vehicle at the time can be determined according to the formula (17).
In particular, in the optimization framework, the body size or the tractor size of the truck is considered as a global variable to be optimized. When the current state is updated, the oldest frame is rejected and not updated. Although the out-of-window state quantities are fixed, they can also provide some information about global dimensional variables. Specifically, if the contour is observed after the ith frame is removedA new size loss may be generated as shown in equation (33).
Wherein,and->Is constant (is>Is the weight calculated from the variance in equation (19).
Since the laplace distribution can be equivalently expressed as the product of a gaussian distribution and an inverse gaussian distribution, in some examples, then γ=diag (γ 01 ) To approximate the L2 term of the Huber loss function as shown in equation (34) for better robustness.
Wherein delta represents a preset parameter, r i Expressed in the formula (33)
The number of size loss terms may increase over time, and in order to avoid redundant computation, in an embodiment according to the present disclosure, they are combined into one term in an incremental form, so the loss term at Ti time may be expressed as shown in formula (35).
Wherein A is i Can be calculated by the SVD decomposition method as shown in formulas (36) - (38):
=UΛΛ T V T =UΛV T (UΛV T ) T formula (37)
A i =(UΛV T ) T Formula (38)
Wherein,is a symmetric matrix, so u=v. b i As shown in equation (39).
In some embodiments, in examples where the goal is to include a first component and a second component, such as a truck model as shown in FIG. 2, the dimensions of the trailer and the axle structure connecting the trailer and the tractor may be calculated by observational quantities, as shown in equations (40) - (42) below.
Wherein equations (40) - (42) are solutions to the optimization problem, as shown in equation (43).
In summary, given the a priori losses of L and W, the overall size loss term can be as shown in equation (44).
Wherein E is s The first term in (a) is the cumulative size loss, and the second term is the prior size loss.
In the present disclosure, based on a loss function including at least one of a position loss, an orientation loss, a speed loss, a size loss, and a structural constraint of a target, by minimizing the loss function, a state quantity of the target at each time may be optimized, so as to obtain optimized state quantities. In the unmanned field, the method can update the state information of surrounding vehicles more accurately so that an unmanned system can make a safe path plan based on the result, thereby avoiding traffic accidents.
There is also provided, as shown in fig. 5, a target state estimating apparatus 500 according to an embodiment of the present disclosure, including: an acquisition unit 510 adapted to acquire observables for a target by a plurality of sensors, wherein at least one observables is acquired by each sensor; a construction unit 520 adapted to determine the state quantity of the target to be optimized based on the observed quantity; and an optimizing unit 530, adapted to optimize the state quantity of the target at each moment by minimizing the loss function, so as to obtain the optimized state quantity of the target at each moment. The loss function includes at least one of a position loss, an orientation loss, a velocity loss, a size loss, a structural constraint of the target.
With reference to fig. 6, a computing device 2000 will now be described, which is an example of a hardware device that may be applied to aspects of the present disclosure. The computing device 2000 may be any machine configured to perform processes and/or calculations and may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a smart phone, an in-vehicle computer, or any combination thereof. The above-described object state estimation apparatus may be implemented in whole or at least in part by the computing device 2000 or similar device or system.
The computing device 2000 may include elements that are connected to the bus 2002 (possibly via one or more interfaces) or that communicate with the bus 2002. For example, computing device 2000 may include a bus 2002, one or more processors 2004, one or more input devices 2006, and one or more output devices 2008. The one or more processors 2004 may be any type of processor and may include, but are not limited to, one or more general purpose processors and/or one or more special purpose processors (e.g., special processing chips). Input device 2006 may be any type of device capable of inputting information to computing device 2000 and may include, but is not limited to, a mouse, a keyboard, a touch screen, a microphone, and/or a remote control. The output device 2008 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, video/audio output terminals, vibrators, and/or printers. The computing device 2000 may also include a non-transitory storage device 2010, or any storage device that may be non-transitory and that may enable data storage, and may include, but is not limited to, a disk drive, an optical storage device, solid state memory, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, an optical disk or any other optical medium, a ROM (read only memory), a RAM (random access memory), a cache memory, and/or any other memory core, or be connected to the non-transitory storage device 2010 A sheet or cartridge, and/or any other medium from which a computer may read data, instructions, and/or code. The non-transitory storage device 2010 may be detached from the interface. The non-transitory storage device 2010 may have data/program (including instructions)/code for implementing the methods and steps described above. Computing device 2000 may also include a communication device 2012. The communication device 2012 may be any type of device or system that enables communication with external devices and/or with a network, and may include, but is not limited to, a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset, such as bluetooth TM Devices, 1302.11 devices, wiFi devices, wiMax devices, cellular communication devices, and/or the like.
Computing device 2000 may also include a working memory 2014, which may be any type of working memory that may store programs (including instructions) and/or data useful for the operation of processor 2004 and may include, but is not limited to, random access memory and/or read-only memory devices.
Software elements (programs) may reside in the working memory 2014 including, but not limited to, an operating system 2016, one or more application programs 2018, drivers, and/or other data and code. Instructions for performing the above-described methods and steps may be included in one or more applications 2018, and the various elements of the above-described target state estimation device, etc., may be implemented by instructions of the one or more applications 2018 being read and executed by the processor 2004. More specifically, the obtaining unit 510 of the target state estimation device may be implemented, for example, by the processor 2004 executing the application 2018 having instructions to perform step 110. The construction unit 520 of the target state estimation device may be implemented, for example, by the processor 2004 executing the application 2018 with instructions to perform step 120. Further, the optimization unit 530 of the aforementioned target state estimation device may be implemented, for example, by the processor 2004 executing the application 2018 with instructions to perform step 130. Executable code or source code of instructions of software elements (programs) may be stored in a non-transitory computer readable storage medium (such as storage device 2010 described above) and, when executed, may be stored (possibly compiled and/or installed) in working memory 2014. Executable code or source code for instructions of software elements (programs) may also be downloaded from a remote location.
It should also be understood that various modifications may be made according to specific requirements. For example, custom hardware may also be used, and/or particular elements may be implemented in hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. For example, some or all of the disclosed methods and apparatus may be implemented by programming hardware (e.g., programmable logic circuits including Field Programmable Gate Arrays (FPGAs) and/or Programmable Logic Arrays (PLAs)) in an assembly language or hardware programming language such as VERILOG, VHDL, c++ using logic and algorithms according to the present disclosure.
It should also be appreciated that the foregoing method may be implemented by a server-client mode. For example, a client may receive data entered by a user and send the data to a server. The client may also receive data input by the user, perform a part of the foregoing processes, and send the processed data to the server. The server may receive data from the client and perform the aforementioned method or another part of the aforementioned method and return the execution result to the client. The client may receive the result of the execution of the method from the server and may present it to the user, for example, via an output device.
It should also be appreciated that the components of computing device 2000 may be distributed over a network. For example, some processes may be performed using one processor while other processes may be performed by another processor remote from the one processor. Other components of computing system 2000 may also be similarly distributed. As such, computing device 2000 may be construed as a distributed computing system that performs processing in multiple locations.
Although embodiments or examples of the present disclosure have been described with reference to the accompanying drawings, it is to be understood that the foregoing methods, systems, and apparatus are merely exemplary embodiments or examples, and that the scope of the present invention is not limited by these embodiments or examples but only by the claims following the grant and their equivalents. Various elements of the embodiments or examples may be omitted or replaced with equivalent elements thereof. Furthermore, the steps may be performed in a different order than described in the present disclosure. Further, various elements of the embodiments or examples may be combined in various ways. It is important that as technology evolves, many of the elements described herein may be replaced by equivalent elements that appear after the disclosure.

Claims (20)

1. A method of target state estimation, comprising:
Obtaining observed quantity of the target at each moment through a plurality of sensors, wherein at least one observed quantity is obtained through each sensor;
determining a state quantity to be optimized of the target based on the observed quantity; and
optimizing the state quantity of the target at each moment by a minimized loss function to obtain the state quantity of the target after optimization at each moment;
wherein the loss function includes at least one of a position loss, an orientation loss, a velocity loss, a size loss, a structural constraint of the target.
2. The method of claim 1, wherein the observed quantity comprises at least one of:
at least one of speed, position and orientation of the target at each time; and
the dimension of the target includes at least one of a length, a width, and a height.
3. The method of claim 1, wherein the state quantity comprises at least one of:
at least one of speed, position and orientation of the target at each time;
at least one of average speed, average position and average orientation of the target in a preset period of time to which each moment belongs; and
the size of the target.
4. The method of claim 1, wherein,
the plurality of sensors are located on at least one observation vehicle or on a roadside apparatus;
the target state estimation method is implemented in a computing device, and the computing device obtains at least one observed quantity through each sensor;
the computing device resides on the at least one observation vehicle, on the roadside device, or at the cloud.
5. The method of claim 1, wherein,
the target is a multi-level structure, and the structural constraints include structural constraints between the multi-level structures;
the observed quantity comprises observed quantities respectively corresponding to at least two stages in the multi-stage structure, and the state quantity comprises state quantities respectively corresponding to at least two stages in the multi-stage structure.
6. The method of claim 2 or 3, wherein,
the object is a vehicle comprising a first part and at least one second part, the second part being rotatable around the first part;
the location of the target includes at least one of: a position of the first component, a position of at least one of the second components, a position of the vehicle;
the dimensions of the target include at least one of: the dimensions of the first component, the dimensions of at least one of the second components, the dimensions of the vehicle;
The orientation of the target includes at least one of: the direction of the speed, the orientation of the first component, the direction of the lane in which the vehicle is located.
7. The method of claim 1, wherein,
the plurality of sensors includes at least one of: the system comprises an image acquisition device and a point cloud acquisition device;
the observed quantity is obtained through an observation model corresponding to each sensor, and the observation model comprises at least one of the following: an image-based binocular ranging algorithm, an image-based monocular ranging algorithm, a point cloud-based ranging algorithm, an image and map-based projected ranging algorithm, and a point cloud and map-based projected ranging algorithm.
8. The method of claim 7, wherein optimizing the state quantity of the target at each time instant by minimizing a loss function comprises at least one of:
if the observation model outputs a plurality of coordinate values for the same datum point, discarding the output of the observation model; and
if a plurality of observation models output a plurality of observation amounts at the same time to the same reference point, respectively carrying out normalization processing on the weight square sum of the transverse reference point residual error components and the weight square sum of the longitudinal reference point residual error components corresponding to the plurality of observation amounts in the position loss.
9. The method of claim 2 or 3, wherein,
the location includes a location of at least one reference point, the reference point including at least one of: center points and contour corner points;
the loss of position includes at least one reference point residual, the reference point residual including at least one of: a center point residual representing a difference between the observed quantity and the state quantity for the center point and a contour corner residual representing a difference between the observed quantity and the state quantity for the contour corner.
10. The method of claim 9, wherein,
the center point residual error and the contour angular point residual error respectively have corresponding weights, and the weights are diagonal matrixes;
each of the center point residual and the contour corner residual includes a lateral residual component and a longitudinal residual component, and the lateral residual component and the longitudinal residual component have corresponding weights, respectively.
11. The method of claim 10, wherein,
when the transverse variance of one of the center point residual error and the contour angular point residual error is smaller than a preset threshold value, the weight of the corresponding transverse residual error component takes a first fixed value;
And when the longitudinal variance of one of the center point residual and the contour angular point residual is smaller than a preset threshold value, the weight of the corresponding longitudinal residual component takes the first fixed value.
12. The method of claim 10, wherein,
the target is a vehicle, and the transverse residual component and the longitudinal residual component in the center point residual are the transverse residual component and the longitudinal residual component calibrated according to the vehicle size, so that the residual is minimum when the center point observed by the sensor is located in the area occupied by the vehicle.
13. The method of claim 10, wherein,
the target is a vehicle, the transverse residual component is perpendicular to the lane orientation of the vehicle, and the longitudinal residual component is parallel to the lane orientation of the vehicle; or alternatively
The lateral residual component is perpendicular to the body orientation of the vehicle and the longitudinal residual component is parallel to the body orientation of the vehicle.
14. The method of claim 1, wherein,
the speed loss includes at least one of: speed prior loss and speed smoothing loss;
the speed prior loss comprises a residual error between the speed of the target at each current moment and the speed after the last optimization;
The speed smoothing loss comprises a residual of the speed of the target at each current moment and the average speed of the target within a preset period of time at the current moment.
15. The method of claim 6, wherein,
the orientation loss term includes at least one of: a priori loss of orientation of the first component, a smooth loss of orientation of the first component, a priori loss of orientation of the second component, a smooth loss of orientation of the second component, and an angular velocity loss;
the orientation priori loss comprises residual errors of an orientation vector of the target at each current moment and an orientation vector after the last optimization;
the orientation smoothing loss comprises a residual error of the orientation of the target at each current moment and the average orientation of the target within a preset period of time of the current moment;
the angular velocity loss includes a residual of a first angular rate of change related to a vehicle size, a vehicle speed, an angle of the first component, and an angle of the second component, and a second angular rate of change related to an amount of angular change of the second component over a predetermined time interval.
16. The method of claim 1, wherein,
The size loss term includes at least one of: size priori loss, and optimized size accumulated loss at each moment;
the dimension prior loss comprises a residual error between the dimension of the target at each current moment and the dimension after the last optimization;
the cumulative size loss includes a sum of the size losses of the target between the initial time and the last optimization time.
17. The method of claim 16, wherein the size cumulative loss is calculated using an incremental update method;
the observed quantity of the target at each time is the observed quantity of the target at each time in a sliding time window;
the state quantity of the target at each moment is the state quantity of the target at each moment in a sliding time window;
the sliding time window includes a plurality of data moments, and each moment is at least two moments in the plurality of data moments.
18. A target state estimation device comprising:
an acquisition unit adapted to acquire observables for a target by a plurality of sensors, wherein at least one observables is acquired by each sensor;
a construction unit adapted to determine a state quantity of the target to be optimized based on the observed quantity; and
The optimizing unit is suitable for optimizing the state quantity of the target at each moment by minimizing a loss function to obtain the state quantity of the target after optimization at each moment;
wherein the loss function includes at least one of a position loss, an orientation loss, a velocity loss, a size loss, a structural constraint of the target.
19. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the method comprises the steps of
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-17.
20. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-17.
CN202210837658.2A 2022-07-15 2022-07-15 Target state estimation method, device, electronic equipment and medium Pending CN117437771A (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN202210837658.2A CN117437771A (en) 2022-07-15 2022-07-15 Target state estimation method, device, electronic equipment and medium
EP23184310.3A EP4345417A3 (en) 2022-07-15 2023-07-07 Method, apparatus, electronic device and medium for target state estimation
JP2023114683A JP2024012162A (en) 2022-07-15 2023-07-12 Method, electronic device and medium for target state estimation
AU2023204626A AU2023204626A1 (en) 2022-07-15 2023-07-12 Method, apparatus, electronic device and medium for target state estimation
US18/351,955 US20240025428A1 (en) 2022-07-15 2023-07-13 Method, electronic device and medium for target state estimation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210837658.2A CN117437771A (en) 2022-07-15 2022-07-15 Target state estimation method, device, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN117437771A true CN117437771A (en) 2024-01-23

Family

ID=87196180

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210837658.2A Pending CN117437771A (en) 2022-07-15 2022-07-15 Target state estimation method, device, electronic equipment and medium

Country Status (5)

Country Link
US (1) US20240025428A1 (en)
EP (1) EP4345417A3 (en)
JP (1) JP2024012162A (en)
CN (1) CN117437771A (en)
AU (1) AU2023204626A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118097027A (en) * 2024-04-19 2024-05-28 杭州欣禾圣世科技有限公司 Multi-video 3D human motion capturing method and system based on evolutionary computation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022045982A1 (en) * 2020-08-31 2022-03-03 Nanyang Technological University Unmanned aerial vehicle and localization method for unmanned aerial vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118097027A (en) * 2024-04-19 2024-05-28 杭州欣禾圣世科技有限公司 Multi-video 3D human motion capturing method and system based on evolutionary computation

Also Published As

Publication number Publication date
EP4345417A3 (en) 2024-05-29
EP4345417A2 (en) 2024-04-03
AU2023204626A1 (en) 2024-02-01
US20240025428A1 (en) 2024-01-25
JP2024012162A (en) 2024-01-25

Similar Documents

Publication Publication Date Title
Scherer et al. River mapping from a flying robot: state estimation, river detection, and obstacle mapping
EP3654064B1 (en) Apparatus and method for characterizing an object based on measurement samples from one or more location sensors
CN114599995A (en) Estimating in-plane velocity from radar returns of stationary roadside objects
CN107192409A (en) The method of automated sensor Attitude estimation
US11893496B2 (en) Method for recognizing objects in an environment of a vehicle
WO2021102676A1 (en) Object state acquisition method, mobile platform and storage medium
CN115066632A (en) System and method for tracking the expansion state of a moving object using model geometry learning
Akai Reliable Monte Carlo localization for mobile robots
US20240025428A1 (en) Method, electronic device and medium for target state estimation
CN111612818A (en) Novel binocular vision multi-target tracking method and system
EP4307244A1 (en) Method, apparatus, electronic device and medium for target state estimation
Xu et al. Dynamic vehicle pose estimation and tracking based on motion feedback for LiDARs
Or et al. Learning vehicle trajectory uncertainty
CN110637209A (en) Method, apparatus, and computer-readable storage medium having instructions for estimating a pose of a motor vehicle
CN116385997A (en) Vehicle-mounted obstacle accurate sensing method, system and storage medium
EP3654065B1 (en) Apparatus and method for characterizing an object based on measurement samples from one or more location sensors
Dichgans et al. Robust Vehicle Tracking with Monocular Vision using Convolutional Neuronal Networks
CN117405118B (en) Multi-sensor fusion mapping method, system, equipment and storage medium
Li et al. Set‐theoretic localization for mobile robots with infrastructure‐based sensing
CN114088086B (en) Multi-target robust positioning method for resisting measurement wild value interference
CN118603111A (en) Multi-source sensing information fusion and verification method and device for sweeper and computing equipment
CN115471520A (en) Measuring method and device
CN118795881A (en) Robust iteration pose tracking method and system based on laser radar and robot
CN118131212A (en) Method for identifying environment object and electronic equipment
CN114814920A (en) Unmanned positioning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination