WO2020228242A1 - Method and apparatus for tracking target object, and storage medium - Google Patents
Method and apparatus for tracking target object, and storage medium Download PDFInfo
- Publication number
- WO2020228242A1 WO2020228242A1 PCT/CN2019/112425 CN2019112425W WO2020228242A1 WO 2020228242 A1 WO2020228242 A1 WO 2020228242A1 CN 2019112425 W CN2019112425 W CN 2019112425W WO 2020228242 A1 WO2020228242 A1 WO 2020228242A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- car
- target object
- objects
- coordinate system
- radar
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 231100001261 hazardous Toxicity 0.000 claims description 11
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 6
- 238000007781 pre-processing Methods 0.000 claims description 5
- 238000012545 processing Methods 0.000 description 11
- 230000002093 peripheral effect Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 6
- 230000003044 adaptive effect Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- SAZUGELZHZOXHB-UHFFFAOYSA-N acecarbromal Chemical compound CCC(Br)(CC)C(=O)NC(=O)NC(C)=O SAZUGELZHZOXHB-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/04—Systems determining presence of a target
Definitions
- This application relates to the technical field of smart cars, in particular to a method, device and storage medium for tracking a target object.
- the car may include an adaptive cruise system, and automatically drive under the control of the adaptive cruise system.
- the key to control through the adaptive cruise system is the tracking of target objects, that is, the tracking of target objects such as other cars and pedestrians in the current direction of the car.
- the object information obtained by millimeter wave radar may come from the target object, or it may be affected by other noise.
- the unstable operation of the millimeter-wave radar itself and the uneven return energy false target objects may be obtained, resulting in inaccurate object tracking.
- the millimeter-wave radar measurement signal may be temporarily lost, which leads to large fluctuations in the object information, resulting in the acquisition of target objects and object information, which leads to automatic driving calculations.
- the amount will increase, reducing the real-time performance of automatic driving, and will also lead to the inability to accurately and timely detect dangerous target objects, which will lead to unnecessary false alarms or misexecution of the car, reduce driving comfort, and even cause safety hazards.
- the embodiments of the present application provide a method, device, and storage medium for tracking a target object, which are used to solve the problem of low accuracy of target object tracking in related technologies, which leads to poor driving safety of automobiles.
- the technical solution is as follows:
- a method for tracking a target object includes:
- the target object is tracked to control the car.
- the determining the target object that meets the tracking condition based on the radar information includes:
- the object other than the prohibited object, the fake object, the empty object, and the non-hazardous object is determined as the target object that meets the tracking condition.
- the radar information includes object information of objects appearing in the driving direction of the car, the lateral distance and the longitudinal distance between the car and the object, and the object information includes The relative distance between the objects, the relative speed, and the number of occurrences of the objects within the detection range of the millimeter wave radar;
- the determining of stationary objects, false objects, empty objects and non-dangerous objects in the objects based on the object information includes:
- an object whose lateral distance is less than the lateral distance threshold and the longitudinal distance is less than the longitudinal distance threshold is determined as the non-dangerous object.
- the acquiring the radar information of the car in the driving direction through the millimeter wave radar installed on the car includes:
- Preprocessing the radar data to obtain the radar information Preprocessing the radar data to obtain the radar information.
- the preprocessing the radar data to obtain the radar information includes:
- the relative distance is decomposed in the world coordinate system to obtain the lateral distance And the longitudinal distance.
- the tracking the target object to control the car includes:
- the target object is tracked in the image pixel coordinate system to control the automobile.
- the projecting the target object from the world coordinate system to the image pixel coordinate system includes:
- X 1 is the abscissa of the target object in the image pixel coordinate system
- the Y 1 is the ordinate of the target object in the image pixel coordinate system
- the image pixel coordinate system The median coordinate unit is pixel
- the X is the lateral distance of the target object in the world coordinate system
- the Y is the longitudinal distance of the target object in the world coordinate system
- the a and b Is the magnification
- the image.cols is the image width.
- the tracking the target object in the image pixel coordinate system to control the car includes:
- the reference position information is updated
- a device for tracking a target object includes:
- the acquisition module is used to acquire radar information of the car in the driving direction through the millimeter wave radar installed on the car;
- a determining module configured to determine a target object that meets the tracking condition based on the radar information
- the tracking module is used to track the target object to control the car.
- the determining module includes:
- the first determining sub-module is configured to determine stationary objects, false objects, empty objects and non-dangerous objects in the objects based on the radar information;
- the second determination sub-module is used to determine objects other than the prohibited object, the fake object, the empty object, and the non-dangerous object as target objects that meet the tracking condition.
- the radar information includes object information of objects appearing in the driving direction of the car, the lateral distance and the longitudinal distance between the car and the object, and the object information includes The relative distance between the objects, the relative speed, and the number of occurrences of the objects within the detection range of the millimeter wave radar;
- the first determining submodule is used for:
- an object whose lateral distance is less than the lateral distance threshold and the longitudinal distance is less than the longitudinal distance threshold is determined as the non-dangerous object.
- the acquisition module includes:
- the processing sub-module is used to preprocess the radar data to obtain the radar information.
- the processing sub-module is used to:
- the relative distance is decomposed in the world coordinate system to obtain the lateral distance And the longitudinal distance.
- the tracking module includes:
- Projection sub-module for projecting the target object from the world coordinate system to the image pixel coordinate system
- the tracking sub-module is used to track the target object in the image pixel coordinate system to control the car.
- the projection sub-module is used to:
- X 1 is the abscissa of the target object in the image pixel coordinate system
- the Y 1 is the ordinate of the target object in the image pixel coordinate system
- the image pixel coordinate system The median coordinate unit is pixel
- the X is the lateral distance of the target object in the world coordinate system
- the Y is the longitudinal distance of the target object in the world coordinate system
- the a and b Is the magnification
- the image.cols is the image width.
- the tracking sub-module is used to:
- the reference position information is updated
- a computer-readable storage medium stores a computer program, and the computer program implements the steps of the target object tracking method provided above when the computer program is executed by a processor.
- an automobile in another aspect, and the automobile includes:
- a memory for storing processor executable instructions
- the processor is configured to execute the steps of the target object tracking method provided above.
- a computer program product containing instructions which when running on a computer, causes the computer to execute the steps of the target object tracking method provided in the first aspect.
- the radar information in the direction of the car can be acquired, and the target objects that meet the tracking conditions can be determined according to the acquired radar information, and then the target objects that meet the tracking conditions can be tracked, so that the tracking is targeted and improved This improves the accuracy of object tracking and ensures the driving safety of the car.
- FIG. 1 is a flowchart of a method for tracking a target object provided by an embodiment of the present application
- FIG. 2 is a flowchart of another target object tracking method provided by an embodiment of the present application.
- FIG. 3 is a schematic diagram of a position between a car and a target object provided by an embodiment of the present application
- FIG. 4 is a schematic structural diagram of a tracking device for a target object provided by an embodiment of the present application.
- FIG. 5 is a schematic structural diagram of a determining module provided by an embodiment of the present application.
- FIG. 6 is a schematic structural diagram of an acquisition module provided by an embodiment of the present application.
- FIG. 7 is a schematic structural diagram of a tracking module provided by an embodiment of the present application.
- Fig. 8 is a schematic structural diagram of an automobile provided by an embodiment of the present application.
- the key to control through the adaptive cruise system is to track the target object.
- the target object and object information in the driving direction can be obtained in real time through the millimeter wave radar fixed in front of the car, and the target object can be tracked according to the obtained object information. Tracking, controlling the car based on the tracking results.
- the object information obtained by millimeter wave radar may come from the target object, or it may be affected by other noise.
- false target objects may be obtained, resulting in inaccurate object tracking.
- the millimeter-wave radar measurement signal may be temporarily lost, which leads to large fluctuations in the object information, resulting in the acquisition of target objects and object information, which leads to automatic driving calculations.
- the amount will increase, reducing the real-time performance of automatic driving, and will also lead to the inability to accurately and timely detect dangerous target objects, which will lead to unnecessary false alarms or misexecution of the car, reduce driving comfort, and even cause safety hazards.
- embodiments of the present application provide a tracking method for a target object that can improve the accuracy of object tracking.
- FIG. 1 is a flowchart of a method for tracking a target object provided by an embodiment of the application. Referring to FIG. 1, the method is applied to a car and includes the following steps.
- Step 101 Obtain the radar information of the car in the driving direction through the millimeter wave radar installed on the car.
- Step 102 Based on the radar information, determine target objects that meet the tracking conditions.
- Step 103 Track the target object to control the car.
- the radar information in the direction of the car can be acquired, and the target objects that meet the tracking conditions can be determined according to the acquired radar information, and then the target objects that meet the tracking conditions can be tracked, so that the tracking is targeted and improved This improves the accuracy of object tracking and ensures the driving safety of the car.
- determining the target object that meets the tracking conditions includes:
- the radar information includes object information of objects appearing in the driving direction of the car, the lateral distance and the longitudinal distance between the car and the object, and the object information includes the relative distance between the car and the object. Distance, relative speed, and the number of occurrences of the object within the detection range of the millimeter wave radar;
- determine the stationary objects, false objects, empty objects and non-hazardous objects in the object including:
- an object whose lateral distance is less than the lateral distance threshold and the longitudinal distance is less than the longitudinal distance threshold is determined as the non-dangerous object.
- obtaining the radar information of the car in the driving direction through the millimeter wave radar installed on the car includes:
- the radar data is preprocessed to obtain the radar information.
- preprocessing the radar data to obtain the radar information includes:
- the millimeter wave radar protocol analyze the radar data of the object to obtain the object information of the object;
- the relative distance is decomposed in the world coordinate system to obtain the horizontal distance and the vertical distance.
- tracking the target object to control the car includes:
- projecting the target object from the world coordinate system to the image pixel coordinate system includes:
- the target object is projected into the image pixel coordinate system by the following projection formula
- X 1 is the abscissa of the target object in the image pixel coordinate system
- the Y 1 is the ordinate of the target object in the image pixel coordinate system
- the coordinate unit in the image pixel coordinate system is pixel
- the X is the lateral distance of the target object in the world coordinate system
- the Y is the longitudinal distance of the target object in the world coordinate system
- the a and b are magnifications
- the image.cols is the image width.
- tracking the target object in the image pixel coordinate system to control the car includes:
- the reference position information is updated
- the collision duration for the selected target object to collide with the car is determined.
- FIG. 2 is a flowchart of a method for tracking a target object provided by an embodiment of the application. Referring to FIG. 2, the method includes the following steps.
- Step 201 The car obtains the radar information of the car in the driving direction through the millimeter wave radar installed on the car.
- the car can obtain the driving direction of the car through the millimeter wave radar installed on the car On the radar information.
- the car can obtain the radar data in the driving direction through the millimeter wave radar; the radar data is preprocessed to obtain the radar information.
- the radar information includes object information of objects appearing in the driving direction of the car, the horizontal distance and the longitudinal distance between the car and the object, and the object information includes the relative distance between the car and the object, the relative speed, and the object information.
- the car preprocesses the radar data to obtain the radar information.
- the operation of obtaining the radar information can be: parse the radar data of the object according to the millimeter wave radar protocol to obtain the object information of the object; take the position of the millimeter wave radar as the origin , Establish a world coordinate system; when the object information includes the relative distance between the object and the car and the relative angle between the object and the car, the relative distance is decomposed into the world coordinate system to obtain the horizontal distance and the vertical distance.
- the position of the millimeter wave radar may be used as the origin to establish the world coordinate system, or other positions may be used as the origin to establish the world coordinate system.
- the direction of the car is longitudinal Y
- the direction perpendicular to the direction of travel is transverse X
- the lateral distance is the car (position O in Figure 3) and the object (position A in Figure 3)
- the longitudinal distance is the vertical distance between the car and the object (OA 2 in Figure 3)
- ⁇ AOA 2 is the relative angle
- AO is the relative distance.
- the radar information can be stored, and the car can define the structure of the radar information, and then define the array of the structure type, and finally store the radar information in the array .
- Step 202 The car determines the target object that meets the tracking condition based on the radar information.
- the car may detect some invalid objects through the millimeter wave radar, such as empty objects, false objects, stationary objects and/or non-hazardous objects, these invalid objects will not affect the driving safety of the car, and These invalid objects may cause problems for the car to track the target object. Therefore, the car needs to determine the target object from the acquired objects based on the radar information, and the target object is the object that affects the driving safety of the car. That is, the car needs to determine the target object that meets the tracking conditions based on the radar information.
- the operation of the car to determine the target object that meets the tracking conditions can be: based on the object information, determine the stationary object, false object, empty object and non-dangerous object in the object; object, false object, empty object and non-dangerous object will be prohibited Objects other than dangerous objects are determined as target objects that meet the tracking conditions.
- the operation of determining stationary, false, empty, and non-hazardous objects in the object based on object information can be: comparing the absolute value of the relative speed in the object with the driving speed of the car (or with the millimeter wave radar The object with the same moving speed) is determined as a stationary object; the object whose relative distance between the object and the car is equal to 0 is determined as an empty object; the object whose occurrence number is greater than or less than the number of occurrence threshold is determined as a false object; Among the objects, the objects whose lateral distance is less than the lateral distance threshold and the longitudinal distance is less than the longitudinal distance threshold are determined as non-hazardous objects.
- the threshold of the number of occurrences, the threshold of the horizontal distance and the threshold of the vertical distance can be set in advance, for example, the threshold of the number of occurrences can be 10, 20, and so on.
- the horizontal distance threshold can be 2 meters, 3 meters, etc.
- the longitudinal distance threshold can be 3 meters, 5 meters, 10 meters, and so on.
- the scanning period of a millimeter-wave radar can be 50ms (milliseconds). Since the millimeter-wave radar scans once every 50ms, the car can obtain radar information. Therefore, objects of every N period can be used as a group, and the most The proximity data association method returns a single object to a single radar information and records the number of appearances of each object; when the number of appearances of any object is less than N, any object is determined to be a false object, and N is the threshold of the number of appearances.
- the millimeter wave radar may collect multiple times on a single object to obtain multiple radar information
- the radar information of a single object can be combined. That is, the car can set the distance error threshold, angle error threshold, and speed error threshold of adjacent periodic objects, and set the objects in every N cycles as a group, which will satisfy that the relative distance is less than the distance error threshold and the relative angle is less than the angle
- Objects with an error threshold and a relative speed less than the speed error threshold are associated with the same object, so that a single object returns to a single radar information; when the single object appears and disappears within a preset time, the single object is determined to be a false object.
- stationary objects, fake objects, empty objects, and non-hazardous objects can be eliminated from the detected objects, and the remaining objects after the elimination Determined as the target object.
- Step 203 The car tracks the target object.
- the target object is an object that affects the driving safety of the car
- the target object needs to be tracked after the car determines the target object.
- the operation of tracking the target object by the car may be: projecting the target object from the world coordinate system into the image pixel coordinate system; tracking the target object in the image pixel coordinate system.
- the operation of the car to project the target object from the world coordinate system into the image pixel coordinate system may be: according to the coordinate value of the target object in the world coordinate system, project the target object into the image pixel coordinate system through a projection formula. That is, according to the coordinate value of the target object in the world coordinate system, the car can project the target object into the image pixel coordinate system through the following projection formula.
- X 1 is the abscissa of the target object in the image pixel coordinate system
- Y 1 is the ordinate of the target object in the image pixel coordinate system
- the coordinates in the image pixel coordinate system The unit is pixel
- X is the horizontal distance of the target object in the world coordinate system
- Y is the longitudinal distance of the target object in the world coordinate system
- a and b are magnifications
- image.cols is the image width.
- the car can track the target object in the image pixel coordinate system through the Kalman filter algorithm. That is, the operation of the car to track the target object in the image pixel coordinate system can be: select the target object closest to the car; use the fourth-order Kalman filter algorithm to predict the information of the selected target object to obtain the predicted position information; The predicted position information is compared with the reference position information through a comparison formula.
- the reference position information is the current position information of the selected target object; when the predicted position information and the reference position information do not satisfy the comparison formula, and the number of times the comparison formula is not satisfied is greater than or equal to When the threshold of the number of times of inconsistency is different, the reference position information is updated. When the predicted position information and the reference position information satisfy the comparison formula, and the number of times the selected target object is selected is greater than or equal to the selection number threshold, the collision duration of the selected target object and the car is determined.
- a car selects the target object closest to the car, it can use the longitudinal distance, the horizontal distance or the relative distance as a reference, and sort the target objects in the order from near to far or from far to near. Select the target object closest to the car.
- the threshold of the number of inconsistencies and the threshold of the number of selections can be set in advance.
- the threshold of the number of inconsistencies can be 3, 4, etc.
- the threshold of the number of selections can be 3, 4, and so on.
- the current position information of the selected target object can be determined by the above projection formula (1), that is, the reference position information of the selected target object can be determined by the above projection formula (1).
- the car can predict the predicted position information of the selected target object through the following prediction formula.
- (x n+1 , y n+1 , ⁇ x n+1 , ⁇ y n+1 ) is the predicted position information
- (x n , y n , ⁇ x n , ⁇ y n ) is the reference position information
- x n , y n , ⁇ x n , ⁇ y n are the abscissa, ordinate, rate of change along the abscissa and along the ordinate of the selected target object in the image pixel coordinate system, respectively The rate of change.
- the car can compare the predicted position information with the reference position information through the following comparison formula.
- x t and y t are allowable errors, and the allowable errors can be set in advance, for example, the allowable errors can be 0.1, 0.2, and so on.
- updating the reference position information by the car may refer to reselecting the target object closest to the car, and determining the current position information of the newly selected target object as the reference position information.
- the number of times the comparison formula is not met can be increased first, and then it is determined whether the number of times the comparison formula is not met is greater than or equal to the threshold of the number of inconsistencies, and when the comparison is not met When the number of formulas is less than the threshold of the number of inconsistencies, the fourth-order Kalman filter algorithm is used to predict the information of the selected target object again.
- the number of times the selected target object is selected can be increased first, and then it is determined whether the number of times the selected target object is selected is greater than or equal to the selection times threshold, and when the selected target object is selected When the number of selections is less than the selection threshold, the information prediction of the selected target object is performed again through the fourth-order Kalman filter algorithm.
- Step 204 The car controls the car according to the tracking result of the target object.
- the car can include the collision duration of the selected target object and the car according to the tracking result of the target object. Therefore, the car can control the car according to the collision duration of the target object and the car.
- the car when the collision duration is less than or equal to the collision duration threshold, the car is controlled to perform braking control, and when the collision duration is greater than the collision duration, the car can update the reference position information.
- the collision duration threshold can be set in advance, for example, the collision duration threshold can be 20 seconds, 30 seconds, 1 minute, and so on.
- the car can obtain radar information in the direction of the car, and eliminate empty objects, false objects, non-hazardous objects, and prohibited objects according to the obtained radar information, thereby determining target objects that meet the tracking conditions, and then Target objects that meet the tracking conditions are tracked, so that the tracking is targeted, avoiding the tracking of invalid objects and the interference of invalid objects to the target object, improving the accuracy of object tracking, and ensuring the driving safety of the car.
- Fig. 4 is a block diagram of a device for tracking a target object provided by an embodiment of the present disclosure.
- the device can be implemented by software, hardware or a combination of both.
- the device includes: an acquisition module 401, a determination module 402, and a tracking module 403.
- the obtaining module 401 is configured to obtain radar information of the car in the driving direction through the millimeter wave radar installed on the car;
- the determining module 402 is configured to determine a target object that meets the tracking condition based on the radar information
- the tracking module 403 is used to track the target object to control the car.
- the determining module 402 includes:
- the first determining sub-module 4021 is configured to determine stationary objects, false objects, empty objects and non-dangerous objects in the objects based on the radar information;
- the second determining sub-module 4022 is configured to determine objects other than the prohibited object, the false object, the empty object, and the non-dangerous object as target objects that meet the tracking condition.
- the radar information includes object information of objects appearing in the driving direction of the car, the lateral distance and the longitudinal distance between the car and the object, and the object information includes The relative distance between the objects, the relative speed, and the number of occurrences of the objects within the detection range of the millimeter wave radar;
- the first determining submodule 4021 is used to:
- an object whose lateral distance is less than the lateral distance threshold and the longitudinal distance is less than the longitudinal distance threshold is determined as the non-dangerous object.
- the acquiring module 401 includes:
- An obtaining sub-module 4011 is configured to obtain radar data in the driving direction through the millimeter wave radar;
- the processing sub-module 4012 is used to preprocess the radar data to obtain the radar information.
- processing submodule 4012 is used to:
- the relative distance is decomposed in the world coordinate system to obtain the lateral distance And the longitudinal distance.
- the tracking module 403 includes:
- the projection sub-module 4031 is used to project the target object from the world coordinate system to the image pixel coordinate system;
- the tracking sub-module 4032 is used to track the target object in the image pixel coordinate system to control the car.
- the projection sub-module 4031 is used to:
- X 1 is the abscissa of the target object in the image pixel coordinate system
- the Y 1 is the ordinate of the target object in the image pixel coordinate system
- the image pixel coordinate system The median coordinate unit is pixel
- the X is the lateral distance of the target object in the world coordinate system
- the Y is the longitudinal distance of the target object in the world coordinate system
- the a and b Is the magnification
- the image.cols is the image width.
- the tracking sub-module 4032 is used to:
- the reference position information is updated
- the car can obtain radar information in the direction of the car, and eliminate empty objects, false objects, non-hazardous objects, and prohibited objects according to the obtained radar information, so as to determine those that meet the tracking conditions
- the target object is then tracked to the target object that meets the tracking conditions, so that the tracking is targeted, avoiding the tracking of invalid objects and the interference of invalid objects to the target object, improving the accuracy of object tracking and ensuring the car’s performance Driving safety.
- the target object tracking device provided in the above embodiment tracks the target object
- only the division of the above functional modules is used as an example for illustration.
- the above functions can be allocated to different functional modules as needed.
- Complete that is, divide the internal structure of the device into different functional modules to complete all or part of the functions described above.
- the target object tracking device provided in the above-mentioned embodiment and the target object tracking method embodiment belong to the same concept.
- the method embodiment please refer to the method embodiment, which will not be repeated here.
- FIG. 8 shows a structural block diagram of a car 800 provided by an exemplary embodiment of the present application.
- the car 800 includes a processor 801 and a memory 802.
- the processor 801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
- the processor 801 can adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). achieve.
- the processor 801 may also include a main processor and a coprocessor.
- the main processor is a processor used to process data in the awake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
- the processor 801 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
- the processor 801 may further include an AI (Artificial Intelligence) processor, which is used to process computing operations related to machine learning.
- AI Artificial Intelligence
- the memory 802 may include one or more computer-readable storage media, which may be non-transitory.
- the memory 802 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
- the non-transitory computer-readable storage medium in the memory 802 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 801 to implement the target object provided in the method embodiment of the present application. Tracking method.
- the car 800 optionally further includes: a peripheral device interface 803 and at least one peripheral device.
- the processor 801, the memory 802, and the peripheral device interface 803 may be connected by a bus or a signal line.
- Each peripheral device can be connected to the peripheral device interface 803 through a bus, a signal line or a circuit board.
- the peripheral device includes at least one of a radio frequency circuit 804, a touch display screen 805, a camera 806, an audio circuit 807, a positioning component 808, and a power supply 809.
- the peripheral device interface 803 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 801 and the memory 802.
- the processor 801, the memory 802, and the peripheral device interface 803 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 801, the memory 802 and the peripheral device interface 803 or The two can be implemented on separate chips or circuit boards, which are not limited in this embodiment.
- the radio frequency circuit 804 is used to receive and transmit RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
- the radio frequency circuit 804 communicates with a communication network and other communication devices through electromagnetic signals.
- the radio frequency circuit 804 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
- the radio frequency circuit 804 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
- the radio frequency circuit 804 can communicate with other terminals through at least one wireless communication protocol.
- the wireless communication protocol includes but is not limited to: metropolitan area network, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or WiFi (Wireless Fidelity, wireless fidelity) network.
- the radio frequency circuit 804 may also include NFC (Near Field Communication) related circuits, which is not limited in this application.
- the display screen 805 is used to display UI (User Interface).
- the UI can include graphics, text, icons, videos, and any combination thereof.
- the display screen 805 also has the ability to collect touch signals on or above the surface of the display screen 805.
- the touch signal can be input to the processor 801 as a control signal for processing.
- the display screen 805 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
- the display screen 805 there may be one display screen 805, which is provided with the front panel of the car 800; in other embodiments, there may be at least two display screens 805, which are respectively arranged on different surfaces of the car 800 or in a folding design; In still other embodiments, the display screen 805 may be a flexible display screen, which is arranged on a curved surface or a folding surface of the automobile 800. Furthermore, the display screen 805 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
- the display screen 805 may be made of materials such as LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, organic light emitting diode).
- the camera assembly 806 is used to capture images or videos.
- the camera assembly 806 includes a front camera and a rear camera.
- the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
- VR Virtual Reality
- the audio circuit 807 may include a microphone and a speaker.
- the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 801 for processing, or input to the radio frequency circuit 804 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the car 800.
- the microphone can also be an array microphone or an omnidirectional acquisition microphone.
- the speaker is used to convert the electrical signal from the processor 801 or the radio frequency circuit 804 into sound waves.
- the speaker can be a traditional thin-film speaker or a piezoelectric ceramic speaker.
- the audio circuit 807 may also include a headphone jack.
- the positioning component 808 is used to locate the current geographic location of the car 800 to implement navigation or LBS (Location Based Service, location-based service).
- the positioning component 808 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, the Granus system of Russia, or the Galileo system of the European Union.
- the power supply 809 is used to supply power to various components in the car 800.
- the power source 809 may be alternating current, direct current, disposable batteries, or rechargeable batteries.
- the rechargeable battery may support wired charging or wireless charging.
- the rechargeable battery can also be used to support fast charging technology.
- the car 800 further includes one or more sensors 810.
- the embodiments of the present application not only provide a car, but also include a processor and a memory for storing executable instructions of the processor, where the processor is configured to execute the steps in the embodiments shown in FIGS. 1 and 2
- the embodiment of the present application also provides a computer-readable storage medium in which a computer program is stored. When the computer program is executed by a processor, the computer program in the embodiment shown in FIG. 1 and FIG. The tracking method of the target object.
- FIG. 8 does not constitute a limitation on the automobile 800, and may include more or fewer components than shown, or combine certain components, or adopt different component arrangements.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
Claims (10)
- 一种目标物体的追踪方法,其特征在于,所述方法包括:A method for tracking a target object, characterized in that the method includes:通过安装在汽车上的毫米波雷达获取所述汽车在行驶方向上的雷达信息;Obtain the radar information of the car in the driving direction through the millimeter wave radar installed on the car;基于所述雷达信息,确定符合追踪条件的目标物体;Based on the radar information, determine target objects that meet the tracking conditions;对所述目标物体进行追踪,以对所述汽车进行控制。The target object is tracked to control the car.
- 如权利要求1所述的方法,其特征在于,所述基于所述雷达信息,确定符合追踪条件的目标物体,包括:The method according to claim 1, wherein the determining the target object that meets the tracking condition based on the radar information comprises:基于所述雷达信息,确定所述物体中的静止物体、虚假物体、空物体和非危险物体;Based on the radar information, determine stationary objects, false objects, empty objects, and non-dangerous objects among the objects;将所述禁止物体、所述虚假物体、所述空物体和所述非危险物体之外的物体确定为符合所述追踪条件的目标物体。The object other than the prohibited object, the fake object, the empty object, and the non-hazardous object is determined as the target object that meets the tracking condition.
- 如权利要求2所述的方法,其特征在于,所述雷达信息包括出现在所述汽车在行驶方向上的物体的物体信息、所述汽车与所述物体之间横向距离和纵向距离,所述物体信息包括所述汽车与所述物体之间的相对距离、相对速度以及所述物体在所述毫米波雷达的检测范围内出现的出现次数;The method according to claim 2, wherein the radar information includes object information of objects appearing in the driving direction of the car, the horizontal distance and the longitudinal distance between the car and the object, and the The object information includes the relative distance between the car and the object, the relative speed, and the number of occurrences of the object within the detection range of the millimeter wave radar;所述基于所述物体信息,确定所述物体中的静止物体、虚假物体、空物体和非危险物体,包括:The determining of stationary objects, false objects, empty objects and non-dangerous objects in the objects based on the object information includes:将所述物体中所述相对速度的绝对值与所述汽车的行驶速度相等的物体确定为所述静止物体;Determining an object whose absolute value of the relative speed is equal to the driving speed of the car among the objects as the stationary object;将所述物体中与所述汽车之间的相对距离等于0的物体确定为所述空物体;Determining an object with a relative distance equal to 0 among the objects to the car as the empty object;将所述物体中出现次数大于或小于出现次数阈值的物体确定为所述虚假物体;Determining an object with an appearance number greater than or less than an appearance frequency threshold among the objects as the fake object;将所述物体中横向距离小于横向距离阈值且纵向距离小于纵向距离阈值的物体确定为所述非危险物体。Among the objects, an object whose lateral distance is less than the lateral distance threshold and the longitudinal distance is less than the longitudinal distance threshold is determined as the non-dangerous object.
- 如权利要求1所述的方法,其特征在于,所述通过安装在汽车上的毫米波雷达获取所述汽车在行驶方向上的雷达信息,包括:The method according to claim 1, wherein the acquiring radar information of the car in the driving direction through a millimeter wave radar installed on the car comprises:通过所述毫米波雷达获取所述行驶方向上的雷达数据;Acquiring radar data in the driving direction through the millimeter wave radar;对所述雷达数据进行预处理,得到所述雷达信息。Preprocessing the radar data to obtain the radar information.
- 如权利要求4所述的方法,其特征在于,所述对所述雷达数据进行预处理,得到所述雷达信息,包括:The method of claim 4, wherein said preprocessing said radar data to obtain said radar information comprises:按照毫米波雷达协议,对所述物体的雷达数据进行解析,得到所述物体的物体信息;Analyze the radar data of the object according to the millimeter wave radar protocol to obtain the object information of the object;以所述毫米波雷达的位置为原点,建立世界坐标系;Use the position of the millimeter wave radar as the origin to establish a world coordinate system;当所述物体信息包括所述物体与所述汽车之间的相对距离以及所述物体与所述汽车之间的相对角度时,将所述相对距离分解在所述世界坐标系中,得到横向距离和纵向距离。When the object information includes the relative distance between the object and the car and the relative angle between the object and the car, the relative distance is decomposed in the world coordinate system to obtain the lateral distance And the longitudinal distance.
- 如权利要求1所述的方法,其特征在于,所述对所述目标物体进行追踪,以对所述汽车进行控制,包括:The method of claim 1, wherein the tracking the target object to control the car comprises:将所述目标物体从世界坐标系投影到图像像素坐标系中;Projecting the target object from the world coordinate system to the image pixel coordinate system;在所述图像像素坐标系中对所述目标物体进行追踪,以对所述汽车进行控制。The target object is tracked in the image pixel coordinate system to control the automobile.
- 如权利要求6所述的方法,其特征在于,所述将所述目标物体从世界坐标系投影到图像像素坐标系中,包括:The method according to claim 6, wherein the projecting the target object from the world coordinate system to the image pixel coordinate system comprises:根据所述世界坐标系下所述目标物体的坐标值,通过下述投影公式将所述目标物体投影至所述图像像素坐标系中;According to the coordinate values of the target object in the world coordinate system, project the target object into the image pixel coordinate system by the following projection formula;X 1=X*a+image.cols, X 1 =X*a+image.cols,Y 1=Y*b Y 1 =Y*b其中,所述X 1为所述目标物体在所述图像像素坐标系下的横坐标,所述Y 1为所述目标物体在所述图像像素坐标系下的纵坐标,所述图像像素坐标系中坐标单位为像素pixel,所述X为所述目标物体在所述世界坐标系下的横向距离,所述Y为所述目标物体在所述世界坐标系下的纵向距离,所述a和b为放大倍数,所述image.cols为图像宽度。 Wherein, X 1 is the abscissa of the target object in the image pixel coordinate system, the Y 1 is the ordinate of the target object in the image pixel coordinate system, and the image pixel coordinate system The median coordinate unit is pixel, the X is the lateral distance of the target object in the world coordinate system, the Y is the longitudinal distance of the target object in the world coordinate system, and the a and b Is the magnification, and the image.cols is the image width.
- 如权利要求6所述的方法,其特征在于,所述在所述图像像素坐标系中 对所述目标物体进行追踪,以对所述汽车进行控制,包括:The method according to claim 6, wherein the tracking the target object in the image pixel coordinate system to control the car comprises:选取距离所述汽车最近的目标物体;Select the target object closest to the car;通过四阶卡尔曼滤波算法对选取的目标物体进行信息预测,得到预测位置信息;Use the fourth-order Kalman filter algorithm to predict the selected target object to obtain the predicted position information;将所述预测位置信息与参考位置信息通过对比公式进行对比,所述参考位置信息为所述选取的目标物体当前的位置信息;Comparing the predicted position information with reference position information through a comparison formula, where the reference position information is the current position information of the selected target object;当所述预测位置信息与所述参考位置信息不满足对比公式,且不满足所述对比公式的次数大于或等于不一致次数阈值时,对所述参考位置信息进行更新;When the predicted position information and the reference position information do not satisfy the comparison formula, and the number of times the comparison formula is not satisfied is greater than or equal to the inconsistency threshold, the reference position information is updated;当所述预测位置信息与所述参考位置信息满足所述对比公式,且所述选取的目标物体被选取的次数大于或等于选取次数阈值时,确定所述选取的目标物体与所述汽车发生碰撞的碰撞时长。When the predicted position information and the reference position information satisfy the comparison formula, and the number of times the selected target object is selected is greater than or equal to the selection number threshold, it is determined that the selected target object collides with the car The duration of the collision.
- 一种目标物体的追踪装置,其特征在于,所述装置包括:A tracking device for a target object, characterized in that the device comprises:获取模块,用于通过安装在汽车上的毫米波雷达获取所述汽车在行驶方向上的雷达信息;The acquisition module is used to acquire radar information of the car in the driving direction through the millimeter wave radar installed on the car;确定模块,用于基于所述雷达信息,确定符合追踪条件的目标物体;A determining module, configured to determine a target object that meets the tracking condition based on the radar information;追踪模块,用于对所述目标物体进行追踪,以对所述汽车进行控制。The tracking module is used to track the target object to control the car.
- 一种计算机可读存储介质,其特征在于,所述存储介质内存储有计算机程序,所述计算机程序被处理器执行时实现权利要求1-8中任一所述的方法。A computer-readable storage medium, characterized in that a computer program is stored in the storage medium, and when the computer program is executed by a processor, the method according to any one of claims 1-8 is implemented.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910394386.1 | 2019-05-13 | ||
CN201910394386.1A CN110077402B (en) | 2019-05-13 | 2019-05-13 | Target object tracking method, target object tracking device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020228242A1 true WO2020228242A1 (en) | 2020-11-19 |
Family
ID=67419823
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/112425 WO2020228242A1 (en) | 2019-05-13 | 2019-10-22 | Method and apparatus for tracking target object, and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110077402B (en) |
WO (1) | WO2020228242A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112526503A (en) * | 2020-11-20 | 2021-03-19 | 广州极飞科技有限公司 | Method for detecting object distance and related device |
CN112595338A (en) * | 2020-12-24 | 2021-04-02 | 中国联合网络通信集团有限公司 | Navigation method and navigation system |
CN112764035A (en) * | 2020-12-28 | 2021-05-07 | 南京市德赛西威汽车电子有限公司 | False target detection optimization method based on left-right communication of BSD radar |
CN114038191A (en) * | 2021-11-05 | 2022-02-11 | 青岛海信网络科技股份有限公司 | Method and device for collecting traffic data |
CN117111049A (en) * | 2023-10-23 | 2023-11-24 | 成都瑞达物联科技有限公司 | ETC channel vehicle presence detection method and system |
CN117214966A (en) * | 2023-08-01 | 2023-12-12 | 珠海微度芯创科技有限责任公司 | Image mapping method, device, equipment and medium of millimeter wave security inspection imaging equipment |
CN112526503B (en) * | 2020-11-20 | 2024-06-07 | 广州极飞科技股份有限公司 | Method for detecting object distance and related device |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110077402B (en) * | 2019-05-13 | 2021-09-28 | 奇瑞汽车股份有限公司 | Target object tracking method, target object tracking device and storage medium |
CN110751127B (en) * | 2019-10-30 | 2022-07-12 | 芜湖汽车前瞻技术研究院有限公司 | Distance determination method, device and storage medium |
CN110926425A (en) * | 2019-11-01 | 2020-03-27 | 宁波大学 | Navigation logistics transportation system of 3D structured light camera and control method thereof |
CN115331190B (en) * | 2022-09-30 | 2022-12-09 | 北京闪马智建科技有限公司 | Road hidden danger identification method and device based on radar vision fusion |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090259400A1 (en) * | 2008-04-15 | 2009-10-15 | Caterpillar Inc. | Vehicle collision avoidance system |
CN104386063A (en) * | 2014-09-19 | 2015-03-04 | 奇瑞汽车股份有限公司 | Driving assistance system based on artificial intelligence |
CN104637059A (en) * | 2015-02-09 | 2015-05-20 | 吉林大学 | Night preceding vehicle detection method based on millimeter-wave radar and machine vision |
CN106950952A (en) * | 2017-03-10 | 2017-07-14 | 无锡卡尔曼导航技术有限公司 | For the unpiloted farm environment cognitive method of agricultural machinery |
CN108528442A (en) * | 2017-03-06 | 2018-09-14 | 通用汽车环球科技运作有限责任公司 | Use the vehicle collision prediction algorithm of radar sensor and UPA sensors |
CN109143221A (en) * | 2018-07-23 | 2019-01-04 | 奇瑞汽车股份有限公司 | Method for tracking target and device |
CN109613528A (en) * | 2018-12-11 | 2019-04-12 | 南京慧尔视防务科技有限公司 | A kind of high-resolution multi-target tracking radar and detection method |
CN110077402A (en) * | 2019-05-13 | 2019-08-02 | 奇瑞汽车股份有限公司 | Method for tracing, device and the storage medium of target object |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7474255B2 (en) * | 2006-12-05 | 2009-01-06 | Chung Shan Institute Of Science And Technology, Armaments Bureau, M.N.D. | Target tracking method of radar with frequency modulated continuous wave |
US8565481B1 (en) * | 2011-05-26 | 2013-10-22 | Google Inc. | System and method for tracking objects |
CN107609522B (en) * | 2017-09-19 | 2021-04-13 | 东华大学 | Information fusion vehicle detection system based on laser radar and machine vision |
CN109085570A (en) * | 2018-06-10 | 2018-12-25 | 南京理工大学 | Automobile detecting following algorithm based on data fusion |
-
2019
- 2019-05-13 CN CN201910394386.1A patent/CN110077402B/en active Active
- 2019-10-22 WO PCT/CN2019/112425 patent/WO2020228242A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090259400A1 (en) * | 2008-04-15 | 2009-10-15 | Caterpillar Inc. | Vehicle collision avoidance system |
CN104386063A (en) * | 2014-09-19 | 2015-03-04 | 奇瑞汽车股份有限公司 | Driving assistance system based on artificial intelligence |
CN104637059A (en) * | 2015-02-09 | 2015-05-20 | 吉林大学 | Night preceding vehicle detection method based on millimeter-wave radar and machine vision |
CN108528442A (en) * | 2017-03-06 | 2018-09-14 | 通用汽车环球科技运作有限责任公司 | Use the vehicle collision prediction algorithm of radar sensor and UPA sensors |
CN106950952A (en) * | 2017-03-10 | 2017-07-14 | 无锡卡尔曼导航技术有限公司 | For the unpiloted farm environment cognitive method of agricultural machinery |
CN109143221A (en) * | 2018-07-23 | 2019-01-04 | 奇瑞汽车股份有限公司 | Method for tracking target and device |
CN109613528A (en) * | 2018-12-11 | 2019-04-12 | 南京慧尔视防务科技有限公司 | A kind of high-resolution multi-target tracking radar and detection method |
CN110077402A (en) * | 2019-05-13 | 2019-08-02 | 奇瑞汽车股份有限公司 | Method for tracing, device and the storage medium of target object |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112526503A (en) * | 2020-11-20 | 2021-03-19 | 广州极飞科技有限公司 | Method for detecting object distance and related device |
CN112526503B (en) * | 2020-11-20 | 2024-06-07 | 广州极飞科技股份有限公司 | Method for detecting object distance and related device |
CN112595338A (en) * | 2020-12-24 | 2021-04-02 | 中国联合网络通信集团有限公司 | Navigation method and navigation system |
CN112764035A (en) * | 2020-12-28 | 2021-05-07 | 南京市德赛西威汽车电子有限公司 | False target detection optimization method based on left-right communication of BSD radar |
CN112764035B (en) * | 2020-12-28 | 2024-01-30 | 南京市德赛西威汽车电子有限公司 | False target detection optimization method based on BSD radar left-right communication |
CN114038191A (en) * | 2021-11-05 | 2022-02-11 | 青岛海信网络科技股份有限公司 | Method and device for collecting traffic data |
CN117214966A (en) * | 2023-08-01 | 2023-12-12 | 珠海微度芯创科技有限责任公司 | Image mapping method, device, equipment and medium of millimeter wave security inspection imaging equipment |
CN117214966B (en) * | 2023-08-01 | 2024-04-05 | 珠海微度芯创科技有限责任公司 | Image mapping method, device, equipment and medium of millimeter wave security inspection imaging equipment |
CN117111049A (en) * | 2023-10-23 | 2023-11-24 | 成都瑞达物联科技有限公司 | ETC channel vehicle presence detection method and system |
CN117111049B (en) * | 2023-10-23 | 2024-01-30 | 成都瑞达物联科技有限公司 | ETC channel vehicle presence detection method and system |
Also Published As
Publication number | Publication date |
---|---|
CN110077402A (en) | 2019-08-02 |
CN110077402B (en) | 2021-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020228242A1 (en) | Method and apparatus for tracking target object, and storage medium | |
WO2021128777A1 (en) | Method, apparatus, device, and storage medium for detecting travelable region | |
CN110865388B (en) | Combined calibration method and device for camera and laser radar and storage medium | |
CN110979318B (en) | Lane information acquisition method and device, automatic driving vehicle and storage medium | |
EP3806066B1 (en) | Method and apparatus for controlling automated guided vehicles, and storage medium | |
CN109532845B (en) | Control method and device of intelligent automobile and storage medium | |
CN110097025B (en) | Lane line detection method, device and storage medium | |
CN110956847B (en) | Parking space identification method and device and storage medium | |
CN110751127B (en) | Distance determination method, device and storage medium | |
CN111016888A (en) | Parking control method and device for automobile and storage medium | |
CN109409301B (en) | Information acquisition method and device of traffic signal lamp and storage medium | |
CN109581358B (en) | Obstacle recognition method, obstacle recognition device and storage medium | |
WO2020258602A1 (en) | Intelligent vehicle control method and apparatus, and storage medium | |
CN111361550B (en) | Parking space identification method and device and storage medium | |
CN114299468A (en) | Method, device, terminal, storage medium and product for detecting convergence of lane | |
CN110775056B (en) | Vehicle driving method, device, terminal and medium based on radar detection | |
CN109484480B (en) | Automobile control method and device and storage medium | |
CN111538009B (en) | Radar point marking method and device | |
CN110390252B (en) | Obstacle detection method and device based on prior map information and storage medium | |
CN115797401B (en) | Verification method and device for alignment parameters, storage medium and electronic equipment | |
CN111619556B (en) | Obstacle avoidance control method and device for automobile and storage medium | |
CN114789734A (en) | Perception information compensation method, device, vehicle, storage medium, and program | |
CN114623836A (en) | Vehicle pose determining method and device and vehicle | |
CN112379363A (en) | Measuring method, device and electronic equipment | |
CN113734199B (en) | Vehicle control method, device, terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19929097 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19929097 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19929097 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24/05/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19929097 Country of ref document: EP Kind code of ref document: A1 |