CN114092898A - Target object sensing method and device - Google Patents

Target object sensing method and device Download PDF

Info

Publication number
CN114092898A
CN114092898A CN202010755668.2A CN202010755668A CN114092898A CN 114092898 A CN114092898 A CN 114092898A CN 202010755668 A CN202010755668 A CN 202010755668A CN 114092898 A CN114092898 A CN 114092898A
Authority
CN
China
Prior art keywords
endpoint
uncertainty
point
feature points
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010755668.2A
Other languages
Chinese (zh)
Inventor
曹彤彤
李向旭
刘冰冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010755668.2A priority Critical patent/CN114092898A/en
Priority to PCT/CN2021/106261 priority patent/WO2022022284A1/en
Publication of CN114092898A publication Critical patent/CN114092898A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides a sensing method and a sensing device for a target object, which are used for improving the accuracy of calculating the position or the speed of the target object. The scheme relates to the field of automatic driving. The method comprises the following steps: acquiring a plurality of characteristic points of a point cloud cluster, wherein the point cloud cluster represents a target object; determining an uncertainty of each of the plurality of feature points, the uncertainty indicating an error generated when a position of each feature point in the point cloud cluster is acquired by the acquisition device; acquiring a first state of a target object corresponding to each feature point in the plurality of feature points based on the state of each feature point in the plurality of feature points; determining a second state of the object based on the first state of the object corresponding to each of the plurality of feature points and the uncertainty corresponding to each of the plurality of feature points, wherein the state includes velocity and/or position.

Description

Target object sensing method and device
Technical Field
The present application relates to the field of automatic driving, and more particularly, to a method and apparatus for sensing a target object.
Background
The point data set on the outline of the target object obtained by the acquisition device is also referred to as a point cloud (point cloud). The existing common point cloud includes a laser point cloud, that is, when a laser beam irradiates the surface of a target object, the reflected laser beam carries information such as the direction and distance of the target object. When the laser beam is scanned along a certain track, the reflected laser point information is recorded while scanning, and since the scanning is fine, a large number of laser points can be obtained, and thus, a laser point cloud can be formed. Currently, laser point clouds are commonly used in the field of autonomous driving or unmanned driving for sensing target objects.
In the existing target object sensing scheme, point cloud data including a target object is processed to obtain a point cloud cluster representing the target object, a geometric center or a gravity center of the point cloud cluster is determined, and then the position and the speed of the target object are calculated based on the position and the speed of the geometric center or the gravity center of the point cloud cluster to sense the target object.
However, in the above solution for calculating the position and the velocity of the target object based on the position and the velocity of the geometric center or the center of gravity of the point cloud cluster, if the geometric center or the center of gravity of the point cloud cluster is blocked, the accuracy of the calculated position and velocity of the target object may be reduced.
Disclosure of Invention
The application provides a sensing method and a sensing device for a target object, which are used for improving the accuracy of calculating the position or the speed of the target object.
In a first aspect, a method for sensing a target object is provided, including: acquiring a plurality of characteristic points of a point cloud cluster, wherein the point cloud cluster represents the target object; determining an uncertainty for each of the plurality of feature points, the uncertainty indicating an error in acquiring a location of the each feature point in the point cloud cluster by an acquisition device; acquiring a first state of the target object corresponding to each feature point in the plurality of feature points based on the state of each feature point in the plurality of feature points, wherein the state of each feature point comprises the position and/or the speed of each feature point, and the first state comprises a first speed and/or a first position of the target object; determining a second state of the object based on the first state of the object corresponding to the each of the plurality of feature points and the uncertainty corresponding to the each of the plurality of feature points, the second state including a second velocity and/or a second position of the object.
In the embodiment of the application, the first state of the target corresponding to the state of each feature point in the plurality of feature points is calculated based on the state of each feature point in the point cloud cluster, and the second state of the target is determined based on the first state of the target corresponding to each feature point in the plurality of feature points and the uncertainty corresponding to each feature point, so that the accuracy of determining the second state of the target is improved. According to the technical scheme, the situation that in the prior art, the state of the target object is determined only based on the state of the geometric center or the gravity center of the point cloud cluster is avoided, and when the geometric center or the gravity center of the point cloud cluster is shielded, the accuracy of the determined state of the target object is reduced.
In one possible implementation, the plurality of feature points includes a plurality of endpoints of the point cloud cluster, where an endpoint is also referred to as a "point of interest" and generally refers to an intersection of two adjacent edges in the point cloud cluster.
In the embodiment of the application, a plurality of end points of the point cloud cluster are used as the plurality of feature points, which is beneficial to simplifying the process of determining the feature points.
In one possible implementation, the determining the uncertainty of each of the plurality of feature points includes: determining the type of an edge connected with each endpoint in the plurality of endpoints, wherein the type of the edge comprises a visible edge directly acquired by the acquisition equipment and an invisible edge which cannot be directly acquired by the acquisition equipment; determining the uncertainty for each of the plurality of endpoints based on a type of two edges of the plurality of endpoints connected to each endpoint.
In the embodiment of the application, the uncertainty of each endpoint in the plurality of endpoints is determined based on the types of the two edges connected with each endpoint in the plurality of endpoints, which is beneficial to improving the accuracy of determining the uncertainty of the endpoint.
In one possible implementation, the plurality of end points includes a first end point, a type of a first edge connected to the first end point is a visible edge, and a type of a second edge connected to the first end point is an invisible edge, and then the uncertainty of the first end point is determined based on a component of a detection uncertainty of an acquisition device in an orientation direction of the target object.
In the embodiment of the application, the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target object, which is beneficial to improving the accuracy of determining the uncertainty of the first endpoint.
In one possible implementation, the uncertainty d of the first endpoint1By the formula
Figure BDA0002611483000000021
Is determined wherein R1Representing the measured distance, C, between the acquisition device and the first endpoint0The set value is a preset value and is inversely related to the acquisition precision of the acquisition equipment, and the unit is radian; theta1Representing a coordinate azimuth angle when the acquisition equipment acquires the first endpoint;
Figure BDA0002611483000000022
represents an azimuth of the orientation of the target object.
In the embodiment of the application, the formula is used
Figure BDA0002611483000000023
And the uncertainty of the first endpoint is calculated, so that the accuracy of determining the uncertainty of the first endpoint is improved.
In a possible implementation manner, the plurality of endpoints includes a second endpoint, and the two edges connected to the second endpoint are both visible edges, so that the measurement distance between the second endpoint and the acquisition device is positively correlated with the uncertainty of the second endpoint.
In the embodiment of the application, the uncertainty of the second endpoint is determined based on the measurement distance between the second endpoint and the acquisition device, which is beneficial to improving the accuracy of determining the uncertainty of the second endpoint.
In one possible implementation, the uncertainty d of the second endpoint2By the formula d2=R2×C1Is determined wherein R2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians.
In the embodiment of the application, the formula d is used2=R2×C1And the uncertainty of the second endpoint is determined, so that the accuracy of determining the uncertainty of the second endpoint is improved.
In one possible implementation, the uncertainty d of the second endpoint on the x-axis in the coordinate system2xBy the formula d2x=Dx0+R2cos(θ2)×C1Determining, wherein R represents a distance between the acquisition device and the first endpoint, C1Representing the detection uncertainty of the acquisition equipment, and the unit is radian rad; theta2Representing a coordinate azimuth angle when the acquisition equipment acquires the second endpoint; dx0Representing a preset initial uncertainty of the second endpoint in the x-axis direction.
In one possible implementation, the uncertainty d of the second endpoint on the y-axis in the coordinate system2yBy the formula d2y=Dy0+Rsin(θ)×C1Determining, wherein R represents a distance between the acquisition device and the first endpoint, C1Representing the detection uncertainty of the acquisition equipment, and the unit is radian rad; theta represents a coordinate azimuth angle when the acquisition equipment acquires the first endpoint; dy0Representing a preset initial uncertainty of the second endpoint in the y-axis direction.
In one possible implementation, the determining the uncertainty of each of the plurality of endpoints based on types of two edges connected to each of the plurality of endpoints includes: and if the first reference point is not shielded by other objects, determining the uncertainty of each end point of the plurality of end points based on the types of two edges connected with each end point of the plurality of end points, wherein the first reference point is a point which is away from the first end point by a preset distance in the extension line direction of the first edge, and the other objects are objects except the target object and the acquisition equipment in the image where the point cloud cluster is located.
In the embodiment of the application, when the first reference point is not occluded by other objects, the uncertainty of each endpoint in the plurality of endpoints can be determined based on the types of the two edges connected with each endpoint in the plurality of endpoints, which is beneficial to improving the accuracy of the uncertainty of the determined endpoint.
In one possible implementation, the method further includes: and if the first reference point is blocked by other objects, determining the uncertainty of the first endpoint based on the degree of change of the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first endpoint.
In the embodiment of the application, if the first reference point is blocked by other objects, the uncertainty of the first endpoint is determined based on the horizontal field angle corresponding to the first reference point and the variation degree of the horizontal field angle corresponding to the first endpoint, which is beneficial to improving the accuracy of the uncertainty of the first endpoint.
In a possible implementation manner, if the first reference point is occluded by the other object, determining the uncertainty of the first endpoint based on a horizontal field angle corresponding to the first reference point and a horizontal field angle corresponding to the first endpoint includes: if the first reference point is shielded by other objects, based on the difference delta between the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first end point, passing through a formula
Figure BDA0002611483000000031
Determining the uncertainty d of the first endpoint3Wherein R is1Representing the measured distance, C, between the acquisition device and the first endpoint0The set value is a preset value and is inversely related to the acquisition precision of the acquisition equipment, and the unit is radian; theta1Representing a coordinate azimuth angle when the acquisition equipment acquires the first endpoint;
Figure BDA0002611483000000032
represents an azimuth of the orientation of the target object.
In this embodiment of the application, if the first reference point is blocked by another object, the uncertainty of the first endpoint is determined based on the horizontal field angle corresponding to the first reference point and the change degree of the horizontal field angle corresponding to the first endpoint, and the component of the detection uncertainty of the acquisition device in the direction of the target object, which is favorable for improving the accuracy of the uncertainty of the first endpoint.
In one possible implementation, the method further includes: and if the first reference point is shielded by other objects, determining the uncertainty of the second endpoint based on the degree of change of the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first endpoint.
In the embodiment of the application, if the first reference point is blocked by other objects, the uncertainty of the second endpoint is determined based on the horizontal field angle corresponding to the first reference point and the change degree of the horizontal field angle corresponding to the first endpoint, which is beneficial to improving the accuracy of the uncertainty of the second endpoint.
In a possible implementation manner, if the first reference point is not occluded by the other object, determining the uncertainty of the second endpoint based on a degree of change of a horizontal field angle corresponding to the first reference point and a horizontal field angle corresponding to the first endpoint includes: if the first reference point is not shielded by the other objects, based on the difference delta between the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first end point, the formula d is used4=R2×(C1+ δ) determining the location of the second endpointSaid uncertainty d4Wherein R is2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians.
In this embodiment of the application, if the first reference point is blocked by another object, the uncertainty of the second endpoint is determined based on the horizontal field angle corresponding to the first reference point, the change degree of the horizontal field angle corresponding to the first endpoint, and the measurement distance between the acquisition device and the second endpoint, which is favorable for improving the accuracy of the uncertainty of the second endpoint.
In one possible implementation, the determining the second state of the object based on the first state of the object corresponding to each of the plurality of feature points and the uncertainty corresponding to each of the plurality of feature points includes: determining a confidence level corresponding to said each of said plurality of feature points based on an uncertainty corresponding to said each of said plurality of feature points; determining the second state of the object based on the first state of the object corresponding to the each of the plurality of feature points and the confidence corresponding to the each of the plurality of feature points.
In one possible implementation, the determining the confidence corresponding to each of the plurality of feature points based on the uncertainty corresponding to each of the plurality of feature points includes: based on the uncertainty corresponding to each of the plurality of feature points, formulating
Figure BDA0002611483000000041
Determining confidence M corresponding to the kth feature point in the plurality of feature pointskWherein k represents the kth feature point in a plurality of feature points, k is 1 … … n, and n is the total number of the plurality of feature points; dkRepresenting an uncertainty of the k-th feature point; deltakRepresenting a change between the history state of the kth feature point and the first state; c3、C4Is a preset value.
It should be noted that the above coordinate azimuth can be understood as an included angle between a line connecting an end point of the target object and the acquisition device in the coordinate system and the x-axis. The azimuth angle of the orientation of the object can be understood as a horizontal angle between the clockwise direction and the x-axis from the direction of the orientation of the object.
In a second aspect, there is provided an apparatus for sensing a target object, the apparatus comprising means for performing the steps of the first aspect or any one of the possible implementations of the first aspect.
In a third aspect, a sensing device for an object is provided, the device having the function of the device designed to implement the method of the first aspect. These functions may be implemented by hardware, or by hardware executing corresponding software. The hardware or software includes one or more units corresponding to the above functions.
In a fourth aspect, a computing device is provided that includes an input-output interface, a processor, and a memory. The processor is configured to control the input/output interface to send and receive signals or information, the memory is configured to store a computer program, and the processor is configured to call and run the computer program from the memory, so that the computing device executes the method of the first aspect.
In a fifth aspect, there is provided a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the method of the above-mentioned aspects.
In a sixth aspect, a computer-readable medium is provided, which stores program code, which, when run on a computer, causes the computer to perform the method of the above-mentioned aspects.
In a seventh aspect, a chip system is provided, which comprises a processor for a computing device to implement the functions referred to in the above aspects, such as generating, receiving, sending, or processing data and/or information referred to in the above methods. In one possible design, the system-on-chip further includes a memory for storing program instructions and data necessary for the computing device. The chip system may be formed by a chip, or may include a chip and other discrete devices.
In an eighth aspect, a vehicle is provided that includes an input-output interface, a processor, and a memory. The processor is configured to control the input/output interface to send and receive signals or information, the memory is configured to store a computer program, and the processor is configured to call and run the computer program from the memory, so that the computing device executes the method of the first aspect.
Alternatively, the vehicle may have an automatic driving function.
Drawings
Fig. 1 is a functional block diagram of a vehicle 100 provided in an embodiment of the present application.
Fig. 2 is a schematic diagram of an applicable automatic driving system according to an embodiment of the present application.
Fig. 3 is a schematic flowchart of a target object sensing method according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a point cloud cluster corresponding to a target object in an embodiment of the present application.
Fig. 5 is a schematic diagram of a positional relationship between the target object 400 and the acquisition apparatus 500 in the coordinate system according to the embodiment of the present application.
Fig. 6 is a schematic diagram of a positional relationship between the target object 400 and the acquisition apparatus 500 in a coordinate system according to another embodiment of the present application.
Fig. 7 is a schematic diagram of an environment map according to an embodiment of the present application.
Fig. 8 is a schematic view of a sensing device for a target object according to an embodiment of the present application.
FIG. 9 is a schematic block diagram of a computing device of another embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings. For convenience of understanding, a scene of intelligent driving is taken as an example in conjunction with fig. 1, and a scene to which the embodiment of the present application is applied is described below.
Fig. 1 is a functional block diagram of a vehicle 100 provided in an embodiment of the present application. In one embodiment, the vehicle 100 is configured in a fully or partially autonomous driving mode. For example, the vehicle 100 may control itself while in the autonomous driving mode, and may determine a current state of the vehicle and its surroundings by human operation, determine a possible behavior of at least one other vehicle in the surroundings, and determine a confidence level corresponding to a likelihood that the other vehicle performs the possible behavior, controlling the vehicle 100 based on the determined information. While the vehicle 100 is in the autonomous driving mode, the vehicle 100 may be placed into operation without human interaction.
The vehicle 100 may include various subsystems such as a travel system 102, a sensor system 104, a control system 106, one or more peripherals 108, as well as a power supply 110, a computer system 112, and a user interface 116. Alternatively, vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. In addition, each of the sub-systems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
The travel system 102 may include components that provide powered motion to the vehicle 100. In one embodiment, the travel system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121. The engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine of a gasoline engine and an electric motor, or a hybrid engine of an internal combustion engine and an air compression engine. The engine 118 converts the energy source 119 into mechanical energy.
Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 119 may also provide energy to other systems of the vehicle 100.
The transmission 120 may transmit mechanical power from the engine 118 to the wheels 121. The transmission 120 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 120 may also include other devices, such as a clutch. Wherein the drive shaft may comprise one or more shafts that may be coupled to one or more wheels 121.
The sensor system 104 (also referred to as a "collection device") may include a number of sensors that sense information about the environment surrounding the vehicle 100. For example, the sensor system 104 may include a positioning system 122 (which may be a Global Positioning System (GPS) system, a Beidou system, or other positioning system), an Inertial Measurement Unit (IMU) 124, a radar 126, a laser rangefinder 128, and a camera 130. The sensor system 104 may also include sensors of internal systems of the monitored vehicle 100 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors may be used to detect the object and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a critical function of the safe operation of the autonomous vehicle 100.
The positioning system 122 may be used to estimate the geographic location of the vehicle 100. The IMU 124 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration. In one embodiment, IMU 124 may be a combination of an accelerometer and a gyroscope.
The radar 126 may utilize radio signals to sense objects within the surrounding environment of the vehicle 100. In some embodiments, in addition to sensing the target object, the radar 126 may be used to sense one or more of a speed, a position, and a heading of the target object.
The laser rangefinder 128 may utilize laser light to sense objects in the environment in which the vehicle 100 is located. In some embodiments, the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The camera 130 may be used to capture multiple images of the surrounding environment of the vehicle 100. The camera 130 may be a still camera or a video camera.
The control system 106 is for controlling the operation of the vehicle 100 and its components. Control system 106 may include various elements including a steering system 132, a throttle 134, a braking unit 136, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.
The steering system 132 is operable to adjust the heading of the vehicle 100. For example, in one embodiment, a steering wheel system.
The throttle 134 is used to control the operating speed of the engine 118 and thus the speed of the vehicle 100.
The brake unit 136 is used to control the deceleration of the vehicle 100. The brake unit 136 may use friction to slow the wheel 121. In other embodiments, the brake unit 136 may convert the kinetic energy of the wheel 121 into an electric current. The brake unit 136 may take other forms to slow the rotational speed of the wheels 121 to control the speed of the vehicle 100.
The computer vision system 140 may be operable to process and analyze images captured by the camera 130 to identify objects and/or features in the environment surrounding the vehicle 100. The objects and/or features may include traffic signals, road boundaries, and obstacles. The computer vision system 140 may use object recognition algorithms, Structure From Motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 140 may be used to map an environment, track objects, estimate the speed of objects, and so forth.
The route control system 142 is used to determine a travel route of the vehicle 100. In some embodiments, the route control system 142 may combine data from the sensors, the GPS 122, and one or more predetermined maps to determine a travel route for the vehicle 100.
Obstacle avoidance system 144 is used to identify, assess, and avoid or otherwise negotiate potential obstacles in the environment of vehicle 100.
Of course, in one example, the control system 106 may additionally or alternatively include components other than those shown and described. Or may reduce some of the components shown above.
Vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripherals 108. The peripheral devices 108 may include a wireless communication system 146, an in-vehicle computer 148, a microphone 150, and/or speakers 152.
In some embodiments, the peripheral devices 108 provide a means for a user of the vehicle 100 to interact with the user interface 116. For example, the onboard computer 148 may provide information to a user of the vehicle 100. The user interface 116 may also operate the in-vehicle computer 148 to receive user input. The in-vehicle computer 148 may be operated via a touch screen. In other cases, the peripheral devices 108 may provide a means for the vehicle 100 to communicate with other devices located within the vehicle. For example, the microphone 150 may receive audio (e.g., voice commands or other audio input) from a user of the vehicle 100. Similarly, the speaker 152 may output audio to a user of the vehicle 100.
The wireless communication system 146 may communicate wirelessly with one or more devices, either directly or via a communication network. For example, the wireless communication System 146 may use 3G cellular communication such as Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM)/GPRS, or fourth generation (4G) communication such as LTE. Or a fifth Generation (5th-Generation, 5G) communication. The wireless communication system 146 may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system 146 may utilize an infrared link, bluetooth, or ZigBee (ZigBee) to communicate directly with the device. Other wireless protocols, such as various vehicle communication systems, for example, the wireless communication system 146 may include one or more Dedicated Short Range Communications (DSRC) devices that may include public and/or private data communications between vehicles and/or roadside stations.
The power supply 110 may provide power to various components of the vehicle 100. In one embodiment, power source 110 may be a rechargeable lithium ion or lead acid battery. One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100. In some embodiments, the power source 110 and the energy source 119 may be implemented together, such as in some all-electric vehicles.
Some or all of the functionality of the vehicle 100 is controlled by the computer system 112. The computer system 112 may include at least one processor 113, the processor 113 executing instructions 115 stored in a non-transitory computer readable medium, such as data storage 114. The computer system 112 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
The processor 113 may be any conventional processor, such as a commercially available Central Processing Unit (CPU). Alternatively, the processor may be a dedicated device such as an Application Specific Integrated Circuit (ASIC) or other hardware-based processor. Although fig. 1 functionally illustrates processors, memories, and other elements of the computer 110 in the same blocks, those of ordinary skill in the art will appreciate that the processors, computers, or memories may actually comprise multiple processors, computers, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard disk drive or other storage medium located in a different housing than the computer 110. Thus, references to a processor or computer are to be understood as including references to a collection of processors or computers or memories which may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions.
In various aspects described herein, the processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to perform a single maneuver.
In some embodiments, the memory 114 may include instructions 115 (e.g., program logic), and the instructions 115 may be executed by the processor 113 to perform various functions of the vehicle 100, including those described above. The memory 114 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the travel system 102, the sensor system 104, the control system 106, and the peripheral devices 108.
In addition to instructions 115, memory 114 may also store data such as road maps, route information, the location, direction, speed of the vehicle, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 112 during operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
In some embodiments, the processor 113 may further execute the planning scheme for vehicle longitudinal motion parameters according to the embodiment of the present application to help the vehicle plan the longitudinal motion parameters, where the specific longitudinal motion parameter planning method may refer to the description of fig. 3 below, and for brevity, details are not described herein again.
A user interface 116 for providing information to and receiving information from a user of the vehicle 100. Optionally, the user interface 116 may include one or more input/output devices within the collection of peripheral devices 108, such as a wireless communication system 146, an in-vehicle computer 148, a microphone 150, and a speaker 152.
The computer system 112 may control the functions of the vehicle 100 based on inputs received from various subsystems (e.g., the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may utilize input from the control system 106 in order to control the steering unit 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 is operable to provide control over many aspects of the vehicle 100 and its subsystems.
Alternatively, one or more of these components described above may be mounted or associated separately from the vehicle 100. For example, the memory 114 may exist partially or completely separate from the vehicle 100. The above components may be communicatively coupled together in a wired and/or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 1 should not be construed as limiting the embodiment of the present invention.
An autonomous automobile traveling on a roadway, such as vehicle 100 above, may identify objects within its surrounding environment to determine an adjustment to the current speed. The object may be another vehicle, a traffic control device, or another type of object. In some examples, each identified object may be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, separation from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to be adjusted.
Optionally, the autonomous vehicle 100 or a computing device associated with the autonomous vehicle 100 (e.g., the computer system 112, the computer vision system 140, the memory 114 of fig. 1) may predict behavior of the identified objects based on characteristics of the identified objects and the state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.). Optionally, each identified object depends on the behavior of each other, so it is also possible to predict the behavior of a single identified object taking all identified objects together into account. The vehicle 100 is able to adjust its speed based on the predicted behaviour of said identified object. In other words, the autonomous vehicle is able to determine that the vehicle will need to adjust to a steady state (e.g., accelerate, decelerate, or stop) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 in the road on which it is traveling, the curvature of the road, the proximity of static and dynamic objects, and so forth.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computing device may also provide instructions to modify the steering angle of the vehicle 100 to cause the autonomous vehicle to follow a given trajectory and/or to maintain a safe lateral and longitudinal distance from objects in the vicinity of the autonomous vehicle (e.g., cars in adjacent lanes on the road).
The vehicle 100 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, an amusement car, a playground vehicle, construction equipment, a trolley, a golf cart, a train, a trolley, etc., and the embodiment of the present invention is not particularly limited.
A scenario in which the embodiment of the present application is applied is described above with reference to fig. 1, and an automatic driving system in which the embodiment of the present application is applied is described below with reference to fig. 2.
FIG. 2 is a schematic diagram of an autopilot system suitable for use with an embodiment of the application, where computer system 101 includes a processor 103, and where processor 103 is coupled to a system bus 105. Processor 103 may be one or more processors, each of which may include one or more processor cores. A display adapter (video adapter)107, which may drive a display 109, the display 109 coupled with system bus 105. System bus 105 is coupled via a bus bridge 111 to an input/output (I/O) bus 113. The I/O interface 115 is coupled to an I/O bus. The I/O interface 115 communicates with various I/O devices, such as an input device 117 (e.g., keyboard, mouse, touch screen, etc.), a multimedia disk (media tray)121 (e.g., CD-ROM, multimedia interface, etc.). A transceiver 123 (which can send and/or receive radio communication signals), a camera 155 (which can capture scenic and motion digital video images), and an external USB interface 125. Wherein, optionally, the interface connected with the I/O interface 115 may be a USB interface.
The processor 103 may be any conventional processor, including a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, or a combination thereof. Alternatively, the processor may be a dedicated device such as an application specific integrated circuit, ASIC. Alternatively, the processor 103 may be a neural network processor or a combination of a neural network processor and a conventional processor as described above.
Optionally, in various embodiments described herein, computer system 101 may be located remotely from the autonomous vehicle and may communicate wirelessly with the autonomous vehicle. In other aspects, some processes described herein are performed on a processor disposed within an autonomous vehicle, others being performed by a remote processor, including taking the actions required to perform a single maneuver.
Computer 101 may communicate with software deploying server 149 via network interface 129. The network interface 129 is a hardware network interface, such as a network card. The Network 127 may be an external Network, such as the internet, or an internal Network, such as an ethernet or Virtual Private Network (VPN). Optionally, the network 127 may also be a wireless network, such as a Wi-Fi network, a cellular network, or the like.
The hard drive interface is coupled to system bus 105. The hardware drive interface is connected with the hard disk drive. System memory 135 is coupled to system bus 105. Data running in system memory 135 may include the operating system 137 and application programs 143 of computer 101.
The operating system includes a shell (shell)139 and a kernel (kernel) 141. The shell 139 is an interface between the user and the kernel of the operating system. The housing 139 is the outermost layer of the operating system. The shell 139 manages the interaction between the user and the operating system: waits for user input, interprets the user input to the operating system, and processes the output results of the various operating systems.
Kernel 141 is comprised of those portions of the operating system that are used to manage memory, files, peripherals, and system resources. Interacting directly with the hardware, the operating system kernel typically runs processes and provides inter-process communication, CPU slot management, interrupts, memory management, IO management, and the like.
The application programs 143 include programs related to controlling the automatic driving of a vehicle, such as programs for managing the interaction of an automatically driven vehicle with obstacles on the road, programs for controlling the route or speed of an automatically driven vehicle, and programs for controlling the interaction of an automatically driven vehicle with other automatically driven vehicles on the road. The application program 143 also exists on the system of the software deploying server (deploying server) 149. In one embodiment, computer system 101 may download application program 143 from software deploying server (deploying server)149 when application program 147 needs to be executed.
In some embodiments, the application programs may further include an application program corresponding to a sensing scheme for the target object provided in the embodiments of the present application, where the sensing scheme for the target object in the embodiments of the present application will be specifically described below, and is not described herein again for brevity.
Sensor 153 (also referred to as a "collection device") is associated with computer system 101. The sensor 153 is used to detect the environment surrounding the computer 101. For example, the sensor 153 may detect an object, such as an animal, a car, an obstacle, etc., and further the sensor may detect the surrounding environment of the object, such as: the environment surrounding the animal, other animals present around the animal, weather conditions, brightness of the surrounding environment, etc. Alternatively, if the computer 101 is located on an autonomous automobile, the sensor may be a laser radar, a camera, an infrared sensor, a chemical detector, a microphone, or the like.
In a conventional target object sensing scheme, point cloud data including a target object is processed to obtain a point cloud cluster representing the target object, a geometric center or a gravity center of the point cloud cluster is determined, and then the position and the speed of the target object are calculated based on the position and the speed of the geometric center or the gravity center of the point cloud cluster to sense the target object. However, in the above-mentioned solution for calculating the position and velocity of the target object based on the position and velocity of the geometric center or the center of gravity of the point cloud cluster, if the geometric center or the center of gravity of the point cloud cluster is blocked, the accuracy of the calculated position and velocity of the target object may be reduced.
In order to avoid the above problem, the present application provides a method for sensing an object, which calculates a first state (e.g., a first position and/or a first velocity) of the object corresponding to each feature point of a plurality of feature points based on states (e.g., positions and/or velocities) of the feature points of a point cloud cluster, and then fuses the first state of the object corresponding to each feature point based on uncertainty of the feature points to finally obtain a second state (e.g., a second position and/or a second velocity) of the object.
The method of the embodiments of the present application is described below in conjunction with fig. 3. It should be understood that the method shown in fig. 3 may be performed by the autonomous driving system shown in fig. 2, or may also be performed by the control system 106 in the vehicle 100, and optionally, the second state of the target may also be sent to the obstacle avoidance system 144 to plan a driving route of the vehicle 100, and so on.
Fig. 3 is a schematic flowchart of a target object sensing method according to an embodiment of the present application. The method shown in fig. 3 includes steps 310 through 340.
A plurality of feature points of a point cloud cluster are obtained 310, wherein the point cloud cluster represents a target object, or the point cloud cluster represents a partial or complete contour or outline of the target object.
In general, in order to facilitate the acquisition of the feature points, the feature points may be contour points of a point cloud cluster, for example, the feature points are end points of the point cloud cluster. Of course, the plurality of feature points may further include a geometric center, a center of gravity, and the like of the point cloud cluster, which is not limited in the embodiment of the present application.
The above endpoints are also referred to as "points of interest" and generally refer to the intersection of two adjacent edges in a point cloud cluster. Currently, the above-mentioned multiple endpoints can be obtained by using the existing endpoint detection technology.
Optionally, the point cloud cluster may be obtained based on an existing point cloud cluster obtaining scheme, for example, when a laser radar sensor is used as the acquisition device, a laser signal may be transmitted and received by the laser radar sensor, a detection distance corresponding to a transmission angle is determined by using a time difference between transmission and reception, and a three-dimensional point cloud of a spatial environment may be obtained by multilayer scanning. And then, the obtained point cloud is driven by a laser radar and then converted into a format required by target object perception, and the point cloud data is continuously sent to a controller. Correspondingly, the controller can cluster the point cloud data, and filter out the cluster which does not accord with the target characteristics according to the cluster number, the size and the like of the cluster, and the rest is the point cloud cluster corresponding to the target object. Among the common clustering methods, there are density-based clustering of applications with noise (DBSCAN), K-neighbor method (KNN), and the like.
After the point cloud cluster of the target object is obtained, the controller may further extract the orientation and the Shape of the point cloud cluster by using an L-Shape feature extraction algorithm or a trained neural network model, and of course, the orientation of the point cloud cluster may also be determined according to a historical track or a historical motion direction of the target object, and then the Shape is calculated by traversing points in the point cloud cluster according to the orientation.
In general, the point cloud cluster may represent the target object by fitting a bounding box, i.e., the target object is framed by a fitting rectangle. Fig. 4 is a schematic diagram of a point cloud cluster corresponding to a target object in an embodiment of the present application. As can be seen from fig. 4, the outline of the point cloud cluster corresponding to the target 400 is a rectangle with length l and width w, the rectangle includes 4 endpoints, i.e., endpoint 0, endpoint 1, endpoint 2, and endpoint 3, and the geometric center of the rectangle has coordinates (x, y) and the azimuth angle of the coordinates is (x, y)
Figure BDA0002611483000000101
Wherein, the coordinate azimuth angle is the included angle between the orientation of the point cloud cluster and the x axis in the coordinate system, the coordinate system is the geometric center of the laser radar, and the coordinate of the endpoint 0 is
Figure BDA0002611483000000111
The coordinate of the endpoint 1 is
Figure BDA0002611483000000112
The coordinates of the end point 2 are
Figure BDA0002611483000000113
The coordinates of the end point 3 are
Figure BDA0002611483000000114
It should be noted that the length and width of the point cloud cluster may be obtained through statistics of multiple frame point cloud images, and of course, the length and width of the point cloud cluster may also be determined based on the acquisition position of the endpoint, which is not limited in this embodiment of the present application.
And 320, determining the collection position of each characteristic point in the plurality of characteristic points and the uncertainty of each characteristic point, wherein the uncertainty is used for indicating the error generated when the position of each characteristic point in the point cloud cluster is collected through the collection device.
Usually, when the collection device is collecting the target object, because collection device itself can have certain inherent error, or at the in-process of collecting the target object, whether the position of characteristic point in the point cloud cluster can be observed by collection device directly can all influence the corresponding uncertainty of above-mentioned characteristic point, consequently, can come the uncertainty that sets up the endpoint by collection device direct observation based on whether the limit that links to each other with the endpoint can be in this application.
That is, the step 320 includes: determining the type of an edge connected with each endpoint in a plurality of endpoints, wherein the type of the edge comprises a visible edge directly observed by the acquisition equipment and an invisible edge which cannot be directly observed by the acquisition equipment; an uncertainty is determined for each of the plurality of endpoints based on a type of the two edges connected to each of the plurality of endpoints.
The plurality of end points can be generally divided into three types, namely, end points of a first type, wherein two edges connected with the end points of the first type are a visible edge and an invisible edge, for example, the end points 0 and 2 shown in fig. 4; a second type of endpoint, both edges connected to the type of endpoint being visible edges, e.g., endpoint 1 shown in FIG. 4; and a third type of endpoint, where both edges connected to the type of endpoint are invisible edges, e.g., endpoint 3 shown in fig. 4. The following describes a method of calculating uncertainty based on three different types of endpoints, respectively. For ease of distinction, hereinafter a first endpoint belongs to a first type of endpoint, a second endpoint belongs to a second type of endpoint, and a third endpoint belongs to a third type of endpoint.
For a first endpoint, the type of the first edge connected to the first endpoint is a visible edge, the type of the second edge connected to the first endpoint is an invisible edge, and the actual position of the first endpoint is usually located on the extension of the visible edge (i.e., the first edge), and therefore the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target object.
The uncertainty of the first endpoint is determined based on a component of the detection uncertainty of the acquisition device in the orientation of the target object, and it is understood that the component of the detection uncertainty of the acquisition device in the orientation of the target object and other factors have an influence on the uncertainty of the first endpoint in determining the uncertainty of the first endpoint. The uncertainty of the first endpoint is determined based on a component of the detection uncertainty of the acquisition device in the orientation of the target object, and it can be understood that the component of the detection uncertainty of the acquisition device in the orientation of the target object is directly used as the uncertainty of the first endpoint, which is not limited in the embodiment of the present application.
Alternatively, the inherent uncertainty of the acquisition device may be projected to the orientation of the object and the uncertainty resulting from the projection may be used as the uncertainty of the first endpoint. I.e. uncertainty d of the first end point1By the formula
Figure BDA0002611483000000115
Is determined wherein R1Representing the measured distance, theta, between the acquisition device and the first end point1The coordinate azimuth angle when the acquisition equipment acquires the first endpoint is represented;
Figure BDA0002611483000000116
an azimuth angle representing an orientation of the target object; c0Is a preset value and is inversely related to the acquisition precision of the acquisition equipment, and the unit is radian (rad).
It should be noted that the above coordinate azimuth can be understood as an included angle between a line connecting the first end point of the target object and the acquisition device in the coordinate system and the x-axis. The azimuth angle of the orientation of the object can be understood as a horizontal angle between the clockwise direction and the x-axis from the direction of the orientation of the object.
Optionally, if the collecting device is a laser radar, C0The uncertainty indicating the scanning of the laser light can be set in accordance with the scanning resolution of the laser light and is proportional to the scanning resolution of the laser light. For example, when the scanning resolution of the laser is 0.2 °, C may be set0Is 0.01.
In general, since the position of the target object can be represented by the position of the target object in the coordinate system when the position of the target object is subsequently determined, for convenience of subsequent calculation, the uncertainty of the first endpoint can be projected to the x-axis and the y-axis, that is, the uncertainty D of the first endpoint on the x-axis1xIs composed of
Figure BDA0002611483000000121
Uncertainty D of first end point on y axis1yIs composed of
Figure BDA0002611483000000122
Wherein D isx0Representing an initial uncertainty in the x-axis direction; dy0Representing the initial uncertainty in the y-axis direction.
In addition, D isx0And Dy0With a first degree of uncertainty C0And/or the scanning accuracy of the acquisition device.
The calculation scheme of the uncertainty of the first endpoint is described below with reference to fig. 5, taking endpoint 0 as the first endpoint. Fig. 5 is a schematic diagram of a positional relationship between the target object 400 and the acquisition apparatus 500 in the coordinate system according to the embodiment of the present application.
Assuming the coordinate system shown in FIG. 5 has the geometric center of the sensor as the origin of coordinates, wherein the positive directions of the x-axis and the y-axis are shown, and the negative direction is the positive direction, the azimuth angle of the orientation of the acquisition device 500 is θ1', the inherent uncertainty of the acquisition device 500 is C0' (rad). The measured distance between the acquisition device 500 and the endpoint 0 is R1' since the visible edge connected to the end point 0 is parallel to the orientation of the object 400, the azimuth angle of the orientation of the object 400 is equal to the angle between the visible edge and the x-axis
Figure BDA0002611483000000123
Then it represents the measured distance R between the acquisition device 500 and the endpoint 01Is at an angle to the orientation of the target
Figure BDA0002611483000000124
Will acquire the inherent uncertainty C of the device 5000' after projection onto the orientation of the target, the uncertainty d of the end point 0 is obtained1' is
Figure BDA0002611483000000125
Uncertainty of endpoint 0d1' component d in the x-axis1x' is
Figure BDA0002611483000000126
Uncertainty of endpoint 0 d1Component d on the y-axis1y' is
Figure BDA0002611483000000127
Wherein D isx0' represents the initial uncertainty in the x-axis direction; dy0' denotes the initial uncertainty in the y-direction.
It should be noted that the endpoint 2 in fig. 5 also belongs to the first endpoint, and the uncertainty of the first endpoint may be calculated, which is not described herein again for brevity.
For the second endpoint, since the type of the two edges connected to the second endpoint are visible edges, the factor affecting the uncertainty of the second endpoint is typically the measured distance between the second endpoint and the acquisition device, wherein the measured distance between the second endpoint and the acquisition device is positively correlated with the uncertainty of the second endpoint. Thus, the uncertainty of the second endpoint may be determined based on the measured distance between the second endpoint and the acquisition device in the coordinate system.
Optionally, uncertainty d of the second endpoint2By the formula d2=R2×C1Is determined wherein R2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians.
Alternatively, the above C1The horizontal opening angle of the observation target can be set in proportion to the horizontal opening angle of the observation target. For example, the horizontal field angle of the observation target is 10 °, and C may be set1Is 0.17. Of course, the second uncertainty may also be the same as the first uncertainty, and this is not limited in this embodiment of the application.
The calculation scheme of the uncertainty of the second endpoint is described below with the endpoint 1 as the second endpoint in conjunction with fig. 6. Fig. 6 is a schematic diagram of a position relationship between the target object 400 and the acquisition device 500 in a coordinate system according to another embodiment of the present application.
Assuming the coordinate system shown in fig. 6 has the geometric center of the sensor as the origin of coordinates, wherein the positive directions of the x-axis and the y-axis are shown in the figure, and the counterclockwise direction is the positive direction, the azimuth angle of the orientation of the acquisition device 500 is θ2', the inherent uncertainty of the acquisition device 500 is C1' (rad). The measured distance between the acquisition device 500 and the end point 1 is R2',d2' then uncertainty of endpoint 1 d2Is' a d2'=R2'×C1'。
Accordingly, the uncertainty d of endpoint 12' component d in the x-axis2xIs' a d2x'=Dx0'+d2'cos(|θ2' |); uncertainty d of endpoint 12Component d on the y-axis2yIs' a d2y'=Dy0'+d2'sin(|θ2' |) wherein Dx0' represents the initial uncertainty in the x-axis direction; dy0' denotes the initial uncertainty in the y-direction.
For the third endpoint, since the types of the two edges connected to the third endpoint are invisible edges, the uncertainty of the third endpoint may be set to be greater than the uncertainty of the first endpoint and the uncertainty of the second endpoint, and may be set to infinity, for example. Accordingly, the components of the uncertainty of the third end point in the x-direction and the y-direction of the coordinate system may be set to be also larger than the components of the uncertainty of the first end point in the x-direction and the y-direction of the coordinate system and the components of the uncertainty of the second end point in the x-direction and the y-direction of the coordinate system.
In some cases, the point cloud cluster of the target object may only represent a partial outline or contour of the target object, and the obtained end point may not be the actual end point of the target object, and the position of the actual end point of the target object is blocked by other objects. In this case, in order to improve the accuracy of determining the uncertainty of the endpoint, the present application further provides a method for calculating the uncertainty of the endpoint, which is described below with reference to fig. 7. It should be understood that, in the case that the target object is occluded by another object, the calculation may also be directly performed according to the above-described calculation manner of the uncertainty of the first endpoint, the second endpoint, and the third endpoint, which is not limited in this embodiment of the application.
Whether other objects are shielded between the target object and the acquisition device can be judged by generating an environment map. The method comprises the steps of scanning the surroundings through an acquisition device, acquiring an environment map containing a target object, and segmenting the environment map according to a preset angle (for example, an azimuth angle of the acquisition device), wherein the characteristics of each segmented space comprise the azimuth angle corresponding to each space, the measurement distance of an object which is corresponding to the azimuth angle and is closest to acquisition equipment, and the number of the object. The measurement distance of the nearest object corresponding to the azimuth angle and the number of the object can be obtained through the following modes: and calculating the minimum external convex polygon of the object according to the point cloud cluster of the object in the environment map, traversing the external convex polygons of all the objects in the environment map, and obtaining the measured distance of the object which is corresponding to each azimuth angle in the environment map and is closest to the acquisition equipment and the number of the closest object.
Then, a reference point corresponding to each end point of the current target object is determined based on the historical data of the target object, the reference point corresponding to each end point is marked in the environment map, and whether the reference point corresponding to the end point of the target object is blocked or not is determined by combining the measured distance of the object closest to the acquisition device corresponding to each azimuth angle in the environment map and the number of the closest object. When the measurement distance from the acquisition device to the reference point corresponding to a certain endpoint is equal to the measurement distance of the object closest to the acquisition device in the azimuth corresponding to the endpoint, the endpoint and the acquisition device are not shielded by other objects. When the measurement distance from the acquisition equipment to a reference point corresponding to a certain endpoint is greater than the measurement distance of the object closest to the acquisition equipment corresponding to the azimuth angle corresponding to the endpoint, the endpoint and the acquisition equipment are shielded by other objects.
The historical data of the target object may be characteristics of the target object, such as parameters of the length and width of the target object, and coordinates of each end point of the target object, during scanning before the point cloud cluster of the target object is acquired.
For example, fig. 7 is a schematic diagram of an environment map according to an embodiment of the present application. In the environment map 700 shown in fig. 7, the acquisition device 500 scans the surroundings according to a preset azimuth angle to obtain the environment map including the target object 400, and segments the environment map according to a preset angle (for example, the azimuth angle of the acquisition device), where the features of each segmented space include the azimuth angle corresponding to each space, the measured distance of the object closest to the acquisition device corresponding to the azimuth angle, and the number of the object. The measurement distance of the nearest object corresponding to the azimuth angle and the number of the object can be obtained through the following modes: and calculating the minimum external convex polygon of the object according to the point cloud clusters of the object 710 and the target 400 in the environment map, traversing the external convex polygons of all the objects in the environment map, namely the object 710 and the target 400, and obtaining the measured distance of the object closest to the acquisition equipment and the number of the closest object corresponding to each azimuth angle in the environment map.
Then, the position of a reference point 1 corresponding to an endpoint 0 of the current target object is determined based on the historical data of the target object, the reference point 1 corresponding to the endpoint 0 is marked in the environment map, the measurement distance S between the acquisition equipment and the reference point 1 is determined, and the measurement distance S of an object closest to the acquisition equipment in a partitioned space corresponding to an azimuth 1 corresponding to the endpoint 0 is determinedminAnd the number of the nearest object. See FIG. 7, SminIf the reference point is less than S, the position between the reference point 1 and the acquisition equipment is blocked by the object 710.
After determining whether the end points of the target object are occluded by other objects, respectively, in the above manner, the uncertainty of the end points can be calculated, respectively, according to the types of the end points described above.
For the first type of endpoint, if the first reference point is occluded by other objects, the uncertainty of the first endpoint is determined based on the degree of change of the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first endpoint.
The horizontal field angle corresponding to the first endpoint can be understood as a horizontal field angle used by the acquisition device to acquire the whole target object when the first endpoint is taken as the endpoint of the target object.
The horizontal field angle corresponding to the first reference point may be understood as a horizontal field angle used by the acquisition device to acquire the entire target object when the acquisition device uses the first reference point as an end point of the target object.
Optionally, based on a difference δ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, by formula
Figure BDA0002611483000000141
Determining an uncertainty d of a first endpoint3Wherein R is1Representing the measured distance, C, between the acquisition device and the first endpoint0The set value is a preset value and is inversely related to the acquisition precision of acquisition equipment, and the unit is radian; theta1The coordinate azimuth angle when the acquisition equipment acquires the first endpoint is represented;
Figure BDA0002611483000000142
indicating the azimuth of the orientation of the target object.
Referring to fig. 7, the horizontal opening angle corresponding to the endpoint 0 is δ0The horizontal field angle corresponding to the reference point 1 is delta1Then, the difference δ between the horizontal opening angle corresponding to the reference point 1 and the horizontal opening angle corresponding to the endpoint 0 is δ ═ δ1|-|δ0|。
As described above, the uncertainty d of the first endpoint may be used to facilitate subsequent calculations3Uncertainty D projected onto the x-axis and y-axis, i.e. the first end point, in the x-axis3xIs composed of
Figure BDA0002611483000000143
Uncertainty D of first end point on y axis3yIs composed of
Figure BDA0002611483000000144
Wherein D isx0Representing an initial uncertainty in the x-axis direction; dy0Representing the initial uncertainty in the y-axis direction.
Since the position of the first end point affects the position of the second end point, if the first reference point is blocked by other objects, the determination of the position of the second end point is affected to a certain extent, and therefore, the uncertainty of the second end point is positively correlated with the degree of change of the horizontal field angle corresponding to the first end point and the horizontal field angle corresponding to the first end point based on the horizontal field angle corresponding to the first reference point.
Optionally, if the first reference point is occluded by another object, based on a difference δ between a horizontal field angle corresponding to the first reference point and a horizontal field angle corresponding to the first end point, the formula d is used4=D0+R2×(C1+ δ), determining the uncertainty d of the second endpoint4Wherein R is2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians (rad); d0Indicating an initial uncertainty of the preset second endpoint.
As described above, the uncertainty d of the second endpoint may be used to facilitate subsequent calculations4Uncertainty D projected onto the x-axis and y-axis, i.e. the first end point, in the x-axis4xIs composed of
Figure BDA0002611483000000145
Uncertainty D of first end point on y axis3yIs composed of
Figure BDA0002611483000000146
Wherein D isx0Representing an initial uncertainty in the x-axis direction; dy0Representing the initial uncertainty in the y-axis direction.
And 330, calculating a first state of the target object corresponding to each feature point in the plurality of feature points based on the state of each feature point in the plurality of feature points, wherein the state of each feature point comprises the position and/or the speed of each feature point, and the first state comprises the first speed and/or the first position of the target object.
The first state of the object may be understood as a position or a velocity of a geometric center of the object.
For example, referring to FIG. 7, assume the location of endpoint 0 is [ x ]Endpoint 0,yEndpoint 0]The azimuth angle of the orientation of the target object is
Figure BDA0002611483000000151
The length and width are l and w respectively, and the central position of the target object corresponding to the end point 0 is:
Figure BDA0002611483000000152
and 340, determining a second state of the object based on the first state of the object corresponding to each feature point in the plurality of feature points and the uncertainty corresponding to each feature point in the plurality of feature points, wherein the second state comprises a second speed and/or a second position of the object.
Optionally, step 340 includes: determining a confidence corresponding to each feature point in the plurality of feature points based on the uncertainty corresponding to each feature point in the plurality of feature points; and determining the second state of the target object based on the first state of the target object corresponding to each feature point in the plurality of feature points and the confidence degree corresponding to each feature point in the plurality of feature points.
Optionally, the determining the confidence of each feature point of the plurality of feature points based on the uncertainty of each feature point of the plurality of feature points includes: based on the uncertainty corresponding to each of the plurality of feature points, passing through a formula
Figure BDA0002611483000000153
Determining confidence M corresponding to kth characteristic point in a plurality of characteristic pointskWherein k represents the kth feature point in the plurality of feature points, k is 1 … … n, and n is the total number of the plurality of feature points; dkRepresenting uncertainty of the kth feature point; deltakRepresenting a change between the history state and the first state of the kth feature point; c3、C4Is a preset value.
It will be appreciated that typically by setting C3、C4Value adjustment d ofkAnd ΔkAt the calculation of confidence MkSpecific gravity of the same. For example, C may be set3、C4Is 0.5.
As described above, to facilitate subsequent calculations, one mayTo divide the confidence into confidence in the x-axis direction and confidence in the y-direction. I.e. the confidence M of the k-th feature point in the x-axis directionkxIs composed of
Figure BDA0002611483000000154
Confidence M of kth characteristic point in y-axis directionkyIs composed of
Figure BDA0002611483000000155
Wherein d iskxRepresenting uncertainty of the k-th feature point in the x-axis direction; deltakxRepresenting the change between the history state and the first state of the k characteristic point in the x-axis direction; dkyRepresenting uncertainty of the k-th feature point in the y-axis direction; deltakyRepresenting the change between the history state and the first state in the y-axis direction of the k-th feature point.
Alternatively, the above step 340 may be expressed by the following formula: assume that the position of the target object in the second state of the target object is [ X, Y ]]The velocity of the target is expressed as [ V ]X,VY]Then
Figure BDA0002611483000000156
Figure BDA0002611483000000157
Figure BDA0002611483000000158
Wherein k represents the kth feature point in the plurality of feature points, k is 1 … … n, and n is the total number of the plurality of feature points; mkxRepresenting the confidence of the k characteristic point in the x-axis direction; mkyRepresenting the confidence of the k characteristic point in the y-axis direction; t represents the sum of the confidences of the n feature points; x is the number ofcenter,kAn x coordinate representing the geometric center of the target corresponding to the kth feature point; y iscenter,kA y coordinate representing the geometric center of the target corresponding to the kth feature point; v. ofx,kRepresenting the velocity component of the geometric center of the target corresponding to the kth characteristic point on the x axis; v. ofy,kRepresents the kth characteristic point pairThe velocity component of the geometric center of the target object on the y-axis.
In general, in order to improve the accuracy of the states of the respective feature points, it is necessary to calculate the states of the respective feature points through a plurality of rounds and comprehensively consider the calculation results of the plurality of rounds. That is, after calculating the uncertainty of each feature point and the state of each feature point (i.e., after step 330), each feature point (also referred to as "observed feature point") in the current round may be associated with a feature point (also referred to as "tracked feature point") calculated in history, and the state of each feature point in the target object may be updated. Specifically, according to the information such as the position and the orientation of each feature point, data association is performed between each observation feature point and the tracked feature point, and after association, the state of each feature point is updated according to the state of the observation feature point and the state of the tracked feature point, so that the state of each updated feature point is obtained.
It should be understood that there are many ways to associate the observed feature points and the tracked feature points, for example, data association based on nearest neighbor matching may be used to associate each observed feature point with a distance of the tracked feature point. Orientation matching based methods may also be employed to correlate the orientation of each observed feature point with the orientation angle of the tracked feature point relative to the center of the target object. The embodiment of the present application is not particularly limited to this.
It should be further understood that there are many methods for updating the states of the feature points, for example, the states of the feature points may be updated based on kalman filtering, extended kalman filtering, and the like, or the states of the feature points may be updated based on bayesian inference to calculate the maximum posterior probability, which is not limited in this embodiment of the present application.
In the embodiment of the application, the states of the feature points of the target object are updated by associating the observation feature points with the tracked feature points, which is beneficial to improving the accuracy of the states of the feature points. Of course, the state of the observation feature point may be directly determined as the state of the target object in this round without updating the state of each feature point.
The method for sensing the target object according to the embodiment of the present application is described above with reference to fig. 1 to 7, and the apparatus according to the embodiment of the present application is described below with reference to fig. 8 to 9. It should be understood that, it should be noted that the apparatuses shown in fig. 8 to fig. 9 can implement the steps of the above-mentioned method, and are not described herein again for brevity.
Fig. 8 is a schematic view of a sensing device for a target object according to an embodiment of the present application. The apparatus 800 shown in fig. 8 comprises: an acquisition unit 810 and a processing unit 820. Alternatively, the device 800 may be a device for operating the automatic driving system in fig. 1, and the device 800 may also be a device for operating the control system in fig. 2, which is not particularly limited in this embodiment of the present application.
The acquiring unit 810 is configured to acquire a plurality of feature points of a point cloud cluster, where the point cloud cluster represents a target object.
The processing unit 820 is configured to determine an uncertainty of each of the plurality of feature points, wherein the uncertainty indicates an error generated when the position of each feature point in the point cloud cluster is acquired by the acquisition apparatus.
The processing unit 820 is further configured to obtain a first state of the target object corresponding to each feature point based on a state of each feature point in the plurality of feature points, where the state of each feature point includes a position and/or a speed of each feature point, and the first state includes a first speed and/or a first position of the target object.
The processing unit 820 is further configured to determine a second state of the object based on the first state of the object corresponding to each feature point and the uncertainty corresponding to each feature point, where the second state includes a second speed and/or a second position of the object.
Optionally, the plurality of feature points comprises a plurality of end points of the point cloud cluster.
Optionally, as an embodiment, the processing unit 820 is further configured to: determining the type of an edge connected with each endpoint in a plurality of endpoints, wherein the type of the edge comprises a visible edge directly acquired by acquisition equipment and an invisible edge which cannot be directly acquired by the acquisition equipment; an uncertainty is determined for each of the plurality of endpoints based on a type of the two edges connected to each of the plurality of endpoints.
Optionally, as an embodiment, the plurality of end points includes a first end point, a type of a first edge connected to the first end point is a visible edge, and a type of a second edge connected to the first end point is an invisible edge, and the uncertainty of the first end point is determined based on a component of the detection uncertainty of the acquisition device in the direction of the orientation of the object.
Optionally, as one embodiment, the uncertainty d of the first endpoint1By the formula
Figure BDA0002611483000000171
Is determined wherein R1Representing the measured distance, C, between the acquisition device and the first endpoint0The set value is a preset value and is inversely related to the acquisition precision of the acquisition equipment, and the unit is radian; theta1The coordinate azimuth angle when the acquisition equipment acquires the first endpoint is represented;
Figure BDA0002611483000000172
indicating the azimuth of the orientation of the target object.
Optionally, as an embodiment, the plurality of end points includes a second end point, and the two edges connected to the second end point are both visible edges, the measurement distance between the second end point and the acquisition device is positively correlated to the uncertainty of the second end point.
Optionally, as one embodiment, the uncertainty d of the second endpoint2By the formula d2=R2×C1Is determined wherein R2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians.
Optionally, as an embodiment, the processing unit 820 is further configured to: and if the first reference point is not shielded by other objects, determining the uncertainty of each end point in the plurality of end points based on the types of two edges connected with each end point in the plurality of end points, wherein the first reference point is a point which is away from the first end point by a preset distance in the extension line direction of the first edge, and the other objects are objects except the target object and the acquisition equipment in the image where the point cloud cluster is located.
Optionally, as an embodiment, the processing unit 820 is further configured to: and if the first reference point is shielded by other objects, determining the uncertainty of the first end point based on the change degree of the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first end point.
Optionally, as an embodiment, the processing unit 820 is further configured to: if the first reference point is shielded by other objects, based on the difference delta between the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first end point, the formula is used for
Figure BDA0002611483000000173
Determining an uncertainty d of a first endpoint3Wherein R is1Representing the measured distance, C, between the acquisition device and the first endpoint0The set value is a preset value and is inversely related to the acquisition precision of the acquisition equipment, and the unit is radian; theta1The coordinate azimuth angle when the acquisition equipment acquires the first endpoint is represented;
Figure BDA0002611483000000174
indicating the azimuth of the orientation of the target object.
Optionally, as an embodiment, the processing unit 820 is further configured to: and if the first reference point is shielded by other objects, determining the uncertainty of the second end point based on the change degree of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first end point.
Optionally, as an embodiment, the processing unit 820 is further configured to: if the first reference point is not shielded by other objects, based on the difference delta between the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first end point, the formula d is used4=R2×(C1+ δ), determining the uncertainty d of the second endpoint4Wherein R is2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians.
Optionally, as an embodiment, the processing unit 820 is further configured to: determining a confidence level corresponding to said each of said plurality of feature points based on an uncertainty corresponding to said each of said plurality of feature points; determining the second state of the object based on the first state of the object corresponding to the each of the plurality of feature points and the confidence level corresponding to the each of the plurality of feature points.
Optionally, as an embodiment, the processing unit 820 is further configured to: the processing unit is further configured to: based on the uncertainty corresponding to each of the plurality of feature points, formulating
Figure BDA0002611483000000175
Determining confidence M corresponding to the kth feature point in the plurality of feature pointskWherein k represents the kth feature point in a plurality of feature points, k is 1 … … n, and n is the total number of the plurality of feature points; dkRepresenting an uncertainty of the k-th feature point; deltakRepresenting a change between the history state of the kth feature point and the first state; c3、C4Is a preset value.
In an alternative embodiment, the processing unit 820 may be a processor 920, the obtaining module 810 may be a communication interface 930, and the communication device may further include a memory 910, as specifically shown in fig. 9.
FIG. 9 is a schematic block diagram of a computing device of another embodiment of the present application. The computing device 900 shown in fig. 9 may include: memory 910, processor 920, and communication interface 930. Wherein, the memory 910, the processor 920, and the communication interface 930 are connected via an internal connection path, the memory 910 is configured to store instructions, and the processor 920 is configured to execute the instructions stored in the memory 920 to control the communication interface 930 to receive/transmit information or data. Optionally, the memory 910 may be coupled to the processor 920 via an interface, or may be integrated with the processor 920.
It should be noted that the communication interface 930 implements communication between the computing device 900 and other devices by using a transceiver, such as but not limited to an input/output interface (i/o interface).
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 920. The method disclosed in the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in the processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 910, and the processor 920 reads the information in the memory 910, and performs the steps of the above method in combination with the hardware thereof. To avoid repetition, it is not described in detail here.
It should be understood that in the embodiments of the present application, the processor may be a Central Processing Unit (CPU), and the processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will also be appreciated that in embodiments of the present application, the memory may comprise both read-only memory and random access memory, and may provide instructions and data to the processor. A portion of the processor may also include non-volatile random access memory. For example, the processor may also store information of the device type.
It should be understood that the term "and/or" herein is merely one type of association relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (32)

1. A method for sensing an object, comprising:
acquiring a plurality of characteristic points of a point cloud cluster, wherein the point cloud cluster represents the target object;
determining an uncertainty for each of the plurality of feature points, the uncertainty indicating an error in acquiring a location of the each feature point in the point cloud cluster by an acquisition device;
acquiring a first state of the target object corresponding to each feature point in the plurality of feature points based on the state of each feature point in the plurality of feature points, wherein the state of each feature point comprises the position and/or the speed of each feature point, and the first state comprises a first speed and/or a first position of the target object;
determining a second state of the object based on the first state of the object corresponding to the each of the plurality of feature points and the uncertainty corresponding to the each of the plurality of feature points, the second state including a second velocity and/or a second position of the object.
2. The method of claim 1, in which the plurality of feature points comprises a plurality of end points of the point cloud cluster.
3. The method of claim 2, wherein said determining an uncertainty for each of said plurality of feature points comprises:
determining the type of an edge connected with each endpoint of the plurality of endpoints, wherein the type of the edge comprises a visible edge directly acquired by the acquisition equipment and an invisible edge which cannot be directly acquired by the acquisition equipment;
determining the uncertainty for each of the plurality of endpoints based on a type of two edges connecting to each of the plurality of endpoints.
4. The method of claim 3, wherein the plurality of endpoints includes a first endpoint, a type of a first edge connected to the first endpoint is a visible edge, and a type of a second edge connected to the first endpoint is an invisible edge, and the uncertainty of the first endpoint is determined based on a component of a detected uncertainty of an acquisition device in an orientation direction of the target object.
5. The method of claim 4, wherein the uncertainty d of the first endpoint1By the formula
Figure FDA0002611482990000011
Is determined wherein R1Representing the measured distance, C, between the acquisition device and the first endpoint0The set value is a preset value and is inversely related to the acquisition precision of the acquisition equipment, and the unit is radian; theta1Representing a coordinate azimuth angle when the acquisition equipment acquires the first endpoint;
Figure FDA0002611482990000012
represents an azimuth of the orientation of the target object.
6. The method of claim 4 or 5, wherein the plurality of endpoints includes a second endpoint, and wherein the type of both edges connected to the second endpoint are visible edges, then the measured distance between the second endpoint and the acquisition device positively correlates with the uncertainty of the second endpoint.
7. The method of claim 6, wherein the uncertainty d of the second endpoint2By the formula d2=R2×C1Is determined wherein R2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians.
8. The method of any one of claims 4-7, wherein determining the uncertainty for each of the plurality of endpoints based on types of two edges connected to each of the plurality of endpoints comprises:
and if the first reference point is not shielded by other objects, determining the uncertainty of each end point of the plurality of end points based on the types of two edges connected with each end point of the plurality of end points, wherein the first reference point is a point which is away from the first end point by a preset distance in the extension line direction of the first edge, and the other objects are objects except the target object and the acquisition equipment in the image where the point cloud cluster is located.
9. The method of claim 8, wherein the method further comprises:
and if the first reference point is blocked by other objects, determining the uncertainty of the first endpoint based on the degree of change of the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first endpoint.
10. The method of claim 9, wherein the determining the uncertainty of the first endpoint based on the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint if the first reference point is occluded by the other object comprises:
if the first reference point is shielded by other objects, based on the difference delta between the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first end point, passing through a formula
Figure FDA0002611482990000021
Determining the uncertainty d of the first endpoint3Wherein R is1Representing the measured distance, C, between the acquisition device and the first endpoint0The set value is a preset value and is inversely related to the acquisition precision of the acquisition equipment, and the unit is radian; theta1Representing a coordinate azimuth angle when the acquisition equipment acquires the first endpoint;
Figure FDA0002611482990000023
represents an azimuth of the orientation of the target object.
11. The method of any one of claims 8-10, further comprising:
and if the first reference point is shielded by other objects, determining the uncertainty of the second endpoint based on the degree of change of the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first endpoint.
12. The method of claim 11, wherein determining the uncertainty of the second endpoint based on a degree of change in a horizontal field angle corresponding to the first reference point and a horizontal field angle corresponding to the first endpoint if the first reference point is occluded by the other object comprises:
if the first reference point is shielded by the other objects, based on the difference delta between the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first end point, the formula d is used4=R2×(C1+ δ) determining the uncertainty d of the second endpoint4Wherein R is2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians.
13. The method of any one of claims 1-12, wherein determining the second state of the object based on the first state of the object corresponding to the each of the plurality of feature points and the uncertainty corresponding to the each of the plurality of feature points comprises:
determining a confidence level corresponding to said each of said plurality of feature points based on an uncertainty corresponding to said each of said plurality of feature points;
determining the second state of the object based on the first state of the object corresponding to the each of the plurality of feature points and the confidence level corresponding to the each of the plurality of feature points.
14. The method of claim 13, wherein said determining a confidence level for said each of said plurality of feature points based on an uncertainty for said each of said plurality of feature points comprises:
based on an uncertainty corresponding to said each of said plurality of feature pointsBy the formula
Figure FDA0002611482990000022
Determining confidence M corresponding to the kth feature point in the plurality of feature pointskWherein k represents the kth feature point in a plurality of feature points, k is 1 … … n, and n is the total number of the plurality of feature points; dkRepresenting an uncertainty of the k-th feature point; deltakRepresenting a change between the history state of the kth feature point and the first state; c3、C4Is a preset value.
15. An apparatus for sensing an object, comprising:
an acquisition unit configured to acquire a plurality of feature points of a point cloud cluster representing the target object;
a processing unit for determining an uncertainty of each of the plurality of feature points, the uncertainty indicating an error generated when a position of the each feature point in the point cloud cluster is acquired by an acquisition apparatus;
the processing unit is further configured to obtain a first state of the target object corresponding to each feature point in the plurality of feature points based on a state of each feature point in the plurality of feature points, where the state of each feature point includes a position and/or a speed of each feature point, and the first state includes a first speed and/or a first position of the target object;
the processing unit is further configured to determine a second state of the object based on the first state of the object corresponding to each of the plurality of feature points and the uncertainty corresponding to each of the plurality of feature points, where the second state includes a second velocity and/or a second position of the object.
16. The apparatus of claim 15, in which the plurality of feature points are a plurality of end points of the point cloud cluster.
17. The apparatus as recited in claim 16, said processing unit to further:
determining the type of an edge connected with each endpoint of the plurality of endpoints, wherein the type of the edge comprises a visible edge directly acquired by the acquisition equipment and an invisible edge which cannot be directly acquired by the acquisition equipment;
determining the uncertainty for each of the plurality of endpoints based on a type of two edges connecting to each of the plurality of endpoints.
18. The apparatus of claim 17, wherein the plurality of endpoints includes a first endpoint, a type of a first edge connected to the first endpoint is a visible edge, and a type of a second edge connected to the first endpoint is an invisible edge, the uncertainty of the first endpoint is determined based on a component of a detected uncertainty of an acquisition device in an orientation direction of the object.
19. The apparatus of claim 18, wherein the uncertainty d of the first endpoint1By the formula
Figure FDA0002611482990000031
Is determined wherein R1Representing the measured distance, C, between the acquisition device and the first endpoint0The set value is a preset value and is inversely related to the acquisition precision of the acquisition equipment, and the unit is radian; theta1Representing a coordinate azimuth angle when the acquisition equipment acquires the first endpoint;
Figure FDA0002611482990000032
represents an azimuth of the orientation of the target object.
20. The apparatus of claim 18 or 19, wherein the plurality of endpoints includes a second endpoint, and wherein the type of both edges connected to the second endpoint are visible edges, then the measured distance between the second endpoint and the acquisition device positively correlates with the uncertainty of the second endpoint.
21. The apparatus of claim 20, wherein the uncertainty d of the second endpoint2By the formula d2=R2×C1Is determined wherein R2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians.
22. The apparatus of any one of claims 17-21, wherein the processing unit is further to:
and if the first reference point is not shielded by other objects, determining the uncertainty of each end point of the plurality of end points based on the types of two edges connected with each end point of the plurality of end points, wherein the first reference point is a point which is away from the first end point by a preset distance in the extension line direction of the first edge, and the other objects are objects except the target object and the acquisition equipment in the image where the point cloud cluster is located.
23. The apparatus as recited in claim 22, said processing unit to further:
and if the first reference point is blocked by other objects, determining the uncertainty of the first endpoint based on the degree of change of the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first endpoint.
24. The apparatus as recited in claim 23, said processing unit to further:
if the first reference point is shielded by other objects, based on the difference delta between the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first end point, passing through a formula
Figure FDA0002611482990000041
Determining the uncertainty d of the first endpoint3Wherein R is1Representing the measured distance, C, between the acquisition device and the first endpoint0The set value is a preset value and is inversely related to the acquisition precision of the acquisition equipment, and the unit is radian; theta1Representing a coordinate azimuth angle when the acquisition equipment acquires the first endpoint;
Figure FDA0002611482990000043
represents an azimuth of the orientation of the target object.
25. The apparatus as recited in claim 24, said processing unit to further:
and if the first reference point is shielded by other objects, determining the uncertainty of the second endpoint based on the degree of change of the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first endpoint.
26. The apparatus as recited in claim 25, said processing unit to further:
if the first reference point is shielded by the other objects, based on the difference delta between the horizontal field angle corresponding to the first reference point and the horizontal field angle corresponding to the first end point, the formula d is used4=R2×(C1+ δ) determining the uncertainty d of the second endpoint4Wherein R is2Representing the measured distance, C, between the acquisition device and the second endpoint1Representing a preset uncertainty in radians.
27. The apparatus of any one of claims 15-26, wherein the processing unit is further to:
determining a confidence level corresponding to said each of said plurality of feature points based on an uncertainty corresponding to said each of said plurality of feature points;
determining the second state of the object based on the first state of the object corresponding to the each of the plurality of feature points and the confidence level corresponding to the each of the plurality of feature points.
28. The apparatus as recited in claim 27, said processing unit to further:
based on the uncertainty corresponding to each of the plurality of feature points, formulating
Figure FDA0002611482990000042
Determining confidence M corresponding to the kth feature point in the plurality of feature pointskWherein k represents the kth feature point in a plurality of feature points, k is 1 … … n, and n is the total number of the plurality of feature points; dkRepresenting an uncertainty of the k-th feature point; deltakRepresenting a change between the history state of the kth feature point and the first state; c3、C4Is a preset value.
29. A computing device, comprising: at least one processor and memory, the at least one processor coupled with the memory to read and execute instructions in the memory to perform the method of any of claims 1-14.
30. A computer-readable medium, characterized in that the computer-readable medium has stored program code which, when run on a computer, causes the computer to perform the method according to any one of claims 1-14.
31. A chip, comprising: at least one processor and memory, the at least one processor coupled with the memory to read and execute instructions in the memory to perform the method of any of claims 1-14.
32. An autonomous vehicle, comprising: at least one processor and memory, the at least one processor coupled with the memory to read and execute instructions in the memory to perform the method of any of claims 1-14.
CN202010755668.2A 2020-07-31 2020-07-31 Target object sensing method and device Pending CN114092898A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010755668.2A CN114092898A (en) 2020-07-31 2020-07-31 Target object sensing method and device
PCT/CN2021/106261 WO2022022284A1 (en) 2020-07-31 2021-07-14 Target object sensing method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010755668.2A CN114092898A (en) 2020-07-31 2020-07-31 Target object sensing method and device

Publications (1)

Publication Number Publication Date
CN114092898A true CN114092898A (en) 2022-02-25

Family

ID=80037525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010755668.2A Pending CN114092898A (en) 2020-07-31 2020-07-31 Target object sensing method and device

Country Status (2)

Country Link
CN (1) CN114092898A (en)
WO (1) WO2022022284A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114577215A (en) * 2022-03-10 2022-06-03 山东新一代信息产业技术研究院有限公司 Method, device and medium for updating feature map of mobile robot

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810475B (en) * 2014-02-19 2017-04-05 百度在线网络技术(北京)有限公司 A kind of object recognition methods and device
DE102017111351A1 (en) * 2017-05-24 2018-11-29 Jena-Optronik Gmbh Method for detecting and autonomously tracking a target object by means of a LIDAR sensor
CN109831736B (en) * 2017-11-23 2022-01-18 腾讯科技(深圳)有限公司 Data processing method and device, server and client
CN111060024B (en) * 2018-09-05 2021-11-30 天目爱视(北京)科技有限公司 3D measuring and acquiring device with rotation center shaft intersected with image acquisition device
CN111199579B (en) * 2020-01-02 2023-01-24 腾讯科技(深圳)有限公司 Method, device, equipment and medium for building three-dimensional model of target object

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114577215A (en) * 2022-03-10 2022-06-03 山东新一代信息产业技术研究院有限公司 Method, device and medium for updating feature map of mobile robot
CN114577215B (en) * 2022-03-10 2023-10-27 山东新一代信息产业技术研究院有限公司 Method, equipment and medium for updating characteristic map of mobile robot

Also Published As

Publication number Publication date
WO2022022284A1 (en) 2022-02-03

Similar Documents

Publication Publication Date Title
CN113879295B (en) Track prediction method and device
EP4029750A1 (en) Data presentation method and terminal device
CN112512887B (en) Driving decision selection method and device
CN112534483B (en) Method and device for predicting vehicle exit
CN113498529B (en) Target tracking method and device
CN112543877B (en) Positioning method and positioning device
WO2022062825A1 (en) Vehicle control method, device, and vehicle
EP4170544A1 (en) Lane line detection method and apparatus
CN114693540A (en) Image processing method and device and intelligent automobile
CN112810603B (en) Positioning method and related product
CN114531913A (en) Lane line detection method, related device, and computer-readable storage medium
CN113859265B (en) Reminding method and device in driving process
WO2022022284A1 (en) Target object sensing method and apparatus
CN115398272A (en) Method and device for detecting passable area of vehicle
CN114167404A (en) Target tracking method and device
CN114782638B (en) Method and device for generating lane line, vehicle, storage medium and chip
WO2021159397A1 (en) Vehicle travelable region detection method and detection device
WO2021110166A1 (en) Road structure detection method and device
CN113128497A (en) Target shape estimation method and device
CN113799794A (en) Method and device for planning longitudinal motion parameters of vehicle
WO2022061725A1 (en) Traffic element observation method and apparatus
CN114556251B (en) Method and device for determining a passable space for a vehicle
CN115082886B (en) Target detection method, device, storage medium, chip and vehicle
WO2022041820A1 (en) Method and apparatus for planning lane-changing trajectory
WO2022001432A1 (en) Method for inferring lane, and method and apparatus for training lane inference model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination