WO2022022284A1 - Target object sensing method and apparatus - Google Patents

Target object sensing method and apparatus Download PDF

Info

Publication number
WO2022022284A1
WO2022022284A1 PCT/CN2021/106261 CN2021106261W WO2022022284A1 WO 2022022284 A1 WO2022022284 A1 WO 2022022284A1 CN 2021106261 W CN2021106261 W CN 2021106261W WO 2022022284 A1 WO2022022284 A1 WO 2022022284A1
Authority
WO
WIPO (PCT)
Prior art keywords
endpoint
uncertainty
point
feature
feature points
Prior art date
Application number
PCT/CN2021/106261
Other languages
French (fr)
Chinese (zh)
Inventor
曹彤彤
李向旭
刘冰冰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022022284A1 publication Critical patent/WO2022022284A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present application relates to the field of autonomous driving, and more particularly, to a method and device for perceiving a target.
  • the set of point data on the shape of the target obtained by the acquisition device is also called a point cloud.
  • the more commonly used point clouds now include laser point clouds, that is, when a laser beam irradiates the surface of the target, the reflected laser will carry information such as the azimuth and distance of the target. If the laser beam is scanned according to a certain trajectory, the reflected laser point information will be recorded while scanning. Since the scanning is relatively fine, a large number of laser points can be obtained, so a laser point cloud can be formed. At present, laser point clouds are often used in the field of autonomous driving or unmanned driving to perceive objects.
  • the point cloud data containing the target object is first processed to obtain a point cloud cluster representing the target object, and the geometric center or center of gravity of the point cloud cluster is determined, and then based on the geometric center of the point cloud cluster The position and velocity of the center or center of gravity, calculate the position and velocity of the target to perceive the target.
  • the present application provides a method and device for sensing a target, so as to improve the accuracy of calculating the position or speed of the target.
  • a method for perceiving a target object comprising: acquiring a plurality of feature points of a point cloud cluster, the point cloud cluster representing the target object; determining the value of each feature point in the plurality of feature points uncertainty, the uncertainty is used to indicate the error generated when the position of each feature point in the point cloud cluster is collected by the collection device; based on the state of each feature point in the plurality of feature points , obtain the first state of the target object corresponding to each feature point in the plurality of feature points, the state of each feature point includes the position and/or speed of each feature point, the first The state includes the first speed and/or the first position of the target; based on the first state of the target corresponding to each of the plurality of feature points, and the The uncertainty corresponding to each feature point determines a second state of the target, and the second state includes a second velocity and/or a second position of the target.
  • the first state of the target object corresponding to the state of each feature point in the multiple feature points is calculated, and based on the multiple feature points
  • the first state of the target object corresponding to each feature point in the points and the uncertainty corresponding to each feature point are used to determine the second state of the target object, which is beneficial to improve the accuracy of determining the second state of the target object. It is avoided that the state of the target object is determined only based on the state of the geometric center or the center of gravity of the point cloud cluster in the prior art. When the geometric center or the center of gravity of the point cloud cluster is occluded, the accuracy of the determined state of the target object is caused. decline.
  • the plurality of feature points include a plurality of endpoints of the point cloud cluster, where the endpoints are also referred to as "interest points" and generally refer to the intersection of two adjacent edges in the point cloud cluster.
  • multiple endpoints of the point cloud cluster are used as the aforementioned multiple feature points, which is beneficial to simplify the process of determining the feature points.
  • the determining the uncertainty of each feature point in the plurality of feature points includes: determining a type of an edge connected to each end point in the plurality of endpoints, the edge The types include visible edges directly collected by the collection device and invisible edges that cannot be directly collected by the collection device; the uncertainty for each of the endpoints.
  • the uncertainty of each end point in the plurality of end points is determined based on the type of the two edges connected to each end point of the plurality of end points, which is beneficial to improve the accuracy of determining the uncertainty of the end point sex.
  • the plurality of endpoints include a first endpoint, a first edge connected to the first endpoint is a visible edge, and a second edge connected to the first endpoint is an invisible edge, and the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation direction of the target.
  • the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target, which is beneficial to improve the accuracy of determining the uncertainty of the first endpoint sex.
  • the uncertainty d 1 of the first endpoint is determined by the formula Determine, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, in radians; ⁇ 1 represents the coordinate azimuth when the collection device collects the first endpoint; Indicates the azimuth of the orientation of the target.
  • the multiple endpoints include a second endpoint, and the types of the two edges connected to the second endpoint are both visible edges, then the second endpoint is connected to the collection device.
  • the measured distance between is positively related to the uncertainty of the second endpoint.
  • the uncertainty of the second end point is determined based on the measurement distance between the second end point and the acquisition device, which is beneficial to improve the accuracy of determining the uncertainty of the second end point.
  • the measurement distance between, C 1 represents the preset uncertainty, the unit is radian.
  • the determining the uncertainty of each end point of the plurality of end points based on the types of two edges connected to each end point of the plurality of end points includes: if The first reference point is not occluded by other objects, then the uncertainty of each end point of the plurality of end points is determined based on the type of two edges connected to each end point of the plurality of end points, and the first reference point is determined.
  • a reference point is a point with a preset distance from the first end point in the direction of the extension line of the first side, and the other objects are the objects in the image where the point cloud cluster is located except the target object and objects other than the collection device.
  • the uncertainty of each of the multiple endpoints may be determined based on the type of two edges connected to each of the multiple endpoints, It is beneficial to improve the accuracy of determining the uncertainty of the endpoint.
  • the method further includes: if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first endpoint corresponding to the The degree of change of the horizontal opening angle determines the uncertainty of the first end point.
  • the uncertainty of the first endpoint is determined based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint. degree, which is beneficial to improve the accuracy of the uncertainty of the first endpoint.
  • determine the The uncertainty of the first endpoint includes: if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the horizontal angle corresponding to the first endpoint.
  • the difference ⁇ of the opening angle through the formula Determine the uncertainty d 3 of the first endpoint, wherein R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and the acquisition device
  • the collection accuracy of ⁇ is negatively correlated, and the unit is radian; ⁇ 1 represents the coordinate azimuth angle when the collection device collects the first endpoint; Indicates the azimuth of the orientation of the target.
  • the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, and the detection uncertainty of the acquisition device
  • the component in the orientation of the target object determines the uncertainty of the first end point, which is beneficial to improve the accuracy of the uncertainty of the first end point.
  • the method further includes: if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first endpoint corresponding to the The degree of change of the horizontal opening angle determines the uncertainty of the second end point.
  • the uncertainty of the second endpoint is determined based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint , which is beneficial to improve the accuracy of the uncertainty of the second endpoint.
  • determining the uncertainty of the second end point includes: if the first reference point is not blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first end point.
  • the uncertainty of the second end point is determined by measuring the distance between them, which is beneficial to improve the accuracy of the uncertainty of the second end point.
  • the first state of the target object corresponding to each feature point in the plurality of feature points, and each feature point in the plurality of feature points Corresponding uncertainty, determining the second state of the target includes: determining each of the plurality of feature points based on the uncertainty corresponding to each of the plurality of feature points The confidence level corresponding to the feature point; based on the first state of the target object corresponding to each feature point in the plurality of feature points, and the confidence level corresponding to each feature point in the plurality of feature points , to determine the second state of the target.
  • the coordinate azimuth in the above can be understood as the angle between the line connecting the end point of the target object and the acquisition device and the x-axis in the coordinate system.
  • the azimuth angle of the orientation of the target object in the above can be understood as the horizontal angle between the clockwise direction and the x-axis with the orientation of the target object as the starting point.
  • an apparatus for sensing a target object including each unit for implementing the first aspect or any possible implementation manner of the first aspect.
  • a device for sensing a target object has the function of implementing the device in the method design of the above-mentioned first aspect.
  • These functions can be implemented by hardware or by executing corresponding software by hardware.
  • the hardware or software includes one or more units corresponding to the above functions.
  • a computing device including an input-output interface, a processor, and a memory.
  • the processor is used to control the input and output interface to send and receive signals or information
  • the memory is used to store a computer program
  • the processor is used to call and run the computer program from the memory, so that the computing device executes the method in the first aspect.
  • a computer program product comprising: computer program code, when the computer program code is run on a computer, causing the computer to perform the methods in the above aspects.
  • a computer-readable medium stores program codes, which, when executed on a computer, cause the computer to execute the methods in the above-mentioned aspects.
  • a seventh aspect provides a system-on-chip
  • the system-on-a-chip includes a processor for a computing device to implement the functions involved in the above aspects, for example, generating, receiving, sending, or processing data and/or data involved in the above methods or information.
  • the chip system further includes a memory for storing necessary program instructions and data of the computing device.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • a vehicle including an input-output interface, a processor and a memory.
  • the processor is used to control the input and output interface to send and receive signals or information
  • the memory is used to store a computer program
  • the processor is used to call and run the computer program from the memory, so that the computing device executes the method in the first aspect.
  • the above-mentioned vehicle may have an automatic driving function.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of an applicable automatic driving system according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a method for sensing a target according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a point cloud cluster corresponding to a target in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the positional relationship between the target object 400 and the acquisition device 500 in the coordinate system according to the embodiment of the present application.
  • FIG. 6 is a schematic diagram of a positional relationship between a target 400 and a collection device 500 in a coordinate system according to another embodiment of the present application.
  • FIG. 7 is a schematic diagram of an environment map according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a device for sensing a target according to an embodiment of the present application.
  • FIG. 9 is a schematic block diagram of a computing device according to another embodiment of the present application.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • the vehicle 100 is configured in a fully or partially autonomous driving mode.
  • the vehicle 100 can control itself while in an autonomous driving mode, and can determine the current state of the vehicle and its surroundings through human manipulation, determine the likely behavior of at least one other vehicle in the surrounding environment, and determine the other vehicle
  • the vehicle 100 is controlled based on the determined information with a confidence level corresponding to the likelihood of performing the possible behavior.
  • the vehicle 100 may be placed to operate without human interaction.
  • Vehicle 100 may include various subsystems, such as travel system 102 , sensor system 104 , control system 106 , one or more peripherals 108 and power supply 110 , computer system 112 , and user interface 116 .
  • vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. Additionally, each of the subsystems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
  • the travel system 102 may include components that provide powered motion for the vehicle 100 .
  • travel system 102 may include engine 118 , energy source 119 , transmission 120 , and wheels/tires 121 .
  • the engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a gasoline engine and electric motor hybrid engine, an internal combustion engine and an air compression engine hybrid engine.
  • Engine 118 converts energy source 119 into mechanical energy.
  • Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
  • the energy source 119 may also provide energy to other systems of the vehicle 100 .
  • Transmission 120 may transmit mechanical power from engine 118 to wheels 121 .
  • Transmission 120 may include a gearbox, a differential, and a driveshaft.
  • transmission 120 may also include other devices, such as clutches.
  • the drive shaft may include one or more axles that may be coupled to one or more wheels 121 .
  • the sensor system 104 may include several sensors that sense information about the environment surrounding the vehicle 100 .
  • the sensor system 104 may include a positioning system 122 (the positioning system may be a global positioning system (GPS) system, a Beidou system or other positioning systems), an inertial measurement unit (IMU) 124, Radar 126 , laser rangefinder 128 and camera 130 .
  • the sensor system 104 may also include sensors of the internal systems of the vehicle 100 being monitored (eg, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, orientation, velocity, etc.). This detection and identification is a critical function for the safe operation of the autonomous vehicle 100 .
  • the positioning system 122 may be used to estimate the geographic location of the vehicle 100 .
  • the IMU 124 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration.
  • IMU 124 may be a combination of an accelerometer and a gyroscope.
  • Radar 126 may utilize radio signals to sense objects within the surrounding environment of vehicle 100 .
  • the radar 126 may also be used to sense one or more of the target's speed, position, and heading.
  • the laser rangefinder 128 may utilize laser light to sense objects in the environment in which the vehicle 100 is located.
  • the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
  • Camera 130 may be used to capture multiple images of the surrounding environment of vehicle 100 .
  • Camera 130 may be a still camera or a video camera.
  • Control system 106 controls the operation of the vehicle 100 and its components.
  • Control system 106 may include various elements including steering system 132 , throttle 134 , braking unit 136 , computer vision system 140 , route control system 142 , and obstacle avoidance system 144 .
  • the steering system 132 is operable to adjust the heading of the vehicle 100 .
  • it may be a steering wheel system.
  • the throttle 134 is used to control the operating speed of the engine 118 and thus the speed of the vehicle 100 .
  • the braking unit 136 is used to control the deceleration of the vehicle 100 .
  • the braking unit 136 may use friction to slow the wheels 121 .
  • the braking unit 136 may convert the kinetic energy of the wheels 121 into electrical current.
  • the braking unit 136 may also take other forms to slow the wheels 121 to control the speed of the vehicle 100 .
  • Computer vision system 140 may be operable to process and analyze images captured by camera 130 in order to identify objects and/or features in the environment surrounding vehicle 100 .
  • the objects and/or features may include traffic signals, road boundaries and obstacles.
  • Computer vision system 140 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision techniques.
  • SFM structure from motion
  • the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and the like.
  • the route control system 142 is used to determine the travel route of the vehicle 100 .
  • route control system 142 may combine data from sensors, GPS 122, and one or more predetermined maps to determine a driving route for vehicle 100.
  • the obstacle avoidance system 144 is used to identify, evaluate and avoid or otherwise traverse potential obstacles in the environment of the vehicle 100 .
  • control system 106 may additionally or alternatively include components other than those shown and described. Alternatively, some of the components shown above may be reduced.
  • Peripherals 108 may include a wireless communication system 146 , an onboard computer 148 , a microphone 150 and/or a speaker 152 .
  • peripherals 108 provide a means for a user of vehicle 100 to interact with user interface 116 .
  • the onboard computer 148 may provide information to the user of the vehicle 100 .
  • User interface 116 may also operate on-board computer 148 to receive user input.
  • the onboard computer 148 can be operated via a touch screen.
  • peripheral devices 108 may provide a means for vehicle 100 to communicate with other devices located within the vehicle.
  • microphone 150 may receive audio (eg, voice commands or other audio input) from a user of vehicle 100 .
  • speakers 152 may output audio to a user of vehicle 100 .
  • Wireless communication system 146 may wirelessly communicate with one or more devices, either directly or via a communication network.
  • wireless communication system 146 may use 3G cellular communications, such as code division multiple access (CDMA), Global System for Mobile Communications (GSM)/GPRS, or fourth generation (4th generation, 4G) communications such as LTE. Or the fifth generation (5th-Generation, 5G) communication.
  • the wireless communication system 146 may communicate with a wireless local area network (WLAN) using WiFi.
  • the wireless communication system 146 may communicate directly with the device using an infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols, such as various vehicle communication systems, for example, wireless communication system 146 may include one or more dedicated short range communications (DSRC) devices, which may include communication between vehicles and/or roadside stations public and/or private data communications.
  • DSRC dedicated short range communications
  • the power supply 110 may provide power to various components of the vehicle 100 .
  • the power source 110 may be a rechargeable lithium-ion or lead-acid battery.
  • One or more battery packs of such a battery may be configured as a power source to provide power to various components of the vehicle 100 .
  • power source 110 and energy source 119 may be implemented together, such as in some all-electric vehicles.
  • Computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as data memory 114 .
  • Computer system 112 may also be multiple computing devices that control individual components or subsystems of vehicle 100 in a distributed fashion.
  • the processor 113 may be any conventional processor, such as a commercially available central processing unit (CPU). Alternatively, the processor may be a dedicated device such as an application specific integrated circuit (ASIC) or other hardware-based processor.
  • FIG. 1 functionally illustrates the processor, memory, and other elements of the computer 110 in the same block, one of ordinary skill in the art will understand that the processor, computer, or memory may actually include a processor, a computer, or a memory that may or may not Multiple processors, computers, or memories stored within the same physical enclosure.
  • the memory may be a hard drive or other storage medium located within an enclosure other than computer 110 .
  • reference to a processor or computer will be understood to include reference to a collection of processors or computers or memories that may or may not operate in parallel.
  • some components such as the steering and deceleration components may each have their own processor that only performs computations related to component-specific functions .
  • a processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle while others are performed by a remote processor, including taking steps necessary to perform a single maneuver.
  • the memory 114 may contain instructions 115 (eg, program logic) executable by the processor 113 to perform various functions of the vehicle 100 , including those described above.
  • Memory 114 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of travel system 102 , sensor system 104 , control system 106 , and peripherals 108 . instruction.
  • memory 114 may store data such as road maps, route information, vehicle location, direction, speed, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 112 during operation of the vehicle 100 in autonomous, semi-autonomous and/or manual modes.
  • the above-mentioned processor 113 may also execute the planning scheme for the longitudinal motion parameters of the vehicle according to the embodiments of the present application, so as to help the vehicle to plan the longitudinal motion parameters.
  • the specific longitudinal motion parameter planning method reference may be made to the introduction of FIG. 3 below. , and are not repeated here for brevity.
  • a user interface 116 for providing information to or receiving information from a user of the vehicle 100 .
  • user interface 116 may include one or more input/output devices within the set of peripheral devices 108 , such as wireless communication system 146 , onboard computer 148 , microphone 150 and speaker 152 .
  • Computer system 112 may control functions of vehicle 100 based on input received from various subsystems (eg, travel system 102 , sensor system 104 , and control system 106 ) and from user interface 116 .
  • computer system 112 may utilize input from control system 106 in order to control steering unit 132 to avoid obstacles detected by sensor system 104 and obstacle avoidance system 144 .
  • computer system 112 is operable to provide control of various aspects of vehicle 100 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • memory 114 may exist partially or completely separate from vehicle 100 .
  • the above-described components may be communicatively coupled together in a wired and/or wireless manner.
  • FIG. 1 should not be construed as a limitation on the embodiment of the present invention.
  • a self-driving car traveling on a road can recognize objects within its surroundings to determine adjustments to the current speed.
  • the objects may be other vehicles, traffic control equipment, or other types of objects.
  • each identified object may be considered independently, and based on the object's respective characteristics, such as its current speed, acceleration, distance from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to adjust.
  • autonomous vehicle 100 or a computing device associated with autonomous vehicle 100 eg, computer system 112, computer vision system 140, memory 114 of FIG.
  • autonomous vehicle 100 For example, traffic, rain, ice on the road, etc.
  • each identified object is dependent on the behavior of the other, so it is also possible to predict the behavior of a single identified object by considering all identified objects together.
  • the vehicle 100 can adjust its speed based on the predicted behavior of the identified object.
  • the self-driving car can determine that the vehicle will need to adjust to a steady state (eg, accelerate, decelerate, or stop) based on the predicted behavior of the object.
  • a steady state eg, accelerate, decelerate, or stop
  • other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and the like.
  • the computing device may also provide instructions to modify the steering angle of the vehicle 100 so that the self-driving car follows a given trajectory and/or maintains contact with objects in the vicinity of the self-driving car (eg, , cars in adjacent lanes on the road) safe lateral and longitudinal distances.
  • objects in the vicinity of the self-driving car eg, , cars in adjacent lanes on the road
  • the above-mentioned vehicle 100 can be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, a recreational vehicle, a playground vehicle, construction equipment, a tram, a golf cart, a train, a cart, etc.
  • the embodiments of the invention are not particularly limited.
  • FIG. 2 is a schematic diagram of a suitable automatic driving system according to an embodiment of the present application.
  • the computer system 101 includes a processor 103 , and the processor 103 is coupled to a system bus 105 .
  • the processor 103 may be one or more processors, each of which may include one or more processor cores.
  • a video adapter 107 which can drive a display 109, is coupled to the system bus 105.
  • the system bus 105 is coupled to an input/output (I/O) bus 113 through a bus bridge 111 .
  • I/O interface 115 is coupled to the I/O bus.
  • I/O interface 115 communicates with various I/O devices, such as input device 117 (eg, keyboard, mouse, touch screen, etc.), media tray 121, (eg, CD-ROM, multimedia interface, etc.).
  • Transceiver 123 which can transmit and/or receive radio communication signals
  • camera 155 which can capture sceneries and dynamic digital video images
  • external USB interface 125 external USB interface 125 .
  • the interface connected to the I/O interface 115 may be a USB interface.
  • the processor 103 may be any conventional processor, including a Reduced Instruction Set Computing (Reduced Instruction Set Computing, RISC) processor, a Complex Instruction Set Computing (Complex Instruction Set Computer, CISC) processor or a combination of the above.
  • the processor may be a special purpose device such as an application specific integrated circuit ASIC.
  • the processor 103 may be a neural network processor or a combination of a neural network processor and the above-mentioned conventional processors.
  • computer system 101 may be located remotely from the autonomous vehicle and may communicate wirelessly with the autonomous vehicle.
  • some of the processes described herein are performed on a processor disposed within the autonomous vehicle, others are performed by a remote processor, including taking actions required to perform a single maneuver.
  • Network interface 129 is a hardware network interface, such as a network card.
  • the network 127 may be an external network, such as the Internet, or an internal network, such as an Ethernet network or a virtual private network (Virtual Private Network, VPN).
  • the network 127 may also be a wireless network, such as a Wi-Fi network, a cellular network, and the like.
  • the hard disk drive interface is coupled to the system bus 105 .
  • the hard drive interface is connected to the hard drive.
  • System memory 135 is coupled to system bus 105 . Data running in system memory 135 may include operating system 137 and application programs 143 of computer 101 .
  • the operating system includes a shell 139 and a kernel 141 .
  • Shell 139 is an interface between the user and the kernel of the operating system.
  • Shell 139 is the outermost layer of the operating system.
  • Shell 139 manages the interaction between the user and the operating system: waiting for user input, interpreting user input to the operating system, and processing various operating system outputs.
  • Kernel 141 consists of those parts of the operating system that manage memory, files, peripherals, and system resources. Interacting directly with hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, IO management, and more.
  • Application 143 includes programs that control the autonomous driving of the car, for example, programs that manage the interaction of the autonomous car with obstacles on the road, programs that control the route or speed of the autonomous car, and programs that control the interaction of the autonomous car with other autonomous vehicles on the road. .
  • Application 143 also exists on the system of software deploying server 149 .
  • computer system 101 may download application 143 from software deploying server 149 when application 147 needs to be executed.
  • the above-mentioned application program may further include an application program corresponding to the target object perception scheme provided by the embodiments of the present application, wherein the target object perception scheme of the embodiments of the present application will be described in detail below. For the sake of brevity, the This will not be repeated here.
  • Sensors 153 are associated with computer system 101 .
  • the sensor 153 is used to detect the environment around the computer 101 .
  • the sensor 153 can detect objects, such as animals, cars, obstacles, etc., and further sensors can detect the surrounding environment of the above objects, such as: the environment around the animal, other animals around the animal, weather conditions , the brightness of the surrounding environment, etc.
  • the sensors may be lidars, cameras, infrared sensors, chemical detectors, microphones, and the like.
  • the point cloud data containing the object is first processed to obtain a point cloud cluster representing the object, and the geometric center or center of gravity of the point cloud cluster is determined, and then based on the geometric center of the point cloud cluster Or the position and speed of the center of gravity, and calculate the position and speed of the target to perceive the target.
  • the geometric center or the center of gravity of the point cloud cluster is blocked, the calculated target object will be reduced. position and velocity accuracy.
  • the present application provides a method for perceiving a target object.
  • the first state of the target for example, the first position and/or the first speed
  • the method of the embodiment of the present application is described below with reference to FIG. 3 . It should be understood that the method shown in FIG. 3 may be executed by the automatic driving system shown in FIG. 2 , or may also be executed by the control system 106 in the vehicle 100 , and optionally, the second state of the target may also be sent to the obstacle
  • the avoidance system 144 is used to plan the driving route of the vehicle 100 and the like.
  • FIG. 3 is a schematic flowchart of a method for sensing a target according to an embodiment of the present application.
  • the method shown in FIG. 3 includes steps 310 to 340 .
  • the point cloud cluster represents the target object, or in other words, the point cloud cluster is used to represent part or all of the contour or shape of the target object.
  • the above-mentioned multiple feature points may be contour points of a point cloud cluster, for example, multiple feature points are multiple endpoints of the point cloud cluster.
  • the above-mentioned multiple feature points may also include the geometric center, the center of gravity, etc. of the point cloud cluster, which is not limited in this embodiment of the present application.
  • the above-mentioned endpoints also known as "interest points", usually refer to the intersection of two adjacent edges in a point cloud cluster.
  • the above-mentioned multiple endpoints can be acquired by using the existing endpoint detection technology.
  • the above-mentioned point cloud cluster can be obtained based on an existing point cloud cluster acquisition scheme.
  • the laser signal can be transmitted and received by the lidar sensor, and the time difference between transmission and reception can be used to determine a The detection distance corresponding to the emission angle, the three-dimensional point cloud of the space environment can be obtained through multi-layer scanning. Then, the obtained point cloud is converted into the format required by the target perception after being driven by the lidar, and the point cloud data is continuously sent to the controller.
  • the controller can cluster the point cloud data, and filter out the clusters that do not meet the target characteristics according to the number of clustering points and sizes of the clusters, and the rest are the point cloud clusters corresponding to the target.
  • the commonly used clustering methods include density-based spatial clustering of applications with noise (DBSCAN), K nearest neighbor (KNN) and so on.
  • the controller can also use the L-Shape feature extraction algorithm or the trained neural network model to extract the orientation and shape of the point cloud cluster.
  • the orientation of the point cloud cluster is determined by the trajectory or the historical movement direction, and then the shape is calculated by traversing the points in the point cloud cluster according to the orientation.
  • FIG. 4 is a schematic diagram of a point cloud cluster corresponding to a target in an embodiment of the present application.
  • the outline of the point cloud cluster corresponding to the target 400 is a rectangle with a length of l and a width of w.
  • the rectangle includes four endpoints, namely endpoint 0, endpoint 1, endpoint 2, endpoint 3, and the rectangular
  • the coordinates of the geometric center are (x, y), and the coordinate azimuth is Among them, the coordinate azimuth is the angle between the orientation of the point cloud cluster and the x-axis in the coordinate system, which is the geometric center of the lidar, and the coordinates of the endpoint 0 are The coordinates of endpoint 1 are The coordinates of endpoint 2 are The coordinates of endpoint 3 are
  • the length and width of the above-mentioned point cloud clusters can be obtained from the statistics of multi-frame point cloud images.
  • the length and width of the above-mentioned point cloud clusters can also be determined based on the collection positions of the above-mentioned endpoints, which are not made in this embodiment of the present application. limited.
  • the acquisition device collects the target, due to the inherent error of the acquisition device itself, or in the process of collecting the target, whether the position of the feature point in the point cloud cluster can be directly observed by the acquisition device will affect the above characteristics. Therefore, in this application, the uncertainty of the endpoint can be set based on whether the edge connected to the endpoint can be directly observed by the acquisition device.
  • the above-mentioned step 320 includes: determining the type of the edge connected to each of the multiple endpoints, wherein the type of the edge includes the visible edge directly observed by the collection device and the invisible edge that cannot be directly observed by the collection device; The type of two edges that connect each of the endpoints, determining the uncertainty for each of the multiple endpoints.
  • the above-mentioned multiple endpoints can usually be divided into the following three types.
  • first type of endpoint one of the two edges connected to the endpoint of this type is a visible edge, and the other is an invisible edge.
  • Endpoint of the second type both edges connected to the endpoint of this type are visible edges, for example, Endpoint 1 shown in Figure 4;
  • Endpoint of the third type two edges connected to the endpoint of this type Both are invisible edges, for example, endpoint 3 shown in Figure 4.
  • the methods for calculating uncertainty based on the three different types of endpoints are described below.
  • the first endpoint belongs to the first type of endpoint
  • the second endpoint belongs to the second type of endpoint
  • the third endpoint belongs to the third type of endpoint.
  • the type of the first edge connected to the first endpoint is a visible edge
  • the type of the second edge connected to the first endpoint is an invisible edge
  • the actual location of the first endpoint is usually Located on the extension line of the visible side (ie, the first side)
  • the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target.
  • the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target. It can be understood that in the process of determining the uncertainty of the first endpoint, the acquisition device is considered. The component of the detection uncertainty in the orientation of the target, and the influence of other factors on the uncertainty of the first endpoint. The uncertainty of the above-mentioned first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target. It can also be understood that the detection uncertainty of the acquisition device is directly related to the orientation of the target The component is used as the uncertainty of the first endpoint, which is not limited in this embodiment of the present application.
  • the inherent uncertainty of the acquisition device may be projected to the orientation of the target, and the uncertainty obtained after the projection may be used as the uncertainty of the first endpoint. That is, the uncertainty d1 of the first endpoint is obtained by the formula Determine, wherein, R 1 represents the measurement distance between the collection device and the first endpoint, and ⁇ 1 represents the coordinate azimuth angle when the collection device collects the first endpoint; Indicates the azimuth angle of the orientation of the target; C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radian (rad).
  • the coordinate azimuth angle in the above can be understood as the angle between the connection line between the first end point of the target object and the acquisition device and the x-axis in the coordinate system.
  • the azimuth angle of the orientation of the target object in the above can be understood as the horizontal angle between the clockwise direction and the x-axis with the orientation of the target object as the starting point.
  • C 0 represents the uncertainty of laser scanning, which can be set according to the scanning resolution of the laser, which is proportional to the scanning resolution of the laser. For example, when the scanning resolution of the laser is 0.2°, C 0 can be set to 0.01.
  • the uncertainty of the first endpoint can be projected to the x-axis and the y-axis, that is, The uncertainty D 1x of the first endpoint on the x-axis is
  • the uncertainty D 1y of the first endpoint on the y-axis is Among them, D x0 represents the initial uncertainty in the x-axis direction; D y0 represents the initial uncertainty in the y-axis direction.
  • D x0 and D y0 are related to the first uncertainty C 0 and/or the scanning accuracy of the acquisition device.
  • FIG. 5 is a schematic diagram of the positional relationship between the target object 400 and the acquisition device 500 in the coordinate system according to the embodiment of the present application.
  • the positive directions of the x-axis and the y-axis are as shown in the figure, and the counterclockwise direction is the positive direction, the azimuth angle of the orientation of the acquisition device 500 That is, ⁇ 1 ′, and the inherent uncertainty of the acquisition device 500 is C 0 ′(rad).
  • the measured distance between the acquisition device 500 and the endpoint 0 is R 1 ′. Since the visible edge connected to the endpoint 0 is parallel to the orientation of the target 400, the azimuth of the orientation of the object 400 is equal to the distance between the visible edge and the x-axis.
  • the y-axis component d 1y ' of the uncertainty d 1 ' at endpoint 0 is
  • D x0 ' represents the initial uncertainty in the x-axis direction
  • D y0 ' represents the initial uncertainty in the y-axis direction.
  • endpoint 2 in FIG. 5 also belongs to the above-mentioned first endpoint, and the calculation method of the uncertainty of the above-mentioned first endpoint can be used, which is not repeated here for brevity.
  • the factor affecting the uncertainty of the second end point is usually the measurement distance between the second end point and the acquisition device, wherein, The measurement distance between the second endpoint and the acquisition device is positively related to the uncertainty of the second endpoint. Therefore, the uncertainty of the second end point can be determined based on the measured distance between the second end point and the acquisition device in the coordinate system.
  • the above C 1 may be set according to the horizontal opening angle of the observation target object, and is proportional to the horizontal opening angle of the observation target object. For example, if the horizontal opening angle of the observation target is 10°, C 1 can be set to 0.17.
  • the above-mentioned second uncertainty may also be the same as the first uncertainty, which is not limited in this embodiment of the present application.
  • FIG. 6 is a schematic diagram of the positional relationship between the target 400 and the acquisition device 500 in the coordinate system according to another embodiment of the present application.
  • the azimuth angle of the orientation of the acquisition device 500 That is, ⁇ 2 ′, and the inherent uncertainty of the acquisition device 500 is C 1 ′(rad).
  • the uncertainty of the third endpoint can be set larger than that of the first endpoint and the uncertainty of the second endpoint.
  • the uncertainty for example, can be set to infinity.
  • it can be set that the components of the uncertainty of the third endpoint in the x and y directions of the coordinate system are also larger than the components of the uncertainty of the first endpoint in the x and y directions of the coordinate system, and the second The components of the uncertainty of the endpoint in the x and y directions of the coordinate system.
  • the point cloud cluster of the target object can only represent part of the shape or outline of the target object. It is possible that the obtained endpoint is not the actual endpoint of the target object, and the actual endpoint position of the target object is occluded by other objects. In this case, in order to improve the accuracy of determining the uncertainty of the endpoint, the present application also provides a method for calculating the uncertainty of the endpoint, which is described below with reference to FIG. 7 . It should be understood that in the case that the above-mentioned target object is blocked by other objects, it can also be directly calculated according to the calculation method of the uncertainty of the first endpoint, the second endpoint and the third endpoint introduced above. This is not limited.
  • Whether there are other objects occluded between the above-mentioned target object and the collection device can be determined by generating an environment map.
  • the surroundings are scanned by the acquisition device to obtain an environmental map containing the target, and the environmental map is segmented according to a preset angle (for example, the azimuth of the acquisition device), and the features of each segmented space include the corresponding The azimuth angle and the measured distance of the object closest to the acquisition device corresponding to the azimuth angle and the number of the object.
  • the measured distance of the nearest object corresponding to the azimuth angle and the number of the object can be obtained by: calculating the minimum circumscribed convex polygon of the object according to the point cloud cluster of the object in the environment map, traversing the circumscribed convex polygon of all objects in the environment map, and obtaining The measured distance of the object closest to the acquisition device corresponding to each azimuth angle in the environment map, and the number of the closest object.
  • the reference point corresponding to each endpoint of the current object is determined based on the historical data of the object, and the reference point corresponding to each endpoint is marked in the above environmental map, and combined with the corresponding reference point of each azimuth in the environmental map
  • the measured distance of the object closest to the acquisition device and the serial number of the closest object determine whether the reference point corresponding to the end point of the target object is blocked.
  • the measurement distance from the acquisition device to the reference point corresponding to an endpoint is equal to the measurement distance of the object closest to the acquisition device in the azimuth angle corresponding to the endpoint, then the endpoint and the acquisition device are not blocked by other objects.
  • the measurement distance from the acquisition device to the reference point corresponding to an endpoint is greater than the measurement distance of the object closest to the acquisition device in the azimuth angle corresponding to the endpoint, the endpoint and the acquisition device are blocked by other objects.
  • the historical data of the target object may be the characteristics of the target object obtained during the scanning process before the point cloud cluster of the target object is obtained, for example, parameters such as the length and width of the target object, and for example, the target object The coordinates of each endpoint of , etc.
  • FIG. 7 is a schematic diagram of an environment map according to an embodiment of the present application.
  • the collection device 500 scans the surroundings according to a preset azimuth angle, acquires an environment map including the target 400 , and scans the surroundings according to a preset angle (for example, the azimuth angle of the collection device)
  • the environment map is segmented, and the features of each segmented space include the azimuth angle corresponding to each space, the measured distance of the object closest to the collection device corresponding to the azimuth angle, and the number of the object.
  • the measurement distance of the nearest object corresponding to the azimuth angle and the number of the object can be obtained by the following methods: Calculate the minimum circumscribed convex polygon of the object according to the point cloud clusters of the object 710 and the target object 400 in the environment map, and traverse the minimum circumscribed convex polygon of all objects in the environment map.
  • the circumscribed convex polygons that is, the object 710 and the target object 400, obtain the measured distance of the object closest to the acquisition device corresponding to each azimuth angle in the environment map and the number of the closest object.
  • the position of the reference point 1 corresponding to the endpoint 0 of the current object based on the historical data of the target object, and mark the reference point 1 corresponding to the endpoint 0 in the above environmental map, and determine the distance between the acquisition device and the reference point 1.
  • Measure the distance S and determine the measured distance S min of the object closest to the acquisition device in the segmented space corresponding to the azimuth 1 corresponding to the endpoint 0, and the number of the closest object. Referring to FIG. 7 , S min ⁇ S, the object 710 blocks the space between the reference point 1 and the acquisition device.
  • the uncertainty of the endpoints can be calculated according to the types of endpoints above.
  • the first endpoint For the first type of endpoint, if the first reference point is occluded by other objects, based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, determine whether the first endpoint is not certainty.
  • the above-mentioned horizontal opening angle corresponding to the first end point can be understood as the horizontal opening angle used by the collecting device to collect the entire target object when the first end point is used as the end point of the target object.
  • the horizontal opening angle corresponding to the above-mentioned first reference point can be understood as the horizontal opening angle used by the collecting device to collect the entire target object when the first reference point is used as the end point of the target object.
  • the horizontal expansion angle corresponding to endpoint 0 is ⁇ 0
  • the horizontal expansion angle corresponding to reference point 1 is ⁇ 1
  • the uncertainty d 3 of the first endpoint can be projected to the x-axis and the y-axis, that is, the uncertainty D 3x of the first endpoint on the x-axis is
  • the uncertainty D 3y of the first endpoint on the y-axis is Among them, D x0 represents the initial uncertainty in the x-axis direction; D y0 represents the initial uncertainty in the y-axis direction.
  • the position of the first end point will affect the position of the second end point, if the first reference point is blocked by other objects, it will affect the determination of the position of the second end point to a certain extent. Therefore, the uncertainty of the second end point is the same as the above
  • the degree of change based on the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint is positively correlated.
  • the formula d 4 D 0 +R 2 ⁇ ( C 1 + ⁇ ), determine the uncertainty d 4 of the second endpoint, where R 2 represents the measurement distance between the acquisition device and the second endpoint, and C 1 represents the preset uncertainty, in radians (rad) ; D 0 represents the initial uncertainty of the preset second endpoint.
  • the uncertainty d 4 of the second endpoint can be projected to the x-axis and the y-axis, that is, the uncertainty D 4x of the first endpoint on the x-axis is
  • the uncertainty D 3y of the first endpoint on the y-axis is Among them, D x0 represents the initial uncertainty in the x-axis direction; D y0 represents the initial uncertainty in the y-axis direction.
  • the first state of the target object corresponding to each feature point in the multiple feature points calculates the first state of the target object corresponding to each feature point in the multiple feature points, and the state of each feature point includes the position of each feature point and/or Velocity, the first state includes a first velocity and/or a first position of the object.
  • the above-mentioned first state of the target can be understood as the position or velocity of the geometric center of the target.
  • the above step 340 includes: based on the uncertainty corresponding to each feature point in the plurality of feature points, determining the confidence level corresponding to each feature point in the plurality of feature points; based on each feature point in the plurality of feature points The corresponding first state of the target object and the confidence level corresponding to each feature point in the plurality of feature points determine the second state of the target object.
  • determining the confidence corresponding to each feature point in the plurality of feature points based on the uncertainty corresponding to each feature point in the plurality of feature points includes: based on the uncertainty corresponding to each feature point in the plurality of feature points.
  • weights of d k and ⁇ k in calculating the confidence level M k are usually adjusted by setting the values of C 3 and C 4 .
  • the values of C 3 and C 4 can be set to 0.5.
  • the above confidence may be divided into a confidence in the x-axis direction and a confidence in the y-direction. That is, the confidence level M kx of the k-th feature point in the x-axis direction is The confidence level M ky of the k-th feature point in the y-axis direction is Among them, d kx represents the uncertainty of the k-th feature point in the x-axis direction; ⁇ kx represents the change between the historical state and the first state of the k-th feature point in the x-axis direction; d ky represents the first state Uncertainty of the k feature points in the y-axis direction; ⁇ ky represents the change between the historical state and the first state of the k-th feature point in the y-axis direction.
  • each feature point of this round (also called “observation feature point”) can be compared with the historically calculated feature point (also called “observation feature point”).
  • “Tracked Feature Points” to associate and update the status of each feature point in the target. Specifically, according to the position, orientation and other information of each feature point, each observed feature point is associated with the tracked feature point. state to obtain the updated state of each feature point.
  • the state of each feature point can be updated based on Kalman filtering, extended Kalman filtering, etc., or the maximum a posteriori can be calculated based on Bayesian reasoning.
  • the state of each feature point is updated with probability, which is not limited in this embodiment of the present application.
  • the state of each feature point may not be updated, and the state of the feature point observed in the current round may be directly determined as the state of the target.
  • FIG. 8 to FIG. 9 The method for sensing the object of the embodiment of the present application is described above with reference to FIGS. 1 to 7 , and the apparatus of the embodiment of the present application is described below with reference to FIGS. 8 to 9 . It should be understood that it should be noted that the apparatuses shown in FIG. 8 to FIG. 9 can implement each step in the above method, which is not repeated here for brevity.
  • FIG. 8 is a schematic diagram of a device for sensing a target according to an embodiment of the present application.
  • the apparatus 800 shown in FIG. 8 includes: an obtaining unit 810 and a processing unit 820 .
  • the foregoing apparatus 800 may be the apparatus for running the automatic driving system in FIG. 1
  • the foregoing apparatus 800 may also be the apparatus for running the control system shown in FIG. 2 , which is not specifically limited in this embodiment of the present application.
  • the above obtaining unit 810 is configured to obtain a plurality of feature points of a point cloud cluster, and the point cloud cluster represents a target object.
  • the above-mentioned processing unit 820 is configured to determine the uncertainty of each feature point in the plurality of feature points, wherein the uncertainty is used to indicate the error generated when the position of each feature point in the point cloud cluster is collected by the collection device.
  • the above-mentioned processing unit 820 is further configured to obtain the first state of the target object corresponding to each feature point based on the state of each feature point in the plurality of feature points, and the state of each feature point includes the position of each feature point and/ or velocity, the first state includes a first velocity and/or a first position of the object.
  • the above processing unit 820 is further configured to determine the second state of the target based on the first state of the target corresponding to each feature point and the uncertainty corresponding to each feature point, and the second state includes the second speed of the target and/or second position.
  • the plurality of feature points include a plurality of endpoints of the point cloud cluster.
  • the processing unit 820 is further configured to: determine the type of an edge connected to each of the multiple endpoints, and the type of the edge includes a visible edge directly collected by the collection device and an edge that cannot be directly collected by the collection device.
  • the invisible edges of determine the uncertainty for each of the multiple endpoints based on the type of two edges that connect each of the multiple endpoints.
  • the plurality of endpoints include a first endpoint, the type of the first edge connected to the first endpoint is a visible edge, and the type of the second edge connected to the first endpoint is an invisible edge. , then the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation direction of the target.
  • the uncertainty d 1 of the first endpoint is determined by the formula Determine, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, which is negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radians; ⁇ 1 represents the acquisition device to collect the first The coordinate azimuth at the end point; Indicates the azimuth of the orientation of the target.
  • the multiple endpoints include a second endpoint, and the types of the two edges connected to the second endpoint are both visible edges, then the measured distance between the second endpoint and the collection device is the same as the second endpoint. The uncertainty is positively correlated.
  • the processing unit 820 is further configured to: if the first reference point is not blocked by other objects, determine, based on the type of two edges connected to each of the multiple endpoints, among the multiple endpoints. The uncertainty of each endpoint, the first reference point is the point with a preset distance from the first endpoint in the direction of the extension line of the first side, and the other objects are the objects in the image where the point cloud cluster is located except the target object and the collected object. objects outside the device.
  • the processing unit 820 is further configured to: if the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, Determine the uncertainty of the first endpoint.
  • the processing unit 820 is further configured to: if the first reference point is blocked by other objects, based on the difference ⁇ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, by formula Determine the uncertainty d 3 of the first endpoint, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radian; ⁇ 1 represents the coordinate azimuth angle when the acquisition device collects the first endpoint; Indicates the azimuth of the orientation of the target.
  • the processing unit 820 is further configured to: if the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, Determine the uncertainty of the second endpoint.
  • the processing unit 820 is further configured to: determine each of the multiple feature points based on the uncertainty corresponding to each of the multiple feature points Corresponding confidence level; based on the first state of the target object corresponding to each of the multiple feature points, and the confidence level corresponding to each of the multiple feature points , to determine the second state of the target.
  • the total number of points; d k represents the uncertainty of the k-th feature point; ⁇ k represents the change between the historical state of the k-th feature point and the first state; C 3 and C 4 are the set value.
  • the processing unit 820 may be a processor 920, the obtaining module 810 may be a communication interface 930, and the communication device may further include a memory 910, as shown in FIG. 9 .
  • FIG. 9 is a schematic block diagram of a computing device according to another embodiment of the present application.
  • the computing device 900 shown in FIG. 9 may include: a memory 910 , a processor 920 , and a communication interface 930 .
  • the memory 910, the processor 920, and the communication interface 930 are connected through an internal connection path, the memory 910 is used to store instructions, and the processor 920 is used to execute the instructions stored in the memory 920 to control the communication interface 930 to receive/send information or data.
  • the memory 910 may be coupled with the processor 920 through an interface, or may be integrated with the processor 920 .
  • the above-mentioned communication interface 930 uses a transceiver device such as but not limited to an input/output interface (input/output interface) to implement communication between the computing device 900 and other devices.
  • a transceiver device such as but not limited to an input/output interface (input/output interface) to implement communication between the computing device 900 and other devices.
  • each step of the above-mentioned method may be completed by an integrated logic circuit of hardware in the processor 920 or an instruction in the form of software.
  • the methods disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • the software modules may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory 910, and the processor 920 reads the information in the memory 910, and completes the steps of the above method in combination with its hardware. To avoid repetition, detailed description is omitted here.
  • the processor may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (digital signal processors, DSP), dedicated integrated Circuit (application specific integrated circuit, ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory may include a read-only memory and a random access memory, and provide instructions and data to the processor.
  • a portion of the processor may also include non-volatile random access memory.
  • the processor may also store device type information.
  • the size of the sequence numbers of the above-mentioned processes does not mean the sequence of execution, and the execution sequence of each process should be determined by its functions and internal logic, and should not be dealt with in the embodiments of the present application. implementation constitutes any limitation.
  • the disclosed system, apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present application provides a target object sensing method and an apparatus, so as to improve the accuracy of calculations of the position or speed of the target object. The present solution relates to the technical field of autonomous driving. The method comprises: obtaining a plurality of feature points of a point cloud cluster, the point cloud cluster representing the target object; determining the degree of uncertainty of each feature point among the plurality of feature points, the degree of uncertainty being used for indicating the error generated when the position of each feature point in the point cloud cluster is collected by means of a collecting device; on the basis of the state of each feature point among the plurality of feature points, obtaining a first state of the target object corresponding to each feature point among the plurality of feature points; on the basis of the first state of the target object corresponding to each feature point among the plurality of feature points and the corresponding degree of uncertainty of each feature point among the plurality of feature points, determining a second state of the target object, wherein the state comprises speed and/or position.

Description

目标物的感知方法及装置Object sensing method and device
本申请要求于2020年7月31日提交中国专利局、申请号为202010755668.2、发明名称为“目标物的感知方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of the Chinese patent application with the application number 202010755668.2 and the invention titled "Method and Device for Sensing Objects" filed with the China Patent Office on July 31, 2020, the entire contents of which are incorporated into this application by reference .
技术领域technical field
本申请涉及自动驾驶领域,并且更具体地,涉及目标物的感知方法及装置。The present application relates to the field of autonomous driving, and more particularly, to a method and device for perceiving a target.
背景技术Background technique
通过采集设备得到的目标物的外形上的点数据集合也称之为点云(point cloud)。现在比较常用的点云包括激光点云,即当一束激光照射到目标物的表面时,所反射的激光会携带目标物的方位、距离等信息。若将激光束按照某种轨迹进行扫描,便会边扫描边记录到反射的激光点信息,由于扫描较为精细,则能够得到大量的激光点,因而就可形成激光点云。目前,激光点云常用于自动驾驶或无人驾驶领域,用于感知目标物。The set of point data on the shape of the target obtained by the acquisition device is also called a point cloud. The more commonly used point clouds now include laser point clouds, that is, when a laser beam irradiates the surface of the target, the reflected laser will carry information such as the azimuth and distance of the target. If the laser beam is scanned according to a certain trajectory, the reflected laser point information will be recorded while scanning. Since the scanning is relatively fine, a large number of laser points can be obtained, so a laser point cloud can be formed. At present, laser point clouds are often used in the field of autonomous driving or unmanned driving to perceive objects.
在现有的目标物感知方案中,先对包含目标物的点云数据进行处理,得到表示目标物的点云簇,并确定该点云簇的几何中心或重心,然后基于点云簇的几何中心或重心的位置以及速度,计算目标物的位置和速度,以感知目标物。In the existing target object perception scheme, the point cloud data containing the target object is first processed to obtain a point cloud cluster representing the target object, and the geometric center or center of gravity of the point cloud cluster is determined, and then based on the geometric center of the point cloud cluster The position and velocity of the center or center of gravity, calculate the position and velocity of the target to perceive the target.
然而,上述这种基于点云簇的几何中心或重心的位置以及速度,计算目标物的位置和速度的方案中,若点云簇的几何中心或重心被遮挡后,会导致计算的目标物的位置和速度的准确度降低。However, in the above scheme of calculating the position and speed of the target object based on the position and velocity of the geometric center or the center of gravity of the point cloud cluster, if the geometric center or the center of gravity of the point cloud cluster is occluded, it will lead to the calculated target object's position and speed. The accuracy of position and velocity is reduced.
发明内容SUMMARY OF THE INVENTION
本申请提供一种目标物的感知方法及装置,以提高计算目标物的位置或速度的准确度。The present application provides a method and device for sensing a target, so as to improve the accuracy of calculating the position or speed of the target.
第一方面,提供了一种目标物的感知方法,包括:获取点云簇的多个特征点,所述点云簇表示所述目标物;确定所述多个特征点中每个特征点的不确定度,所述不确定度用于指示通过采集设备采集所述每个特征点在所述点云簇中的位置时产生的误差;基于所述多个特征点中每个特征点的状态,获取所述多个特征点中每个特征点对应的所述目标物的第一状态,所述每个特征点的状态包括所述每个特征点的位置和/或速度,所述第一状态包括目标物的第一速度和/或第一位置;基于所述多个特征点中所述每个特征点对应的所述目标物的第一状态,以及所述多个特征点中所述每个特征点对应的不确定度,确定所述目标物的第二状态,所述第二状态包括所述目标物的第二速度和/或第二位置。In a first aspect, a method for perceiving a target object is provided, comprising: acquiring a plurality of feature points of a point cloud cluster, the point cloud cluster representing the target object; determining the value of each feature point in the plurality of feature points uncertainty, the uncertainty is used to indicate the error generated when the position of each feature point in the point cloud cluster is collected by the collection device; based on the state of each feature point in the plurality of feature points , obtain the first state of the target object corresponding to each feature point in the plurality of feature points, the state of each feature point includes the position and/or speed of each feature point, the first The state includes the first speed and/or the first position of the target; based on the first state of the target corresponding to each of the plurality of feature points, and the The uncertainty corresponding to each feature point determines a second state of the target, and the second state includes a second velocity and/or a second position of the target.
在本申请实施例中,基于点云簇的多个特征点中每个特征点的状态,计算多个特征点中每个特征点的状态对应的目标物的第一状态,并基于多个特征点中每个特征点对应的目标物的第一状态以及每个特征点对应的不确定度,确定目标物的第二状态,有利于提高确 定目标物的第二状态的准确率。避免了现有技术仅基于点云簇的几何中心或重心的状态,确定目标物的状态的方案中,当点云簇的几何中心或重心被遮挡后,导致确定的目标物的状态的准确率下降。In the embodiment of the present application, based on the state of each feature point in the multiple feature points of the point cloud cluster, the first state of the target object corresponding to the state of each feature point in the multiple feature points is calculated, and based on the multiple feature points The first state of the target object corresponding to each feature point in the points and the uncertainty corresponding to each feature point are used to determine the second state of the target object, which is beneficial to improve the accuracy of determining the second state of the target object. It is avoided that the state of the target object is determined only based on the state of the geometric center or the center of gravity of the point cloud cluster in the prior art. When the geometric center or the center of gravity of the point cloud cluster is occluded, the accuracy of the determined state of the target object is caused. decline.
在一种可能的实现方式中,所述多个特征点包括所述点云簇的多个端点,其中端点又称为“兴趣点”通常指点云簇中的两个相邻边的交点。In a possible implementation manner, the plurality of feature points include a plurality of endpoints of the point cloud cluster, where the endpoints are also referred to as "interest points" and generally refer to the intersection of two adjacent edges in the point cloud cluster.
在本申请实施例中,将点云簇的多个端点作为上述多个特征点,有利于简化确定特征点的过程。In the embodiment of the present application, multiple endpoints of the point cloud cluster are used as the aforementioned multiple feature points, which is beneficial to simplify the process of determining the feature points.
在一种可能的实现方式中,所述确定所述多个特征点中每个特征点的不确定度,包括:确定所述多个端点中与每个端点相连的边的类型,所述边的类型包括所述采集设备直接采集到的可见边和所述采集设备无法直接采集到的不可见边;基于所述多个端点中与每个端点相连的两个边的类型,确定所述多个端点中每个端点的所述不确定度。In a possible implementation manner, the determining the uncertainty of each feature point in the plurality of feature points includes: determining a type of an edge connected to each end point in the plurality of endpoints, the edge The types include visible edges directly collected by the collection device and invisible edges that cannot be directly collected by the collection device; the uncertainty for each of the endpoints.
在本申请实施例中,基于与多个端点中每个端点相连的两个边的类型,确定所述多个端点中每个端点的不确定度,有利于提高确定端点的不确定度的准确性。In the embodiment of the present application, the uncertainty of each end point in the plurality of end points is determined based on the type of the two edges connected to each end point of the plurality of end points, which is beneficial to improve the accuracy of determining the uncertainty of the end point sex.
在一种可能的实现方式中,所述多个端点包括第一端点,与所述第一端点相连的第一边的类型为可见边,与所述第一端点相连的第二边的类型为不可见边,则所述第一端点的所述不确定度是基于采集设备的检测不确定度在所述目标物的朝向方向上的分量确定的。In a possible implementation manner, the plurality of endpoints include a first endpoint, a first edge connected to the first endpoint is a visible edge, and a second edge connected to the first endpoint is an invisible edge, and the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation direction of the target.
在本申请实施例中,基于采集设备的检测不确定度在所述目标物的朝向上的分量,确定第一端点的不确定度,有利于提高确定第一端点的不确定度的准确性。In the embodiment of the present application, the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target, which is beneficial to improve the accuracy of determining the uncertainty of the first endpoint sex.
在一种可能的实现方式中,所述第一端点的所述不确定度d 1通过公式
Figure PCTCN2021106261-appb-000001
确定,其中,R 1表示所述采集设备与所述第一端点之间的测量距离,C 0为预设值,且与所述采集设备的采集精度负相关,单位为弧度;θ 1表示所述采集设备采集所述第一端点时的坐标方位角;
Figure PCTCN2021106261-appb-000002
表示所述目标物的朝向的方位角。
In a possible implementation manner, the uncertainty d 1 of the first endpoint is determined by the formula
Figure PCTCN2021106261-appb-000001
Determine, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, in radians; θ 1 represents the coordinate azimuth when the collection device collects the first endpoint;
Figure PCTCN2021106261-appb-000002
Indicates the azimuth of the orientation of the target.
在本申请实施例中,通过公式
Figure PCTCN2021106261-appb-000003
计算第一端点的不确定度,有利于提高确定第一端点的不确定度的准确性。
In the embodiment of this application, by formula
Figure PCTCN2021106261-appb-000003
Calculating the uncertainty of the first end point is beneficial to improve the accuracy of determining the uncertainty of the first end point.
在一种可能的实现方式中,所述多个端点包括第二端点,且与所述第二端点相连的两条边的类型都为可见边,则所述第二端点与所述采集设备之间的测量距离与所述第二端点的所述不确定度正相关。In a possible implementation manner, the multiple endpoints include a second endpoint, and the types of the two edges connected to the second endpoint are both visible edges, then the second endpoint is connected to the collection device. The measured distance between is positively related to the uncertainty of the second endpoint.
在本申请实施例中,基于第二端点与采集设备之间的测量距离,确定第二端点的不确定度,有利于提高确定第二端点的不确定度的准确性。In the embodiment of the present application, the uncertainty of the second end point is determined based on the measurement distance between the second end point and the acquisition device, which is beneficial to improve the accuracy of determining the uncertainty of the second end point.
在一种可能的实现方式中,所述第二端点的所述不确定度d 2通过公式d 2=R 2×C 1确定,其中,R 2表示所述采集设备与所述第二端点之间的测量距离,C 1表示预设的不确定度,单位为弧度。 In a possible implementation manner, the uncertainty d 2 of the second endpoint is determined by the formula d 2 =R 2 ×C 1 , where R 2 represents the difference between the acquisition device and the second endpoint The measurement distance between, C 1 represents the preset uncertainty, the unit is radian.
在本申请实施例中,通过公式d 2=R 2×C 1,确定第二端点的不确定度,有利于提高确定第二端点的不确定度的准确性。 In the embodiment of the present application, the uncertainty of the second end point is determined by the formula d 2 =R 2 ×C 1 , which is beneficial to improve the accuracy of determining the uncertainty of the second end point.
在一种可能的实现方式中,所述第二端点在所述坐标系中x轴上的所述不确定度d 2x通过公式d 2x=D x0+R 2cos(θ 2)×C 1确定,其中,R表示所述采集设备与所述第一端点之间的距离,C 1表示所述采集设备的检测不确定度,单位为弧度rad;θ 2表示所述采集设备采 集所述第二端点时的坐标方位角;D x0表示预设的所述第二端点在所述x轴方向上的初始不确定度。 In a possible implementation manner, the uncertainty d 2x of the second end point on the x-axis in the coordinate system is determined by the formula d 2x =D x0 +R 2 cos(θ 2 )×C 1 , where R represents the distance between the acquisition device and the first endpoint, C 1 represents the detection uncertainty of the acquisition device, in radians rad; θ 2 represents the acquisition device to collect the first endpoint The coordinate azimuth at the two endpoints; D x0 represents the preset initial uncertainty of the second endpoint in the x-axis direction.
在一种可能的实现方式中,所述第二端点在所述坐标系中y轴上的所述不确定度d 2y通过公式d 2y=D y0+Rsin(θ)×C 1确定,其中,R表示所述采集设备与所述第一端点之间的距离,C 1表示所述采集设备的检测不确定度,单位为弧度rad;θ表示所述采集设备采集所述第一端点时的坐标方位角;D y0表示预设的所述第二端点在所述y轴方向上的初始不确定度。 In a possible implementation manner, the uncertainty d 2y of the second end point on the y-axis in the coordinate system is determined by the formula d 2y =D y0 +Rsin(θ)×C 1 , wherein, R represents the distance between the acquisition device and the first endpoint, C 1 represents the detection uncertainty of the acquisition device, in radians rad; θ represents when the acquisition device acquires the first endpoint The coordinate azimuth; D y0 represents the preset initial uncertainty of the second endpoint in the y-axis direction.
在一种可能的实现方式中,所述基于与所述多个端点中每个端点相连的两个边的类型,确定所述多个端点中每个端点的所述不确定度,包括:若第一参考点未被其他物体遮挡,则基于与所述多个端点中每个端点相连的两个边的类型,确定所述多个端点中每个端点的所述不确定度,所述第一参考点为所述第一边的延长线方向上与所述第一端点之间相距预设距离的点,所述其他物体为所述点云簇所在的图像中除所述目标物和所述采集设备之外的物体。In a possible implementation manner, the determining the uncertainty of each end point of the plurality of end points based on the types of two edges connected to each end point of the plurality of end points includes: if The first reference point is not occluded by other objects, then the uncertainty of each end point of the plurality of end points is determined based on the type of two edges connected to each end point of the plurality of end points, and the first reference point is determined. A reference point is a point with a preset distance from the first end point in the direction of the extension line of the first side, and the other objects are the objects in the image where the point cloud cluster is located except the target object and objects other than the collection device.
在本申请实施例中,当第一参考点未被其他物体遮挡后,可以基于与多个端点中每个端点相连的两个边的类型,确定多个端点中每个端点的不确定度,有利于提高确定端点的不确定度的准确定性。In this embodiment of the present application, after the first reference point is not blocked by other objects, the uncertainty of each of the multiple endpoints may be determined based on the type of two edges connected to each of the multiple endpoints, It is beneficial to improve the accuracy of determining the uncertainty of the endpoint.
在一种可能的实现方式中,所述方法还包括:若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的变化程度,确定所述第一端点的所述不确定度。In a possible implementation manner, the method further includes: if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first endpoint corresponding to the The degree of change of the horizontal opening angle determines the uncertainty of the first end point.
在本申请实施例中,若第一参考点被其他物体遮挡,则基于第一参考点对应的水平张角和第一端点对应的水平张角的变化程度,确定第一端点的不确定度,有利于提高第一端点的不确定度的准确性。In the embodiment of the present application, if the first reference point is blocked by other objects, the uncertainty of the first endpoint is determined based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint. degree, which is beneficial to improve the accuracy of the uncertainty of the first endpoint.
在一种可能的实现方式中,若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角,确定所述第一端点的所述不确定度,包括:若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的差δ,通过公式
Figure PCTCN2021106261-appb-000004
确定所述第一端点的所述不确定度d 3,其中,R 1表示所述采集设备与所述第一端点之间的测量距离,C 0为预设值,与所述采集设备的采集精度负相关,单位为弧度;θ 1表示所述采集设备采集所述第一端点时的坐标方位角;
Figure PCTCN2021106261-appb-000005
表示所述目标物的朝向的方位角。
In a possible implementation manner, if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, determine the The uncertainty of the first endpoint includes: if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the horizontal angle corresponding to the first endpoint The difference δ of the opening angle, through the formula
Figure PCTCN2021106261-appb-000004
Determine the uncertainty d 3 of the first endpoint, wherein R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and the acquisition device The collection accuracy of θ is negatively correlated, and the unit is radian; θ 1 represents the coordinate azimuth angle when the collection device collects the first endpoint;
Figure PCTCN2021106261-appb-000005
Indicates the azimuth of the orientation of the target.
在本申请实施例中,若第一参考点被其他物体遮挡,则基于第一参考点对应的水平张角和第一端点对应的水平张角的变化程度,以及采集设备的检测不确定度在所述目标物的朝向上的分量,确定第一端点的不确定度,有利于提高第一端点的不确定度的准确性。In this embodiment of the present application, if the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, and the detection uncertainty of the acquisition device The component in the orientation of the target object determines the uncertainty of the first end point, which is beneficial to improve the accuracy of the uncertainty of the first end point.
在一种可能的实现方式中,所述方法还包括:若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的变化程度,确定所述第二端点的所述不确定度。In a possible implementation manner, the method further includes: if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first endpoint corresponding to the The degree of change of the horizontal opening angle determines the uncertainty of the second end point.
在本申请实施例中,若第一参考点被其他物体遮挡,则基于第一参考点对应的水平张角和第一端点对应的水平张角的变化程度,确定第二端点的不确定度,有利于提高第二端点的不确定度的准确性。In the embodiment of the present application, if the first reference point is blocked by other objects, the uncertainty of the second endpoint is determined based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint , which is beneficial to improve the accuracy of the uncertainty of the second endpoint.
在一种可能的实现方式中,若所述第一参考点未被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的变化程度,确定所述第二端点的所述不确定度,包括:若所述第一参考点未被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的差δ,通过公式d 4=R 2×(C 1+δ),确定所述第二端点的所述不确定度d 4,其中,R 2表示所述采集设备与所述第二端点之间的测量距离,C 1表示预设的不确定度,单位为弧度。 In a possible implementation manner, if the first reference point is not blocked by the other objects, based on the change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint degree, and determining the uncertainty of the second end point includes: if the first reference point is not blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first end point The difference δ of the horizontal opening angle corresponding to the point, the uncertainty d 4 of the second end point is determined by the formula d 4 =R 2 ×(C 1 +δ), where R 2 represents the difference between the acquisition device and the For the measurement distance between the second endpoints, C 1 represents a preset uncertainty, and the unit is radian.
在本申请实施例中,若第一参考点被其他物体遮挡,则基于第一参考点对应的水平张角和第一端点对应的水平张角的变化程度,以及采集设备与第二端点之间的测量距离,确定第二端点的不确定度,有利于提高第二端点的不确定度的准确性。In this embodiment of the present application, if the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, and the relationship between the acquisition device and the second endpoint The uncertainty of the second end point is determined by measuring the distance between them, which is beneficial to improve the accuracy of the uncertainty of the second end point.
在一种可能的实现方式中,所述基于所述多个特征点中所述每个特征点对应的所述目标物的第一状态,以及所述多个特征点中所述每个特征点对应的不确定度,确定所述目标物的第二状态,包括:基于所述多个特征点中所述每个特征点对应的不确定度,确定所述多个特征点中所述每个特征点对应的置信度;基于所述多个特征点中所述每个特征点对应的所述目标物的第一状态,以及所述多个特征点中所述每个特征点对应的置信度,确定所述目标物的所述第二状态。In a possible implementation manner, the first state of the target object corresponding to each feature point in the plurality of feature points, and each feature point in the plurality of feature points Corresponding uncertainty, determining the second state of the target includes: determining each of the plurality of feature points based on the uncertainty corresponding to each of the plurality of feature points The confidence level corresponding to the feature point; based on the first state of the target object corresponding to each feature point in the plurality of feature points, and the confidence level corresponding to each feature point in the plurality of feature points , to determine the second state of the target.
在一种可能的实现方式中,所述基于所述多个特征点中所述每个特征点对应的不确定度,确定所述多个特征点中所述每个特征点对应的置信度,包括:基于所述多个特征点中所述每个特征点对应的不确定度,通过公式
Figure PCTCN2021106261-appb-000006
确定所述多个特征点中第k个特征点对应的置信度M k,其中,k表示多个特征点中的第k个特征点,k=1……n,n为所述多个特征点的总数;d k表示所述第k个特征点的不确定度;Δ k表示所述第k个特征点的历史状态与所述第一状态之间的变化;C 3、C 4为预设值。
In a possible implementation manner, the confidence level corresponding to each feature point in the plurality of feature points is determined based on the uncertainty corresponding to each feature point in the plurality of feature points, Including: based on the uncertainty corresponding to each feature point in the plurality of feature points, by formula
Figure PCTCN2021106261-appb-000006
Determine the confidence level M k corresponding to the k th feature point in the multiple feature points, where k represents the k th feature point in the multiple feature points, k=1...n, n is the multiple features The total number of points; d k represents the uncertainty of the k-th feature point; Δ k represents the change between the historical state of the k-th feature point and the first state; C 3 and C 4 are the set value.
需要说明的是,上文中的坐标方位角可以理解为坐标系中目标物的端点与采集设备之间的连线与x轴之间的夹角。上文中目标物的朝向的方位角可以理解为是以目标物的朝向的方向为起点,依顺时针方向到x轴之间的水平夹角。It should be noted that the coordinate azimuth in the above can be understood as the angle between the line connecting the end point of the target object and the acquisition device and the x-axis in the coordinate system. The azimuth angle of the orientation of the target object in the above can be understood as the horizontal angle between the clockwise direction and the x-axis with the orientation of the target object as the starting point.
第二方面,提供了一种目标物的感知装置,所述装置包括用于执行第一方面或第一方面中任一种可能实现方式中的各个单元。In a second aspect, an apparatus for sensing a target object is provided, the apparatus including each unit for implementing the first aspect or any possible implementation manner of the first aspect.
第三方面,提供了一种目标物的感知装置,所述装置具有实现上述第一方面的方法设计中的装置的功能。这些功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的单元。In a third aspect, a device for sensing a target object is provided, and the device has the function of implementing the device in the method design of the above-mentioned first aspect. These functions can be implemented by hardware or by executing corresponding software by hardware. The hardware or software includes one or more units corresponding to the above functions.
第四方面,提供了一种计算设备,包括输入输出接口、处理器和存储器。该处理器用于控制输入输出接口收发信号或信息,该存储器用于存储计算机程序,该处理器用于从存储器中调用并运行该计算机程序,使得该计算设备执行上述第一方面中的方法。In a fourth aspect, a computing device is provided, including an input-output interface, a processor, and a memory. The processor is used to control the input and output interface to send and receive signals or information, the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that the computing device executes the method in the first aspect.
第五方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码在计算机上运行时,使得计算机执行上述各方面中的方法。In a fifth aspect, a computer program product is provided, the computer program product comprising: computer program code, when the computer program code is run on a computer, causing the computer to perform the methods in the above aspects.
第六方面,提供了一种计算机可读介质,所述计算机可读介质存储有程序代码,当所述计算机程序代码在计算机上运行时,使得计算机执行上述各方面中的方法。In a sixth aspect, a computer-readable medium is provided, and the computer-readable medium stores program codes, which, when executed on a computer, cause the computer to execute the methods in the above-mentioned aspects.
第七方面,提供了一种芯片系统,该芯片系统包括处理器,用于计算设备实现上述方面中所涉及的功能,例如,生成,接收,发送,或处理上述方法中所涉及的数据和/或信 息。在一种可能的设计中,所述芯片系统还包括存储器,所述存储器,用于保存计算设备必要的程序指令和数据。该芯片系统,可以由芯片构成,也可以包括芯片和其他分立器件。A seventh aspect provides a system-on-chip, the system-on-a-chip includes a processor for a computing device to implement the functions involved in the above aspects, for example, generating, receiving, sending, or processing data and/or data involved in the above methods or information. In a possible design, the chip system further includes a memory for storing necessary program instructions and data of the computing device. The chip system may be composed of chips, or may include chips and other discrete devices.
第八方面,提供了一种车辆,包括输入输出接口、处理器和存储器。该处理器用于控制输入输出接口收发信号或信息,该存储器用于存储计算机程序,该处理器用于从存储器中调用并运行该计算机程序,使得该计算设备执行上述第一方面中的方法。In an eighth aspect, a vehicle is provided, including an input-output interface, a processor and a memory. The processor is used to control the input and output interface to send and receive signals or information, the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that the computing device executes the method in the first aspect.
可选地,上述车辆可以具有自动驾驶功能。Optionally, the above-mentioned vehicle may have an automatic driving function.
附图说明Description of drawings
图1是本申请实施例提供的车辆100的功能框图。FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
图2是本申请实施例的适用的自动驾驶系统的示意图。FIG. 2 is a schematic diagram of an applicable automatic driving system according to an embodiment of the present application.
图3是本申请实施例的目标物的感知方法的示意性流程图。FIG. 3 is a schematic flowchart of a method for sensing a target according to an embodiment of the present application.
图4是本申请实施例中目标物对应的点云簇的示意图。FIG. 4 is a schematic diagram of a point cloud cluster corresponding to a target in an embodiment of the present application.
图5是本申请实施例的坐标系中目标物400与采集设备500之间位置关系的示意图。FIG. 5 is a schematic diagram of the positional relationship between the target object 400 and the acquisition device 500 in the coordinate system according to the embodiment of the present application.
图6是本申请实另一施例的坐标系中目标物400与采集设备500之间位置关系的示意图。FIG. 6 is a schematic diagram of a positional relationship between a target 400 and a collection device 500 in a coordinate system according to another embodiment of the present application.
图7是本申请实施例的环境地图的示意图。FIG. 7 is a schematic diagram of an environment map according to an embodiment of the present application.
图8是本申请实施例的目标物的感知装置的示意图。FIG. 8 is a schematic diagram of a device for sensing a target according to an embodiment of the present application.
图9是本申请另一实施例的计算设备的示意性框图。FIG. 9 is a schematic block diagram of a computing device according to another embodiment of the present application.
具体实施方式detailed description
下面将结合附图,对本申请中的技术方案进行描述。为了便于理解,下文结合图1,以智能驾驶的场景为例,介绍本申请实施例适用的场景。The technical solutions in the present application will be described below with reference to the accompanying drawings. For ease of understanding, the following describes a scenario to which the embodiments of the present application are applicable by taking a scenario of intelligent driving as an example with reference to FIG. 1 .
图1是本申请实施例提供的车辆100的功能框图。在一个实施例中,将车辆100配置为完全或部分地自动驾驶模式。例如,车辆100可以在处于自动驾驶模式中的同时控制自身,并且可通过人为操作来确定车辆及其周边环境的当前状态,确定周边环境中的至少一个其他车辆的可能行为,并确定该其他车辆执行可能行为的可能性相对应的置信水平,基于所确定的信息来控制车辆100。在车辆100处于自动驾驶模式中时,可以将车辆100置为在没有和人交互的情况下操作。FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application. In one embodiment, the vehicle 100 is configured in a fully or partially autonomous driving mode. For example, the vehicle 100 can control itself while in an autonomous driving mode, and can determine the current state of the vehicle and its surroundings through human manipulation, determine the likely behavior of at least one other vehicle in the surrounding environment, and determine the other vehicle The vehicle 100 is controlled based on the determined information with a confidence level corresponding to the likelihood of performing the possible behavior. When the vehicle 100 is in an autonomous driving mode, the vehicle 100 may be placed to operate without human interaction.
车辆100可包括各种子系统,例如行进系统102、传感器系统104、控制系统106、一个或多个外围设备108以及电源110、计算机系统112和用户接口116。可选地,车辆100可包括更多或更少的子系统,并且每个子系统可包括多个元件。另外,车辆100的每个子系统和元件可以通过有线或者无线互连。 Vehicle 100 may include various subsystems, such as travel system 102 , sensor system 104 , control system 106 , one or more peripherals 108 and power supply 110 , computer system 112 , and user interface 116 . Alternatively, vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. Additionally, each of the subsystems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
行进系统102可包括为车辆100提供动力运动的组件。在一个实施例中,行进系统102可包括引擎118、能量源119、传动装置120和车轮/轮胎121。引擎118可以是内燃引擎、电动机、空气压缩引擎或其他类型的引擎组合,例如汽油发动机和电动机组成的混动引擎,内燃引擎和空气压缩引擎组成的混动引擎。引擎118将能量源119转换成机械能量。The travel system 102 may include components that provide powered motion for the vehicle 100 . In one embodiment, travel system 102 may include engine 118 , energy source 119 , transmission 120 , and wheels/tires 121 . The engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a gasoline engine and electric motor hybrid engine, an internal combustion engine and an air compression engine hybrid engine. Engine 118 converts energy source 119 into mechanical energy.
能量源119的示例包括汽油、柴油、其他基于石油的燃料、丙烷、其他基于压缩气体的燃料、乙醇、太阳能电池板、电池和其他电力来源。能量源119也可以为车辆100的其 他系统提供能量。Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity. The energy source 119 may also provide energy to other systems of the vehicle 100 .
传动装置120可以将来自引擎118的机械动力传送到车轮121。传动装置120可包括变速箱、差速器和驱动轴。在一个实施例中,传动装置120还可以包括其他器件,比如离合器。其中,驱动轴可包括可耦合到一个或多个车轮121的一个或多个轴。Transmission 120 may transmit mechanical power from engine 118 to wheels 121 . Transmission 120 may include a gearbox, a differential, and a driveshaft. In one embodiment, transmission 120 may also include other devices, such as clutches. Among other things, the drive shaft may include one or more axles that may be coupled to one or more wheels 121 .
传感器系统104(又称“采集设备”)可包括感知关于车辆100周边的环境的信息的若干个传感器。例如,传感器系统104可包括定位系统122(定位系统可以是全球定位系统(global positioning system,GPS)系统,也可以是北斗系统或者其他定位系统)、惯性测量单元(inertial measurement unit,IMU)124、雷达126、激光测距仪128以及相机130。传感器系统104还可包括被监视车辆100的内部系统的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个的传感器数据可用于检测对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是自主车辆100的安全操作的关键功能。The sensor system 104 (also known as "collection device") may include several sensors that sense information about the environment surrounding the vehicle 100 . For example, the sensor system 104 may include a positioning system 122 (the positioning system may be a global positioning system (GPS) system, a Beidou system or other positioning systems), an inertial measurement unit (IMU) 124, Radar 126 , laser rangefinder 128 and camera 130 . The sensor system 104 may also include sensors of the internal systems of the vehicle 100 being monitored (eg, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, orientation, velocity, etc.). This detection and identification is a critical function for the safe operation of the autonomous vehicle 100 .
定位系统122可用于估计车辆100的地理位置。IMU 124用于基于惯性加速度来感测车辆100的位置和朝向变化。在一个实施例中,IMU 124可以是加速度计和陀螺仪的组合。The positioning system 122 may be used to estimate the geographic location of the vehicle 100 . The IMU 124 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration. In one embodiment, IMU 124 may be a combination of an accelerometer and a gyroscope.
雷达126可利用无线电信号来感测车辆100的周边环境内的物体。在一些实施例中,除了感知目标物以外,雷达126还可用于感知目标物的速度、位置、前进方向中的一种或多种状态。Radar 126 may utilize radio signals to sense objects within the surrounding environment of vehicle 100 . In some embodiments, in addition to sensing the target, the radar 126 may also be used to sense one or more of the target's speed, position, and heading.
激光测距仪128可利用激光来感测车辆100所位于的环境中的物体。在一些实施例中,激光测距仪128可包括一个或多个激光源、激光扫描器以及一个或多个检测器,以及其他系统组件。The laser rangefinder 128 may utilize laser light to sense objects in the environment in which the vehicle 100 is located. In some embodiments, the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
相机130可用于捕捉车辆100的周边环境的多个图像。相机130可以是静态相机或视频相机。Camera 130 may be used to capture multiple images of the surrounding environment of vehicle 100 . Camera 130 may be a still camera or a video camera.
控制系统106为控制车辆100及其组件的操作。控制系统106可包括各种元件,其中包括转向系统132、油门134、制动单元136、计算机视觉系统140、路线控制系统142以及障碍规避系统144。The control system 106 controls the operation of the vehicle 100 and its components. Control system 106 may include various elements including steering system 132 , throttle 134 , braking unit 136 , computer vision system 140 , route control system 142 , and obstacle avoidance system 144 .
转向系统132可操作来调整车辆100的前进方向。例如在一个实施例中可以为方向盘系统。The steering system 132 is operable to adjust the heading of the vehicle 100 . For example, in one embodiment it may be a steering wheel system.
油门134用于控制引擎118的操作速度并进而控制车辆100的速度。The throttle 134 is used to control the operating speed of the engine 118 and thus the speed of the vehicle 100 .
制动单元136用于控制车辆100减速。制动单元136可使用摩擦力来减慢车轮121。在其他实施例中,制动单元136可将车轮121的动能转换为电流。制动单元136也可采取其他形式来减慢车轮121转速从而控制车辆100的速度。The braking unit 136 is used to control the deceleration of the vehicle 100 . The braking unit 136 may use friction to slow the wheels 121 . In other embodiments, the braking unit 136 may convert the kinetic energy of the wheels 121 into electrical current. The braking unit 136 may also take other forms to slow the wheels 121 to control the speed of the vehicle 100 .
计算机视觉系统140可以操作来处理和分析由相机130捕捉的图像以便识别车辆100周边环境中的物体和/或特征。所述物体和/或特征可包括交通信号、道路边界和障碍物。计算机视觉系统140可使用物体识别算法、运动中恢复结构(structure from motion,SFM)算法、视频跟踪和其他计算机视觉技术。在一些实施例中,计算机视觉系统140可以用于为环境绘制地图、跟踪物体、估计物体的速度等等。Computer vision system 140 may be operable to process and analyze images captured by camera 130 in order to identify objects and/or features in the environment surrounding vehicle 100 . The objects and/or features may include traffic signals, road boundaries and obstacles. Computer vision system 140 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and the like.
路线控制系统142用于确定车辆100的行驶路线。在一些实施例中,路线控制系统142可结合来自传感器、GPS 122和一个或多个预定地图的数据以为车辆100确定行驶路线。The route control system 142 is used to determine the travel route of the vehicle 100 . In some embodiments, route control system 142 may combine data from sensors, GPS 122, and one or more predetermined maps to determine a driving route for vehicle 100.
障碍规避系统144用于识别、评估和避免或者以其他方式越过车辆100的环境中的潜在障碍物。The obstacle avoidance system 144 is used to identify, evaluate and avoid or otherwise traverse potential obstacles in the environment of the vehicle 100 .
当然,在一个实例中,控制系统106可以增加或替换地包括除了所示出和描述的那些以外的组件。或者也可以减少一部分上述示出的组件。Of course, in one example, the control system 106 may additionally or alternatively include components other than those shown and described. Alternatively, some of the components shown above may be reduced.
车辆100通过外围设备108与外部传感器、其他车辆、其他计算机系统或用户之间进行交互。外围设备108可包括无线通信系统146、车载电脑148、麦克风150和/或扬声器152。 Vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripheral devices 108 . Peripherals 108 may include a wireless communication system 146 , an onboard computer 148 , a microphone 150 and/or a speaker 152 .
在一些实施例中,外围设备108提供车辆100的用户与用户接口116交互手段。例如,车载电脑148可向车辆100的用户提供信息。用户接口116还可操作车载电脑148来接收用户的输入。车载电脑148可以通过触摸屏进行操作。在其他情况中,外围设备108可提供用于车辆100与位于车内的其它设备通信的手段。例如,麦克风150可从车辆100的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器152可向车辆100的用户输出音频。In some embodiments, peripherals 108 provide a means for a user of vehicle 100 to interact with user interface 116 . For example, the onboard computer 148 may provide information to the user of the vehicle 100 . User interface 116 may also operate on-board computer 148 to receive user input. The onboard computer 148 can be operated via a touch screen. In other cases, peripheral devices 108 may provide a means for vehicle 100 to communicate with other devices located within the vehicle. For example, microphone 150 may receive audio (eg, voice commands or other audio input) from a user of vehicle 100 . Similarly, speakers 152 may output audio to a user of vehicle 100 .
无线通信系统146可以直接地或者经由通信网络来与一个或多个设备无线通信。例如,无线通信系统146可使用3G蜂窝通信,例如码分多址(code division multiple access,CDMA)、全球移动通信系统(Global System for Mobile Communications,GSM)/GPRS,或者第四代(fourth generation,4G)通信,例如LTE。或者第五代(5th-Generation,5G)通信。无线通信系统146可利用WiFi与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信系统146可利用红外链路、蓝牙或紫蜂(ZigBee)与设备直接通信。其他无线协议,例如各种车辆通信系统,例如,无线通信系统146可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备,这些设备可包括车辆和/或路边台站之间的公共和/或私有数据通信。Wireless communication system 146 may wirelessly communicate with one or more devices, either directly or via a communication network. For example, wireless communication system 146 may use 3G cellular communications, such as code division multiple access (CDMA), Global System for Mobile Communications (GSM)/GPRS, or fourth generation (4th generation, 4G) communications such as LTE. Or the fifth generation (5th-Generation, 5G) communication. The wireless communication system 146 may communicate with a wireless local area network (WLAN) using WiFi. In some embodiments, the wireless communication system 146 may communicate directly with the device using an infrared link, Bluetooth, or ZigBee. Other wireless protocols, such as various vehicle communication systems, for example, wireless communication system 146 may include one or more dedicated short range communications (DSRC) devices, which may include communication between vehicles and/or roadside stations public and/or private data communications.
电源110可向车辆100的各种组件提供电力。在一个实施例中,电源110可以为可再充电锂离子或铅酸电池。这种电池的一个或多个电池组可被配置为电源为车辆100的各种组件提供电力。在一些实施例中,电源110和能量源119可一起实现,例如一些全电动车中那样。The power supply 110 may provide power to various components of the vehicle 100 . In one embodiment, the power source 110 may be a rechargeable lithium-ion or lead-acid battery. One or more battery packs of such a battery may be configured as a power source to provide power to various components of the vehicle 100 . In some embodiments, power source 110 and energy source 119 may be implemented together, such as in some all-electric vehicles.
车辆100的部分或所有功能受计算机系统112控制。计算机系统112可包括至少一个处理器113,处理器113执行存储在例如数据存储器114这样的非暂态计算机可读介质中的指令115。计算机系统112还可以是采用分布式方式控制车辆100的个体组件或子系统的多个计算设备。Some or all of the functions of the vehicle 100 are controlled by the computer system 112 . Computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as data memory 114 . Computer system 112 may also be multiple computing devices that control individual components or subsystems of vehicle 100 in a distributed fashion.
处理器113可以是任何常规的处理器,诸如商业可获得的中央处理器(central processing unit,CPU)。替选地,该处理器可以是诸如专用集成电路(application specific integrated circuit,ASIC)或其它基于硬件的处理器的专用设备。尽管图1功能性地图示了处理器、存储器、和在相同块中的计算机110的其它元件,但是本领域的普通技术人员应该理解该处理器、计算机、或存储器实际上可以包括可以或者可以不存储在相同的物理外壳内的多个处理器、计算机、或存储器。例如,存储器可以是硬盘驱动器或位于不同于计算机110的外壳内的其它存储介质。因此,对处理器或计算机的引用将被理解为包括对可以或者可以不并行操作的处理器或计算机或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其 自己的处理器,所述处理器只执行与特定于组件的功能相关的计算。The processor 113 may be any conventional processor, such as a commercially available central processing unit (CPU). Alternatively, the processor may be a dedicated device such as an application specific integrated circuit (ASIC) or other hardware-based processor. Although FIG. 1 functionally illustrates the processor, memory, and other elements of the computer 110 in the same block, one of ordinary skill in the art will understand that the processor, computer, or memory may actually include a processor, a computer, or a memory that may or may not Multiple processors, computers, or memories stored within the same physical enclosure. For example, the memory may be a hard drive or other storage medium located within an enclosure other than computer 110 . Thus, reference to a processor or computer will be understood to include reference to a collection of processors or computers or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components such as the steering and deceleration components may each have their own processor that only performs computations related to component-specific functions .
在此处所描述的各个方面中,处理器可以位于远离该车辆并且与该车辆进行无线通信。在其它方面中,此处所描述的过程中的一些在布置于车辆内的处理器上执行而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。In various aspects described herein, a processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle while others are performed by a remote processor, including taking steps necessary to perform a single maneuver.
在一些实施例中,存储器114可包含指令115(例如,程序逻辑),指令115可被处理器113执行来执行车辆100的各种功能,包括以上描述的那些功能。存储器114也可包含额外的指令,包括向行进系统102、传感器系统104、控制系统106和外围设备108中的一个或多个发送数据、从其接收数据、与其交互和/或对其进行控制的指令。In some embodiments, the memory 114 may contain instructions 115 (eg, program logic) executable by the processor 113 to perform various functions of the vehicle 100 , including those described above. Memory 114 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of travel system 102 , sensor system 104 , control system 106 , and peripherals 108 . instruction.
除了指令115以外,存储器114还可存储数据,例如道路地图、路线信息,车辆的位置、方向、速度以及其它这样的车辆数据,以及其他信息。这种信息可在车辆100在自主、半自主和/或手动模式中操作期间被车辆100和计算机系统112使用。In addition to instructions 115, memory 114 may store data such as road maps, route information, vehicle location, direction, speed, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 112 during operation of the vehicle 100 in autonomous, semi-autonomous and/or manual modes.
在一些实施例中,上述处理器113还可以执行本申请实施例的车辆纵向运动参数的规划方案,以帮助车辆规划纵向运动参数,其中具体的纵向运动参数规划方法可以参照下文中图3的介绍,为了简洁,在此不再赘述。In some embodiments, the above-mentioned processor 113 may also execute the planning scheme for the longitudinal motion parameters of the vehicle according to the embodiments of the present application, so as to help the vehicle to plan the longitudinal motion parameters. For the specific longitudinal motion parameter planning method, reference may be made to the introduction of FIG. 3 below. , and are not repeated here for brevity.
用户接口116,用于向车辆100的用户提供信息或从其接收信息。可选地,用户接口116可包括在外围设备108的集合内的一个或多个输入/输出设备,例如无线通信系统146、车载电脑148、麦克风150和扬声器152。A user interface 116 for providing information to or receiving information from a user of the vehicle 100 . Optionally, user interface 116 may include one or more input/output devices within the set of peripheral devices 108 , such as wireless communication system 146 , onboard computer 148 , microphone 150 and speaker 152 .
计算机系统112可基于从各种子系统(例如,行进系统102、传感器系统104和控制系统106)以及从用户接口116接收的输入来控制车辆100的功能。例如,计算机系统112可利用来自控制系统106的输入以便控制转向单元132来避免由传感器系统104和障碍规避系统144检测到的障碍物。在一些实施例中,计算机系统112可操作来对车辆100及其子系统的许多方面提供控制。Computer system 112 may control functions of vehicle 100 based on input received from various subsystems (eg, travel system 102 , sensor system 104 , and control system 106 ) and from user interface 116 . For example, computer system 112 may utilize input from control system 106 in order to control steering unit 132 to avoid obstacles detected by sensor system 104 and obstacle avoidance system 144 . In some embodiments, computer system 112 is operable to provide control of various aspects of vehicle 100 and its subsystems.
可选地,上述这些组件中的一个或多个可与车辆100分开安装或关联。例如,存储器114可以部分或完全地与车辆100分开存在。上述组件可以按有线和/或无线方式来通信地耦合在一起。Alternatively, one or more of these components described above may be installed or associated with the vehicle 100 separately. For example, memory 114 may exist partially or completely separate from vehicle 100 . The above-described components may be communicatively coupled together in a wired and/or wireless manner.
可选地,上述组件只是一个示例,实际应用中,上述各个模块中的组件有可能根据实际需要增添或者删除,图1不应理解为对本发明实施例的限制。Optionally, the above component is just an example. In practical applications, components in each of the above modules may be added or deleted according to actual needs, and FIG. 1 should not be construed as a limitation on the embodiment of the present invention.
在道路行进的自动驾驶汽车,如上面的车辆100,可以识别其周围环境内的物体以确定对当前速度的调整。所述物体可以是其它车辆、交通控制设备、或者其它类型的物体。在一些示例中,可以独立地考虑每个识别的物体,并且基于物体的各自的特性,诸如它的当前速度、加速度、与车辆的间距等,可以用来确定自动驾驶汽车所要调整的速度。A self-driving car traveling on a road, such as vehicle 100 above, can recognize objects within its surroundings to determine adjustments to the current speed. The objects may be other vehicles, traffic control equipment, or other types of objects. In some examples, each identified object may be considered independently, and based on the object's respective characteristics, such as its current speed, acceleration, distance from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to adjust.
可选地,自动驾驶车辆100或者与自动驾驶车辆100相关联的计算设备(如图1的计算机系统112、计算机视觉系统140、存储器114)可以基于所识别的物体的特性和周围环境的状态(例如,交通、雨、道路上的冰等等)来预测所述识别的物体的行为。可选地,每一个所识别的物体都依赖于彼此的行为,因此还可以将所识别的所有物体全部一起考虑来预测单个识别的物体的行为。车辆100能够基于预测的所述识别的物体的行为来调整它的速度。换句话说,自动驾驶汽车能够基于所预测的物体的行为来确定车辆将需要调整到稳定状态(例如,加速、减速、或者停止)。在这个过程中,也可以考虑其它因素来确定车辆100的速度,诸如,车辆100在行驶的道路中的横向位置、道路的曲率、静态和动态物 体的接近度等等。Optionally, autonomous vehicle 100 or a computing device associated with autonomous vehicle 100 (eg, computer system 112, computer vision system 140, memory 114 of FIG. For example, traffic, rain, ice on the road, etc.) to predict the behavior of the identified object. Optionally, each identified object is dependent on the behavior of the other, so it is also possible to predict the behavior of a single identified object by considering all identified objects together. The vehicle 100 can adjust its speed based on the predicted behavior of the identified object. In other words, the self-driving car can determine that the vehicle will need to adjust to a steady state (eg, accelerate, decelerate, or stop) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and the like.
除了提供调整自动驾驶汽车的速度的指令之外,计算设备还可以提供修改车辆100的转向角的指令,以使得自动驾驶汽车遵循给定的轨迹和/或维持与自动驾驶汽车附近的物体(例如,道路上的相邻车道中的轿车)的安全横向和纵向距离。In addition to providing instructions to adjust the speed of the self-driving car, the computing device may also provide instructions to modify the steering angle of the vehicle 100 so that the self-driving car follows a given trajectory and/or maintains contact with objects in the vicinity of the self-driving car (eg, , cars in adjacent lanes on the road) safe lateral and longitudinal distances.
上述车辆100可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本发明实施例不做特别的限定。The above-mentioned vehicle 100 can be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, a recreational vehicle, a playground vehicle, construction equipment, a tram, a golf cart, a train, a cart, etc. The embodiments of the invention are not particularly limited.
上文结合图1介绍了本申请实施例适用的场景,下文结合图2介绍执行本申请实施例的适用的自动驾驶系统。The applicable scene of the embodiment of the present application is described above with reference to FIG. 1 , and the applicable automatic driving system for executing the embodiment of the present application is described below with reference to FIG. 2 .
图2是本申请实施例的适用的自动驾驶系统的示意图,计算机系统101包括处理器103,处理器103和系统总线105耦合。处理器103可以是一个或者多个处理器,其中每个处理器都可以包括一个或多个处理器核。显示适配器(video adapter)107,显示适配器可以驱动显示器109,显示器109和系统总线105耦合。系统总线105通过总线桥111和输入/输出(input/output,I/O)总线113耦合。I/O接口115和I/O总线耦合。I/O接口115和多种I/O设备进行通信,比如输入设备117(如:键盘,鼠标,触摸屏等),多媒体盘(media tray)121,(例如,CD-ROM,多媒体接口等)。收发器123(可以发送和/或接受无线电通信信号),摄像头155(可以捕捉景田和动态数字视频图像)和外部USB接口125。其中,可选地,和I/O接口115相连接的接口可以是USB接口。FIG. 2 is a schematic diagram of a suitable automatic driving system according to an embodiment of the present application. The computer system 101 includes a processor 103 , and the processor 103 is coupled to a system bus 105 . The processor 103 may be one or more processors, each of which may include one or more processor cores. A video adapter 107, which can drive a display 109, is coupled to the system bus 105. The system bus 105 is coupled to an input/output (I/O) bus 113 through a bus bridge 111 . I/O interface 115 is coupled to the I/O bus. I/O interface 115 communicates with various I/O devices, such as input device 117 (eg, keyboard, mouse, touch screen, etc.), media tray 121, (eg, CD-ROM, multimedia interface, etc.). Transceiver 123 (which can transmit and/or receive radio communication signals), camera 155 (which can capture sceneries and dynamic digital video images) and external USB interface 125 . Wherein, optionally, the interface connected to the I/O interface 115 may be a USB interface.
其中,处理器103可以是任何传统处理器,包括精简指令集计算(Reduced Instruction Set Computing,RISC)处理器、复杂指令集计算(Complex Instruction Set Computer,CISC)处理器或上述的组合。可选地,处理器可以是诸如专用集成电路ASIC的专用装置。可选地,处理器103可以是神经网络处理器或者是神经网络处理器和上述传统处理器的组合。The processor 103 may be any conventional processor, including a Reduced Instruction Set Computing (Reduced Instruction Set Computing, RISC) processor, a Complex Instruction Set Computing (Complex Instruction Set Computer, CISC) processor or a combination of the above. Alternatively, the processor may be a special purpose device such as an application specific integrated circuit ASIC. Optionally, the processor 103 may be a neural network processor or a combination of a neural network processor and the above-mentioned conventional processors.
可选地,在本文所述的各种实施例中,计算机系统101可位于远离自动驾驶车辆的地方,并且可与自动驾驶车辆无线通信。在其它方面,本文所述的一些过程在设置在自动驾驶车辆内的处理器上执行,其它由远程处理器执行,包括采取执行单个操纵所需的动作。Alternatively, in various embodiments described herein, computer system 101 may be located remotely from the autonomous vehicle and may communicate wirelessly with the autonomous vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the autonomous vehicle, others are performed by a remote processor, including taking actions required to perform a single maneuver.
计算机101可以通过网络接口129和软件部署服务器149通信。网络接口129是硬件网络接口,比如,网卡。网络127可以是外部网络,比如因特网,也可以是内部网络,比如以太网或者虚拟私人网络(Virtual Private Network,VPN)。可选地,网络127还可以是无线网络,比如Wi-Fi网络,蜂窝网络等。 Computer 101 may communicate with software deployment server 149 through network interface 129 . Network interface 129 is a hardware network interface, such as a network card. The network 127 may be an external network, such as the Internet, or an internal network, such as an Ethernet network or a virtual private network (Virtual Private Network, VPN). Optionally, the network 127 may also be a wireless network, such as a Wi-Fi network, a cellular network, and the like.
硬盘驱动接口和系统总线105耦合。硬件驱动接口和硬盘驱动器相连接。系统内存135和系统总线105耦合。运行在系统内存135的数据可以包括计算机101的操作系统137和应用程序143。The hard disk drive interface is coupled to the system bus 105 . The hard drive interface is connected to the hard drive. System memory 135 is coupled to system bus 105 . Data running in system memory 135 may include operating system 137 and application programs 143 of computer 101 .
操作系统包括外壳(shell)139和内核(kernel)141。外壳139是介于使用者和操作系统之内核间的一个接口。外壳139是操作系统最外面的一层。外壳139管理使用者与操作系统之间的交互:等待使用者的输入,向操作系统解释使用者的输入,并且处理各种各样的操作系统的输出结果。The operating system includes a shell 139 and a kernel 141 . Shell 139 is an interface between the user and the kernel of the operating system. Shell 139 is the outermost layer of the operating system. Shell 139 manages the interaction between the user and the operating system: waiting for user input, interpreting user input to the operating system, and processing various operating system outputs.
内核141由操作系统中用于管理存储器、文件、外设和系统资源的那些部分组成。直接与硬件交互,操作系统内核通常运行进程,并提供进程间的通信,提供CPU时间片管理、中断、内存管理、IO管理等等。 Kernel 141 consists of those parts of the operating system that manage memory, files, peripherals, and system resources. Interacting directly with hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, IO management, and more.
应用程序143包括控制汽车自动驾驶相关的程序,比如,管理自动驾驶的汽车和路上障碍物交互的程序,控制自动驾驶汽车路线或者速度的程序,控制自动驾驶汽车和路上其他自动驾驶汽车交互的程序。应用程序143也存在于软件部署服务器(deploying server)149的系统上。在一个实施例中,在需要执行应用程序147时,计算机系统101可以从软件部署服务器(deploying server)149下载应用程序143。Application 143 includes programs that control the autonomous driving of the car, for example, programs that manage the interaction of the autonomous car with obstacles on the road, programs that control the route or speed of the autonomous car, and programs that control the interaction of the autonomous car with other autonomous vehicles on the road. . Application 143 also exists on the system of software deploying server 149 . In one embodiment, computer system 101 may download application 143 from software deploying server 149 when application 147 needs to be executed.
在一些实施例中,上述应用程序还可以包括用于本申请实施例提供的目标物的感知方案对应的应用程序,其中本申请实施例的目标物的感知方案将在下文中具体介绍,为了简洁在此不再赘述。In some embodiments, the above-mentioned application program may further include an application program corresponding to the target object perception scheme provided by the embodiments of the present application, wherein the target object perception scheme of the embodiments of the present application will be described in detail below. For the sake of brevity, the This will not be repeated here.
传感器153(又称“采集设备”)和计算机系统101关联。传感器153用于探测计算机101周围的环境。举例来说,传感器153可以探测目标物,例如,动物,汽车,障碍物等,进一步传感器还可以探测上述目标物的周围的环境,比如:动物周围的环境,动物周围出现的其他动物,天气条件,周围环境的光亮度等。可选地,如果计算机101位于自动驾驶的汽车上,传感器可以是激光雷达,摄像头,红外线感应器,化学检测器,麦克风等。Sensors 153 (also known as "collection devices") are associated with computer system 101 . The sensor 153 is used to detect the environment around the computer 101 . For example, the sensor 153 can detect objects, such as animals, cars, obstacles, etc., and further sensors can detect the surrounding environment of the above objects, such as: the environment around the animal, other animals around the animal, weather conditions , the brightness of the surrounding environment, etc. Alternatively, if the computer 101 is located in an autonomous vehicle, the sensors may be lidars, cameras, infrared sensors, chemical detectors, microphones, and the like.
在传统的目标物感知方案中,先对包含目标物的点云数据进行处理,得到表示目标物的点云簇,并确定该点云簇的几何中心或重心,然后基于点云簇的几何中心或重心的位置以及速度,计算目标物的位置和速度,以感知目标物。然而,在上述这种基于点云簇的几何中心或重心的位置以及速度,计算目标物的位置和速度的方案中,若点云簇的几何中心或重心被遮挡后,会降低计算的目标物的位置和速度的准确度。In the traditional object perception scheme, the point cloud data containing the object is first processed to obtain a point cloud cluster representing the object, and the geometric center or center of gravity of the point cloud cluster is determined, and then based on the geometric center of the point cloud cluster Or the position and speed of the center of gravity, and calculate the position and speed of the target to perceive the target. However, in the above scheme of calculating the position and speed of the target based on the position and velocity of the geometric center or the center of gravity of the point cloud cluster, if the geometric center or the center of gravity of the point cloud cluster is blocked, the calculated target object will be reduced. position and velocity accuracy.
为了避免上述问题,本申请提供了一种目标物的感知方法,基于点云簇的多个特征点的状态(例如,位置和/或速度),计算多个特征点中每个特征点对应的目标物的第一状态(例如,第一位置和/或第一速度),然后基于多个特征点的不确定度对上述每个特征点对应的目标物的第一状态进行融合,最终得到目标物的第二状态(例如,第二位置和/或第二速度)。In order to avoid the above problems, the present application provides a method for perceiving a target object. The first state of the target (for example, the first position and/or the first speed), and then fuse the first state of the target corresponding to each of the above feature points based on the uncertainty of multiple feature points, and finally obtain the target a second state of the object (eg, a second position and/or a second velocity).
下文结合图3介绍本申请实施例的方法。应理解,图3所示的方法可以由图2所示的自动驾驶系统执行,或者还可以由车辆100中的控制系统106执行,可选地,还可以将目标物的第二状态发送至障碍规避系统144,以规划车辆100的行车路线等。The method of the embodiment of the present application is described below with reference to FIG. 3 . It should be understood that the method shown in FIG. 3 may be executed by the automatic driving system shown in FIG. 2 , or may also be executed by the control system 106 in the vehicle 100 , and optionally, the second state of the target may also be sent to the obstacle The avoidance system 144 is used to plan the driving route of the vehicle 100 and the like.
图3是本申请实施例的目标物的感知方法的示意性流程图。图3所示的方法包括步骤310至步骤340。FIG. 3 is a schematic flowchart of a method for sensing a target according to an embodiment of the present application. The method shown in FIG. 3 includes steps 310 to 340 .
310,获取点云簇的多个特征点,其中,点云簇表示目标物,或者说,点云簇用于表示目标物的部分或全部轮廓或外形。310. Acquire multiple feature points of the point cloud cluster, where the point cloud cluster represents the target object, or in other words, the point cloud cluster is used to represent part or all of the contour or shape of the target object.
通常,为了便于特征点的获取,上述多个特征点可以为点云簇的轮廓点,例如,多个特征点为所述点云簇的多个端点。当然,上述多个特征点还可以包括点云簇的几何中心、重心等,本申请实施例对此不作限定。Generally, in order to facilitate the acquisition of feature points, the above-mentioned multiple feature points may be contour points of a point cloud cluster, for example, multiple feature points are multiple endpoints of the point cloud cluster. Of course, the above-mentioned multiple feature points may also include the geometric center, the center of gravity, etc. of the point cloud cluster, which is not limited in this embodiment of the present application.
上述端点又称为“兴趣点”通常指点云簇中的两个相邻边的交点。目前,可以采用现有的端点检测技术获取上述多个端点。The above-mentioned endpoints, also known as "interest points", usually refer to the intersection of two adjacent edges in a point cloud cluster. At present, the above-mentioned multiple endpoints can be acquired by using the existing endpoint detection technology.
可选地,上述点云簇可以基于现有的点云簇获取方案获得,例如,将激光雷达传感器作为采集设备时,可以通过激光雷达传感器发射和接收激光信号,利用发射和接收的时间差确定一发射角度对应的探测距离,通过多层扫描可以获得空间环境的三维点云。然后将获得的点云通过激光雷达驱动后转换成目标物感知所需要的格式,并持续地向控制器发送 点云数据。相应地,控制器可以对点云数据进行聚类,并根据聚类簇的聚类点数、尺寸等滤除不符合目标特征的聚类簇,剩下的就是目标物对应的点云簇。其中,常用的聚类方法有基于密度的聚类算法(density-based spatial clustering of applications with noise,DBSCAN)、K近邻法(K near neighbor,KNN)等。Optionally, the above-mentioned point cloud cluster can be obtained based on an existing point cloud cluster acquisition scheme. For example, when a lidar sensor is used as the acquisition device, the laser signal can be transmitted and received by the lidar sensor, and the time difference between transmission and reception can be used to determine a The detection distance corresponding to the emission angle, the three-dimensional point cloud of the space environment can be obtained through multi-layer scanning. Then, the obtained point cloud is converted into the format required by the target perception after being driven by the lidar, and the point cloud data is continuously sent to the controller. Correspondingly, the controller can cluster the point cloud data, and filter out the clusters that do not meet the target characteristics according to the number of clustering points and sizes of the clusters, and the rest are the point cloud clusters corresponding to the target. Among them, the commonly used clustering methods include density-based spatial clustering of applications with noise (DBSCAN), K nearest neighbor (KNN) and so on.
在获取了目标物的点云簇之后,控制器还可以利用L形(L-Shape)特征提取算法或者训练好的神经网络模型提取点云簇的朝向和形状,当然也可以根据目标物的历史轨迹或者历史运动方向等确定点云簇的朝向,然后根据朝向遍历点云簇中的点来计算形状。After acquiring the point cloud cluster of the target, the controller can also use the L-Shape feature extraction algorithm or the trained neural network model to extract the orientation and shape of the point cloud cluster. Of course, it can also be based on the history of the target. The orientation of the point cloud cluster is determined by the trajectory or the historical movement direction, and then the shape is calculated by traversing the points in the point cloud cluster according to the orientation.
通常,上述点云簇可以是通过拟合包围盒表示目标物,即通过拟合长方形将目标物框住。图4是本申请实施例中目标物对应的点云簇的示意图。从图4可以看出,目标物400对应的点云簇的轮廓为长为l,宽为w的长方形,该长方形包括4个端点即端点0、端点1、端点2、端点3,且长方形的几何中心的坐标为(x,y),坐标方位角为
Figure PCTCN2021106261-appb-000007
其中,坐标方位角为点云簇的朝向与坐标系中x轴的夹角,该坐标系为激光雷达的几何中心,则端点0的坐标为
Figure PCTCN2021106261-appb-000008
端点1的坐标为
Figure PCTCN2021106261-appb-000009
端点2的坐标为
Figure PCTCN2021106261-appb-000010
端点3的坐标为
Figure PCTCN2021106261-appb-000011
Generally, the above-mentioned point cloud cluster can represent the target by fitting a bounding box, that is, by fitting a rectangle to frame the target. FIG. 4 is a schematic diagram of a point cloud cluster corresponding to a target in an embodiment of the present application. As can be seen from FIG. 4 , the outline of the point cloud cluster corresponding to the target 400 is a rectangle with a length of l and a width of w. The rectangle includes four endpoints, namely endpoint 0, endpoint 1, endpoint 2, endpoint 3, and the rectangular The coordinates of the geometric center are (x, y), and the coordinate azimuth is
Figure PCTCN2021106261-appb-000007
Among them, the coordinate azimuth is the angle between the orientation of the point cloud cluster and the x-axis in the coordinate system, which is the geometric center of the lidar, and the coordinates of the endpoint 0 are
Figure PCTCN2021106261-appb-000008
The coordinates of endpoint 1 are
Figure PCTCN2021106261-appb-000009
The coordinates of endpoint 2 are
Figure PCTCN2021106261-appb-000010
The coordinates of endpoint 3 are
Figure PCTCN2021106261-appb-000011
需要说明的是,上述点云簇的长和宽可以通过多帧点云图像统计得出,当然上述点云簇的长和宽还可以基于上述端点的采集位置确定,本申请实施例对此不作限定。It should be noted that the length and width of the above-mentioned point cloud clusters can be obtained from the statistics of multi-frame point cloud images. Of course, the length and width of the above-mentioned point cloud clusters can also be determined based on the collection positions of the above-mentioned endpoints, which are not made in this embodiment of the present application. limited.
320,确定多个特征点中每个特征点的采集位置与每个特征点的不确定度,其中,不确定度用于指示通过采集设备采集每个特征点在点云簇中的位置时产生的误差。320. Determine the collection position of each feature point in the plurality of feature points and the uncertainty of each feature point, where the uncertainty is used to indicate that the position of each feature point in the point cloud cluster is collected by the collection device. error.
通常,采集设备在采集目标物时,由于采集设备本身会存在一定的固有误差,或者在采集目标物的过程中,特征点在点云簇中的位置是否可以被采集设备直接观测都会影响上述特征点对应的不确定度,因此,在本申请中可以基于与端点相连的边的是否可以被采集设备直接观测到来设置端点的不确定度。Usually, when the acquisition device collects the target, due to the inherent error of the acquisition device itself, or in the process of collecting the target, whether the position of the feature point in the point cloud cluster can be directly observed by the acquisition device will affect the above characteristics. Therefore, in this application, the uncertainty of the endpoint can be set based on whether the edge connected to the endpoint can be directly observed by the acquisition device.
即,上述步骤320包括:确定与多个端点中每个端点相连的边的类型,其中,边的类型包括采集设备直接观测到的可见边和采集设备无法直接观测的不可见边;基于与多个端点中每个端点相连的两个边的类型,确定多个端点中每个端点的不确定度。That is, the above-mentioned step 320 includes: determining the type of the edge connected to each of the multiple endpoints, wherein the type of the edge includes the visible edge directly observed by the collection device and the invisible edge that cannot be directly observed by the collection device; The type of two edges that connect each of the endpoints, determining the uncertainty for each of the multiple endpoints.
上述多个端点通常可以分为以下三种类型,第一类型的端点,与该类型的端点相连的两条边一条为可见边,一条为不可见边,例如,图4所示的端点0、端点2;第二类型的端点,与该类型的端点相连的两条边都为可见边,例如,图4所示的端点1;第三类型的端点,与该类型的端点相连的两条边都为不可见边,例如,图4所示的端点3。下文基于三种不同类型的端点分别介绍计算不确定度的方法。为了便于区分,下文第一端点属于第一类型的端点,第二端点属于第二类型的端点,第三端点属于第三类型的端点。The above-mentioned multiple endpoints can usually be divided into the following three types. For the first type of endpoint, one of the two edges connected to the endpoint of this type is a visible edge, and the other is an invisible edge. Endpoint 2; Endpoint of the second type, both edges connected to the endpoint of this type are visible edges, for example, Endpoint 1 shown in Figure 4; Endpoint of the third type, two edges connected to the endpoint of this type Both are invisible edges, for example, endpoint 3 shown in Figure 4. The methods for calculating uncertainty based on the three different types of endpoints are described below. For the convenience of distinction, the first endpoint belongs to the first type of endpoint, the second endpoint belongs to the second type of endpoint, and the third endpoint belongs to the third type of endpoint.
对于第一端点而言,与第一端点相连的第一边的类型为可见边,与第一端点相连的第二边的类型为不可见边,而第一端点的实际位置通常位于可见边(即第一边)的延长线上,因此,第一端点的不确定度是基于采集设备的检测不确定度在目标物的朝向上的分量确定的。For the first endpoint, the type of the first edge connected to the first endpoint is a visible edge, the type of the second edge connected to the first endpoint is an invisible edge, and the actual location of the first endpoint is usually Located on the extension line of the visible side (ie, the first side), the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target.
上述第一端点的不确定度是基于采集设备的检测不确定度在目标物的朝向上的分量确定的,可以理解为,确定第一端点的不确定度的过程中,考虑了采集设备的检测不确定 度在目标物的朝向上的分量,以及其他因素对第一端点的不确定度的影响。上述第一端点的不确定度是基于采集设备的检测不确定度在目标物的朝向上的分量确定的,还可以理解为,直接将采集设备的检测不确定度在目标物的朝向上的分量作为第一端点的不确定度,本申请实施例对此不作限定。The uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target. It can be understood that in the process of determining the uncertainty of the first endpoint, the acquisition device is considered. The component of the detection uncertainty in the orientation of the target, and the influence of other factors on the uncertainty of the first endpoint. The uncertainty of the above-mentioned first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target. It can also be understood that the detection uncertainty of the acquisition device is directly related to the orientation of the target The component is used as the uncertainty of the first endpoint, which is not limited in this embodiment of the present application.
可选地,可以将采集设备的固有不确定度投影至目标物的朝向,并将投影后得到的不确定度作为第一端点的不确定度。即,第一端点的不确定度d 1通过公式
Figure PCTCN2021106261-appb-000012
确定,其中,R 1表示采集设备与第一端点之间的测量距离,θ 1表示采集设备采集第一端点时的坐标方位角;
Figure PCTCN2021106261-appb-000013
表示目标物的朝向的方位角;C 0为预设值,且与采集设备的采集精度负相关,单位为弧度(rad)。
Optionally, the inherent uncertainty of the acquisition device may be projected to the orientation of the target, and the uncertainty obtained after the projection may be used as the uncertainty of the first endpoint. That is, the uncertainty d1 of the first endpoint is obtained by the formula
Figure PCTCN2021106261-appb-000012
Determine, wherein, R 1 represents the measurement distance between the collection device and the first endpoint, and θ 1 represents the coordinate azimuth angle when the collection device collects the first endpoint;
Figure PCTCN2021106261-appb-000013
Indicates the azimuth angle of the orientation of the target; C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radian (rad).
需要说明的是,上文中的坐标方位角可以理解为坐标系中目标物的第一端点与采集设备之间的连线和x轴之间的夹角。上文中目标物的朝向的方位角可以理解为是以目标物的朝向的方向为起点,依顺时针方向到x轴之间的水平夹角。It should be noted that, the coordinate azimuth angle in the above can be understood as the angle between the connection line between the first end point of the target object and the acquisition device and the x-axis in the coordinate system. The azimuth angle of the orientation of the target object in the above can be understood as the horizontal angle between the clockwise direction and the x-axis with the orientation of the target object as the starting point.
可选地,若采集设备为激光雷达时,C 0表示激光扫描的不确定度,可以根据激光的扫描分辨率来设定,与激光的扫描分辨率成正比例。例如,激光的扫描分辨率为0.2°时,则可以设C 0为0.01。 Optionally, if the acquisition device is a laser radar, C 0 represents the uncertainty of laser scanning, which can be set according to the scanning resolution of the laser, which is proportional to the scanning resolution of the laser. For example, when the scanning resolution of the laser is 0.2°, C 0 can be set to 0.01.
通常,由于后续确定目标物的位置时,可以通过目标物在坐标系中的位置表示,因此,为了便于后续计算,可以将上述第一端点的不确定度投影至x轴和y轴,即第一端点在x轴上的不确定度D 1x
Figure PCTCN2021106261-appb-000014
第一端点在y轴上的不确定度D 1y
Figure PCTCN2021106261-appb-000015
其中,D x0表示x轴方向上的初始不确定度;D y0表示y轴方向上的初始不确定度。
Usually, when the position of the target object is subsequently determined, it can be represented by the position of the target object in the coordinate system. Therefore, in order to facilitate subsequent calculations, the uncertainty of the first endpoint can be projected to the x-axis and the y-axis, that is, The uncertainty D 1x of the first endpoint on the x-axis is
Figure PCTCN2021106261-appb-000014
The uncertainty D 1y of the first endpoint on the y-axis is
Figure PCTCN2021106261-appb-000015
Among them, D x0 represents the initial uncertainty in the x-axis direction; D y0 represents the initial uncertainty in the y-axis direction.
需要说明的是,D x0和D y0与第一不确定度C 0和/或采集设备的扫描精度相关。 It should be noted that D x0 and D y0 are related to the first uncertainty C 0 and/or the scanning accuracy of the acquisition device.
下文结合图5,以端点0作为第一端点,介绍第一端点的不确定度的计算方案。图5是本申请实施例的坐标系中目标物400与采集设备500之间位置关系的示意图。In the following, with reference to FIG. 5 , taking endpoint 0 as the first endpoint, a solution for calculating the uncertainty of the first endpoint is introduced. FIG. 5 is a schematic diagram of the positional relationship between the target object 400 and the acquisition device 500 in the coordinate system according to the embodiment of the present application.
假设图5所示的坐标系以传感器的几何中心为坐标原点,其中,x轴和y轴的正方向如图所示,且以逆时针方向为正方向,则采集设备500的朝向的方位角即为θ 1',采集设备500的固有不确定度为C 0'(rad)。采集设备500与端点0之间的测量距离为R 1',由于与端点0相连的可见边与目标物400的朝向平行,因此,目标物400朝向的方位角等于可见边与x轴之间的夹角
Figure PCTCN2021106261-appb-000016
则表示采集设备500与端点0之间的测量距离R 1的直线与目标物的朝向之间的夹角为
Figure PCTCN2021106261-appb-000017
将采集设备500的固有不确定度C 0'投影至目标物的朝向后,得到端点0的不确定度d 1'为
Figure PCTCN2021106261-appb-000018
Assuming that the coordinate system shown in FIG. 5 takes the geometric center of the sensor as the coordinate origin, the positive directions of the x-axis and the y-axis are as shown in the figure, and the counterclockwise direction is the positive direction, the azimuth angle of the orientation of the acquisition device 500 That is, θ 1 ′, and the inherent uncertainty of the acquisition device 500 is C 0 ′(rad). The measured distance between the acquisition device 500 and the endpoint 0 is R 1 ′. Since the visible edge connected to the endpoint 0 is parallel to the orientation of the target 400, the azimuth of the orientation of the object 400 is equal to the distance between the visible edge and the x-axis. included angle
Figure PCTCN2021106261-appb-000016
Then the angle between the straight line representing the measurement distance R 1 between the acquisition device 500 and the endpoint 0 and the orientation of the target is
Figure PCTCN2021106261-appb-000017
After projecting the inherent uncertainty C 0 ' of the acquisition device 500 to the orientation of the target, the uncertainty d 1 ' of the endpoint 0 is obtained as
Figure PCTCN2021106261-appb-000018
则端点0的不确定度d 1'在x轴上的分量d 1x'为
Figure PCTCN2021106261-appb-000019
端点0的不确定度d 1'在y轴上的分量d 1y'为
Figure PCTCN2021106261-appb-000020
其中,D x0'表示x轴方向上的初始不确定度;D y0'表示y轴方向上的初始不确定度。
Then the component d 1x ' of the uncertainty d 1 ' of the endpoint 0 on the x-axis is
Figure PCTCN2021106261-appb-000019
The y-axis component d 1y ' of the uncertainty d 1 ' at endpoint 0 is
Figure PCTCN2021106261-appb-000020
Wherein, D x0 ' represents the initial uncertainty in the x-axis direction; D y0 ' represents the initial uncertainty in the y-axis direction.
需要说明的是,图5中的端点2也属于上述第一端点,可以采用上述第一端点的不确定度的计算方式,为了简洁,在此不再赘述。It should be noted that the endpoint 2 in FIG. 5 also belongs to the above-mentioned first endpoint, and the calculation method of the uncertainty of the above-mentioned first endpoint can be used, which is not repeated here for brevity.
对于第二端点而言,由于与第二端点相连的两条边的类型都为可见边,影响第二端点的不确定度的因素通常为第二端点与采集设备之间的测量距离,其中,第二端点与采集设 备之间的测量距离与第二端点的不确定度正相关。因此,可以基于坐标系中第二端点与采集设备之间的测量距离确定第二端点的不确定度。For the second end point, since the types of the two edges connected to the second end point are both visible edges, the factor affecting the uncertainty of the second end point is usually the measurement distance between the second end point and the acquisition device, wherein, The measurement distance between the second endpoint and the acquisition device is positively related to the uncertainty of the second endpoint. Therefore, the uncertainty of the second end point can be determined based on the measured distance between the second end point and the acquisition device in the coordinate system.
可选地,第二端点的不确定度d 2通过公式d 2=R 2×C 1确定,其中,R 2表示采集设备与第二端点之间的测量距离,C 1表示预设的不确定度,单位为弧度。 Optionally, the uncertainty d 2 of the second endpoint is determined by the formula d 2 =R 2 ×C 1 , where R 2 represents the measurement distance between the acquisition device and the second endpoint, and C 1 represents a preset uncertainty degrees in radians.
可选地,上述C 1可以根据观测目标物的水平张角设定,与观测目标物的水平张角成正比例。例如,观测目标物的水平张角为10°,可以设定C 1为0.17。当然,上述第二不确定度还可以与第一不确定度相同,本申请实施例对此不作限定。 Optionally, the above C 1 may be set according to the horizontal opening angle of the observation target object, and is proportional to the horizontal opening angle of the observation target object. For example, if the horizontal opening angle of the observation target is 10°, C 1 can be set to 0.17. Of course, the above-mentioned second uncertainty may also be the same as the first uncertainty, which is not limited in this embodiment of the present application.
下文结合图6,以端点1作为第二端点,介绍第二端点的不确定度的计算方案。图6是本申请另一实施例的坐标系中目标物400与采集设备500之间位置关系的示意图。In the following, with reference to FIG. 6 , taking endpoint 1 as the second endpoint, the calculation scheme of the uncertainty of the second endpoint is introduced. FIG. 6 is a schematic diagram of the positional relationship between the target 400 and the acquisition device 500 in the coordinate system according to another embodiment of the present application.
假设图6所示的坐标系以传感器的几何中心为坐标原点,其中,x轴和y轴的正方向如图所示,且以逆时针方向为正方向,则采集设备500的朝向的方位角即为θ 2',采集设备500的固有不确定度为C 1'(rad)。采集设备500与端点1之间的测量距离为R 2',d 2'则端点1的不确定度d 2'为d 2'=R 2'×C 1'。 Assuming that the coordinate system shown in FIG. 6 takes the geometric center of the sensor as the coordinate origin, wherein the positive directions of the x-axis and the y-axis are as shown in the figure, and the counterclockwise direction is the positive direction, the azimuth angle of the orientation of the acquisition device 500 That is, θ 2 ′, and the inherent uncertainty of the acquisition device 500 is C 1 ′(rad). The measurement distance between the acquisition device 500 and the endpoint 1 is R 2 ', and d 2 ', the uncertainty d 2 ' of the endpoint 1 is d 2 '=R 2 '×C 1 '.
相应地,端点1的不确定度d 2'在x轴上的分量d 2x'为d 2x'=D x0'+d 2'cos(θ 2');端点1的不确定度d 2'在y轴上的分量d 2y'为d 2y'=D y0'+d 2'sin(|θ 2'|),其中,D x0'表示x轴方向上的初始不确定度;D y0'表示y轴方向上的初始不确定度。 Correspondingly, the component d 2x ' of the uncertainty d 2 ' of the endpoint 1 on the x-axis is d 2x '=D x0 '+d 2 'cos(θ 2 '); the uncertainty d 2 ' of the endpoint 1 is in The component d 2y ' on the y-axis is d 2y '=D y0 '+d 2 'sin(|θ 2 '|), where D x0 ' represents the initial uncertainty in the x-axis direction; D y0 ' represents y The initial uncertainty in the axial direction.
对于第三端点而言,由于与第三端点相连的两条边的类型都为不可见边,则可以设定第三端点的不确定度大于第一端点的不确定度以及第二端点的不确定度,例如,可以设为无穷大。相应地,可以设定第三端点的不确定度在坐标系的x方向和y方向的分量,也大于第一端点的不确定度在坐标系的x方向和y方向的分量,以及第二端点的不确定度在坐标系的x方向和y方向的分量。For the third endpoint, since the two edges connected to the third endpoint are both invisible edges, the uncertainty of the third endpoint can be set larger than that of the first endpoint and the uncertainty of the second endpoint. The uncertainty, for example, can be set to infinity. Correspondingly, it can be set that the components of the uncertainty of the third endpoint in the x and y directions of the coordinate system are also larger than the components of the uncertainty of the first endpoint in the x and y directions of the coordinate system, and the second The components of the uncertainty of the endpoint in the x and y directions of the coordinate system.
在一些情况下,上述目标物的点云簇只能表示目标物的部分外形或轮廓,有可能获取的端点并不是目标物的实际端点,目标物的实际端点的位置被其他物体遮挡。在这种情况下,为了提高确定端点的不确定度的准确率,本申请还提供了一种计算端点的不确定度的方法,下文结合图7介绍。应理解,在上述这种目标物体被其他物体遮挡的情况下,也可以直接按照上文介绍的第一端点、第二端点以及第三端点的不确定度的计算方式计算,本申请实施例对此不作限定。In some cases, the point cloud cluster of the target object can only represent part of the shape or outline of the target object. It is possible that the obtained endpoint is not the actual endpoint of the target object, and the actual endpoint position of the target object is occluded by other objects. In this case, in order to improve the accuracy of determining the uncertainty of the endpoint, the present application also provides a method for calculating the uncertainty of the endpoint, which is described below with reference to FIG. 7 . It should be understood that in the case that the above-mentioned target object is blocked by other objects, it can also be directly calculated according to the calculation method of the uncertainty of the first endpoint, the second endpoint and the third endpoint introduced above. This is not limited.
上述目标物和采集装置之间是否有其他物体遮挡,可以通过生成环境地图的来判断。通过采集装置对周围进行扫描,获取包含目标物的环境地图,并按照预设的角度(例如,采集装置的方位角)对环境地图进行分割,每个分割后的空间的特征包含每个空间对应的方位角以及该方位角对应的距离采集设备最近的物体的测量距离以及物体的编号。其中,方位角对应的最近物体的测量距离以及物体的编号可以通过以下方式获取:根据环境地图中物体的点云簇计算物体的最小外接凸多边形,遍历环境地图中所有物体的外接凸多边形,获得环境地图中每个方位角对应的距离采集设备最近的物体的测量距离,以及该最近的物体的编号。Whether there are other objects occluded between the above-mentioned target object and the collection device can be determined by generating an environment map. The surroundings are scanned by the acquisition device to obtain an environmental map containing the target, and the environmental map is segmented according to a preset angle (for example, the azimuth of the acquisition device), and the features of each segmented space include the corresponding The azimuth angle and the measured distance of the object closest to the acquisition device corresponding to the azimuth angle and the number of the object. Among them, the measured distance of the nearest object corresponding to the azimuth angle and the number of the object can be obtained by: calculating the minimum circumscribed convex polygon of the object according to the point cloud cluster of the object in the environment map, traversing the circumscribed convex polygon of all objects in the environment map, and obtaining The measured distance of the object closest to the acquisition device corresponding to each azimuth angle in the environment map, and the number of the closest object.
然后,基于目标物的历史数据确定当前目标物的每个端点对应的参考点,并在上述环境地图中将每个端点对应的参考点标出,并结合环境地图中的每个方位角对应的距离采集设备最近的物体的测量距离,以及该最近的物体的编号,确定目标物的端点对应的参考点是否被遮挡。当采集设备到某个端点对应的参考点的测量距离,等于该端点对应的方位角 中对应的距离采集设备最近的物体的测量距离,则该端点和采集设备之间没有被其他物体遮挡。当采集设备到某个端点对应的参考点的测量距离,大于该端点对应的方位角中对应的距离采集设备最近的物体的测量距离,则该端点和采集设备之间被其他物体遮挡。Then, the reference point corresponding to each endpoint of the current object is determined based on the historical data of the object, and the reference point corresponding to each endpoint is marked in the above environmental map, and combined with the corresponding reference point of each azimuth in the environmental map The measured distance of the object closest to the acquisition device and the serial number of the closest object determine whether the reference point corresponding to the end point of the target object is blocked. When the measurement distance from the acquisition device to the reference point corresponding to an endpoint is equal to the measurement distance of the object closest to the acquisition device in the azimuth angle corresponding to the endpoint, then the endpoint and the acquisition device are not blocked by other objects. When the measurement distance from the acquisition device to the reference point corresponding to an endpoint is greater than the measurement distance of the object closest to the acquisition device in the azimuth angle corresponding to the endpoint, the endpoint and the acquisition device are blocked by other objects.
需要说明的是,上述目标物的历史数据可以是在获取目标物的点云簇之前的扫描过程中,获取的目标物的特征,例如,目标物的长、宽等参数,又例如,目标物的各个端点的坐标等。It should be noted that the historical data of the target object may be the characteristics of the target object obtained during the scanning process before the point cloud cluster of the target object is obtained, for example, parameters such as the length and width of the target object, and for example, the target object The coordinates of each endpoint of , etc.
例如,图7是本申请实施例的环境地图的示意图。在图7所示的环境地图700中,采集装置500按照预设的方位角对周围进行扫描,获取包含目标物400的环境地图,并按照预设的角度(例如,采集装置的方位角)对环境地图进行分割,每个分割后的空间的特征包含每个空间对应的方位角以及该方位角对应的距离采集设备最近的物体的测量距离以及物体的编号。其中,方位角对应的最近物体的测量距离以及物体的编号可以通过以下方式获取:根据环境地图中物体710和目标物400的点云簇计算物体的最小外接凸多边形,遍历环境地图中所有物体的外接凸多边形,即物体710和目标物400,获得环境地图中每个方位角对应的距离采集设备最近的物体的测量距离以及该最近的物体的编号。For example, FIG. 7 is a schematic diagram of an environment map according to an embodiment of the present application. In the environment map 700 shown in FIG. 7 , the collection device 500 scans the surroundings according to a preset azimuth angle, acquires an environment map including the target 400 , and scans the surroundings according to a preset angle (for example, the azimuth angle of the collection device) The environment map is segmented, and the features of each segmented space include the azimuth angle corresponding to each space, the measured distance of the object closest to the collection device corresponding to the azimuth angle, and the number of the object. Among them, the measurement distance of the nearest object corresponding to the azimuth angle and the number of the object can be obtained by the following methods: Calculate the minimum circumscribed convex polygon of the object according to the point cloud clusters of the object 710 and the target object 400 in the environment map, and traverse the minimum circumscribed convex polygon of all objects in the environment map. The circumscribed convex polygons, that is, the object 710 and the target object 400, obtain the measured distance of the object closest to the acquisition device corresponding to each azimuth angle in the environment map and the number of the closest object.
然后,基于目标物的历史数据确定当前目标物的端点0对应的参考点1的位置,并在上述环境地图中将端点0对应的参考点1标出,确定采集设备到参考点1之间的测量距离S,并确定端点0对应的方位角1对应的分割空间中,距离采集设备最近的物体的测量距离S min,以及该最近的物体的编号。参见图7,S min<S,则参考点1和采集设备之间被物体710遮挡。 Then, determine the position of the reference point 1 corresponding to the endpoint 0 of the current object based on the historical data of the target object, and mark the reference point 1 corresponding to the endpoint 0 in the above environmental map, and determine the distance between the acquisition device and the reference point 1. Measure the distance S, and determine the measured distance S min of the object closest to the acquisition device in the segmented space corresponding to the azimuth 1 corresponding to the endpoint 0, and the number of the closest object. Referring to FIG. 7 , S min <S, the object 710 blocks the space between the reference point 1 and the acquisition device.
在按照上述方式分别确定目标物的端点是否被其他物体遮挡之后,可以按照上文中的端点的类型分别计算端点的不确定度。After determining whether the endpoints of the target object are occluded by other objects in the above manner, the uncertainty of the endpoints can be calculated according to the types of endpoints above.
对于第一类端点而言,若第一参考点被其他物体遮挡,则基于第一参考点对应的水平张角与第一端点对应的水平张角的变化程度,确定第一端点的不确定度。For the first type of endpoint, if the first reference point is occluded by other objects, based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, determine whether the first endpoint is not certainty.
上述第一端点对应的水平张角,可以理解为采集设备以第一端点作为目标物的端点时,采集整个目标物使用的水平张角。The above-mentioned horizontal opening angle corresponding to the first end point can be understood as the horizontal opening angle used by the collecting device to collect the entire target object when the first end point is used as the end point of the target object.
上述第一参考点对应的水平张角,可以理解为采集设备以第一参考点作为目标物的端点时,采集整个目标物使用的水平张角。The horizontal opening angle corresponding to the above-mentioned first reference point can be understood as the horizontal opening angle used by the collecting device to collect the entire target object when the first reference point is used as the end point of the target object.
可选地,基于第一参考点对应的水平张角与第一端点对应的水平张角的差δ,通过公式
Figure PCTCN2021106261-appb-000021
确定第一端点的不确定度d 3,其中,R 1表示采集设备与第一端点之间的测量距离,C 0为预设值,与采集设备的采集精度负相关,单位为弧度;θ 1表示采集设备采集第一端点时的坐标方位角;
Figure PCTCN2021106261-appb-000022
表示目标物的朝向的方位角。
Optionally, based on the difference δ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, by formula
Figure PCTCN2021106261-appb-000021
Determine the uncertainty d 3 of the first endpoint, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radian; θ 1 represents the coordinate azimuth when the acquisition device collects the first endpoint;
Figure PCTCN2021106261-appb-000022
Indicates the azimuth of the orientation of the target.
参见图7,端点0对应的水平张角为δ 0,参考点1对应的水平张角为δ 1,则参考点1对应的水平张角与端点0对应的水平张角的差δ为δ=|δ 1|-|δ 0|。 Referring to FIG. 7 , the horizontal expansion angle corresponding to endpoint 0 is δ 0 , the horizontal expansion angle corresponding to reference point 1 is δ 1 , the difference δ between the horizontal expansion angle corresponding to reference point 1 and the horizontal expansion angle corresponding to endpoint 0 is δ= |δ 1 |-|δ 0 |.
如上文所述,为了便于后续计算,可以将上述第一端点的不确定度d 3投影至x轴和y轴,即第一端点在x轴上的不确定度D 3x
Figure PCTCN2021106261-appb-000023
第一端点在y轴上的不确定度D 3y
Figure PCTCN2021106261-appb-000024
其中,D x0表示x轴方向上的初始不确定度;D y0表示y轴方向上的初始不确定度。
As mentioned above, in order to facilitate subsequent calculations, the uncertainty d 3 of the first endpoint can be projected to the x-axis and the y-axis, that is, the uncertainty D 3x of the first endpoint on the x-axis is
Figure PCTCN2021106261-appb-000023
The uncertainty D 3y of the first endpoint on the y-axis is
Figure PCTCN2021106261-appb-000024
Among them, D x0 represents the initial uncertainty in the x-axis direction; D y0 represents the initial uncertainty in the y-axis direction.
由于第一端点的位置会影响第二端点的位置,若第一参考点被其他物体遮挡,在一定 程度上会影响第二端点的位置的确定,因此,第二端点的不确定度与上述基于第一参考点对应的水平张角与第一端点对应的水平张角的变化程度正相关。Since the position of the first end point will affect the position of the second end point, if the first reference point is blocked by other objects, it will affect the determination of the position of the second end point to a certain extent. Therefore, the uncertainty of the second end point is the same as the above The degree of change based on the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint is positively correlated.
可选地,若第一参考点被其他物体遮挡,基于第一参考点对应的水平张角与第一端点对应的水平张角的差δ,通过公式d 4=D 0+R 2×(C 1+δ),确定第二端点的不确定度d 4,其中,R 2表示采集设备与第二端点之间的测量距离,C 1表示预设的不确定度,单位为弧度(rad);D 0表示预设的第二端点的初始不确定度。 Optionally, if the first reference point is blocked by other objects, based on the difference δ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, the formula d 4 =D 0 +R 2 ×( C 1 +δ), determine the uncertainty d 4 of the second endpoint, where R 2 represents the measurement distance between the acquisition device and the second endpoint, and C 1 represents the preset uncertainty, in radians (rad) ; D 0 represents the initial uncertainty of the preset second endpoint.
如上文所述,为了便于后续计算,可以将上述第二端点的不确定度d 4投影至x轴和y轴,即第一端点在x轴上的不确定度D 4x
Figure PCTCN2021106261-appb-000025
第一端点在y轴上的不确定度D 3y
Figure PCTCN2021106261-appb-000026
其中,D x0表示x轴方向上的初始不确定度;D y0表示y轴方向上的初始不确定度。
As mentioned above, in order to facilitate subsequent calculations, the uncertainty d 4 of the second endpoint can be projected to the x-axis and the y-axis, that is, the uncertainty D 4x of the first endpoint on the x-axis is
Figure PCTCN2021106261-appb-000025
The uncertainty D 3y of the first endpoint on the y-axis is
Figure PCTCN2021106261-appb-000026
Among them, D x0 represents the initial uncertainty in the x-axis direction; D y0 represents the initial uncertainty in the y-axis direction.
330,基于多个特征点中每个特征点的状态,计算多个特征点中每个特征点对应的目标物的第一状态,每个特征点的状态包括每个特征点的位置和/或速度,第一状态包括目标物的第一速度和/或第一位置。330, based on the state of each feature point in the multiple feature points, calculate the first state of the target object corresponding to each feature point in the multiple feature points, and the state of each feature point includes the position of each feature point and/or Velocity, the first state includes a first velocity and/or a first position of the object.
上述目标物的第一状态可以理解为目标物的几何中心的位置或速度。The above-mentioned first state of the target can be understood as the position or velocity of the geometric center of the target.
例如,参见图7,假设端点0的位置为[x 端点0,y 端点0],目标物的朝向的方位角为
Figure PCTCN2021106261-appb-000027
长和宽分别为l和w,则端点0对应的目标物中心位置为:
For example, referring to Figure 7, assuming that the position of endpoint 0 is [x endpoint 0 , y endpoint 0 ], the azimuth of the orientation of the target is
Figure PCTCN2021106261-appb-000027
The length and width are l and w respectively, then the center position of the target corresponding to endpoint 0 is:
Figure PCTCN2021106261-appb-000028
Figure PCTCN2021106261-appb-000028
340,基于多个特征点中每个特征点对应的目标物的第一状态,以及多个特征点中每个特征点对应的不确定度,确定目标物的第二状态,第二状态包括目标物的第二速度和/或第二位置。340. Determine a second state of the target based on the first state of the target corresponding to each of the multiple feature points and the uncertainty corresponding to each of the multiple feature points, where the second state includes the target second velocity and/or second position of the object.
可选地,上述步骤340包括:基于多个特征点中每个特征点对应的不确定度,确定多个特征点中每个特征点对应的置信度;基于多个特征点中每个特征点对应的目标物的第一状态,以及多个特征点中每个特征点对应的置信度,确定目标物的第二状态。Optionally, the above step 340 includes: based on the uncertainty corresponding to each feature point in the plurality of feature points, determining the confidence level corresponding to each feature point in the plurality of feature points; based on each feature point in the plurality of feature points The corresponding first state of the target object and the confidence level corresponding to each feature point in the plurality of feature points determine the second state of the target object.
可选地,上述基于多个特征点中每个特征点对应的不确定度,确定多个特征点中每个特征点对应的置信,包括:基于多个特征点中每个特征点对应的不确定度,通过公式
Figure PCTCN2021106261-appb-000029
确定多个特征点中第k个特征点对应的置信度M k,其中,k表示多个特征点中的第k个特征点,k=1……n,n为多个特征点的总数;d k表示第k个特征点的不确定度;Δ k表示第k个特征点的历史状态与第一状态之间的变化;C 3、C 4为预设值。
Optionally, determining the confidence corresponding to each feature point in the plurality of feature points based on the uncertainty corresponding to each feature point in the plurality of feature points includes: based on the uncertainty corresponding to each feature point in the plurality of feature points. certainty, via the formula
Figure PCTCN2021106261-appb-000029
Determine the confidence level M k corresponding to the k th feature point in the multiple feature points, wherein k represents the k th feature point in the multiple feature points, k=1...n, n is the total number of multiple feature points; d k represents the uncertainty of the k-th feature point; Δ k represents the change between the historical state of the k-th feature point and the first state; C 3 and C 4 are preset values.
应理解,通常通过设置C 3、C 4的取值调节d k和Δ k在计算置信度M k时的比重。例如,可以设置C 3、C 4的取值为0.5。 It should be understood that the weights of d k and Δ k in calculating the confidence level M k are usually adjusted by setting the values of C 3 and C 4 . For example, the values of C 3 and C 4 can be set to 0.5.
如上文所述,为了便于后续计算,可以将上述置信度分为x轴方向上的置信度,以及y方向上的置信度。即第k个特征点在x轴方向上的置信度M kx
Figure PCTCN2021106261-appb-000030
第k个特征点在y轴方向上的置信度M ky
Figure PCTCN2021106261-appb-000031
其中,d kx表示第k个特征点的在x轴方向上的不确定度;Δ kx表示第k个特征点的在x轴方向上历史状态与第一状态之间的变化;d ky表示第k个特征点的在y轴方向上的不确定度;Δ ky表示第k个特征点的在y轴方向上历史状态与第一状态之间的变化。
As described above, in order to facilitate subsequent calculations, the above confidence may be divided into a confidence in the x-axis direction and a confidence in the y-direction. That is, the confidence level M kx of the k-th feature point in the x-axis direction is
Figure PCTCN2021106261-appb-000030
The confidence level M ky of the k-th feature point in the y-axis direction is
Figure PCTCN2021106261-appb-000031
Among them, d kx represents the uncertainty of the k-th feature point in the x-axis direction; Δ kx represents the change between the historical state and the first state of the k-th feature point in the x-axis direction; d ky represents the first state Uncertainty of the k feature points in the y-axis direction; Δky represents the change between the historical state and the first state of the k-th feature point in the y-axis direction.
可选地,上述步骤340可以通过以下公式表达:假设目标物的第二状态中,目标物的 位置为[X,Y],目标物的速度表示为[V X,V Y]则
Figure PCTCN2021106261-appb-000032
Figure PCTCN2021106261-appb-000033
其中,k表示多个特征点中的第k个特征点,k=1……n,n为多个特征点的总数;M kx表示第k个特征点在x轴方向上的置信度;M ky表示第k个特征点在y轴方向上的置信度;T表示n个特征点的置信度的总和;x center,k表示第k个特征点对应的目标物的几何中心的x坐标;y center,k表示第k个特征点对应的目标物的几何中心的y坐标;v x,k表示第k个特征点对应的目标物的几何中心在x轴上的速度分量;v y,k表示第k个特征点对应的目标物的几何中心在y轴上的速度分量。
Optionally, the above step 340 can be expressed by the following formula: Assuming that in the second state of the target, the position of the target is [X, Y], and the speed of the target is [V X , V Y ], then
Figure PCTCN2021106261-appb-000032
Figure PCTCN2021106261-appb-000033
Among them, k represents the k-th feature point among multiple feature points, k=1...n, n is the total number of multiple feature points; M kx represents the confidence of the k-th feature point in the x-axis direction; M ky represents the confidence of the k-th feature point in the y-axis direction; T represents the sum of the confidence of the n feature points; x center, k represents the x-coordinate of the geometric center of the target corresponding to the k-th feature point; y center,k represents the y-coordinate of the geometric center of the target object corresponding to the kth feature point; v x,k represents the velocity component of the geometric center of the target object corresponding to the kth feature point on the x-axis; v y,k represents The velocity component on the y-axis of the geometric center of the target corresponding to the k-th feature point.
通常,为了提高各个特征点的状态的准确度,需要通过多轮计算各个特征点的状态,并综合考虑多轮的计算结果。即在计算上述各个特征点的不确定度以及各个特征点的状态之后(即步骤330之后),可以将本轮各个特征点(又称“观测特征点”)与历史计算的特征点(又称“已跟踪特征点”)进行关联,并更新目标物中各个特征点的状态。具体地,根据各个特征点的位置、朝向等信息,将各个观测特征点与已跟踪的特征点进行数据关联,关联后根据观测特征点的状态和已跟踪特征点的状态,更新各个特征点的状态,获得更新后的各个特征点状态。Generally, in order to improve the accuracy of the state of each feature point, it is necessary to calculate the state of each feature point through multiple rounds, and comprehensively consider the calculation results of the multiple rounds. That is, after calculating the uncertainty of each feature point and the state of each feature point (that is, after step 330), each feature point of this round (also called "observation feature point") can be compared with the historically calculated feature point (also called "observation feature point"). "Tracked Feature Points") to associate and update the status of each feature point in the target. Specifically, according to the position, orientation and other information of each feature point, each observed feature point is associated with the tracked feature point. state to obtain the updated state of each feature point.
应理解,将上述观测特征点和已跟踪特征点进行关联的方法有很多种,例如,可以采用基于最近邻匹配的数据关联,通过计算各个观测特征点与已跟踪特征点的距离来关联。也可以采用基于朝向匹配的方法,通过比较各个观测特征点的朝向与已跟踪特征点的相对目标物的中心的朝向角来关联。本申请实施例对此不作具体限定。It should be understood that there are many methods for associating the above observed feature points with tracked feature points. For example, data association based on nearest neighbor matching can be used to associate by calculating the distance between each observed feature point and tracked feature point. Orientation matching-based methods can also be used to correlate by comparing the orientation of each observed feature point with the orientation angle of the tracked feature point relative to the center of the target. This embodiment of the present application does not specifically limit this.
还应理解,上述更新各个特征点的状态的方法有很多种,例如,可以采用基于卡尔曼滤波、拓展卡尔曼滤波等更新各个特征点的状态,也可以是基于贝叶斯推理计算最大后验概率更新各个特征点的状态,本申请实施例对此不作限定。It should also be understood that there are many methods for updating the state of each feature point. For example, the state of each feature point can be updated based on Kalman filtering, extended Kalman filtering, etc., or the maximum a posteriori can be calculated based on Bayesian reasoning. The state of each feature point is updated with probability, which is not limited in this embodiment of the present application.
在本申请实施例中,通过将上述观测特征点和已跟踪特征点进行关联,以对目标物的特征点的状态进行更新,有利于提高各个特征点的状态的准确度。当然,也可以不对各个特征点进行状态更新,而直接将本轮观测特征点的状态确定目标物的状态。In the embodiment of the present application, by associating the observed feature points with the tracked feature points to update the state of the feature points of the target object, it is beneficial to improve the accuracy of the state of each feature point. Of course, the state of each feature point may not be updated, and the state of the feature point observed in the current round may be directly determined as the state of the target.
上文结合图1至图7介绍了本申请实施例的目标物的感知方法,下文结合图8至图9介绍本申请实施例的装置。应理解,需要说明的是,图8至图9所示的装置可以实现上述方法中各个步骤,为了简洁,在此不再赘述。The method for sensing the object of the embodiment of the present application is described above with reference to FIGS. 1 to 7 , and the apparatus of the embodiment of the present application is described below with reference to FIGS. 8 to 9 . It should be understood that it should be noted that the apparatuses shown in FIG. 8 to FIG. 9 can implement each step in the above method, which is not repeated here for brevity.
图8是本申请实施例的目标物的感知装置的示意图。图8所示的装置800包括:获取单元810和处理单元820。可选地,上述装置800可以是图1中运行自动驾驶系统的装置,上述装置800还可以是图2所示的运行控制系统的装置,本申请实施例对此不作具体限定。FIG. 8 is a schematic diagram of a device for sensing a target according to an embodiment of the present application. The apparatus 800 shown in FIG. 8 includes: an obtaining unit 810 and a processing unit 820 . Optionally, the foregoing apparatus 800 may be the apparatus for running the automatic driving system in FIG. 1 , and the foregoing apparatus 800 may also be the apparatus for running the control system shown in FIG. 2 , which is not specifically limited in this embodiment of the present application.
上述获取单元810,用于获取点云簇的多个特征点,点云簇表示目标物。The above obtaining unit 810 is configured to obtain a plurality of feature points of a point cloud cluster, and the point cloud cluster represents a target object.
上述处理单元820,用于确定多个特征点中每个特征点的不确定度,其中,不确定度用于指示通过采集设备采集每个特征点在点云簇中的位置时产生的误差。The above-mentioned processing unit 820 is configured to determine the uncertainty of each feature point in the plurality of feature points, wherein the uncertainty is used to indicate the error generated when the position of each feature point in the point cloud cluster is collected by the collection device.
上述处理单元820,还用于基于多个特征点中每个特征点的状态,获取每个特征点对应的目标物的第一状态,每个特征点的状态包括每个特征点的位置和/或速度,第一状态包括目标物的第一速度和/或第一位置。The above-mentioned processing unit 820 is further configured to obtain the first state of the target object corresponding to each feature point based on the state of each feature point in the plurality of feature points, and the state of each feature point includes the position of each feature point and/ or velocity, the first state includes a first velocity and/or a first position of the object.
上述处理单元820,还用于基于每个特征点对应的目标物的第一状态以及每个特征点对应的不确定度,确定目标物的第二状态,第二状态包括目标物的第二速度和/或第二位置。The above processing unit 820 is further configured to determine the second state of the target based on the first state of the target corresponding to each feature point and the uncertainty corresponding to each feature point, and the second state includes the second speed of the target and/or second position.
可选地,多个特征点包括点云簇的多个端点。Optionally, the plurality of feature points include a plurality of endpoints of the point cloud cluster.
可选地,作为一个实施例,处理单元820还用于:确定与多个端点中每个端点相连的边的类型,边的类型包括采集设备直接采集到的可见边和采集设备无法直接采集到的不可见边;基于与多个端点中每个端点相连的两个边的类型,确定多个端点中每个端点的不确定度。Optionally, as an embodiment, the processing unit 820 is further configured to: determine the type of an edge connected to each of the multiple endpoints, and the type of the edge includes a visible edge directly collected by the collection device and an edge that cannot be directly collected by the collection device. The invisible edges of ; determine the uncertainty for each of the multiple endpoints based on the type of two edges that connect each of the multiple endpoints.
可选地,作为一个实施例,多个端点包括第一端点,与第一端点相连的第一边的类型为可见边,与第一端点相连的第二边的类型为不可见边,则第一端点的不确定度是基于采集设备的检测不确定度在所述目标物的朝向方向上的分量确定的。Optionally, as an embodiment, the plurality of endpoints include a first endpoint, the type of the first edge connected to the first endpoint is a visible edge, and the type of the second edge connected to the first endpoint is an invisible edge. , then the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation direction of the target.
可选地,作为一个实施例,第一端点的不确定度d 1通过公式
Figure PCTCN2021106261-appb-000034
确定,其中,R 1表示采集设备与第一端点之间的测量距离,C 0为预设值,与所述采集设备的采集精度负相关,单位为弧度;θ 1表示采集设备采集第一端点时的坐标方位角;
Figure PCTCN2021106261-appb-000035
表示目标物的朝向的方位角。
Optionally, as an embodiment, the uncertainty d 1 of the first endpoint is determined by the formula
Figure PCTCN2021106261-appb-000034
Determine, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, which is negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radians; θ 1 represents the acquisition device to collect the first The coordinate azimuth at the end point;
Figure PCTCN2021106261-appb-000035
Indicates the azimuth of the orientation of the target.
可选地,作为一个实施例,多个端点包括第二端点,且与第二端点相连的两条边的类型都为可见边,则第二端点与采集设备之间的测量距离与第二端点的不确定度正相关。Optionally, as an embodiment, the multiple endpoints include a second endpoint, and the types of the two edges connected to the second endpoint are both visible edges, then the measured distance between the second endpoint and the collection device is the same as the second endpoint. The uncertainty is positively correlated.
可选地,作为一个实施例,第二端点的不确定度d 2通过公式d 2=R 2×C 1确定,其中,R 2表示采集设备与第二端点之间的测量距离,C 1表示预设的不确定度,单位为弧度。 Optionally, as an embodiment, the uncertainty d 2 of the second endpoint is determined by the formula d 2 =R 2 ×C 1 , where R 2 represents the measurement distance between the acquisition device and the second endpoint, and C 1 represents The preset uncertainty, in radians.
可选地,作为一个实施例,处理单元820还用于:若第一参考点未被其他物体遮挡,则基于与多个端点中每个端点相连的两个边的类型,确定多个端点中每个端点的不确定度,第一参考点为第一边的延长线方向上与第一端点之间相距预设距离的点,其他物体为点云簇所在的图像中除目标物和采集设备之外的物体。Optionally, as an embodiment, the processing unit 820 is further configured to: if the first reference point is not blocked by other objects, determine, based on the type of two edges connected to each of the multiple endpoints, among the multiple endpoints. The uncertainty of each endpoint, the first reference point is the point with a preset distance from the first endpoint in the direction of the extension line of the first side, and the other objects are the objects in the image where the point cloud cluster is located except the target object and the collected object. objects outside the device.
可选地,作为一个实施例,处理单元820还用于:若第一参考点被其他物体遮挡,基于第一参考点对应的水平张角与第一端点对应的水平张角的变化程度,确定第一端点的不确定度。Optionally, as an embodiment, the processing unit 820 is further configured to: if the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, Determine the uncertainty of the first endpoint.
可选地,作为一个实施例,处理单元820还用于:若第一参考点被其他物体遮挡,基于第一参考点对应的水平张角与第一端点对应的水平张角的差δ,通过公式
Figure PCTCN2021106261-appb-000036
确定第一端点的不确定度d 3,其中,R 1表示采集设备与第一端点之间的测量距离,C 0为预设值,与所述采集设备的采集精度负相关,单位为弧度;θ 1表示采集设备采集第一端点时的坐标方位角;
Figure PCTCN2021106261-appb-000037
表示目标物的朝向的方位角。
Optionally, as an embodiment, the processing unit 820 is further configured to: if the first reference point is blocked by other objects, based on the difference δ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, by formula
Figure PCTCN2021106261-appb-000036
Determine the uncertainty d 3 of the first endpoint, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radian; θ 1 represents the coordinate azimuth angle when the acquisition device collects the first endpoint;
Figure PCTCN2021106261-appb-000037
Indicates the azimuth of the orientation of the target.
可选地,作为一个实施例,处理单元820还用于:若第一参考点被其他物体遮挡,基于第一参考点对应的水平张角与第一端点对应的水平张角的变化程度,确定第二端点的不确定度。Optionally, as an embodiment, the processing unit 820 is further configured to: if the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, Determine the uncertainty of the second endpoint.
可选地,作为一个实施例,处理单元820还用于:若第一参考点未被其他物体遮挡,基于第一参考点对应的水平张角与第一端点对应的水平张角的差δ,通过公式d 4=R 2×(C 1+δ),确定第二端点的不确定度d 4,其中,R 2表示采集设备与第二端点之间 的测量距离,C 1表示预设的不确定度,单位为弧度。 Optionally, as an embodiment, the processing unit 820 is further configured to: if the first reference point is not blocked by other objects, based on the difference δ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint , the uncertainty d 4 of the second endpoint is determined by the formula d 4 =R 2 ×(C 1 +δ), where R 2 represents the measurement distance between the acquisition device and the second endpoint, and C 1 represents the preset Uncertainty in radians.
可选地,作为一个实施例,处理单元820还用于:基于所述多个特征点中所述每个特征点对应的不确定度,确定所述多个特征点中所述每个特征点对应的置信度;基于所述多个特征点中所述每个特征点对应的所述目标物的所述第一状态,以及所述多个特征点中所述每个特征点对应的置信度,确定所述目标物的所述第二状态。Optionally, as an embodiment, the processing unit 820 is further configured to: determine each of the multiple feature points based on the uncertainty corresponding to each of the multiple feature points Corresponding confidence level; based on the first state of the target object corresponding to each of the multiple feature points, and the confidence level corresponding to each of the multiple feature points , to determine the second state of the target.
可选地,作为一个实施例,处理单元820还用于:所述处理单元,还用于:基于所述多个特征点中所述每个特征点对应的不确定度,通过公式
Figure PCTCN2021106261-appb-000038
确定所述多个特征点中第k个特征点对应的置信度M k,其中,k表示多个特征点中的第k个特征点,k=1……n,n为所述多个特征点的总数;d k表示所述第k个特征点的不确定度;Δ k表示所述第k个特征点的历史状态与所述第一状态之间的变化;C 3、C 4为预设值。
Optionally, as an embodiment, the processing unit 820 is further configured to: the processing unit is further configured to: based on the uncertainty corresponding to each feature point in the plurality of feature points, by formula
Figure PCTCN2021106261-appb-000038
Determine the confidence level M k corresponding to the k th feature point in the multiple feature points, where k represents the k th feature point in the multiple feature points, k=1...n, n is the multiple features The total number of points; d k represents the uncertainty of the k-th feature point; Δ k represents the change between the historical state of the k-th feature point and the first state; C 3 and C 4 are the set value.
在可选的实施例中,所述处理单元820可以为处理器920,所述获取模块810可以为通信接口930,所述通信设备还可以包括存储器910,具体如图9所示。In an optional embodiment, the processing unit 820 may be a processor 920, the obtaining module 810 may be a communication interface 930, and the communication device may further include a memory 910, as shown in FIG. 9 .
图9是本申请另一实施例的计算设备的示意性框图。图9所示的计算设备900可以包括:存储器910、处理器920、以及通信接口930。其中,存储器910、处理器920,通信接口930通过内部连接通路相连,该存储器910用于存储指令,该处理器920用于执行该存储器920存储的指令,以控制通信接口930接收/发送信息或数据。可选地,存储器910既可以和处理器920通过接口耦合,也可以和处理器920集成在一起。FIG. 9 is a schematic block diagram of a computing device according to another embodiment of the present application. The computing device 900 shown in FIG. 9 may include: a memory 910 , a processor 920 , and a communication interface 930 . The memory 910, the processor 920, and the communication interface 930 are connected through an internal connection path, the memory 910 is used to store instructions, and the processor 920 is used to execute the instructions stored in the memory 920 to control the communication interface 930 to receive/send information or data. Optionally, the memory 910 may be coupled with the processor 920 through an interface, or may be integrated with the processor 920 .
需要说明的是,上述通信接口930使用例如但不限于输入/输出接口(input/output interface)一类的收发装置,来实现计算设备900与其他设备之间的通信。It should be noted that the above-mentioned communication interface 930 uses a transceiver device such as but not limited to an input/output interface (input/output interface) to implement communication between the computing device 900 and other devices.
在实现过程中,上述方法的各步骤可以通过处理器920中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器910,处理器920读取存储器910中的信息,结合其硬件完成上述方法的步骤。为避免重复,这里不再详细描述。In the implementation process, each step of the above-mentioned method may be completed by an integrated logic circuit of hardware in the processor 920 or an instruction in the form of software. The methods disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware processor, or executed by a combination of hardware and software modules in the processor. The software modules may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art. The storage medium is located in the memory 910, and the processor 920 reads the information in the memory 910, and completes the steps of the above method in combination with its hardware. To avoid repetition, detailed description is omitted here.
应理解,本申请实施例中,该处理器可以为中央处理单元(central processing unit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。It should be understood that, in this embodiment of the present application, the processor may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (digital signal processors, DSP), dedicated integrated Circuit (application specific integrated circuit, ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
还应理解,本申请实施例中,该存储器可以包括只读存储器和随机存取存储器,并向处理器提供指令和数据。处理器的一部分还可以包括非易失性随机存取存储器。例如,处理器还可以存储设备类型的信息。It should also be understood that, in this embodiment of the present application, the memory may include a read-only memory and a random access memory, and provide instructions and data to the processor. A portion of the processor may also include non-volatile random access memory. For example, the processor may also store device type information.
应理解,本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系。It should be understood that the term "and/or" in this document is only an association relationship to describe associated objects, indicating that there can be three kinds of relationships, for example, A and/or B, which can mean that A exists alone, and A and B exist at the same time , there are three cases of B alone. In addition, the character "/" in this document generally indicates that the related objects are an "or" relationship.
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的 先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。It should be understood that, in various embodiments of the present application, the size of the sequence numbers of the above-mentioned processes does not mean the sequence of execution, and the execution sequence of each process should be determined by its functions and internal logic, and should not be dealt with in the embodiments of the present application. implementation constitutes any limitation.
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。Those of ordinary skill in the art can realize that the units and algorithm steps of each example described in conjunction with the embodiments disclosed herein can be implemented in electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each particular application, but such implementations should not be considered beyond the scope of this application.
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。Those skilled in the art can clearly understand that, for the convenience and brevity of description, the specific working process of the system, device and unit described above may refer to the corresponding process in the foregoing method embodiments, which will not be repeated here.
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。In the several embodiments provided in this application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are only illustrative. For example, the division of the units is only a logical function division. In actual implementation, there may be other division methods. For example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented. On the other hand, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。The units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。In addition, each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。The functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application. The aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。The above are only specific embodiments of the present application, but the protection scope of the present application is not limited to this. should be covered within the scope of protection of this application. Therefore, the protection scope of the present application should be subject to the protection scope of the claims.

Claims (32)

  1. 一种目标物的感知方法,其特征在于,包括:A method for sensing a target, comprising:
    获取点云簇的多个特征点,所述点云簇表示所述目标物;Acquiring multiple feature points of a point cloud cluster, the point cloud cluster representing the target;
    确定所述多个特征点中每个特征点的不确定度,所述不确定度用于指示通过采集设备采集所述每个特征点在所述点云簇中的位置时产生的误差;determining the uncertainty of each feature point in the plurality of feature points, where the uncertainty is used to indicate an error generated when the position of each feature point in the point cloud cluster is collected by the collection device;
    基于所述多个特征点中每个特征点的状态,获取所述多个特征点中每个特征点对应的所述目标物的第一状态,所述每个特征点的状态包括所述每个特征点的位置和/或速度,所述第一状态包括目标物的第一速度和/或第一位置;Based on the state of each feature point in the plurality of feature points, a first state of the target corresponding to each feature point in the plurality of feature points is obtained, and the state of each feature point includes the the position and/or velocity of each feature point, and the first state includes the first velocity and/or the first position of the target;
    基于所述多个特征点中所述每个特征点对应的所述目标物的第一状态,以及所述多个特征点中所述每个特征点对应的不确定度,确定所述目标物的第二状态,所述第二状态包括所述目标物的第二速度和/或第二位置。Determine the target based on the first state of the target corresponding to each of the plurality of feature points and the uncertainty corresponding to each of the plurality of feature points The second state includes a second velocity and/or a second position of the target.
  2. 如权利要求1所述的方法,其特征在于,所述多个特征点包括所述点云簇的多个端点。The method of claim 1, wherein the plurality of feature points comprise a plurality of endpoints of the point cloud cluster.
  3. 如权利要求2所述的方法,其特征在于,所述确定所述多个特征点中每个特征点的不确定度,包括:The method of claim 2, wherein the determining the uncertainty of each feature point in the plurality of feature points comprises:
    确定与所述多个端点中每个端点相连的边的类型,所述边的类型包括所述采集设备直接采集到的可见边和所述采集设备无法直接采集到的不可见边;determining the type of an edge connected to each of the plurality of endpoints, where the type of the edge includes a visible edge directly collected by the collection device and an invisible edge that cannot be directly collected by the collection device;
    基于与所述多个端点中每个端点相连的两个边的类型,确定所述多个端点中每个端点的所述不确定度。The uncertainty for each of the plurality of end points is determined based on the type of two edges connected to each of the plurality of end points.
  4. 如权利要求3所述的方法,其特征在于,所述多个端点包括第一端点,与所述第一端点相连的第一边的类型为可见边,与所述第一端点相连的第二边的类型为不可见边,则所述第一端点的所述不确定度是基于采集设备的检测不确定度在所述目标物的朝向方向上的分量确定的。The method of claim 3, wherein the plurality of endpoints comprises a first endpoint, and a first edge connected to the first endpoint is a visible edge, and is connected to the first endpoint The type of the second edge is an invisible edge, and the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation direction of the target.
  5. 如权利要求4所述的方法,其特征在于,所述第一端点的所述不确定度d 1通过公式
    Figure PCTCN2021106261-appb-100001
    确定,其中,R 1表示所述采集设备与所述第一端点之间的测量距离,C 0为预设值,且与所述采集设备的采集精度负相关,单位为弧度;θ 1表示所述采集设备采集所述第一端点时的坐标方位角;
    Figure PCTCN2021106261-appb-100002
    表示所述目标物的朝向的方位角。
    The method of claim 4, wherein the uncertainty d 1 of the first endpoint is determined by the formula
    Figure PCTCN2021106261-appb-100001
    Determine, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, in radians; θ 1 represents the coordinate azimuth when the collection device collects the first endpoint;
    Figure PCTCN2021106261-appb-100002
    Indicates the azimuth of the orientation of the target.
  6. 如权利要求4或5所述的方法,其特征在于,所述多个端点包括第二端点,且与所述第二端点相连的两条边的类型都为可见边,则所述第二端点与所述采集设备之间的测量距离与所述第二端点的所述不确定度正相关。The method according to claim 4 or 5, wherein the plurality of endpoints includes a second endpoint, and the types of two edges connected to the second endpoint are both visible edges, then the second endpoint The measured distance from the acquisition device is positively related to the uncertainty of the second endpoint.
  7. 如权利要求6所述的方法,其特征在于,所述第二端点的所述不确定度d 2通过公式d 2=R 2×C 1确定,其中,R 2表示所述采集设备与所述第二端点之间的测量距离,C 1表示预设的不确定度,单位为弧度。 The method of claim 6, wherein the uncertainty d 2 of the second endpoint is determined by a formula d 2 =R 2 ×C 1 , wherein R 2 represents the relationship between the acquisition device and the The measurement distance between the second endpoints, C 1 represents the preset uncertainty, in radians.
  8. 如权利要求4-7中任一项所述的方法,其特征在于,所述基于与所述多个端点中与每个端点相连的两个边的类型,确定所述多个端点中每个端点的所述不确定度,包括:7. The method of any one of claims 4-7, wherein the determining each of the plurality of end points is based on types of two edges connected to each end point of the plurality of end points The uncertainty of the endpoint, including:
    若第一参考点未被其他物体遮挡,则基于与所述多个端点中每个端点相连的两个边的 类型,确定所述多个端点中每个端点的所述不确定度,所述第一参考点为所述第一边的延长线方向上与所述第一端点之间相距预设距离的点,所述其他物体为所述点云簇所在的图像中除所述目标物和所述采集设备之外的物体。If the first reference point is not blocked by other objects, determining the uncertainty of each end point of the plurality of end points based on the type of two edges connected to each end point of the plurality of end points, the The first reference point is a point with a preset distance from the first end point in the direction of the extension line of the first side, and the other object is the target object in the image where the point cloud cluster is located and objects other than the collection device.
  9. 如权利要求8所述的方法,其特征在于,所述方法还包括:The method of claim 8, wherein the method further comprises:
    若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的变化程度,确定所述第一端点的所述不确定度。If the first reference point is occluded by the other objects, the first endpoint is determined based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint. the uncertainty.
  10. 如权利要求9所述的方法,其特征在于,所述若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角,确定所述第一端点的所述不确定度,包括:The method according to claim 9, wherein, if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first end point A horizontal opening angle, and determining the uncertainty of the first endpoint, including:
    若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的差δ,通过公式
    Figure PCTCN2021106261-appb-100003
    确定所述第一端点的所述不确定度d 3,其中,R 1表示所述采集设备与所述第一端点之间的测量距离,C 0为预设值,与所述采集设备的采集精度负相关,单位为弧度;θ 1表示所述采集设备采集所述第一端点时的坐标方位角;
    Figure PCTCN2021106261-appb-100004
    表示所述目标物的朝向的方位角。
    If the first reference point is blocked by the other objects, based on the difference δ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, the formula
    Figure PCTCN2021106261-appb-100003
    Determine the uncertainty d 3 of the first endpoint, wherein R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and the acquisition device The collection accuracy of θ is negatively correlated, and the unit is radian; θ 1 represents the coordinate azimuth angle when the collection device collects the first endpoint;
    Figure PCTCN2021106261-appb-100004
    Indicates the azimuth of the orientation of the target.
  11. 如权利要求8-10中任一项所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 8-10, wherein the method further comprises:
    若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的变化程度,确定所述第二端点的所述不确定度。If the first reference point is occluded by the other objects, determine the position of the second endpoint based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint. described uncertainty.
  12. 如权利要求11所述的方法,其特征在于,所述若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的变化程度,确定所述第二端点的所述不确定度,包括:The method according to claim 11, wherein, if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first end point The degree of change of the horizontal opening angle, to determine the uncertainty of the second end point, including:
    若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的差δ,通过公式d 4=R 2×(C 1+δ),确定所述第二端点的所述不确定度d 4,其中,R 2表示所述采集设备与所述第二端点之间的测量距离,C 1表示预设的不确定度,单位为弧度。 If the first reference point is blocked by the other objects, based on the difference δ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, the formula d 4 =R 2 × (C 1 +δ), determine the uncertainty d 4 of the second endpoint, where R 2 represents the measurement distance between the acquisition device and the second endpoint, and C 1 represents a preset uncertainty Degree of certainty, in radians.
  13. 如权利要求1-12中任一项所述的方法,其特征在于,所述基于所述多个特征点中所述每个特征点对应的所述目标物的第一状态,以及所述多个特征点中所述每个特征点对应的不确定度,确定所述目标物的第二状态,包括:The method according to any one of claims 1-12, wherein the first state of the target object corresponding to each of the plurality of feature points based on the The uncertainty corresponding to each feature point in the feature points determines the second state of the target, including:
    基于所述多个特征点中所述每个特征点对应的不确定度,确定所述多个特征点中所述每个特征点对应的置信度;determining, based on the uncertainty corresponding to each of the plurality of feature points, a confidence level corresponding to each of the plurality of feature points;
    基于所述多个特征点中所述每个特征点对应的所述目标物的所述第一状态,以及所述多个特征点中所述每个特征点对应的置信度,确定所述目标物的所述第二状态。The target is determined based on the first state of the target object corresponding to each of the plurality of feature points and the confidence level corresponding to each of the plurality of feature points said second state of the thing.
  14. 如权利要求13所述的方法,其特征在于,所述基于所述多个特征点中所述每个特征点对应的不确定度,确定所述多个特征点中所述每个特征点对应的置信度,包括:The method according to claim 13, wherein the corresponding uncertainty of each feature point in the plurality of feature points is determined based on the uncertainty corresponding to each feature point in the plurality of feature points. confidence level, including:
    基于所述多个特征点中所述每个特征点对应的不确定度,通过公式
    Figure PCTCN2021106261-appb-100005
    确定所述多个特征点中第k个特征点对应的置信度M k,其中,k表示多个特征点中的第k个特征点,k=1……n,n为所述多个特征点的总数;d k表示所述第k个特征点的不确定度;Δ k表示所述第k个特征点的历史状态与所述第一状态之间的变化;C 3、C 4为预设值。
    Based on the uncertainty corresponding to each feature point in the plurality of feature points, the formula
    Figure PCTCN2021106261-appb-100005
    Determine the confidence level M k corresponding to the k th feature point in the multiple feature points, where k represents the k th feature point in the multiple feature points, k=1...n, n is the multiple features The total number of points; d k represents the uncertainty of the k-th feature point; Δ k represents the change between the historical state of the k-th feature point and the first state; C 3 and C 4 are the set value.
  15. 一种目标物的感知装置,其特征在于,包括:A device for sensing a target, comprising:
    获取单元,用于获取点云簇的多个特征点,所述点云簇表示所述目标物;an acquisition unit, configured to acquire a plurality of feature points of a point cloud cluster, the point cloud cluster representing the target object;
    处理单元,用于确定所述多个特征点中每个特征点的不确定度,所述不确定度用于指示通过采集设备采集所述每个特征点在所述点云簇中的位置时产生的误差;a processing unit, configured to determine the uncertainty of each feature point in the plurality of feature points, where the uncertainty is used to indicate when the position of each feature point in the point cloud cluster is collected by the collection device error generated;
    所述处理单元,还用于基于所述多个特征点中每个特征点的状态,获取所述多个特征点中每个特征点对应的所述目标物的第一状态,所述每个特征点的状态包括所述每个特征点的位置和/或速度,所述第一状态包括目标物的第一速度和/或第一位置;The processing unit is further configured to acquire, based on the state of each feature point in the plurality of feature points, the first state of the target object corresponding to each feature point in the plurality of feature points, and each The state of the feature points includes the position and/or velocity of each feature point, and the first state includes the first velocity and/or the first position of the target;
    所述处理单元,还用于基于所述多个特征点中所述每个特征点对应的所述目标物的第一状态,以及所述多个特征点中所述每个特征点对应的不确定度,确定所述目标物的第二状态,所述第二状态包括所述目标物的第二速度和/或第二位置。The processing unit is further configured to be based on the first state of the target object corresponding to each feature point in the plurality of feature points, and the inconsistency corresponding to each feature point in the plurality of feature points. The degree of certainty determines a second state of the object, the second state including a second velocity and/or a second position of the object.
  16. 如权利要求15所述的装置,其特征在于,所述多个特征点为所述点云簇的多个端点。The apparatus of claim 15, wherein the plurality of feature points are a plurality of endpoints of the point cloud cluster.
  17. 如权利要求16所述的装置,其特征在于,所述处理单元,还用于:The apparatus of claim 16, wherein the processing unit is further configured to:
    确定与所述多个端点中每个端点相连的边的类型,所述边的类型包括所述采集设备直接采集到的可见边和所述采集设备无法直接采集到的不可见边;determining the type of an edge connected to each of the plurality of endpoints, where the type of the edge includes a visible edge directly collected by the collection device and an invisible edge that cannot be directly collected by the collection device;
    基于与所述多个端点中每个端点相连的两个边的类型,确定所述多个端点中每个端点的所述不确定度。The uncertainty for each of the plurality of end points is determined based on the type of two edges connected to each of the plurality of end points.
  18. 如权利要求17所述的装置,其特征在于,所述多个端点包括第一端点,与所述第一端点相连的第一边的类型为可见边,与所述第一端点相连的第二边的类型为不可见边,则所述第一端点的所述不确定度是基于采集设备的检测不确定度在所述目标物的朝向方向上的分量确定的。The apparatus of claim 17, wherein the plurality of endpoints include a first endpoint, and a first edge connected to the first endpoint is a visible edge, and is connected to the first endpoint The type of the second edge is an invisible edge, and the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation direction of the target.
  19. 如权利要求18所述的装置,其特征在于,所述第一端点的所述不确定度d 1通过公式
    Figure PCTCN2021106261-appb-100006
    确定,其中,R 1表示所述采集设备与所述第一端点之间的测量距离,C 0为预设值,与所述采集设备的采集精度负相关,单位为弧度;θ 1表示所述采集设备采集所述第一端点时的坐标方位角;
    Figure PCTCN2021106261-appb-100007
    表示所述目标物的朝向的方位角。
    19. The apparatus of claim 18, wherein the uncertainty d1 of the first endpoint is determined by the formula
    Figure PCTCN2021106261-appb-100006
    Determine, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, in radians; θ 1 represents the the coordinate azimuth when the collection device collects the first endpoint;
    Figure PCTCN2021106261-appb-100007
    Indicates the azimuth of the orientation of the target.
  20. 如权利要求18或19所述的装置,其特征在于,所述多个端点包括第二端点,且与所述第二端点相连的两条边的类型都为可见边,则所述第二端点与所述采集设备之间的测量距离与所述第二端点的所述不确定度正相关。The apparatus according to claim 18 or 19, wherein the plurality of endpoints includes a second endpoint, and the types of two edges connected to the second endpoint are both visible edges, then the second endpoint The measured distance from the acquisition device is positively related to the uncertainty of the second endpoint.
  21. 如权利要求20所述的装置,其特征在于,所述第二端点的所述不确定度d 2通过公式d 2=R 2×C 1确定,其中,R 2表示所述采集设备与所述第二端点之间的测量距离,C 1表示预设的不确定度,单位为弧度。 The apparatus of claim 20, wherein the uncertainty d 2 of the second endpoint is determined by a formula d 2 =R 2 ×C 1 , wherein R 2 represents the relationship between the acquisition device and the The measurement distance between the second endpoints, C 1 represents the preset uncertainty, in radians.
  22. 如权利要求17-21中任一项所述的装置,其特征在于,所述处理单元,还用于:The apparatus according to any one of claims 17-21, wherein the processing unit is further configured to:
    若第一参考点未被其他物体遮挡,则基于与所述多个端点中每个端点相连的两个边的类型,确定所述多个端点中每个端点的所述不确定度,所述第一参考点为所述第一边的延长线方向上与所述第一端点之间相距预设距离的点,所述其他物体为所述点云簇所在的图像中除所述目标物和所述采集设备之外的物体。If the first reference point is not blocked by other objects, determining the uncertainty of each end point of the plurality of end points based on the type of two edges connected to each end point of the plurality of end points, the The first reference point is a point with a preset distance from the first end point in the direction of the extension line of the first side, and the other object is the target object in the image where the point cloud cluster is located and objects other than the collection device.
  23. 如权利要求22所述的装置,其特征在于,所述处理单元,还用于:The apparatus of claim 22, wherein the processing unit is further configured to:
    若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述 第一端点对应的水平张角的变化程度,确定所述第一端点的所述不确定度。If the first reference point is occluded by the other objects, the first endpoint is determined based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint. the uncertainty.
  24. 如权利要求23所述的装置,其特征在于,所述处理单元,还用于:The apparatus of claim 23, wherein the processing unit is further configured to:
    若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的差δ,通过公式
    Figure PCTCN2021106261-appb-100008
    确定所述第一端点的所述不确定度d 3,其中,R 1表示所述采集设备与所述第一端点之间的测量距离,C 0为预设值,与所述采集设备的采集精度负相关,单位为弧度;θ 1表示所述采集设备采集所述第一端点时的坐标方位角;
    Figure PCTCN2021106261-appb-100009
    表示所述目标物的朝向的方位角。
    If the first reference point is blocked by the other objects, based on the difference δ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, the formula
    Figure PCTCN2021106261-appb-100008
    Determine the uncertainty d 3 of the first endpoint, wherein R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and the acquisition device The collection accuracy of θ is negatively correlated, and the unit is radian; θ 1 represents the coordinate azimuth angle when the collection device collects the first endpoint;
    Figure PCTCN2021106261-appb-100009
    Indicates the azimuth of the orientation of the target.
  25. 如权利要求24所述的装置,其特征在于,所述处理单元,还用于:The apparatus of claim 24, wherein the processing unit is further configured to:
    若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的变化程度,确定所述第二端点的所述不确定度。If the first reference point is occluded by the other objects, determine the position of the second endpoint based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint. described uncertainty.
  26. 如权利要求25所述的装置,其特征在于,所述处理单元,还用于:The apparatus of claim 25, wherein the processing unit is further configured to:
    若所述第一参考点被所述其他物体遮挡,基于所述第一参考点对应的水平张角与所述第一端点对应的水平张角的差δ,通过公式d 4=R 2×(C 1+δ),确定所述第二端点的所述不确定度d 4,其中,R 2表示所述采集设备与所述第二端点之间的测量距离,C 1表示预设的不确定度,单位为弧度。 If the first reference point is blocked by the other objects, based on the difference δ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, the formula d 4 =R 2 × (C 1 +δ), determine the uncertainty d 4 of the second endpoint, where R 2 represents the measurement distance between the acquisition device and the second endpoint, and C 1 represents a preset uncertainty Degree of certainty, in radians.
  27. 如权利要求15-26中任一项所述的装置,其特征在于,所述处理单元,还用于:The apparatus according to any one of claims 15-26, wherein the processing unit is further configured to:
    基于所述多个特征点中所述每个特征点对应的不确定度,确定所述多个特征点中所述每个特征点对应的置信度;determining, based on the uncertainty corresponding to each of the plurality of feature points, a confidence level corresponding to each of the plurality of feature points;
    基于所述多个特征点中所述每个特征点对应的所述目标物的所述第一状态,以及所述多个特征点中所述每个特征点对应的置信度,确定所述目标物的所述第二状态。The target is determined based on the first state of the target object corresponding to each of the plurality of feature points and the confidence level corresponding to each of the plurality of feature points said second state of the thing.
  28. 如权利要求27所述的装置,其特征在于,所述处理单元,还用于:The apparatus of claim 27, wherein the processing unit is further configured to:
    基于所述多个特征点中所述每个特征点对应的不确定度,通过公式
    Figure PCTCN2021106261-appb-100010
    确定所述多个特征点中第k个特征点对应的置信度M k,其中,k表示多个特征点中的第k个特征点,k=1……n,n为所述多个特征点的总数;d k表示所述第k个特征点的不确定度;Δ k表示所述第k个特征点的历史状态与所述第一状态之间的变化;C 3、C 4为预设值。
    Based on the uncertainty corresponding to each feature point in the plurality of feature points, the formula
    Figure PCTCN2021106261-appb-100010
    Determine the confidence level M k corresponding to the k th feature point in the multiple feature points, where k represents the k th feature point in the multiple feature points, k=1...n, n is the multiple features The total number of points; d k represents the uncertainty of the k-th feature point; Δ k represents the change between the historical state of the k-th feature point and the first state; C 3 and C 4 are the set value.
  29. 一种计算设备,其特征在于,包括:至少一个处理器和存储器,所述至少一个处理器与所述存储器耦合,用于读取并执行所述存储器中的指令,以执行如权利要求1-14中任一项所述的方法。A computing device, characterized in that it comprises: at least one processor and a memory, the at least one processor is coupled to the memory for reading and executing instructions in the memory, so as to execute the steps of claims 1- The method of any of 14.
  30. 一种计算机可读介质,其特征在于,所述计算机可读介质存储有程序代码,当所述计算机程序代码在计算机上运行时,使得所述计算机执行如权利要求1-14中任一项所述的方法。A computer-readable medium, characterized in that, the computer-readable medium stores program codes, which, when the computer program codes are executed on a computer, cause the computer to perform the method described in any one of claims 1-14. method described.
  31. 一种芯片,其特征在于,包括:至少一个处理器和存储器,所述至少一个处理器与所述存储器耦合,用于读取并执行所述存储器中的指令,以执行如权利要求1-14中任一项所述的方法。A chip, characterized by comprising: at least one processor and a memory, wherein the at least one processor is coupled to the memory for reading and executing instructions in the memory, so as to execute the instructions in claims 1-14 The method of any of the above.
  32. 一种自动驾驶车辆,其特征在于,包括:至少一个处理器和存储器,所述至少一个处理器与所述存储器耦合,用于读取并执行所述存储器中的指令,以执行如权利要求1-14中任一项所述的方法。An autonomous driving vehicle, characterized by comprising: at least one processor and a memory, wherein the at least one processor is coupled to the memory for reading and executing instructions in the memory, so as to execute the method as claimed in claim 1 - The method of any one of 14.
PCT/CN2021/106261 2020-07-31 2021-07-14 Target object sensing method and apparatus WO2022022284A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010755668.2 2020-07-31
CN202010755668.2A CN114092898A (en) 2020-07-31 2020-07-31 Target object sensing method and device

Publications (1)

Publication Number Publication Date
WO2022022284A1 true WO2022022284A1 (en) 2022-02-03

Family

ID=80037525

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/106261 WO2022022284A1 (en) 2020-07-31 2021-07-14 Target object sensing method and apparatus

Country Status (2)

Country Link
CN (1) CN114092898A (en)
WO (1) WO2022022284A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114577215B (en) * 2022-03-10 2023-10-27 山东新一代信息产业技术研究院有限公司 Method, equipment and medium for updating characteristic map of mobile robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810475A (en) * 2014-02-19 2014-05-21 百度在线网络技术(北京)有限公司 Target object recognition method and apparatus
US20180341021A1 (en) * 2017-05-24 2018-11-29 Jena-Optronik Gmbh Method For Detecting And Autonomously Tracking A Target Object Using A LIDAR Sensor
CN109831736A (en) * 2017-11-23 2019-05-31 腾讯科技(深圳)有限公司 A kind of data processing method, device, server and client
CN111060024A (en) * 2018-09-05 2020-04-24 天目爱视(北京)科技有限公司 3D measuring and acquiring device with rotation center shaft intersected with image acquisition device
CN111199579A (en) * 2020-01-02 2020-05-26 腾讯科技(深圳)有限公司 Method, device, equipment and medium for building three-dimensional model of target object

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810475A (en) * 2014-02-19 2014-05-21 百度在线网络技术(北京)有限公司 Target object recognition method and apparatus
US20180341021A1 (en) * 2017-05-24 2018-11-29 Jena-Optronik Gmbh Method For Detecting And Autonomously Tracking A Target Object Using A LIDAR Sensor
CN109831736A (en) * 2017-11-23 2019-05-31 腾讯科技(深圳)有限公司 A kind of data processing method, device, server and client
CN111060024A (en) * 2018-09-05 2020-04-24 天目爱视(北京)科技有限公司 3D measuring and acquiring device with rotation center shaft intersected with image acquisition device
CN111199579A (en) * 2020-01-02 2020-05-26 腾讯科技(深圳)有限公司 Method, device, equipment and medium for building three-dimensional model of target object

Also Published As

Publication number Publication date
CN114092898A (en) 2022-02-25

Similar Documents

Publication Publication Date Title
WO2022001773A1 (en) Trajectory prediction method and apparatus
CN112639883B (en) Relative attitude calibration method and related device
CN112543877B (en) Positioning method and positioning device
CN113792566A (en) Laser point cloud processing method and related equipment
CN113498529B (en) Target tracking method and device
WO2022001366A1 (en) Lane line detection method and apparatus
WO2022062825A1 (en) Vehicle control method, device, and vehicle
CN112512887A (en) Driving decision selection method and device
WO2022156309A1 (en) Trajectory prediction method and apparatus, and map
WO2022051951A1 (en) Lane line detection method, related device, and computer readable storage medium
CN114693540A (en) Image processing method and device and intelligent automobile
WO2021163846A1 (en) Target tracking method and target tracking apparatus
CN112810603B (en) Positioning method and related product
WO2022089577A1 (en) Pose determination method and related device thereof
WO2022052881A1 (en) Map construction method and computing device
WO2022052765A1 (en) Target tracking method and device
WO2022022284A1 (en) Target object sensing method and apparatus
CN115546781A (en) Point cloud data clustering method and device
WO2021000787A1 (en) Method and device for road geometry recognition
WO2021217646A1 (en) Method and device for detecting free space for vehicle
US20220309806A1 (en) Road structure detection method and apparatus
WO2021159397A1 (en) Vehicle travelable region detection method and detection device
CN115508841A (en) Road edge detection method and device
CN113128497A (en) Target shape estimation method and device
WO2022061725A1 (en) Traffic element observation method and apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21850793

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21850793

Country of ref document: EP

Kind code of ref document: A1