CN116659376A - Method and device for determining appearance size of dynamic target - Google Patents

Method and device for determining appearance size of dynamic target Download PDF

Info

Publication number
CN116659376A
CN116659376A CN202310471451.2A CN202310471451A CN116659376A CN 116659376 A CN116659376 A CN 116659376A CN 202310471451 A CN202310471451 A CN 202310471451A CN 116659376 A CN116659376 A CN 116659376A
Authority
CN
China
Prior art keywords
determining
point cloud
speed
compensation
dynamic target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310471451.2A
Other languages
Chinese (zh)
Inventor
何仕文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suteng Innovation Technology Co Ltd
Original Assignee
Suteng Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suteng Innovation Technology Co Ltd filed Critical Suteng Innovation Technology Co Ltd
Priority to CN202310471451.2A priority Critical patent/CN116659376A/en
Publication of CN116659376A publication Critical patent/CN116659376A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/04Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving
    • G01B11/043Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving for measuring length
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/04Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving
    • G01B11/046Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving for measuring width
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0608Height gauges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0691Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material of objects while moving

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The disclosure provides a method and a device for determining the appearance size of a dynamic target, and relates to the technical field of intelligent traffic. The method comprises the following steps: determining a detection area, and acquiring multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area; determining motion compensation according to the plurality of measurement speeds based on the time points at which the plurality of measurement speeds are acquired and the time points at which the multi-frame point cloud data are acquired; registering the multi-frame point cloud data based on motion compensation to obtain a reconstructed point cloud of the moving target; and determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the motion direction of the dynamic target. The technical scheme provides an appearance size determining scheme for the dynamic target, and the appearance size of the dynamic target can be determined with high accuracy.

Description

Method and device for determining appearance size of dynamic target
The present application is a divisional application of chinese application No. 202111169245.3, the foregoing of which is incorporated by reference in the present document.
Technical Field
The disclosure relates to the technical field of intelligent traffic, in particular to a method and a device for determining the appearance size of a dynamic target, a method and a device for determining the appearance size of the dynamic target, a computer readable storage medium and electronic equipment.
Background
Current techniques for object appearance detection generally require that the object to be detected be in a stationary state, that is, that the apparent size of the object be measured while the object is in a stationary state. But lacks a solution for determining the apparent dimensions of the dynamic object.
For example, in the road with the height limiting requirement, the determination of the appearance size of the vehicle (dynamic target) running in the road is beneficial to quickly determining the illegal vehicle, and the running safety is effectively improved through timely early warning.
Therefore, the dynamic object target appearance determining method can effectively solve the application scenes, and can effectively reduce law enforcement cost and improve road transportation safety by matching with the rear-end database.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method for determining an appearance size of a dynamic target, a device for determining an appearance size of a dynamic target, a computer-readable storage medium, and an electronic device, where bandwidth is reduced, and power consumption is high due to bandwidth problems, at least to some extent, under the condition that a low latency is ensured.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to one aspect of the present disclosure, there is provided a method of determining an apparent size of a dynamic target, the method comprising: determining a detection area, and acquiring multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area; determining motion compensation according to the plurality of measurement speeds based on a time point at which the plurality of measurement speeds are acquired and a time point at which the multi-frame point cloud data are acquired; registering the multi-frame point cloud data based on the motion compensation to obtain a reconstructed point cloud of the moving target; and determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the motion direction of the dynamic target.
According to another aspect of the present disclosure, there is provided a dynamic target apparent size determining apparatus, the apparatus including: the device comprises an acquisition module, a compensation determination module, a registration module and a determination module.
The acquisition module is used for determining a detection area and acquiring multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area; the compensation determining module is configured to determine motion compensation according to the plurality of measurement speeds based on a time point when the plurality of measurement speeds are acquired and a time point when the multi-frame point cloud data are acquired; the registration module is used for carrying out registration processing on the multi-frame point cloud data based on the motion compensation to obtain a reconstruction point cloud of the moving target; and the determining module is used for determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the motion direction of the dynamic target.
Example 1. A method of determining a dynamic target apparent size, the method comprising: determining a detection area, and acquiring multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area; determining motion compensation according to the plurality of measurement speeds based on a time point at which the plurality of measurement speeds are acquired and a time point at which the multi-frame point cloud data are acquired; registering the multi-frame point cloud data based on the motion compensation to obtain a reconstructed point cloud of the moving target; and determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the motion direction of the dynamic target.
Example 2. The method of example 1, the obtaining multi-frame point cloud data and a plurality of measurement speeds of the dynamic target in the detection area, includes: acquiring point cloud data of the moving target through a laser radarC, wherein c= { (T) C0 ,C 0 ),(T C1 ,C 1 ),…(T Ci ,C i ),…(T CN ,C N ) }, wherein T is Ci Representing acquisition of ith frame point cloud data C i N is a positive integer, i is a positive integer not greater than N; responding to the laser radar to determine that the dynamic target enters the detection area, triggering a speed measuring radar to obtain the measurement speed V of the moving target, wherein V= { (T) V0 ,V 0 ),(T V1 ,V 1 ),…(T Vk ,V k ),…(T VM ,V M ) }, wherein T is Vk Representing the acquisition measurement speed V k M is a positive integer, and k is a positive integer not greater than M.
Example 3. The method of example 2, based on the point in time of acquiring the plurality of measurement speeds and the point in time of acquiring the multi-frame point cloud data, determines motion compensation from the plurality of measurement speeds, includes: according to T Ci And T Vk Determining a compensation mode from the magnitude relation of the velocity V k And estimating a speed to determine a compensation speed, and determining the compensation mode and the compensation speed as the motion compensation.
Example 4. The method according to example 3, the method according to T Ci And T Vk Determining a compensation mode from the magnitude relation of the velocity V k And estimating the speed to determine a compensation speed, comprising: if T Ci <T V0 And T is V0 -T Ci <T0, performing motion compensation according to a compensation mode of uniform linear motion, and determining the compensation speed as T V0 Measurement speed V acquired at a time point 0 Wherein T0 is a first preset duration; if T Ci >T Vl And T is Ci <T Vp Then motion compensation is carried out according to a compensation mode of uniform linear motion, and an interpolation mode is adopted according to V l And V p Interpolation is carried out to obtain the compensation speed, wherein, l and p are positive integers less than M, and l is less than p and V l And V p Respectively T Vl And T Vp Measurement speed acquired at time pointA degree; if T Ci >T Vk Performing motion compensation according to a compensation mode of uniform linear motion, wherein the compensation speed corresponding to the first stage is T Vk Measurement speed V acquired at a time point k The compensation speed corresponding to the second stage is an estimated speed, wherein the estimated speed is according to V k And a point cloud registration speed determination.
Example 5. The method of any one of examples 1 to 4, the determining a measured apparent size of the dynamic object from the reconstructed point cloud and a direction of motion of the dynamic object, comprises: determining a minimum circumscribed rectangle in the reconstruction point cloud based on being perpendicular to the motion direction and the motion direction; determining the distance of the minimum circumscribed rectangle in the direction perpendicular to the movement direction as width information W of the moving object, and determining the distance of the minimum circumscribed rectangle in the movement direction as length information L of the moving object; and determining the maximum value in the reconstruction point cloud as the height information H of the moving object.
Example 6. The method of any one of examples 1 to 4, the method further comprising: acquiring background height information of the detection area through the laser radar; rasterizing the detection area to obtain X, Y grids, wherein X, Y is a positive integer; acquiring background height information B (X, Y) of each grid, wherein B (X, Y) represents the height of the ground points of the grid (X, Y), the value of X is 1,2 …, and the value of X, Y is 1,2 … and Y; the determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the motion direction of the dynamic target includes: acquiring measurement height information G (x, y) of each grid according to the reconstruction point cloud; calculating the height difference between the measured height information G (x, y) and the background height information B (x, y), and carrying out binarization processing on the reconstruction point cloud according to the height difference; determining a minimum circumscribed rectangle in the binarized reconstruction point cloud based on the direction perpendicular to the motion direction and the motion direction; determining the distance of the minimum circumscribed rectangle in the direction perpendicular to the movement direction as width information W of the moving object, and determining the distance of the minimum circumscribed rectangle in the movement direction as length information L of the moving object; and determining the maximum value of the height difference as the height information H of the moving object.
Example 7. The method of any one of examples 1 to 4, after the obtaining the reconstructed point cloud of the moving object, further comprises: calculating the motion speed of the moving target through the adjacent point clouds subjected to the registration processing to obtain a point cloud registration speed; and correcting the speed of the speed measuring radar through the point cloud registration speed.
Example 8. The method of example 6, the method comprising: obtaining the actual appearance size of the moving object comprises the following steps: width information W ', length information L ', and height information H ' of the moving object; comparing the measured appearance size of the dynamic target with the actual appearance size corresponding to the dynamic target; the comparison result comprises: and under the condition that W is greater than W ', L is greater than L ' and H is greater than one or more of H ', determining the dynamic target as a target to be processed.
Example 9. The method of example 8, the obtaining the actual apparent size of the moving object, comprises: responding to the laser radar to determine that the dynamic target enters the detection area, and triggering a camera shooting component to acquire an image of the moving target; performing image pixel compensation according to the measurement speed to remove motion blur in the image; and carrying out identity recognition of the moving object according to the image after the removal processing, and acquiring the actual appearance size of the moving object according to the recognized identity.
Example 10. An apparatus for determining a dynamic target apparent size, the apparatus comprising: the acquisition module is used for determining a detection area and acquiring multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area; the compensation determining module is used for determining motion compensation according to the plurality of measuring speeds based on the time points of acquiring the plurality of measuring speeds and the time points of acquiring the multi-frame point cloud data; the registration module is used for carrying out registration processing on the multi-frame point cloud data based on the motion compensation to obtain a reconstruction point cloud of the moving target; and the determining module is used for determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the movement direction of the dynamic target.
According to yet another aspect of the present disclosure, there is provided an electronic device including a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the method of determining a dynamic target apparent size as in the above embodiments when executing the computer program.
According to yet another aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements a method of determining a dynamic target apparent size as in the above-described embodiments.
The method for determining the appearance size of the dynamic target, the device for determining the appearance size of the dynamic target, the computer-readable storage medium and the electronic device provided by the embodiment of the disclosure have the following technical effects:
after a detection area is determined, multi-frame point cloud data of a dynamic target in the detection area and a plurality of measurement speeds are acquired, wherein motion compensation is determined according to the measurement speeds. The point cloud data is registered based on motion compensation, and then reconstructed point cloud containing motion information can be obtained. Further, the apparent size of the reconstructed point cloud based on the dynamic object and the direction of motion can be determined. The technical scheme provides an appearance size determining scheme for the dynamic target, and the appearance size of the dynamic target can be determined with high accuracy.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
Fig. 1 illustrates a schematic view of a scenario of a determination scheme of a dynamic target apparent size in an exemplary embodiment of the present disclosure.
FIG. 2 is a flow chart illustrating a method of determining a dynamic target apparent size in an exemplary embodiment of the present disclosure.
Fig. 3 is a flow chart illustrating a method of determining a dynamic target apparent size in another exemplary embodiment of the present disclosure.
Fig. 4 shows a flow chart of determining a reconstructed point cloud for the same moving object from point cloud data C and velocity data V for the moving object according to an exemplary embodiment of the present disclosure.
Fig. 5 shows a flow diagram of a method of determining motion compensation in an exemplary embodiment of the present disclosure.
Fig. 6 is a flow chart illustrating a method of determining a dynamic target apparent size in yet another exemplary embodiment of the present disclosure.
Fig. 7 shows a schematic diagram of a grid reflecting background height information in an exemplary embodiment according to the present disclosure.
Fig. 8 shows a schematic diagram of an inter-measurement device compensation enhancement scheme provided in accordance with an exemplary embodiment of the present disclosure.
Fig. 9 is a schematic diagram showing a configuration of a dynamic target apparent size determining apparatus to which an embodiment of the present disclosure can be applied.
Fig. 10 schematically illustrates a structural diagram of a dynamic target apparent size determining apparatus according to another embodiment of the present disclosure.
Fig. 11 shows a schematic diagram of a computer system suitable for use in implementing embodiments of the present disclosure.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present disclosure more apparent, the embodiments of the present disclosure will be described in further detail below with reference to the accompanying drawings.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the disclosure as detailed in the accompanying claims.
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The following describes in detail an embodiment of a method for determining an appearance size of a dynamic target provided by the present disclosure:
fig. 1 is a schematic view of a scenario of a determination scheme of an appearance size of a dynamic target in an exemplary embodiment of the disclosure. Referring to fig. 1, for a moving object 110 (e.g., a moving vehicle), after entering a monitoring area 120, point cloud data and a measurement speed of the moving object are acquired through a measurement device 130 (e.g., a laser radar, a speed measuring radar, an image pickup device, etc.), and further, after calculating the point data and the measurement speed through a calculation device 150, the apparent size of the moving object 110 is determined. By way of example, the external dimensions of the respective moving objects 110 may also be displayed by the display device 160.
FIG. 2 is a flow chart illustrating a method of determining a dynamic target apparent size in an exemplary embodiment of the present disclosure. The method comprises the following steps with reference to fig. 2:
s210, determining a detection area, and acquiring multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area.
In an exemplary embodiment, referring to fig. 1, the monitoring area 120 may be a road with a height limitation requirement, and the range of the area to be measured is related to the point cloud density acquired by the laser radar and the size of the moving object. Specifically, the point cloud density of the laser radar on the scanning result of the area 120 to be measured needs to reach a preset value, and meanwhile, the size of the area 120 to be measured needs to be basically larger than the size of the target object.
In an exemplary embodiment, the point cloud data of the dynamic target 110 is acquired by a lidar. Specifically, the lidar acquires point cloud data of the dynamic target 110 by scanning, for example, measuring point data of attributes such as length, width, height, etc. of the scanned vehicle. However, since the laser radar measurement points are discrete points and the measured object may have a large volume, the scene including the appearance of the whole measured object cannot be scanned at one time, so that the laser radar in this embodiment needs to continuously scan the moving object to obtain multi-frame point cloud data, so as to reduce measurement errors and ensure the integrity of the measured object.
In an exemplary embodiment, the velocity of the dynamic target 110 is obtained from a velocimetry radar (referred to herein as a "measured velocity" in order to distinguish from a subsequent "actual velocity"). Specifically, for the speed measuring radar, when the movement is performed in the measurement area 120, the speed measuring radar obtains the movement speed of the vehicle to be measured, and in order to improve the measurement accuracy of the system, the speed measuring radar also needs to continuously obtain the movement speed of the object to be measured.
S220, determining motion compensation according to the plurality of measurement speeds based on the time points of acquiring the plurality of measurement speeds and the time points of acquiring the multi-frame point cloud data; and S230, carrying out registration processing on the multi-frame point cloud data based on the motion compensation to obtain a reconstruction point cloud of the moving target.
In the process of carrying out registration processing according to point cloud data, the embodiment of the disclosure considers motion compensation so as to enable the obtained reconstructed point cloud to be more attached to the appearance of a moving target, thereby improving the accuracy of determining appearance information. Specifically, the motion compensation is determined according to the plurality of measurement speeds based on a time point at which the plurality of measurement speeds are acquired and a time point at which the multi-frame point cloud data are acquired. The speed information when the point cloud of the current frame is acquired is considered, so that the technical scheme is not only suitable for a moving target moving at a uniform speed, but also can be a target moving at a variable speed, thereby being beneficial to improving the accuracy of determining motion compensation, and expanding the application range of the scheme.
S240, determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the movement direction of the dynamic target.
In the solution provided in the embodiment shown in fig. 2, after determining the detection area, multi-frame point cloud data of the dynamic target in the detection area and a plurality of measurement speeds are obtained, where motion compensation is determined according to the measurement speeds. The point cloud data is registered based on motion compensation, and then reconstructed point cloud containing motion information can be obtained. Further, the apparent size of the reconstructed point cloud based on the dynamic object and the direction of motion can be determined. The technical scheme provides an appearance size determining scheme for the dynamic target, which can determine the appearance size of the dynamic target with higher efficiency and higher accuracy.
In an exemplary embodiment, fig. 3 shows a flow chart of a method for determining an apparent size of a dynamic target in another exemplary embodiment of the present disclosure. The following describes in detail the implementation of each step in the embodiment shown in fig. 3:
referring to fig. 3, in S310, point cloud data C of the moving object is acquired by a lidar.
Specifically, wherein c= { (T C0 ,C 0 ),(T C1 ,C 1 ),…(T Ci ,C i ),…(T CN ,C N ) }, wherein T is Ci Representing acquisition of ith frame point cloud data C i N is a positive integer, i is a positive integer not greater than N.
The measuring device 130 shown with reference to fig. 1 includes a laser radar, a speed measuring radar, an image pickup section, and the like. Only the laser radar is in an on state at the initial time point, and the speed measuring radar, the camera shooting component and the like are in a standby state. In the case where the laser radar has a change in depth information, a change in height information, and a change in color information in the detection area 120, it can be determined that a target object (i.e., a moving target) has appeared in the detection area.
In this embodiment, in order to reduce the system power consumption, the laser radar is responded to determine that the dynamic target enters the detection area, and then the speed measuring radar and the image pickup component are triggered to enter the working state. Similarly, only the camera device is in an on state at the initial time point, and the speed measuring radar, the laser radar and the like are in a standby state. And responding to the camera device to determine that the dynamic target enters the detection area, triggering the speed measuring radar and the laser radar to enter the working state, and further reducing the power consumption of the whole measuring system to the minimum.
Wherein, S320-S350 are used to determine the measured apparent size of the moving object, and S320'-S340' are used to determine the actual apparent size of the moving object.
Specifically, in determining the measurement appearance size of the moving object, motion compensation is determined (S330), then, point cloud data registration is performed based on the motion compensation to obtain a reconstructed point cloud of the moving object (S340), and further, the measurement appearance size of the moving object is determined according to the reconstructed point cloud and the movement direction of the moving object (S350).
Specific embodiments of S320-S350 are described below:
s320, responding to the laser radar to determine that the dynamic target enters the detection area, and triggering the speed measuring radar to acquire the measuring speed V of the moving target.
Specifically, where v= { (T V0 ,V 0 ),(T V1 ,V 1 ),…(T Vk ,V k ),…(T VM ,V M ) }, wherein T is Vk Representing the acquisition measurement speed V k M is a positive integer, and k is a positive integer not greater than M. By acquiring the speed data corresponding to a plurality of time points, the comprehensive speed information of the moving object in the detection area can be acquired, so that the scheme is suitable for the object moving at a uniform speed and the object moving at a variable speed. Meanwhile, the speed of the movement of the object to be measured is continuously obtained, and the measurement accuracy of the system is improved.
Referring to fig. 4, a reconstructed point cloud for the same moving object will be determined by the point cloud data C and the velocity data V for the moving object (refer to the embodiment corresponding to S340 in particular). Wherein the reconstruction point cloud is a reconstruction of the appearance of the measurement object. Exemplary, iterative closest Point method (ICP) is employed for Point cloud data C 0 And point cloud data C 1 Registration is performed. In order to improve the efficiency and reconstruction accuracy of the point cloud data matching, motion compensation needs to be performed according to information such as the motion speed of the vehicle, namely, the motion compensation is considered in the registration process, so as to obtain the point cloud C P1 . Wherein the motion compensation is according to T Ci And T Vk As well as the magnitude relation of the velocity data V (refer specifically to the corresponding embodiment of S330).
With continued reference to FIG. 3, in S330, according to T Ci And T Vk The corresponding compensation modes under different conditions are determined according to the magnitude relation of the velocity V k And determining the compensation speeds corresponding to different conditions by the estimated speeds, thereby determining the motion compensation corresponding to different conditions.
In an exemplary embodiment, fig. 5 shows a flow diagram of a method of determining motion compensation in an exemplary embodiment of the present disclosure. Specifically, FIG. 5 may be referred to as a specific embodiment of S330, referring to FIG. 5, including S331-S337.
In S331, T V0 -T0<T Ci ? That is, determine T Ci Whether or not it is greater than T V0 -T0. Wherein T0 is a first preset duration.
In this embodiment, the ith frame point cloud data C is determined i Corresponding acquisition time point T Ci Whether or not it is greater than the first measurement speed V obtained by the speed measuring radar 0 Corresponding point in time T of (2) V0
If S331 is not satisfied, the measurement time T of the current frame point cloud data is described Ci Less than T V0 And the first measuring speed V 0 If the measurement time difference of (1) is too large (greater than T0), i.e., the current frame point cloud data is not suitable for motion compensation, then S337 is performed: the point cloud data are abandoned, so that the calculation accuracy is ensured and the calculation amount is reduced.
If S331 is true, the current frame point cloud data C is described i Is a measurement time T of (2) Ci And a first measuring speed V 0 Is a measurement time T of (2) V0 The phase difference is not large (less than T0), S332 is continued to be executed. In S332, T V0 -T0<T Ci <T V0 ? Judging the point cloud data C of the current frame i Is a measurement time T of (2) Ci Whether or not it is smaller than the first measured speed V 0 Is a measurement time T of (2) V0
If S332 is true, the current frame point cloud data C is described i Is a measurement time T of (2) Ci Less than V 0 Is a measurement time T of (2) V0 And is connected with V 0 Is a measurement time T of (2) V0 Near, that is, the current frame point cloud data C i The point cloud data scanned by the lidar before the velocimetry radar starts to work normally, but due to the fact that the point cloud data is matched with V 0 Is a measurement time T of (2) V0 Similarly, S333 is therefore performed: performing motion compensation according to a compensation mode of uniform linear motion, and determining the compensation speed as T V0 Measurement speed V acquired at a time point 0
If S332 is not satisfied, the current frame point cloud data C is described i Is a measurement time T of (2) Ci Greater than V 0 Is a measurement time T of (2) V0 Then the current frame point cloud data C i To measure the speedAnd the point cloud data scanned by the laser radar after the radar starts to work normally. Then S334 is further performed: t (T) Vl <T Ci <T Vp ? Judging the point cloud data C of the current frame i Is a measurement time T of (2) Ci Whether or not to be at T Vl And T is Vp Between them.
If S334 is true, the current frame point cloud data C is described i The point cloud data scanned by the laser radar after the velocimetry radar starts to work normally is at a known measurement speed V l Is a measurement time T of (2) Vl With known measuring speed V p Is a measurement time T of (2) Vp Between, then execute S335: according to the measuring speed V l And measuring velocity V p The compensation mode is determined by the difference value of the (B) and an interpolation mode is adopted according to V under the condition that the compensation mode is uniform linear motion l And V p Interpolation is performed to obtain a compensation speed (as in formula (1)), wherein l and p are positive integers less than M, and l is less than p, V l And V p Respectively T Vl And T Vp Measurement speed obtained at a time point. The calculation formula of the compensation speed V "in this case is as follows:
V” = (T Vp - T Ci )/(T Vp - T Vl ) * V l + (T Ci - T Vl ) / (T Vp - T Vl ) * V p (1)
exemplary, measuring speed V l And measuring velocity V p Under the condition of not large difference, the compensation mode is uniform linear motion. At a measuring speed V l And measuring velocity V p Under the condition that the difference value of the velocity interpolation is larger (larger than a preset value), a uniform acceleration motion model can be adopted for motion modeling or a high-order interpolation method can be adopted for obtaining the velocity interpolation. To improve the accuracy of motion compensation.
If S334 is not satisfied, due to the previous frame point cloud data C i Is a measurement time T of (2) Ci Greater than V 0 Is a measurement time T of (2) V0 Then describe the current frame point cloud data C i The point cloud data scanned by the laser radar after the velocimetry radar finishes the measurement work, namely belongs to T Vk <T Ci Is the case in (a).
S336 is executed: performing motion compensation according to a compensation mode of uniform linear motion, wherein the compensation speed corresponding to the first stage is T Vk Measurement speed V acquired at a time point k The compensation speed corresponding to the second stage is an estimated speed (formula (2)), wherein the estimated speed is based on the measurement speed V k And a point cloud registration speed determination.
Exemplary, at T Vk <T Ci In the case of (2), in order to further improve the motion compensation accuracy, the compensation speed is determined in two stages. The first stage may specifically be a distance T Vk During a relatively short period of time, the compensation speed can be determined as T during the period of time (i.e. the first stage) Vk Measurement speed V acquired at a time point k . The second stage may specifically be a distance T Vk For a longer period of time, if the compensation speed is determined to be T again in the period of time (i.e. the second stage) Vk Measurement speed V acquired at a time point k The accuracy of the compensation speed will be likely to be affected, and therefore the compensation speed corresponding to the second stage will be based on the measurement speed V k And point cloud registration speed V cti An estimation is made.
The calculation formula of the compensation velocity V "(i.e. the estimated velocity) corresponding to the second stage is as follows:
V” = k * V k + (1 - k) * V cti (2)
wherein k= (T Ci -T Vk ) T1, T1 is a second preset time length, and the point cloud registration speed V cti The determination mode of (a) is as follows: and after the two frames of point cloud data are matched, obtaining a distance change value of a certain point, and determining the speed of the point according to the time difference of the two frames of point cloud data, namely the point cloud registration speed.
For example, in order to reduce motion estimation errors of point cloud matching, motion estimation speed smoothing processing may be performed on the estimated speed according to kalman filtering.
With the embodiment shown in FIG. 5, according to T Ci And T Vk The size relationship of (a) determines different conditionsThe corresponding compensation mode in the case according to the measurement speed V k And the estimated speed determines the compensation speed corresponding to the different situations, thereby determining the motion compensation corresponding to the different situations. Using motion compensation in the process of point cloud matching (as in fig. 4), i.e., performing S340: and carrying out registration processing on multi-frame point cloud data based on motion compensation (comprising a compensation mode and a compensation speed) under different conditions to obtain a reconstruction point cloud of a moving target.
Exemplary, referring to FIG. 4, point cloud C is being performed 0 And point cloud C 1 During registration processing of (a), determining the point cloud C 1 Is a measurement time T of (2) C1 In which case of the embodiment shown in fig. 5, and performing motion compensation according to the compensation mode and the compensation speed corresponding to the case, thereby determining the point cloud C 0 And point cloud C 1 Point cloud C after registration processing of (a) P1
Further, in-progress point cloud C P1 And point cloud C 2 During registration processing of (a), determining the point cloud C 2 Is a measurement time T of (2) C2 In which case of the embodiment shown in fig. 5, and performing motion compensation according to the compensation mode and the compensation speed corresponding to the case, thereby determining the point cloud C P1 And point cloud C 2 Point cloud C after registration processing of (a) P2
Finally, in-process point cloud C P(N-1) And point cloud C N During registration processing of (a), determining the point cloud C N Is a measurement time T of (2) CN In which case of the embodiment shown in fig. 5, and performing motion compensation according to the compensation mode and the compensation speed corresponding to the case, thereby determining the point cloud C P(N-1) And point cloud C N Point cloud C after registration processing of (a) PN
Therefore, the registration processing of all the point cloud data about the moving target is realized, and the reconstructed point cloud which accurately reflects the appearance of the moving target is obtained. Wherein, with continued reference to fig. 4, in performing the registration process, a motion estimator V P1 、V P2 ……V PN The calculation method of (2) is as follows:
M(x, y, z, roll, pitch, yaw) = [Vx * t, Vy * t, 0, 0, 0, 0] (3)
wherein Vx and Vy are the component of the measurement speed V and the y component, and t is the time interval of two point cloud scans; x, y, z are the translation amounts of motion estimation; roll, pitch, and yaw are the rotation amounts of motion estimation.
With continued reference to fig. 3, in S350: and projecting the reconstruction point cloud on a target plane parallel to the movement direction, and determining a minimum circumscribed rectangle in the reconstruction point cloud based on the direction perpendicular to the movement direction and the movement direction in the target plane. Determining the distance of the minimum circumscribed rectangle in the direction perpendicular to the movement direction as width information W of the moving object, and determining the distance of the minimum circumscribed rectangle in the movement direction as length information L of the moving object; and determining the maximum value of the Z coordinates in the reconstructed point cloud as the height information H of the moving object.
Illustratively, because there may be a matching error, the reconstructed point cloud (denoted as C0) is subjected to noise processing, for example, an isolated point removing method is adopted, so as to reduce matching noise introduced during point cloud matching, and the point cloud after noise removal is denoted as C1.
In the embodiment of S350, if the moving object is a traveling vehicle, the object plane/xy plane may be regarded as a cross section of the vehicle, and the point cloud C1 is projected on the xy plane, and the minimum inscribed quadrangle is determined with the direction of movement of the vehicle in the xy plane as the first side direction and the direction of treatment and movement in the xy plane as the second side direction.
The side length of the first side may be referred to as a length L of the vehicle, and the side length of the second side may be referred to as a width W of the vehicle. And determining the maximum value of the Z coordinates in the reconstructed point cloud as the height information H of the moving object.
In an exemplary embodiment, the present embodiment also considers the background height in order to further enhance the accuracy of determination of the movement target height information. That is, the Z-axis coordinate in the point cloud is the measured height, and the difference between the measured height and the background height is taken as the height of the moving object.
Specifically, fig. 6 is a flow chart illustrating a method for determining the apparent size of a dynamic target in still another exemplary embodiment of the disclosure. Specifically, S350 may be one embodiment. Specifically:
s351, acquiring background height information of a detection area through a laser radar, and rasterizing the detection area to obtain X, Y grids, wherein X, Y is a positive integer. And S352, acquiring background height information B (x, y) of each grid, wherein B (x, y) represents the height of the ground points of the grid (x, y).
Illustratively, the measurement region 120 is background modeled: as shown in fig. 7, background height information of the measurement region is acquired by the laser radar, wherein fig. 7 shows that the gray scale of the grid is determined based on the background height information B (x, y) of the grid. For example, the larger the B (x, y) value, the larger the grid gray value. Meanwhile, in order to facilitate data processing, the measurement area is rasterized, the rasterized background modeling information is represented as B (X, Y), the value of X is 1,2 …, and the value of X, Y is 1,2 …, and Y.
Where B (x, y) is the height of the ground point of the grid (x, y).
S353, the measured height information G (x, y) of each grid is acquired from the reconstructed point cloud.
Illustratively, the measured height information G (x, y) of each grid is determined from the Z-axis coordinates in the point cloud C1 after the above described denoising process.
S354, calculating the height difference between the measured height information G (x, y) and the background height information B (x, y), and carrying out binarization processing on the reconstruction point cloud according to the height difference.
Illustratively, the height difference is: h (x, y) =g (x, y) -B (x, y)
Comparing H (x, y) with a preset length threshold T2, thereby obtaining a front Jing Shange occupied by the dynamic target, and a front Jing Shange judgment formula (4) is as follows:
s355, determining the minimum circumscribed rectangle in the binarized reconstruction point cloud based on the direction perpendicular to the motion direction and the motion direction. And S356, determining a distance of the minimum bounding rectangle in a direction perpendicular to the moving direction as width information W of the moving object, determining a distance of the minimum bounding rectangle in the moving direction as length information L of the moving object, and determining a maximum value of the height difference H (x, y) as height information H of the moving object.
By the embodiment shown in fig. 6, the measured external dimensions s= { L, W, H } of the dynamic object are obtained.
The measurement apparent size s= { L, W, H } of the moving object is determined by S320-S350, and the actual apparent size d= { L ', W ', H ' } of the moving object is determined by S320' -S340' will be described below. Specifically:
referring to fig. 3, in S320', in response to the lidar determining that a dynamic target enters a detection area, triggering an imaging section to acquire an image of the dynamic target; in S330', performing image pixel compensation according to the measurement speed to remove motion blur in the image; and in S340', performing identity recognition of the moving object according to the image after the removal processing, and acquiring an actual appearance size of the moving object according to the recognized identity.
For example, when the laser radar detects that an object enters a measurement area based on background elimination, the camera shooting component and the speed measuring radar are triggered to work. In the case where the moving object is a traveling vehicle, an image of the vehicle in the form of a detection area is acquired by the image pickup means. Further, the license plate in the image is identified through an image identification technology. For example, the identified license plate information is denoted as P.
Referring to FIG. 1, license plate recognition may be performed at the measurement device 130 or at the computing device 150. If the image is identified at the measurement device 130, the amount of data uploaded to the computing device 150 is small. However, the processing unit capacity requirements of the measurement device 130 will be increased.
For example, in the case where the speed of the moving object is high, in order to be able to obtain an image quality satisfying the measurement requirement, motion blur removal is required. Specifically, image pixel compensation is performed according to vehicle speed information acquired by the speed measuring radar, so that motion blur removal is performed on an image generated by the image pickup part.
For example, in order to improve the recognition accuracy, the present embodiment identifies a moving object (such as a vehicle in running) from an image after the line motion blur removal processing. Further, the actual appearance size d= { L ', W ', H ' } of the moving object is obtained according to the identified identity (e.g., license plate information). For example, the acquired license plate information P is sent to a traffic law enforcement database to inquire the actual external dimensions of the vehicle. To perform illegal vehicle appearance detection according to a threshold in the following embodiments. Thereby effectively utilizing the database information of the related departments.
With continued reference to fig. 3, after determining the measured apparent size s= { L, W, H } of the moving object through S320-S350, and the actual apparent size d= { L ', W ', H ' } of the moving object through S320' -S340', S360 is performed: comparing the measured appearance size of the dynamic target with the actual appearance size corresponding to the dynamic target; and executing S370: the comparison result comprises: and under the condition that W is greater than W ', L is greater than L ' and H is greater than one or more of H ', determining the dynamic target as the target to be processed.
Illustratively, the calculation modes of the object to be processed are determined according to the measured external dimension S and the actual external dimension D as in the formula (5) and the formula (6).
D(L, W, H) = S - D = {|L’ - L|, |W’ - W|, |H’ - H|} (5)
Where |x| represents taking absolute value.
Wherein, in the case that the moving object is a traveling vehicle, the object to be processed is a retrofit vehicle. "1" indicates that the moving object is a retrofit vehicle and "0" indicates that the moving object is not a retrofit vehicle. The preset length threshold t3= { TL, TW, TH }, TL, TW, TH represent a moving target length change threshold, a width change threshold, and a height change threshold, respectively.
The technical scheme provides a method for measuring the appearance of an object in real time by combining a plurality of sensors and combining a traffic law enforcement part database, and can quickly detect illegal refitting of vehicles on the basis of fully utilizing the current intelligent traffic infrastructure and data, thereby improving the law enforcement efficiency; meanwhile, dangerous transportation behaviors can be effectively avoided; for example, by combining the height limiting information of the road, judging whether the vehicle can safely run on the road section or not, and reducing the problems of traffic accidents or incapacity caused by factors such as height limiting; in other application scenes, for example, scenes such as a high-speed closed road section, the appearance can be measured once through a distance detection, so that the deformation which is not found by a driver can be avoided, and the accident risk is further reduced.
In an exemplary embodiment, to improve the stability and robustness of the system as a whole, fig. 8 shows a schematic diagram of an inter-measurement device compensation enhancement scheme according to an exemplary embodiment of the present disclosure. Specifically, FIG. 8 illustrates logic for mutual compensation and correction of data for various sensors.
(a) With respect to compensation 1: in the point cloud registration process, the measurement speed obtained by the speed measuring radar is used for providing motion compensation. Therefore, the matching precision of the point cloud is improved, and the reconstruction of the point cloud with higher accuracy is facilitated to be obtained.
(b) With respect to compensation 2: and carrying out image pixel compensation on the image of the image pickup component according to the measurement speed obtained by the speed measuring radar so as to remove motion blur in the image. Therefore, the identification of the moving object can be more accurately performed according to the image after the removal processing.
(c) With respect to compensation 3: since the laser radar has strong positioning capability on objects with high reflectivity (such as license plates of vehicles in the driving process), the positioning accuracy of the shooting part on shooting targets (such as license plates) can be improved based on the laser radar.
(d) With respect to compensation 4: after the point cloud matching is completed, motion speed estimation can be performed according to the point cloud matching, and the point cloud registration speed is obtained, so that the speed of the speed measuring radar is corrected through the point cloud matching speed, and the accuracy and the robustness of the measurement of the speed measuring radar can be improved.
It is noted that the above-described figures are only schematic illustrations of processes involved in a method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
The following are device embodiments of the present disclosure that may be used to perform method embodiments of the present disclosure. For details not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the method of the present disclosure.
Wherein fig. 9 shows a schematic structural diagram of a dynamic target apparent size determining apparatus to which an embodiment of the present disclosure can be applied. Referring to fig. 9, the determining device for the dynamic target apparent size shown in the figure may be implemented as all or a part of the electronic device by software, hardware or a combination of both, and may also be integrated in the electronic device or on a server as a separate module.
The apparatus 900 for determining an appearance size of a dynamic target in an embodiment of the present disclosure includes: an acquisition module 910, a compensation determination module 920, a registration module 930, and a sizing module 940, wherein:
The acquiring module 910 is configured to determine a detection area, and acquire multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area; the compensation determining module 920 is configured to determine motion compensation according to the plurality of measurement speeds based on a time point when the plurality of measurement speeds are acquired and a time point when the multi-frame point cloud data are acquired; the registration module 930 is configured to perform registration processing on the multi-frame point cloud data based on the motion compensation to obtain a reconstructed point cloud of the moving target; and the size determining module 940 is configured to determine a measured external size of the dynamic object according to the reconstructed point cloud and the motion direction of the dynamic object.
In an exemplary embodiment, fig. 10 schematically illustrates a block diagram of a determination apparatus of an apparent size of a dynamic target in another exemplary embodiment according to the present disclosure. Please refer to fig. 10:
in an exemplary embodiment, based on the foregoing, the acquiringA module 910, specifically configured to: acquiring point cloud data C of the moving target through a laser radar, wherein C= { (T) C0 ,C 0 ),(T C1 ,C 1 ),…(T Ci ,C i ),…(T CN ,C N ) }, wherein T is Ci Representing acquisition of ith frame point cloud data C i N is a positive integer, i is a positive integer not greater than N; and triggering a speed measuring radar to acquire a measuring speed V of the moving target in response to the laser radar determining that the dynamic target enters the detection area, wherein V= { (T) V0 ,V 0 ),(T V1 ,V 1 ),…(T Vk ,V k ),…(T VM ,V M ) }, wherein T is Vk Representing the acquisition measurement speed V k M is a positive integer, and k is a positive integer not greater than M.
In an exemplary embodiment, based on the foregoing scheme, the compensation determining module 920 is specifically configured to: according to T Ci And T Vk Determining a compensation mode according to the magnitude relation of the above-mentioned measurement speed V k And determining a compensation speed from the estimated speed, and determining the compensation mode and the compensation speed as the motion compensation.
In an exemplary embodiment, based on the foregoing scheme, the compensation determining module 920 is specifically configured to: if T Ci <T V0 And T is V0 -T Ci <T0, performing motion compensation according to a compensation mode of uniform linear motion, and determining the compensation speed as T V0 Measurement speed V acquired at a time point 0 Wherein T0 is a first preset duration; if T Ci >T Vl And T is Ci <T Vp Then motion compensation is carried out according to a compensation mode of uniform linear motion, and an interpolation mode is adopted according to V l And V p Interpolation is carried out to obtain the compensation speed, wherein, l and p are positive integers less than M, and l is less than p and V l And V p Respectively T Vl And T Vp Measuring speed obtained at a time point; if T Ci >T Vk Then motion compensation is carried out according to a compensation mode of uniform linear motion, and the first stage corresponds toIs compensated at a speed T Vk Measurement speed V acquired at a time point k The compensation speed corresponding to the second stage is an estimated speed, wherein the estimated speed is according to V k And a point cloud registration speed determination.
In an exemplary embodiment, based on the foregoing, the size determining module 940 is specifically configured to: determining a minimum circumscribed rectangle in the reconstruction point cloud based on a direction perpendicular to the motion direction and the motion direction; determining a distance of the minimum bounding rectangle in a direction perpendicular to the moving direction as width information W of the moving object, and determining a distance of the minimum bounding rectangle in the moving direction as length information L of the moving object; and determining the maximum value in the reconstruction point cloud as the height information H of the moving object.
In an exemplary embodiment, based on the foregoing, the apparatus further includes: the background information determination module 950.
The background information determining module 950 is configured to: acquiring background height information of the detection area through the laser radar; rasterizing the detection area to obtain X, Y grids, wherein X, Y is a positive integer; and acquiring background height information B (X, Y) of each grid, wherein B (X, Y) represents the height of the ground points of the grid (X, Y), and X takes on a value of 1,2 … and X, Y takes on a value of 1,2 … and Y.
The size determining module 940 is specifically configured to: acquiring measurement height information G (x, y) of each grid according to the reconstruction point cloud; calculating the height difference between the measured height information G (x, y) and the background height information B (x, y), and performing binarization processing on the reconstruction point cloud according to the height difference; determining a minimum circumscribed rectangle in the binarized reconstruction point cloud based on the direction perpendicular to the motion direction and the motion direction; determining a distance of the minimum bounding rectangle in a direction perpendicular to the moving direction as width information W of the moving object, and determining a distance of the minimum bounding rectangle in the moving direction as length information L of the moving object; and determining the maximum value of the height difference as the height information H of the moving object.
In an exemplary embodiment, based on the foregoing, the apparatus further includes: a correction module 960.
Wherein, the correction module 960 is configured to: after the size determining module 940 obtains the reconstructed point cloud of the moving object, calculating the moving speed of the moving object by using the adjacent point cloud subjected to the registration processing to obtain a point cloud registration speed; and correcting the speed of the speed measuring radar through the point cloud registration speed.
In an exemplary embodiment, based on the foregoing solution, the size determining module 940 is further specifically configured to: obtaining the actual appearance size of the moving object, including: width information W ', length information L ', and height information H ' of the moving object.
Wherein, the device further includes: an alignment module 970 and a targeting module 980.
Wherein, the above-mentioned comparison module 970 is used for: and comparing the measured appearance size of the dynamic target with the actual appearance size corresponding to the dynamic target. And, the above-mentioned goal determining module 980 is used for: the comparison result comprises: and under the condition that W is greater than W ', L is greater than L ' and H is greater than one or more of H ', determining the dynamic target as the target to be processed.
In an exemplary embodiment, based on the foregoing solution, the size determining module 940 is further specifically configured to: responding to the laser radar to determine that the dynamic target enters the detection area, and triggering an image pickup component to acquire an image of the moving target; performing image pixel compensation according to the measurement speed to remove motion blur in the image; and carrying out identity recognition of the moving object according to the image after the removal processing, and acquiring the actual appearance size of the moving object according to the recognized identity.
It should be noted that, when the determination device for the appearance size of the dynamic target provided in the foregoing embodiment performs the determination method for the appearance size of the dynamic target, only the division of the foregoing functional modules is used as an example, in practical application, the foregoing functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the apparatus for determining the appearance size of the dynamic target provided in the foregoing embodiments and the embodiment of the method for determining the appearance size of the dynamic target belong to the same concept, so for details not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiment of the method for determining the appearance size of the dynamic target described in the foregoing disclosure, and the details are not repeated herein.
The foregoing embodiment numbers of the present disclosure are merely for description and do not represent advantages or disadvantages of the embodiments.
The disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of any of the methods of the previous embodiments. The computer readable storage medium may include, among other things, any type of disk including floppy disks, optical disks, DVDs, CD-ROMs, micro-drives, and magneto-optical disks, ROM, RAM, EPROM, EEPROM, DRAM, VRAM, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data.
The disclosed embodiments also provide an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of any of the methods of the embodiments described above when the program is executed by the processor.
Fig. 11 schematically illustrates a block diagram of an electronic device in an exemplary embodiment according to the present disclosure. Referring to fig. 11, an electronic device 1100 includes: a processor 1101 and a memory 1102.
In the embodiment of the disclosure, the processor 1101 is a control center of a computer system, and may be a processor of a physical machine or a processor of a virtual machine. The processor 1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1101 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1101 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state.
In the embodiment of the present disclosure, the processor 1101 is specifically configured to:
determining a detection area, and acquiring multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area; determining motion compensation according to the plurality of measurement speeds based on a time point at which the plurality of measurement speeds are acquired and a time point at which the multi-frame point cloud data are acquired; registering the multi-frame point cloud data based on the motion compensation to obtain a reconstructed point cloud of the moving target; and determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the motion direction of the dynamic target.
Further, the acquiring the multi-frame point cloud data and the plurality of measurement speeds of the dynamic target in the detection area includes: acquiring point cloud data C of the moving target through a laser radar, wherein C= { (T) C0 ,C 0 ),(T C1 ,C 1 ),…(T Ci ,C i ),…(T CN ,C N ) }, wherein T is Ci Representing acquisition of ith frame point cloud data C i N is a positive integer, i is a positive integer not greater than N; and triggering a speed measuring radar to acquire a measuring speed V of the moving target in response to the laser radar determining that the dynamic target enters the detection area, wherein V= { (T) V0 ,V 0 ),(T V1 ,V 1 ),…(T Vk ,V k ),…(T VM ,V M ) }, wherein T is Vk Representing the acquisition measurement speed V k M is a positive integer, and k is a positive integer not greater than M.
Further, the acquiring the multi-frame point cloud data is based on a time point at which the plurality of measurement speeds are acquiredDetermining motion compensation at a point in time of (a), comprising: according to T Ci And T Vk Determining a compensation mode according to the magnitude relation of the above-mentioned measurement speed V k And determining a compensation speed from the estimated speed, and determining the compensation mode and the compensation speed as the motion compensation.
Further, according to T Ci And T Vk Determining a compensation mode according to the magnitude relation of the above-mentioned measurement speed V k And estimating the speed to determine a compensation speed, comprising: if T Ci <T V0 And T is V0 -T Ci <T0, performing motion compensation according to a compensation mode of uniform linear motion, and determining the compensation speed as T V0 Measurement speed V acquired at a time point 0 Wherein T0 is a first preset duration; if T Ci >T Vl And T is Ci <T Vp Then motion compensation is carried out according to a compensation mode of uniform linear motion, and an interpolation mode is adopted according to V l And V p Interpolation is carried out to obtain the compensation speed, wherein, l and p are positive integers less than M, and l is less than p and V l And V p Respectively T Vl And T Vp Measuring speed obtained at a time point; and, if T Ci >T Vk Performing motion compensation according to a compensation mode of uniform linear motion, wherein the compensation speed corresponding to the first stage is T Vk Measurement speed V acquired at a time point k The compensation speed corresponding to the second stage is an estimated speed, wherein the estimated speed is according to V k And a point cloud registration speed determination.
Further, the determining the measured external dimension of the dynamic object according to the reconstructed point cloud and the motion direction of the dynamic object includes: determining a minimum circumscribed rectangle in the reconstruction point cloud based on a direction perpendicular to the motion direction and the motion direction; determining a distance of the minimum bounding rectangle in a direction perpendicular to the moving direction as width information W of the moving object, and determining a distance of the minimum bounding rectangle in the moving direction as length information L of the moving object; and determining the maximum value in the reconstruction point cloud as the height information H of the moving object.
Further, the method further comprises the steps of: acquiring background height information of the detection area through the laser radar; rasterizing the detection area to obtain X, Y grids, wherein X, Y is a positive integer; acquiring background height information B (X, Y) of each grid, wherein B (X, Y) represents the height of the ground points of the grid (X, Y), the value of X is 1,2 …, and the value of X, Y is 1,2 … and Y;
The determining the measurement appearance size of the dynamic object according to the reconstruction point cloud and the motion direction of the dynamic object includes: acquiring measurement height information G (x, y) of each grid according to the reconstruction point cloud; calculating the height difference between the measured height information G (x, y) and the background height information B (x, y), and performing binarization processing on the reconstruction point cloud according to the height difference; determining a minimum circumscribed rectangle in the binarized reconstruction point cloud based on the direction perpendicular to the motion direction and the motion direction; determining a distance of the minimum bounding rectangle in a direction perpendicular to the moving direction as width information W of the moving object, and determining a distance of the minimum bounding rectangle in the moving direction as length information L of the moving object; and determining the maximum value of the height difference as the height information H of the moving object.
Further, after the reconstructed point cloud of the moving object is obtained, the method further includes: calculating the motion speed of the moving target through the adjacent point clouds subjected to the registration processing to obtain the point cloud registration speed; and correcting the speed of the speed measuring radar through the point cloud registration speed.
Further, the method comprises the steps of: obtaining the actual appearance size of the moving object, including: width information W ', length information L ', and height information H ' of the moving object; comparing the measured appearance size of the dynamic target with the actual appearance size corresponding to the dynamic target; and, the comparison result comprises: and under the condition that W is greater than W ', L is greater than L ' and H is greater than one or more of H ', determining the dynamic target as the target to be processed.
Further, the acquiring the actual appearance size of the moving object includes: responding to the laser radar to determine that the dynamic target enters the detection area, and triggering an image pickup component to acquire an image of the moving target; performing image pixel compensation according to the measurement speed to remove motion blur in the image; and carrying out identity recognition of the moving object according to the image after the removal processing, and acquiring the actual appearance size of the moving object according to the recognized identity.
Memory 1102 may include one or more computer-readable storage media, which may be non-transitory. Memory 1102 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments of the present disclosure, a non-transitory computer readable storage medium in memory 1102 is used to store at least one instruction for execution by processor 1101 to implement the methods in embodiments of the present disclosure.
In some embodiments, the electronic device 1100 further includes: a peripheral interface 1103 and at least one peripheral. The processor 1101, memory 1102, and peripheral interface 1103 may be connected by a bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1103 by buses, signal lines or circuit boards. Specifically, the peripheral device includes: at least one of a display 1104, a camera 1105, and audio circuitry 1106.
A peripheral interface 1103 may be used to connect I/O (Input/Output) related at least one peripheral device to the processor 1101 and memory 1102. In some embodiments of the present disclosure, the processor 1101, memory 1102, and peripheral interface 1103 are integrated on the same chip or circuit board; in some other embodiments of the present disclosure, either or both of the processor 1101, memory 1102, and peripheral interface 1103 may be implemented on separate chips or circuit boards. The embodiments of the present disclosure are not particularly limited thereto.
The display 1104 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1104 is a touch display, the display 1104 also has the ability to capture touch signals at or above the surface of the display 1104. The touch signal may be input to the processor 1101 as a control signal for processing. At this point, the display 1104 may also be used to provide virtual buttons and/or virtual keyboards, also known as soft buttons and/or soft keyboards. In some embodiments of the present disclosure, the display 1104 may be one, providing a front panel of the electronic device 1100; in other embodiments of the present disclosure, the display 1104 may be at least two, respectively disposed on different surfaces of the electronic device 1100 or in a folded design; in still other embodiments of the present disclosure, the display 1104 may be a flexible display disposed on a curved surface or a folded surface of the electronic device 1100. Even more, the display 1104 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The display 1104 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera 1105 is used to capture images or video. Optionally, the cameras 1105 include front cameras and rear cameras. In general, a front camera is disposed on a front panel of an electronic device, and a rear camera is disposed on a rear surface of the electronic device. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments of the present disclosure, the camera 1105 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuitry 1106 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, and converting the sound waves into electric signals to be input to the processor 1101 for processing. For purposes of stereo acquisition or noise reduction, the microphone may be multiple, and disposed at different locations of the electronic device 1100. The microphone may also be an array microphone or an omni-directional pickup microphone.
A power supply 1107 is used to power the various components in the electronic device 1100. The power supply 1107 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power source 1107 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
The block diagrams of the electronic device structures shown in the embodiments of the present disclosure do not constitute a limitation of the electronic device 1100, and the electronic device 1100 may include more or less components than illustrated, or may combine some components, or may employ a different arrangement of components.
In the description of the present disclosure, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meaning of the terms in this disclosure will be understood by those of ordinary skill in the art in the specific context. Furthermore, in the description of the present disclosure, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The foregoing is merely specific embodiments of the disclosure, but the protection scope of the disclosure is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the disclosure, and it is intended to cover the scope of the disclosure. Accordingly, equivalent variations from the claims of the present disclosure are intended to be covered by this disclosure.

Claims (10)

1. A device for determining the apparent size of a dynamic target, the device comprising:
the acquisition module is used for determining a detection area and acquiring multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area;
the compensation determining module is used for determining motion compensation according to the plurality of measuring speeds based on the time points of acquiring the plurality of measuring speeds and the time points of acquiring the multi-frame point cloud data;
the registration module is used for carrying out registration processing on the multi-frame point cloud data based on the motion compensation to obtain a reconstruction point cloud of the moving target;
and the determining module is used for determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the movement direction of the dynamic target.
2. The determination device according to claim 1, wherein the compensation determination module is specifically configured to Ci And T Vk Determining a compensation mode from the magnitude relation of the velocity V k And estimating a speed to determine a compensation speed and determining the compensation mode and the compensation speed as the motion compensation;
wherein T is Ci Representing acquisition of ith frame point cloud data C i N is a positive integer, i is a positive integer not greater than N, T Vk Representing the acquisition measurement speed V k M is a positive integer, and k is a positive integer not greater than M.
3. The determination device according to claim 2, wherein the compensation determination module is specifically configured to,
when T is Ci <T V0 And T is V0 -T Ci <T0, performing motion compensation according to a compensation mode of uniform linear motion, and determining the compensation speed as T V0 Measurement speed V acquired at a time point 0 Wherein T0 is a first preset duration;
when T is Ci >T Vl And T is Ci <T Vp Performing motion compensation according to a compensation mode of uniform linear motion, and adopting an interpolation mode according to V l And V p Interpolation is carried out to obtain the compensation speed, wherein, l and p are positive integers less than M, and l is less than p and V l And V p Respectively T Vl And T Vp Measuring speed obtained at a time point;
when T is Ci >T Vk Performing motion compensation according to a compensation mode of uniform linear motion, wherein the compensation speed corresponding to the first stage is T Vk Measurement speed V acquired at a time point k The compensation speed corresponding to the second stage is an estimated speed, wherein the estimated speed is according to V k And a point cloud registration speed determination.
4. A determining apparatus according to claim 2 or 3, wherein the compensation mode is uniform linear motion or uniform acceleration motion.
5. A determining apparatus according to claim 2 or 3, wherein the compensation determining module is further configured to perform a motion estimation speed balancing process on the estimated speed according to a kalman filter.
6. A determining device according to claim 3, wherein the point cloud registration speed is determined from a time difference of two frames of point cloud data.
7. The determining apparatus according to claim 1, wherein,
the obtaining module is further configured to obtain an actual appearance size of the moving object, including: width information W ', length information L ', and height information H ' of the moving object;
the determining module is used for comparing the measured appearance size of the dynamic target with the actual appearance size corresponding to the dynamic target; the comparison result comprises: and under the condition that W is greater than W ', L is greater than L ' and H is greater than one or more of H ', determining the dynamic target as a target to be processed.
8. The determining device according to claim 7, wherein the acquiring the actual apparent size of the moving object includes: responding to the laser radar to determine that the dynamic target enters the detection area, and triggering a camera shooting component to acquire an image of the moving target; performing image pixel compensation according to the measurement speed to remove motion blur in the image; and carrying out identity recognition of the moving object according to the image after the removal processing, and acquiring the actual appearance size of the moving object according to the recognized identity.
9. The apparatus according to claim 8, wherein the acquisition module is further configured to identify a license plate in the image of the moving object.
10. A method for determining an apparent size of a dynamic target, the method comprising:
determining a detection area, and acquiring multi-frame point cloud data and a plurality of measurement speeds of a dynamic target in the detection area;
determining motion compensation according to the plurality of measurement speeds based on a time point at which the plurality of measurement speeds are acquired and a time point at which the multi-frame point cloud data are acquired;
registering the multi-frame point cloud data based on the motion compensation to obtain a reconstructed point cloud of the moving target;
Determining the measurement appearance size of the dynamic target according to the reconstruction point cloud and the motion direction of the dynamic target;
and comparing the measured appearance size of the dynamic target with the actual appearance size corresponding to the dynamic target.
CN202310471451.2A 2021-09-30 2021-09-30 Method and device for determining appearance size of dynamic target Pending CN116659376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310471451.2A CN116659376A (en) 2021-09-30 2021-09-30 Method and device for determining appearance size of dynamic target

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310471451.2A CN116659376A (en) 2021-09-30 2021-09-30 Method and device for determining appearance size of dynamic target
CN202111169245.3A CN114111568B (en) 2021-09-30 2021-09-30 Method and device for determining appearance size of dynamic target, medium and electronic equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202111169245.3A Division CN114111568B (en) 2021-09-30 2021-09-30 Method and device for determining appearance size of dynamic target, medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN116659376A true CN116659376A (en) 2023-08-29

Family

ID=80441363

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310471451.2A Pending CN116659376A (en) 2021-09-30 2021-09-30 Method and device for determining appearance size of dynamic target
CN202111169245.3A Active CN114111568B (en) 2021-09-30 2021-09-30 Method and device for determining appearance size of dynamic target, medium and electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202111169245.3A Active CN114111568B (en) 2021-09-30 2021-09-30 Method and device for determining appearance size of dynamic target, medium and electronic equipment

Country Status (1)

Country Link
CN (2) CN116659376A (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114863695B (en) * 2022-05-30 2023-04-18 中邮建技术有限公司 Overproof vehicle detection system and method based on vehicle-mounted laser and camera
CN114820392B (en) * 2022-06-28 2022-10-18 新石器慧通(北京)科技有限公司 Laser radar detection moving target distortion compensation method, device and storage medium
CN116229040A (en) * 2022-07-15 2023-06-06 深圳市速腾聚创科技有限公司 Target area positioning method and target area positioning device
CN115205284B (en) * 2022-09-09 2023-02-14 深圳市速腾聚创科技有限公司 Target object detection method and device, medium and electronic equipment
CN115876098B (en) * 2022-12-12 2023-10-24 苏州思卡信息系统有限公司 Vehicle size measurement method of multi-beam laser radar

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107817503B (en) * 2016-09-14 2018-12-21 北京百度网讯科技有限公司 Motion compensation process and device applied to laser point cloud data
CN111699410B (en) * 2019-05-29 2024-06-07 深圳市卓驭科技有限公司 Processing method, equipment and computer readable storage medium of point cloud
CN111260683B (en) * 2020-01-09 2023-08-08 合肥工业大学 Target detection and tracking method and device for three-dimensional point cloud data
CN112731450B (en) * 2020-08-19 2023-06-30 深圳市速腾聚创科技有限公司 Point cloud motion compensation method, device and system
CN111932943B (en) * 2020-10-15 2021-05-14 深圳市速腾聚创科技有限公司 Dynamic target detection method and device, storage medium and roadbed monitoring equipment
CN112733641B (en) * 2020-12-29 2024-07-05 深圳依时货拉拉科技有限公司 Object size measuring method, device, equipment and storage medium
CN112462348B (en) * 2021-02-01 2021-04-27 知行汽车科技(苏州)有限公司 Method and device for amplifying laser point cloud data and storage medium

Also Published As

Publication number Publication date
CN114111568B (en) 2023-05-23
CN114111568A (en) 2022-03-01

Similar Documents

Publication Publication Date Title
CN114111568B (en) Method and device for determining appearance size of dynamic target, medium and electronic equipment
US10580164B2 (en) Automatic camera calibration
CN106462996B (en) Method and device for displaying vehicle surrounding environment without distortion
CN110826499A (en) Object space parameter detection method and device, electronic equipment and storage medium
JP2013232091A (en) Approaching object detection device, approaching object detection method and approaching object detection computer program
CN112368756A (en) Method for calculating collision time of object and vehicle, calculating device and vehicle
JP7050763B2 (en) Detection of objects from camera images
CN114913506A (en) 3D target detection method and device based on multi-view fusion
CN112172797B (en) Parking control method, device, equipment and storage medium
CN106168988A (en) Rule is sheltered and for the method and apparatus sheltering the image information of video camera for producing
CN109664854A (en) A kind of automobile method for early warning, device and electronic equipment
JP2013148356A (en) Vehicle position calculation device
US10134182B1 (en) Large scale dense mapping
CN114545434A (en) Road side visual angle speed measurement method and system, electronic equipment and storage medium
Petrovai et al. Obstacle detection using stereovision for Android-based mobile devices
CN113160322A (en) Camera calibration method and device, medium and electronic device
JPWO2019123582A1 (en) Object information generator and object information generator
Peng et al. A real-time fisheye video correction method based on Android smart-phone GPU
CN114973208B (en) Vehicle blind area monitoring and early warning method and related equipment
JP2021148730A (en) Position estimation method, position estimation device, and program
JP2015041187A (en) Traffic measurement device, and traffic measurement method
CN112215033B (en) Method, device and system for generating panoramic looking-around image of vehicle and storage medium
CN114170267A (en) Target tracking method, device, equipment and computer readable storage medium
JP2021051348A (en) Object distance estimation apparatus and object distance estimation method
CN115082662B (en) Target area positioning method and target area positioning device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination