CN110673123A - Target object ranging method and device - Google Patents

Target object ranging method and device Download PDF

Info

Publication number
CN110673123A
CN110673123A CN201911015688.XA CN201911015688A CN110673123A CN 110673123 A CN110673123 A CN 110673123A CN 201911015688 A CN201911015688 A CN 201911015688A CN 110673123 A CN110673123 A CN 110673123A
Authority
CN
China
Prior art keywords
target object
distance
relative distance
relative
width
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911015688.XA
Other languages
Chinese (zh)
Other versions
CN110673123B (en
Inventor
潘铭星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Horizon Robotics Technology Research and Development Co Ltd
Original Assignee
Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Horizon Robotics Technology Research and Development Co Ltd filed Critical Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority to CN201911015688.XA priority Critical patent/CN110673123B/en
Publication of CN110673123A publication Critical patent/CN110673123A/en
Application granted granted Critical
Publication of CN110673123B publication Critical patent/CN110673123B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention discloses a distance measuring method and a distance measuring device for a target object, wherein a first actual width of the target object is determined according to an image acquired by an image acquisition device; then, acquiring relevant state parameters of the target object at the previous moment; and obtaining a first relative distance according to the relevant state parameters, and obtaining a second actual width of the target object according to the first relative distance. Then, a relative difference value is obtained according to the first actual width and the second actual width, and the first relative distance is adjusted to be the second relative distance according to the relative difference value. Therefore, the actual width of the target object is obtained in different modes, then the comparison result of the actual width is utilized to restrict and adjust the relative distance between the two movable devices, the relative distance between the two movable devices is accurate, the distance measurement accuracy can be improved, and the driving safety is guaranteed.

Description

Target object ranging method and device
Technical Field
The application relates to the technical field of automatic driving, in particular to a distance measuring method and device for a target object.
Background
With the continuous development of science and technology, automatic driving is also developed at a rapid speed. The automatic driving is not required to be equipped with a driver, and the whole process is automatically controlled by a computer.
One of the major concerns of autopilot research is ranging, i.e., measuring the distance between a preceding autopilot device and a current autopilot device. The accuracy of the distance measurement directly affects driving safety and driving efficiency. For example, if the distance between the front vehicle and the rear vehicle is not accurately measured, the rear vehicle easily collides with the front vehicle, causing traffic accidents, and seriously affecting the driving safety. For another example, inaccurate measurement distances of front and rear unmanned aerial vehicles may cause accidents such as collision and crash of the unmanned aerial vehicles.
Therefore, how to improve the ranging accuracy is a problem that needs to be solved urgently at present.
Disclosure of Invention
The present application is proposed to solve the above-mentioned technical problems.
According to an aspect of the present application, there is provided a ranging method of a target object, the method including: determining a first actual width of the target object according to an image acquired by an image acquisition device;
acquiring relevant state parameters of the target object at the previous moment; obtaining a first relative distance according to the relevant state parameter, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment; obtaining a second actual width of the target object according to the first relative distance; obtaining a relative difference value according to the first actual width and the second actual width; and adjusting the first relative distance to be a second relative distance according to the relative difference.
According to another aspect of the present application, there is provided a ranging apparatus for a target object, including: the first determining module is used for determining a first actual width of the target object according to the image acquired by the image acquisition device; the acquisition module is used for acquiring the relevant state parameters of the target object at the previous moment; the first processing module is used for obtaining a first relative distance according to the relevant state parameter, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment; the second processing module is used for obtaining a second actual width of the target object according to the first relative distance; the comparison module is used for obtaining a relative difference value according to the first actual width and the second actual width; and the adjusting module is used for adjusting the first relative distance into a second relative distance according to the relative difference.
According to still another aspect of the present application, there is provided an electronic apparatus including: a processor; and a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the method as described above.
According to yet another aspect of the application, there is provided a computer readable medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the method as described above.
Compared with the prior art, the method and the device have the advantages that the first actual width of the target object is determined according to the image acquired by the image acquisition device; then, acquiring relevant state parameters of the target object at the previous moment; and obtaining the first relative distance according to the relevant state parameters, wherein the distance between the image acquisition device at the current moment and the target object can be obtained because the relevant state parameters can truly reflect the actual state of the target object in the driving process. In addition, a second actual width of the target object is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first actual width and the second actual width, and the first relative distance is adjusted to be the second relative distance according to the relative difference value. Therefore, the actual width of the target object is obtained in different modes, then the comparison result of the actual width is utilized to restrict and adjust the relative distance between the two movable devices, the relative distance between the two movable devices is accurate, the distance measurement accuracy can be improved, and the driving safety is guaranteed.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 is a flowchart illustrating a method for measuring a distance to a target object according to an exemplary embodiment of the present disclosure.
Fig. 2 is a schematic flowchart of a process before determining a first actual width of a target object according to another exemplary embodiment of the present application.
FIG. 3A is a projection relationship diagram provided by an exemplary embodiment of the present application;
fig. 3B is a schematic illustration of an unmanned vehicle provided by an exemplary embodiment of the present application.
Fig. 4 is a schematic flow chart of obtaining the first relative distance according to an exemplary embodiment of the present application.
FIG. 5 is a schematic illustration of a second actual width provided by an exemplary embodiment of the present application.
Fig. 6 is a flowchart of a method for adjusting a first relative distance to a second relative distance according to a relative difference according to an exemplary embodiment of the present application.
Fig. 7 is a schematic diagram of a ranging apparatus for a target object according to an exemplary embodiment of the present application.
Fig. 8 is another schematic diagram of a ranging apparatus for a target object according to an exemplary embodiment of the present disclosure.
Fig. 9 is an exemplary block diagram of the first processing module 730 according to an exemplary embodiment of the present application.
Fig. 10 is an exemplary block diagram of an adjustment module 760 provided in an exemplary embodiment of the present application.
Fig. 11 is a block diagram of an electronic device provided in an exemplary embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein.
Summary of the application
The existing distance measuring modes are generally divided into monocular distance measuring, binocular distance measuring, trinocular distance measuring and the like, and are divided according to the number of image acquisition devices. For example, monocular ranging is where a single image acquisition device (in a posterior movable apparatus) measures the distance between the posterior movable apparatus and a preceding movable apparatus (also referred to herein as a "target object").
In monocular distance measurement, however, the accuracy of distance measurement directly affects driving safety and driving efficiency. If the distance measurement is inaccurate, the driving safety is seriously influenced.
In view of the above problems, the present application aims to research how to improve the accuracy of ranging in monocular ranging, and based on the purpose, the present application researches a ranging method for a target object, and determines a first actual width of the target object according to an image acquired by an image acquisition device; then, acquiring relevant state parameters of the target object at the previous moment; and obtaining the first relative distance according to the relevant state parameters, wherein the distance between the image acquisition device at the current moment and the target object can be obtained because the relevant state parameters can truly reflect the actual state of the target object in the driving process. In addition, a second actual width of the target object is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first actual width and the second actual width, and the first relative distance is adjusted to be the second relative distance according to the relative difference value. Therefore, the actual width of the target object is obtained in different modes, then the comparison result of the actual width is utilized to restrict and adjust the relative distance between the two movable devices, the relative distance between the two movable devices is accurate, the distance measurement accuracy can be improved, and the driving safety is guaranteed.
Exemplary method
Fig. 1 is a flowchart illustrating a ranging method for a target object according to an exemplary embodiment of the present disclosure. The embodiment can be applied to mobile equipment. The mobile equipment of this embodiment, including unmanned vehicle, unmanned aerial vehicle, arm and mobile robot etc. can the autonomous movement's equipment.
The present embodiment is applied in monocular distance measurement, and during forward and backward relative driving of two movable devices, the distance between the two movable devices (target object) and the preceding movable device is measured by the following movable device. It is noted that the rear movable apparatus takes an image with the image pickup device to perform ranging, and thus the distance between the image pickup device and the target object is equivalent to the distance between the rear movable apparatus and the target object.
A ranging method of a target object described in one or more embodiments of the present application is shown in fig. 1, and includes the following steps:
step 101, determining a first actual width of the target object according to the image acquired by the image acquisition device.
The image capturing device of this embodiment may be a camera, an infrared camera, or other devices, and of course, the specific type is not limited, and any device having an image capturing function should be included in the scope of this embodiment.
The image obtained in this example contains the following information: the width of the target object in the image (which may be considered as a virtual width), the morphology of the target object (shape, state, appearance, etc.). In addition, the image acquisition device has a focal length, which is set in advance.
The first actual width is used to characterize the true width of the target object, which is processed from one or more of the above-mentioned information contained in the image. The specific implementation process will be described in detail later, and will not be described herein again.
Taking the unmanned vehicle as an example, the unmanned vehicle has a virtual vehicle width in the captured image, and the first actual width refers to a real vehicle width of the unmanned vehicle.
And 102, acquiring relevant state parameters of the target object at the previous moment.
In particular, the relevant state parameter can reflect the actual state of the target object as it appears during driving. Since the relevant state parameters presented by the target object at each moment in the driving process may be different, the relevant state parameters of the target object at each moment in time include: a correlated distance parameter, a correlated velocity parameter, a process noise. Further, the process noise includes: velocity noise and distance noise. Furthermore, the term "relevant state parameter at the previous time" in this embodiment refers to the relevant state parameter of the target object at the previous time based on the current time. When the target object runs to the current time, the relevant state parameters of the target object at the previous time are already presented, so that the relevant state parameters can be obtained through a specific implementation mode, and detailed description is given later, and is not repeated herein.
Wherein, the related speed parameter refers to the driving speed of the target object, and the related speed parameter at each moment is related to the related state parameter at the previous moment. The correlation velocity at a previous time may affect the correlation velocity at a later time. Taking the current time as an example, the relevant speed parameter at the current time is related to the relevant speed parameter at the previous time, the speed noise at the current time, and the like.
The relevant distance parameter refers to the distance between the image acquisition device and the target object, and the relevant distance parameter at each moment is related to the relevant state parameter at the previous moment. Taking the current time as an example, the relevant distance parameter of the current time is related to the relevant distance parameter of the previous time, the relevant speed parameter of the previous time, the time difference between the previous time and the current time, the distance noise and the like.
The velocity noise at each time is used to regulate the velocity at each time. The distance noise at each time is used for regulating and controlling the distance at each time.
Further, during the driving process of the target object, the relevant state parameters presented at each moment may be different, and the relevant state parameters of the target object at the previous moment may affect the relevant state parameters of the target object at the current moment. Therefore, the related state parameter at the previous time is required to be obtained as the basic parameter for obtaining the first relative distance. The specific implementation process will be described later, and will not be described herein again.
Step 103, obtaining a first relative distance according to the relevant state parameter.
Specifically, the first relative distance is a distance between the image acquisition device and the target object at the present time.
The obtained related state parameters can reflect the actual state of the target object in the driving process, so that the first relative distance obtained by the related state parameters can truly and accurately reflect the relative distance between the image acquisition device and the target object.
And 104, acquiring a second actual width of the target object according to the first relative distance.
And the second actual width is used for representing the real width of the target object and is obtained based on the first relative distance. The first actual width is the true width of the target object, which is processed from the information contained in the image. The second actual width is also the true width of the target object and is obtained from the true state parameters of the target object. The two actual widths are of different origin.
And 105, obtaining a relative difference value according to the first actual width and the second actual width.
Comparing the two, the relative difference can be obtained. If the relative difference is small (e.g., smaller than a predetermined threshold), it indicates that both actual widths are relatively accurate. If the relative difference is large (e.g., above a predetermined threshold), it reflects a deviation in the measured first relative distance, and further adjustment is required.
And 106, adjusting the first relative distance to a second relative distance according to the relative difference.
Wherein, the adjustment mode is different according to the difference of the relative difference. The embodiment adopts the relative difference value as the adjustment standard, and can further optimize the relative distance with the target object.
Through the analysis, the embodiment of the invention determines the first actual width of the target object according to the image acquired by the image acquisition device; then, acquiring relevant state parameters of the target object at the previous moment; and obtaining the first relative distance according to the relevant state parameters, wherein the distance between the image acquisition device at the current moment and the target object can be obtained because the relevant state parameters can truly reflect the actual state of the target object in the driving process. In addition, a second actual width of the target object is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first actual width and the second actual width, and the first relative distance is adjusted to be the second relative distance according to the relative difference value. Therefore, the actual width of the target object is obtained in different modes, then the comparison result of the actual width is utilized to restrict and adjust the relative distance between the two movable devices, the relative distance between the two movable devices is accurate, the distance measurement accuracy can be improved, and the driving safety is guaranteed.
On the basis of the above-mentioned embodiment shown in fig. 1, as an alternative implementation manner of this embodiment, before determining the first actual width of the target object in step 101, referring to fig. 2, the following steps need to be performed:
step 201, the width of the target object in the image is obtained.
Here, the width of the target object in the image refers to a virtual width for distinguishing from an actual width in the present application. After the image is shot, the image contains the target object, so that the virtual width of the target object in the image can be obtained by identifying the image. Taking an unmanned vehicle as an example, referring to fig. 3B, the image of the rear end of the leading vehicle is obtained by shooting the trailing vehicle, and the virtual vehicle width of the leading vehicle in the image can be obtained by recognizing the image. It is to be noted that the virtual vehicle width of the preceding vehicle in the image can be obtained from the image even under extreme conditions (e.g., the preceding vehicle turns, the preceding vehicle travels straight on a lane).
Step 202, determining the initial width of the target object according to the width of the target object in the image and the projection relation.
The projection relation comprises a mapping relation and an amplification scale of a preset reference surface and an image. The enlargement ratio may be set in advance. Referring to fig. 3A, the image may be enlarged into the preset reference plane according to the enlargement ratio, the width of the target object in the image is also correspondingly enlarged in the preset reference plane according to the enlargement ratio, and then the width of the target object in the preset reference plane is used as the initial width. Specifically, the preset reference surface may be set as the ground.
Step 203, determining a first actual width according to the initial width.
In particular, the initial width of the target object is also used to characterize the true width (first actual width) of the target object. Therefore, in a specific implementation process, the initial width can be directly used as the first actual width.
However, if the image obtained by shooting under the extreme condition is not correct in angle (for example, turning), the initial width of the target object may have a certain deviation.
In view of this, after obtaining the initial width of the target object, the following operations are performed: inputting the image into a preset model, and determining the type of a target object; and adjusting the initial width of the target object according to the type of the target object to obtain a first actual width.
Wherein the target object in the image will present its own form, such as the shape, state and appearance presented by the target object. As different target objects may have different appearances, particular shapes, or their own brands LOGO, etc. Therefore, after the image is input into the preset model, the preset model can determine the type of the target object according to the form of the target object.
In the process of determining the type of the target object, a base model (such as a neural network model (CNN), RNN, and the like) is constrained in advance by using the sample morphology and the sample type of a large number of related objects, so as to obtain the preset model. And then, inputting the image into a preset model, processing the form of the target object in the image according to the preset model, and outputting the type of the target object.
And the type of target object may determine the actual width of the target object. Taking an unmanned vehicle as an example, the type of the unmanned vehicle refers to an unmanned vehicle type, for example, if the unmanned vehicle in fig. 3B is a type a of a certain brand, the vehicle width is fixed. The actual width corresponding to the model of the unmanned vehicle can be used as a standard to adjust the initial width to obtain a first actual width of the unmanned vehicle so as to correct possible deviation of the initial width. And in the adjusting process, determining the width difference value between the real width and the initial width corresponding to the unmanned vehicle type, comparing the width difference value with a preset threshold value, and if the width difference value is smaller than the preset threshold value, selecting one of the real width and the initial width corresponding to the unmanned vehicle type. And if the width difference is larger than a preset threshold value, replacing the initial width by using the real width corresponding to the unmanned vehicle model. Of course, other adjustment methods, such as combining (increasing or decreasing) the width difference with the initial width, should also fall within the scope of the present invention.
On the basis of the embodiment shown in fig. 1, as an optional implementation manner of this embodiment, in the process of step 102, a relevant distance parameter of the target object at a previous time, a relevant speed parameter at a previous time, and a noise error at a previous time are obtained.
This is done because during driving, the relevant state parameter of the target object at the previous moment in time influences the relevant state parameter at the current moment in time. Taking an unmanned vehicle as an example, under the condition that the driving condition of the rear vehicle is not changed, if the front vehicle decelerates to drive at a time before, the distance between the two vehicles at the current time is reduced. The actual influence of the related state parameter at the previous moment is considered, the first relative distance at the current moment is obtained by taking the actual influence as a basis, and the accuracy of the first relative distance can be improved.
On the basis of obtaining the relevant state parameter, as an optional implementation manner of this embodiment, in the process of step 103, the following operations are implemented: and obtaining a first relative distance according to the relevant distance parameter at the previous moment and the relevant speed parameter at the previous moment.
More specifically, referring to fig. 4, the above implementation process includes the following specific operation steps:
step 401, obtaining a time difference between a current time and a previous time.
The time difference between the previous time and the current time may be in units of "milliseconds", and the time difference may be any value, for example, 2ms, 5ms, and the like.
And step 402, obtaining the relative movement distance of the current moment according to the time difference and the related speed parameter of the previous moment.
The process noise can be divided into: distance noise and velocity noise. The distance noise at each moment can be used for regulating and controlling the distance precision at each moment, and the speed noise at each moment can be used for regulating and controlling the speed precision at each moment.
In addition, the relevant state parameters at each moment can influence the relevant state parameters at the moment immediately after the moment. Therefore, in the process of calculating the relevant speed parameter at the previous moment, the speed noise at the previous moment and the relevant speed parameter at the moment immediately before the previous moment need to be referred to. Specifically, the correlated velocity parameter immediately before the previous time and the velocity noise at the previous time may be summed to obtain the correlated velocity parameter at the previous time. For ease of understanding, formula V may be utilizedk-1=Vk-2+Wvk-1And (6) obtaining. Wherein k-1 represents the previous time, Vk-1Indicating the relevant speed parameter, W, at the previous momentvk-1Represents the velocity noise at the previous time, k-2 represents the time immediately before the previous time (also referred to as the previous time of the previous time), Vk-2Representing the relevant speed parameter at the instant immediately preceding the previous instant.
In the implementation process of obtaining the relative movement distance at the current moment according to the time difference and the related speed parameter at the previous moment, the product of the time difference and the related speed parameter at the previous moment can be used as the relative movement distance at the current moment. In accordance with the above example, the relative movement distance at the current time is Vk-1Δ t, wherein Δ t represents a time difference between the current time and the previous time.
In the above operation, the relative movement distance is obtained by combining the relevant state parameters at the previous time (the relevant speed parameter, the time difference, and the like at the previous time), so that the influence of the change of the relevant state parameters at the previous time on the relative movement distance can be comprehensively considered, and the accuracy of the relative movement distance at the current time can be improved.
In step 403, a first relative distance is obtained based on the relative distance parameter at the previous time, the relative movement distance at the current time, and the distance noise at the current time.
Specifically, the relative distance parameter at the previous time is used for representing the distance between the target object image acquisition devices. The relative movement distance of the current moment at the previous moment is used for representing the movement distance of the target object moving relative to the image acquisition device in the time difference, so that the first relative distance can be obtained by summing the relative distance parameter of the previous moment, the relative movement distance of the current moment and the distance noise of the current moment.
Further, the distance noise at the current time is obtained by: obtaining distance noise at a previous moment; and obtaining the distance noise of the current moment according to the distance noise and the time difference of the previous moment. Specifically, the distance noise at the current time can be obtained by summing the distance noise at the previous time and the time difference.
For ease of understanding, in conjunction with the formula notation given above, the present embodiment obtains the first relative distance specifically as follows: sk=Sk-1+Vk-1*Δt+Wsk1Where k denotes the current time, SkA first relative distance, S, representing the current timek-1Representing the relative distance parameter, V, of the preceding instantk-1Δ t represents the relative movement distance at the current time, Wsk1Representing the distance noise at the current time.
As can be seen, in the above operation, the first relative distance is obtained by combining the relevant state parameter at the previous time (the relative distance parameter, the relative movement distance, and the like at the previous time) and the distance noise at the current time, so that the influence of the change in the relevant state parameter at the previous time on the relative movement distance can be comprehensively considered, and the accuracy of the first relative distance at the current time can be improved.
Through the analysis, the relevant state parameters at the previous moment and various parameters at the current moment are comprehensively considered in the implementation process of the first relative distance, and the parameters can comprehensively reflect the real-time driving state of the target object, so that the first relative distance is obtained by taking the parameters as the basis, the relative position relation between the target object and the image acquisition device can be accurately reflected, the distance measurement accuracy can be realized, and the driving safety is further ensured.
On the basis of the embodiment shown in fig. 1, as an optional implementation manner of this embodiment, the step 104 specifically includes the following operations: obtaining the focal length of the image acquisition device; and processing the focal length, the width of the target object in the image and the first relative distance to obtain a second actual width.
Wherein, the focal length of the image acquisition device is set in advance.
The second actual width is obtained according to the pinhole imaging principle. Specifically, the ratio of the focal length to the first relative distance is equal to the ratio of the width of the target object in the image to the second actual width, which is obtained from this relationship.
Referring to fig. 5, the width of the image capturing device in the image of the target object is p, and the first relative distance between the target object and the image capturing device is SkThe focal length of the image acquisition device is f, and the focal length f is set in advance. The magnification ratio can be calculated according to the distance and the focal length as follows: f/S. And then amplifying the width p of the target object in the image according to the amplification ratio f/S to obtain the initial width D of the target object.
Namely:
Figure BDA0002245626080000111
the deformation is as follows:
Figure BDA0002245626080000112
referring to fig. 6, on the basis of the embodiment shown in fig. 1, as an alternative implementation manner of this embodiment, the step 106 specifically includes the following operations:
step 601, judging whether the relative difference is smaller than a preset threshold value.
Specifically, the specific value of the preset threshold needs to be adjusted according to experience and actual conditions, and this embodiment is not limited herein.
After the determination is performed, the obtained determination result may be one of the following two results:
firstly, the relative difference is smaller than a preset threshold value. Indicating that the first actual width and the second actual width are relatively close, step 602 may be performed.
The second step is as follows: if the relative difference is greater than or equal to the preset threshold, which indicates that there may be a deviation in the first relative distance, step 603 is executed.
Step 602, if yes, determining the first relative distance as the second relative distance.
Wherein the second relative distance is used to characterize the distance between the target object and the image acquisition device at the current time. The second relative distance is basic data for performing the subsequent driving operation, so that the requirement on the accuracy of the second relative distance is high. The higher the accuracy of the second relative distance is, the more the driving safety and the driving efficiency can be ensured. If the relative difference is smaller than the preset threshold, it indicates that the accuracy of the obtained first relative distance is higher, so that the first relative distance can be directly determined as the second relative distance.
Therefore, the first relative distance between the two mobile devices is constrained by the relative difference value between the first actual width and the second actual width and the preset threshold value so as to obtain the second relative distance, the obtained second relative distance can be used for representing the relative position relation between the target object and the image acquisition device more accurately, the distance measurement accuracy can be realized, and the driving safety is further guaranteed.
Step 603, if not, adjusting the distance noise according to the relative difference, and obtaining a second relative distance according to the adjusted distance noise and the relevant state parameters.
And the difference value between the third actual width and the first actual width obtained by the second relative distance is smaller than a preset threshold value.
Specifically, the relative difference and the distance noise have a mapping relationship. And acquiring corresponding distance noise from the mapping relation according to the obtained relative difference, and then adjusting the distance noise at the current moment according to the distance noise. For example, the distance noise W at the current time is measuredsk1Adjusted to adjusted range noise Wsk2
After the adjustment, the second relative distance may be obtained with reference to the manner in which the first relative distance is obtained. Specifically, the second relative distance can be obtained by summing the relative distance parameter at the previous time, the relative movement distance at the current time, and the adjusted distance noise. The second relative distance S according to the above formulak’=Sk-1+Vk-1*Δt+Wsk2. Wherein S isk' denotes the second relative distance, denotes Wsk1After adjustmentDistance noise.
Further, obtaining the focal length of the image acquisition device; and processing the focal length, the width of the target object in the image and the second relative distance to obtain a third actual width. The present embodiment is configured such that the difference between the third actual width and the first actual width is smaller than a preset threshold. That is to say, in this embodiment, the difference between the third actual width and the first actual width is smaller than the preset threshold as the constraint condition, so as to constrain the second relative distance, and the obtained second relative distance can be more accurate to represent the relative position relationship between the target object and the image acquisition device, so that the accuracy of distance measurement can be realized, and the driving safety can be further ensured.
It should be noted that, in order to obtain the second relative distance more accurately, the difference between the third actual width and the first actual width may be smaller than a preset threshold as a constraint condition, and the distance noise is adjusted to perform the above steps for multiple times until the constraint condition is satisfied.
Exemplary devices
Fig. 7 illustrates a block diagram of a ranging apparatus 700 of a target object according to an embodiment of the present application.
As shown in fig. 7, a ranging apparatus 700 for a target object according to an embodiment of the present invention includes: a first determining module 710, configured to determine a first actual width of the target object according to the image acquired by the image acquisition apparatus; an obtaining module 720, configured to obtain a relevant state parameter of the target object at a previous time; the first processing module 730, configured to obtain a first relative distance according to the relevant state parameter, where the first relative distance is a distance between the image acquisition device and the target object at the current time; a second processing module 740, configured to obtain a second actual width of the target object according to the first relative distance; a comparing module 750, configured to obtain a relative difference according to the first actual width and the second actual width; an adjusting module 760 for adjusting the first relative distance to the second relative distance according to the relative difference.
FIG. 8 illustrates modules prior to the first determination module 710 according to an embodiment of the application. Specifically, the device further comprises: a first obtaining module 810, configured to obtain a width of the target object in the image;
a second determining module 820, configured to determine an initial width of the target object according to the width of the target object in the image and the projection relationship; the projection relation comprises a mapping relation between a preset reference surface and the image.
A third determining module 830, configured to determine the first actual width according to the initial width.
In one example, the first determining module 710 is specifically configured to input an image into a preset model, and determine a type of the target object; and adjusting the initial width of the target object according to the type of the target object to obtain the first actual width.
In an example, the obtaining module 720 is specifically configured to obtain a distance parameter and a speed parameter of the target object at a previous time; the first processing module 730 is specifically configured to obtain the first relative distance according to the relevant distance parameter at the previous time and the relevant speed parameter at the previous time.
Fig. 9 illustrates an example block diagram of a first processing module 730 according to an embodiment of this application. As shown in fig. 9, in an example, the first processing module 730 specifically includes: a second obtaining module 910, configured to obtain a time difference between the current time and the previous time; a third obtaining module 920, configured to obtain the relative movement distance of the current time according to the time difference and the related speed parameter of the previous time; a fourth obtaining module 930, configured to obtain the first relative distance based on the relative distance parameter at the previous time, the relative movement distance at the current time, and the distance noise.
In one example, the second processing module 740 includes: a fifth obtaining module, configured to obtain a focal length of the image acquisition device; a sixth obtaining module, configured to process the focal length, the width of the target object in the image, and the first relative distance to obtain the second actual width.
Fig. 10 illustrates an example block diagram of an adjustment module 760 in accordance with an embodiment of the present application. As shown in fig. 10, in one example, the adjustment module 760 includes: a determining module 1010, configured to determine whether the relative difference is smaller than a preset threshold; a first adjusting submodule 1020, configured to determine, if yes, the first relative distance as the second relative distance; a second adjusting submodule 1030, configured to adjust the distance noise according to the relative difference if the distance noise is not within the predetermined range; obtaining the second relative distance according to the adjusted distance noise and the relevant state parameter; and the difference value between the third actual width obtained by the second relative distance and the first actual width is smaller than the preset threshold value.
Exemplary Mobile device
FIG. 11 illustrates a block diagram of a removable device according to an embodiment of the present application.
As shown in fig. 11, the removable device (electronic device 10) includes one or more processors 11 and memory 12.
The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the removable device to perform desired functions.
Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. Volatile memory can include, for example, Random Access Memory (RAM), cache memory (or the like). The non-volatile memory may include, for example, Read Only Memory (ROM), a hard disk, flash memory, and the like. One or more computer program instructions may be stored on a computer readable storage medium and executed by the processor 11 to implement the above ranging method for a target object of the various embodiments of the present application and/or other desired functions. Various contents such as an input signal, a signal component, a noise component, etc. may also be stored in the computer-readable storage medium.
In one example, the removable device may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
For example, when the removable device is a first device or a second device, the input means 13 may be a microphone or a microphone array as described above for capturing an input signal of a sound source. When the electronic device is a stand-alone device, the input means 13 may be a communication network connector for receiving the acquired input signals from the first device and the second device.
The input device 13 may also include, for example, a keyboard, a mouse, and the like.
The output device 14 may output various information including determined distance information, direction information, and the like to the outside. The output devices 14 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.
Of course, for simplicity, only some of the components of the removable device relevant to the present application are shown in fig. 11, omitting components such as buses, input/output interfaces, and the like. In addition, the removable device may include any other suitable components, depending on the particular application.
Exemplary computer program product and computer-readable storage Medium
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the trajectory planning method according to various embodiments of the present application described in the above-mentioned "exemplary methods" section of this specification.
The computer program product may include program code for carrying out operations for embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform the steps in the method of pose tracking of a target object according to various embodiments of the present application described in the "exemplary methods" section above in this specification.
A computer-readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (10)

1. A method of ranging a target object, the method comprising:
determining a first actual width of the target object according to an image acquired by an image acquisition device;
acquiring relevant state parameters of the target object at the previous moment;
obtaining a first relative distance according to the relevant state parameter, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment;
obtaining a second actual width of the target object according to the first relative distance;
obtaining a relative difference value according to the first actual width and the second actual width;
and adjusting the first relative distance to be a second relative distance according to the relative difference.
2. The method of claim 1, wherein said determining a first actual width of said target object comprises:
obtaining the width of the target object in the image;
determining the initial width of the target object according to the width and the projection relation of the target object in the image; the projection relation comprises a mapping relation between a preset reference surface and the image;
and determining the first actual width according to the initial width.
3. The method of claim 2, wherein said determining the first actual width from the initial width comprises:
inputting the image into a preset model, and determining the type of the target object;
and adjusting the initial width of the target object according to the type of the target object to obtain the first actual width.
4. The method of claim 1, wherein,
the acquiring of the relevant state parameter of the target object at the previous moment includes: acquiring a relevant distance parameter and a relevant speed parameter of the target object at a previous moment;
the obtaining a first relative distance according to the relevant state parameter includes:
and obtaining the first relative distance according to the relevant distance parameter at the previous moment and the relevant speed parameter at the previous moment.
5. The method of claim 4, wherein the obtaining the first relative distance according to the distance parameter associated with the previous time and the speed parameter associated with the previous time comprises:
obtaining a time difference between the current time and the previous time;
obtaining the relative movement distance of the current moment according to the time difference and the related speed parameter of the previous moment;
and obtaining the first relative distance based on the relative distance parameter of the previous moment, the relative movement distance of the current moment and the distance noise.
6. The method of claim 2, wherein said obtaining a second actual width of the target object from the first relative distance comprises:
obtaining a focal length of the image acquisition device;
and processing the focal length, the width of the target object in the image and the first relative distance to obtain the second actual width.
7. The method of claim 1, wherein said updating the first relative distance to the second relative distance according to the relative difference comprises:
judging whether the relative difference value is smaller than a preset threshold value or not;
if so, determining the first relative distance as the second relative distance;
if not, adjusting the distance noise according to the relative difference value; obtaining the second relative distance according to the adjusted distance noise and the relevant state parameter; and the difference value between the third actual width obtained by the second relative distance and the first actual width is smaller than the preset threshold value.
8. A ranging apparatus for a target object, comprising:
the first determining module is used for determining a first actual width of the target object according to the image acquired by the image acquisition device;
the acquisition module is used for acquiring the relevant state parameters of the target object at the previous moment;
the first processing module is used for obtaining a first relative distance according to the relevant state parameter, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment;
the second processing module is used for obtaining a second actual width of the target object according to the first relative distance;
the comparison module is used for obtaining a relative difference value according to the first actual width and the second actual width;
and the adjusting module is used for adjusting the first relative distance into a second relative distance according to the relative difference.
9. An electronic device, comprising:
a processor; and
a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the method of any of claims 1-7.
10. A computer-readable storage medium, the storage medium storing a computer program for performing the method of any of the preceding claims 1-7.
CN201911015688.XA 2019-10-24 2019-10-24 Target object ranging method and device Active CN110673123B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911015688.XA CN110673123B (en) 2019-10-24 2019-10-24 Target object ranging method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911015688.XA CN110673123B (en) 2019-10-24 2019-10-24 Target object ranging method and device

Publications (2)

Publication Number Publication Date
CN110673123A true CN110673123A (en) 2020-01-10
CN110673123B CN110673123B (en) 2022-07-29

Family

ID=69083797

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911015688.XA Active CN110673123B (en) 2019-10-24 2019-10-24 Target object ranging method and device

Country Status (1)

Country Link
CN (1) CN110673123B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528974A (en) * 2021-02-08 2021-03-19 成都睿沿科技有限公司 Distance measuring method and device, electronic equipment and readable storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102508246A (en) * 2011-10-13 2012-06-20 吉林大学 Method for detecting and tracking obstacles in front of vehicle
US20120200707A1 (en) * 2006-01-04 2012-08-09 Mobileye Technologies Ltd. Estimating distance to an object using a sequence of images recorded by a monocular camera
US8825259B1 (en) * 2013-06-21 2014-09-02 Google Inc. Detecting lane closures and lane shifts by an autonomous vehicle
CN105740804A (en) * 2016-01-27 2016-07-06 大连楼兰科技股份有限公司 Automatic vehicle tracking and driving method based on image processing
CN106019300A (en) * 2016-08-05 2016-10-12 上海思岚科技有限公司 Laser ranging device and laser ranging method thereof
WO2017122641A1 (en) * 2016-01-15 2017-07-20 富士フイルム株式会社 Measurement assistance device and measurement assistance method
CN107966700A (en) * 2017-11-20 2018-04-27 天津大学 A kind of front obstacle detecting system and method for pilotless automobile
CN108872975A (en) * 2017-05-15 2018-11-23 蔚来汽车有限公司 Vehicle-mounted millimeter wave radar filtering estimation method, device and storage medium for target following
CN109203902A (en) * 2017-07-04 2019-01-15 上海汽车集团股份有限公司 A kind of vehicle safety method, system, running crane control device and storage medium
CN109343076A (en) * 2018-10-30 2019-02-15 合肥泰禾光电科技股份有限公司 A kind of distance calibration method and range unit
CN109655823A (en) * 2018-12-30 2019-04-19 北京经纬恒润科技有限公司 The tracking and device of target
CN110361003A (en) * 2018-04-09 2019-10-22 中南大学 Information fusion method, device, computer equipment and computer readable storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120200707A1 (en) * 2006-01-04 2012-08-09 Mobileye Technologies Ltd. Estimating distance to an object using a sequence of images recorded by a monocular camera
CN102508246A (en) * 2011-10-13 2012-06-20 吉林大学 Method for detecting and tracking obstacles in front of vehicle
US8825259B1 (en) * 2013-06-21 2014-09-02 Google Inc. Detecting lane closures and lane shifts by an autonomous vehicle
WO2017122641A1 (en) * 2016-01-15 2017-07-20 富士フイルム株式会社 Measurement assistance device and measurement assistance method
CN105740804A (en) * 2016-01-27 2016-07-06 大连楼兰科技股份有限公司 Automatic vehicle tracking and driving method based on image processing
CN106019300A (en) * 2016-08-05 2016-10-12 上海思岚科技有限公司 Laser ranging device and laser ranging method thereof
CN108872975A (en) * 2017-05-15 2018-11-23 蔚来汽车有限公司 Vehicle-mounted millimeter wave radar filtering estimation method, device and storage medium for target following
CN109203902A (en) * 2017-07-04 2019-01-15 上海汽车集团股份有限公司 A kind of vehicle safety method, system, running crane control device and storage medium
CN107966700A (en) * 2017-11-20 2018-04-27 天津大学 A kind of front obstacle detecting system and method for pilotless automobile
CN110361003A (en) * 2018-04-09 2019-10-22 中南大学 Information fusion method, device, computer equipment and computer readable storage medium
CN109343076A (en) * 2018-10-30 2019-02-15 合肥泰禾光电科技股份有限公司 A kind of distance calibration method and range unit
CN109655823A (en) * 2018-12-30 2019-04-19 北京经纬恒润科技有限公司 The tracking and device of target

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANSELM HASELHOFF 等: "Radar-Vision Fusion with an Application to Car-Following using an Improved AdaBoost Detection Algorithm", 《2007 IEEE INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE》 *
基于测距雷达和机器视觉数据融合的前方车辆检测系统: "庞成", 《中国优秀硕士论文全文数据库信息科技辑月刊》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528974A (en) * 2021-02-08 2021-03-19 成都睿沿科技有限公司 Distance measuring method and device, electronic equipment and readable storage medium
CN112528974B (en) * 2021-02-08 2021-05-14 成都睿沿科技有限公司 Distance measuring method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN110673123B (en) 2022-07-29

Similar Documents

Publication Publication Date Title
JP6272566B2 (en) Route prediction device
CN113022580B (en) Trajectory prediction method, trajectory prediction device, storage medium and electronic equipment
EP3627181A1 (en) Multi-sensor calibration method, multi-sensor calibration device, computer device, medium and vehicle
US20200116867A1 (en) Automatic lidar calibration based on pre-collected static reflection map for autonomous driving
US9182477B2 (en) Vehicle radar alignment method and system
EP2533226A1 (en) Vehicle surroundings monitoring device
KR102610001B1 (en) System for sensor synchronization data analysis in autonomous vehicles
CN109544630B (en) Pose information determination method and device and visual point cloud construction method and device
US20170358048A1 (en) Information processing apparatus, information processing method, and program
EP3608687A1 (en) Vehicle advanced assisted driving calibration device
WO2019135281A1 (en) Line-of-sight direction calibration device, line-of-sight direction calibration method, and line-of-sight direction calibration program
CN110673123B (en) Target object ranging method and device
CN112446917A (en) Attitude determination method and device
CN111337010B (en) Positioning method and positioning device of movable equipment and electronic equipment
WO2024188074A1 (en) Cursor position adjustment method and apparatus, and device
WO2024199293A1 (en) Object detection method and apparatus, electronic device, and storage medium
CN110672074A (en) Method and device for measuring distance of target object
CN112304293B (en) Road height detection method and device, readable storage medium and electronic equipment
CN112115739A (en) Vehicle state quantity information acquisition method and device
CN108961337B (en) Vehicle-mounted camera course angle calibration method and device, electronic equipment and vehicle
JP2012098776A (en) Driving support device
CN111212239B (en) Exposure time length adjusting method and device, electronic equipment and storage medium
CN113014899A (en) Binocular image parallax determination method, device and system
CN116931539A (en) System and method for testing a vehicle system
CN117270709B (en) Mouse pointer control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant