CN117630957A - Object distance detection method and device, storage medium and electronic device - Google Patents

Object distance detection method and device, storage medium and electronic device Download PDF

Info

Publication number
CN117630957A
CN117630957A CN202210983482.1A CN202210983482A CN117630957A CN 117630957 A CN117630957 A CN 117630957A CN 202210983482 A CN202210983482 A CN 202210983482A CN 117630957 A CN117630957 A CN 117630957A
Authority
CN
China
Prior art keywords
pixel
pixel row
row
pixel point
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210983482.1A
Other languages
Chinese (zh)
Inventor
吕聪奕
邬芮璠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dreame Innovation Technology Suzhou Co Ltd
Original Assignee
Dreame Innovation Technology Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dreame Innovation Technology Suzhou Co Ltd filed Critical Dreame Innovation Technology Suzhou Co Ltd
Priority to CN202210983482.1A priority Critical patent/CN117630957A/en
Publication of CN117630957A publication Critical patent/CN117630957A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an object distance detection method and device, a storage medium and an electronic device, wherein the method comprises the following steps: acquiring an image acquired by an image acquisition element in a designated device; extracting pixel rows from the image along a preset direction; extracting at least one pixel row of a determined target pixel point in a preset number of pixel rows adjacent to a designated pixel row as a reference pixel row, wherein the target pixel point corresponds to the laser information; determining a target pixel point corresponding to the designated pixel row based on the target pixel point of the reference pixel row; determining the distance between the object and the appointed equipment according to the corresponding position information of the target pixel point in the image; the technical problem of inaccurate distance measurement between the robot and the obstacle due to interference of ambient light rays is solved.

Description

Object distance detection method and device, storage medium and electronic device
Technical Field
The present invention relates to the field of data processing, and in particular, to an object distance detection method and apparatus, a storage medium, and an electronic apparatus.
Background
Currently, in order to avoid an obstacle when a robot is in operation, a distance between itself and the obstacle can be measured by using a laser.
In the related art, when measuring the distance between the robot and the obstacle using the laser, it is generally necessary to extract laser information reflected by the obstacle from the collected information. The light projected on the acquisition element of the robot comprises the light of the ambient light besides the reflected laser light, so that the accurate extraction of the laser information is affected, and the accurate measurement of the distance between the robot and the obstacle is further affected.
No effective solution has been proposed for the related art.
Disclosure of Invention
The embodiment of the invention provides an object distance detection method and device, a storage medium and an electronic device, which can at least improve the accuracy of distance measurement between a robot and an obstacle.
According to an embodiment of the present invention, there is provided an object distance detection method including: acquiring an image acquired by an image acquisition element in a designated device, wherein the image at least comprises laser information reflected after laser emitted by a line laser light source in the designated device is projected on an object; extracting pixel rows from the image along a preset direction; wherein, the included angle between the preset direction and the laser array arrangement direction of the line laser light source is larger than zero; extracting at least one pixel row of a determined target pixel point in a preset number of pixel rows adjacent to a designated pixel row as a reference pixel row, wherein the target pixel point corresponds to the laser information; determining a target pixel point corresponding to the appointed pixel row based on the target pixel point of the reference pixel row; and determining the distance between the object and the appointed equipment according to the corresponding position information of the target pixel point in the image.
In an exemplary embodiment, after the extracting the pixel row from the image, the method further includes: extracting extreme value pixel points in the pixel row; wherein the extremum pixel points represent peak points of pixel value fluctuation in the pixel row; correspondingly, based on the target pixel points of the reference pixel row, extracting the target pixel points corresponding to the appointed pixel row from the extreme value pixel points of the appointed pixel row. .
In an exemplary embodiment, the above method further comprises: the preset direction is perpendicular to the arrangement direction of the laser arrays of the line laser light sources. .
In one exemplary embodiment, the extracted pixel rows are projected into a planar coordinate system to obtain a waveform map reflecting pixel value fluctuations in the corresponding pixel rows; the first coordinate of the plane coordinate system corresponds to the position of each pixel point in the pixel row, and the second coordinate corresponds to the pixel value of each pixel point in the pixel row; waveforms corresponding to each pixel row in the waveform chart are sequentially arranged according to the relative positions of the pixel rows in the image; and determining a target pixel point in the pixel row by using the waveform chart.
In an exemplary embodiment, the determining the extremum pixel point in the pixel row using the waveform image includes: comparing the first coordinates of each extreme value pixel point in the appointed pixel row with the first coordinates of the target pixel point of the reference pixel row respectively; and extracting the extreme value pixel point with the minimum difference of the first coordinate values from the comparison result as the target pixel point of the specified pixel row.
In one exemplary embodiment, in a case where the specified pixel row includes two or more extreme value pixel points, at least one pixel row of which a target pixel point has been determined among a preset number of pixel rows adjacent to the specified pixel row is extracted as a reference pixel row.
In an exemplary embodiment, in a case where there is an intersection between the projection direction of part of the laser points of the line laser light source and the ground, each pixel row is sequentially taken as a specified pixel row with a vertical upward direction as a detection direction.
According to another embodiment of the present invention, there is also provided an object distance detecting apparatus including: the acquisition module is used for acquiring the image acquired by the image acquisition element in the appointed equipment, wherein the image at least comprises laser information reflected after laser emitted by the line laser light source in the appointed equipment is projected on an object; a first extraction module, configured to extract a pixel row from the image along a preset direction; wherein, the included angle between the preset direction and the laser array arrangement direction of the line laser light source is larger than zero; the second extraction module is used for extracting at least one pixel row of which the target pixel point is determined from a preset number of pixel rows adjacent to the appointed pixel row as a reference pixel row, wherein the target pixel point corresponds to the laser information; the first determining module is used for determining a target pixel point corresponding to the appointed pixel row based on the target pixel point of the reference pixel row; and the second determining module is used for determining the distance between the object and the appointed equipment according to the corresponding position information of the target pixel point in the image.
According to a further aspect of embodiments of the present invention, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to perform the above-described object distance detection method when run.
According to still another aspect of the embodiments of the present invention, there is further provided an electronic device including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the object distance detection method described above through the computer program.
In the embodiment of the invention, a direction with a certain included angle with the arrangement direction of the laser array is taken as a basic data extraction direction of distance detection, and pixel information is extracted from an image containing laser information and ambient light information reflected by an object, so that a series of pixel rows are obtained; and based on the principle of light distribution continuity, the position of the determined laser information in each pixel row is referred to, so that each pixel point corresponding to the laser information can be extracted more simply and accurately, the interference of ambient light on the extraction of the pixel point corresponding to the laser information is reduced, the accuracy of distance measurement between the designated equipment and the object is further improved, and the technical problem of inaccurate distance measurement between the designated equipment and the object caused by the interference of ambient light is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute an undue limitation. In the drawings:
FIG. 1 is a schematic diagram of an application scenario of an alternative object distance detection method according to an embodiment of the present invention;
FIG. 2 is a flow chart of an alternative object distance detection method according to an embodiment of the invention;
FIG. 3 is a schematic diagram of an alternative object distance detection method according to an embodiment of the invention;
FIG. 4 is a schematic diagram of another alternative object distance detection method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of another alternative object distance detection method according to an embodiment of the invention;
FIG. 6 is a schematic diagram of another alternative object distance detection method according to an embodiment of the invention;
fig. 7 is a block diagram of an alternative object distance detection device according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The object distance detection method provided by the embodiment of the specification can be applied to a specified device, wherein the specified device can be a sweeping robot, for example, and can also be other devices such as a self-moving air purifier, a meal delivery robot and the like. At least a wired laser light source and an image acquisition element can be arranged in the appointed equipment. The layout modes of the line laser light source and the image acquisition element on the appointed equipment can be set according to the needs, and the specification is not limited.
As shown in fig. 1 (a), the line laser light source may include a line array composed of a plurality of lasers 1, each of which may emit a single point laser light, and, correspondingly, the line laser light source may emit a line laser composed of a plurality of single point lasers based on the line array. The number of the lasers included in the line laser light source can be set according to the requirement, and the light direction of single-point laser emitted by each laser in the line laser light source is kept consistent. For convenience of description, the direction in which the lasers in the single line laser light source are arranged may be described as the laser array arrangement direction. The number of the line laser light sources contained in the specified equipment can be set according to the needs, and the arrangement positions of different line laser light sources and the directions of laser rays can also be set according to the needs.
The image acquisition element comprises at least an image sensor array. As shown in fig. 1 (b), the image sensor array is a two-dimensional sensor array composed of a plurality of image sensors 2. The number of image sensors included in the image pickup element may be set as needed. The number of the image acquisition elements contained in the appointed equipment can be set according to the needs, and the arrangement positions of different image acquisition elements can also be set according to the needs.
The laser light emitted from the line laser light source may be projected onto an object, such as a table, a bench, a wall, a floor, or the like. The laser projected on the object is collected by the image collecting element after being reflected by the object, and an image containing laser information reflected by the object is obtained. In an actual application scene, the light information collected by the image collecting element not only contains the laser information reflected by the object, but also contains the ambient light information, the ambient light captured by the image collecting element is high in possible brightness, and the brightness of the ambient light is relatively close to that of the laser light emitted by the object captured by the image collecting element, so that the light information is difficult to distinguish, and the accurate extraction of the laser information is affected.
The image acquisition element may further transmit the acquired image to a processor. The processor may be integrated on the designated device or may be integrated in a smart terminal or cloud that may communicate with the designated device. The processor can extract the pixel point corresponding to the laser information from the image. Each pixel point in the image acquired by the image acquisition element corresponds to each image sensor in the image sensor array, and the pixel point corresponding to the laser information in the image is determined, so that the position information of the image sensor for acquiring the laser information can be determined. The position of the object relative to the appointed equipment can be further determined by integrating the position information of each image sensor corresponding to the laser information.
The method for detecting the object distance provided in this embodiment may be applied to the processor, and fig. 2 is a flowchart of the method for detecting the object distance according to an embodiment of the present invention, where the method may include the following steps:
step S202, acquiring an image acquired by an image acquisition element in the appointed equipment, wherein the image at least comprises light information reflected after laser emitted by a line laser light source in the appointed equipment is projected on an object.
The processor may acquire images acquired by the image acquisition element in the designated device. If the processor is integrated in a specific device, the image acquisition element may transmit the image to the processor via a data transmission interface. If the processor is integrated in an intelligent terminal or a cloud end which is communicated with the appointed equipment, the image acquisition element can transmit the image to the processor in a wired or wireless communication mode. Of course, the image capturing element may also transmit the image to the processor in other manners in combination with the actual application scenario, which is not limited in the embodiment of the present disclosure.
The image at least comprises laser information reflected after laser emitted by the line laser light source in the appointed equipment is projected on the object. For example, the above-mentioned specifying device may be a sweeping robot, which may emit a line laser composed of a plurality of single-point lasers outward through an integrated line laser light source during traveling, and the line laser may be projected on an object such as a table, a bench, a wall, a floor, or the like. After the laser projected on the object is reflected by the object, at least part of the reflected laser light is captured by the image acquisition element, so that an image containing the laser information reflected by the object is obtained. The light rays usually captured by the image acquisition element are not only laser light rays emitted by the object, but also ambient light rays, the ambient light rays partially captured by the image acquisition element can have higher brightness, the brightness of the ambient light rays is relatively close to that of the laser light rays emitted by the object captured by the image acquisition element, and the ambient light rays are difficult to distinguish, so that the accurate extraction of the laser information reflected by the object is affected.
For ease of distinguishing the expressions, the image received by the processor may be described as an initial image. The processor may perform preprocessing on the initial image, for example, binarization processing, noise removing processing, and the like may be performed on the initial image first, so as to reduce an influence of part of the ambient light information on the laser information extraction. Fig. 3 shows an image obtained by preprocessing such as binarization processing and noise removal processing on an image acquired by an image acquisition element, wherein white areas 301, 302, and 303 represent areas where pixel points having a pixel value of 255 are located, and black area 304 represents an area where pixel points having a pixel value of 0 are located. As shown in fig. 3, the pixels in the white areas 301, 302, 303 also cover the laser information reflected by the object and the ambient light information with high brightness, and the pixels corresponding to the laser information reflected by the object need to be extracted therefrom.
Step S204, extracting pixel rows from the image along a preset direction; and the included angle between the preset direction and the laser array arrangement direction of the line laser light source is larger than zero.
The processor can extract pixel rows from the image along a preset direction, and an included angle between the preset direction and the laser array arrangement direction of the line laser light source is larger than zero. And the position information of each pixel point in the pixel row in the image and the pixel value can be stored in a correlated way.
The extending direction of the pixel points corresponding to the laser information in the image is theoretically parallel to the arrangement direction of the laser array, but is influenced by the surface unevenness of the projected object and other external interference factors, and distortion may exist in the extending direction of the pixel points corresponding to the laser information, so that it is difficult to accurately extract the pixel points corresponding to the laser information along the arrangement direction of the laser array. According to the embodiment of the specification, the pixel rows are extracted along the direction with a certain included angle with the arrangement direction of the laser array, so that the pixel points corresponding to the laser information are distributed in different pixel rows, and then the pixel points corresponding to the laser information are respectively extracted from each pixel row according to the continuous distribution characteristics of the laser information along the arrangement direction of the laser array, so that the accuracy of extracting the pixel points corresponding to the laser information can be greatly improved.
The pixel row may correspond to one row of pixels in the image, or may correspond to more than two rows of pixels. If the pixel row corresponds to more than two rows of pixel points, the pixel points can be extracted from the more than two rows of pixel points along the direction vertical to the preset direction, and the pixel values of the extracted pixel points are averaged to obtain the pixel row. When the pixel rows are extracted, at least one row of pixel points can be spaced between two adjacent pixel rows, and the pixel points can be separated or not, which is not limited in the embodiment of the present disclosure.
For visual illustration, it is assumed that the rows or columns of the image sensor array are parallel to the laser array arrangement direction, and a planar coordinate system is constructed for the image in fig. 3, resulting in fig. 4. In this example, taking a preset direction perpendicular to the arrangement direction of the laser array as an example, correspondingly, the preset direction is the x direction, a pixel row may be extracted from the image along the x direction, each pixel row corresponds to a row of pixel points in the image, and the pixel rows may be extracted in an interval manner, that is, an interval N (N is an integer greater than or equal to 1) row extracts a row of pixel points parallel to the x axis as a pixel row. As shown in fig. 4, a plurality of pixel rows, 401, 402, 403, 404, etc., may be obtained. Of course, the pixel rows may be extracted in a non-spaced manner, i.e. each row of pixels parallel to the x-axis direction is taken as a pixel row. If two adjacent rows of pixel points are extracted as one pixel line, the pixel values of the two pixel points along the y-axis direction can be averaged to obtain the pixel line.
Of course, the above extraction of the pixel rows is only a preferred example, and those skilled in the art can configure the extraction of the pixel rows according to the need, which is not limited in this specification.
Step S206, extracting at least one pixel row of the determined target pixel point from the preset number of pixel rows adjacent to the specified pixel row as a reference pixel row, where the target pixel point corresponds to the laser information.
For convenience of distinguishing the expressions, a pixel point corresponding to the laser information may be described as a target pixel point, and a pixel row of any target pixel point to be determined may be described as a specified pixel row. For the specified pixel row, at least one pixel row of the predetermined number of pixel rows adjacent to the specified pixel row, in which the target pixel point has been determined, may be first extracted as a reference pixel row. As shown in fig. 4, assuming that the pixel line 404 is a specified pixel line and that the pixel points corresponding to the laser information in the pixel lines 402 and 403 have been determined, the pixel lines 402 and 403 may be used as reference pixel lines of the pixel line 404, or any one of the pixel lines 402 and 403 may be used as reference pixel lines.
In some embodiments, a pixel row upper limit value of the interval between the reference pixel row and the specified pixel row may also be preconfigured. The laser line formed by the laser beam projected in the laser sensor array usually generates distortion due to the influence of the flatness of the object surface projected by the line laser and other external interference factors, and certain difficulty is caused to the extraction of the pixel point corresponding to the laser information. If the interval between the reference pixel row and the designated pixel row is too far, the deviation between the target pixel point in the pixel row and the target pixel point in the designated pixel row may be larger, so that the extraction of the target pixel point in the designated pixel row is difficult to guide, and instead, the extraction of the target pixel point in the designated pixel row may be interfered, so that the accuracy of determining the target pixel point can be further improved by setting the upper limit value of the pixel row of the interval between the reference pixel row and the designated pixel row.
Step S208, determining a target pixel point corresponding to the specified pixel row based on the target pixel point of the reference pixel row.
According to the continuous distribution characteristics of the laser information along the arrangement direction of the laser array, the pixel points corresponding to the laser information can be extracted from each pixel row. For example, the position information of the target pixel point in the image in the reference pixel row may be used as a position reference, and the arrangement direction of the laser array may be used as the extending direction of the target pixel point, so as to determine the target pixel point corresponding to the specified pixel row.
In some embodiments, extremum pixel points in each pixel row, which represent peak points of pixel value fluctuations in the pixel row, may also be pre-extracted. Correspondingly, the target pixel point corresponding to the specified pixel row can be extracted from the extreme value pixel points of the specified pixel row based on the position information of the target pixel point of the reference pixel row in the image. For example, the position information of the target pixel point in the reference pixel row in the image can be used as a position reference, the arrangement direction of the laser array is used as the extending direction of the target pixel point, and the extreme value pixel point close to the extending direction of the target pixel point is extracted from the extreme value pixel points in the appointed pixel row and used as the target pixel point corresponding to the appointed pixel row.
The data processing capacity can be greatly reduced and the data processing efficiency can be improved by determining the extreme value pixel point firstly and then determining the target pixel point based on the extreme value pixel point. The extreme value pixel point in the pixel row can be determined by comparing the pixel values of the pixel points in the pixel row; the extremum pixel point in the pixel row can also be determined by analyzing the fluctuation condition of the pixel value of each pixel point in the pixel row, and the specification is not limited.
The number of extreme value pixel points in the specified pixel row can be judged in advance, at least one pixel row, of which the target pixel points are determined, in the preset number of pixel rows adjacent to the specified pixel row is extracted as a reference pixel row when the specified pixel row contains more than two extreme value pixel points, and the target pixel points corresponding to the specified pixel row are determined based on the position information of the target pixel points in the reference pixel row. In the case that the specified pixel row includes one extremum pixel, the corresponding extremum pixel may be directly determined as the target pixel, so as to further reduce the data processing amount. Of course, in view of that only one extremum pixel point exists in a part of the pixel rows, but the extremum pixel point is not the pixel point corresponding to the laser information, the determined extending trend of the target pixel point can be combined, whether the extremum pixel point is the pixel point corresponding to the laser information or not can be analyzed, and if not, the extremum pixel point can be discarded.
In some embodiments, the processor may extract the pixel row from the image with a laser array arrangement direction perpendicular to the line laser light source as a preset direction. And projecting the extracted pixel rows into a plane coordinate system to obtain a waveform diagram reflecting pixel value fluctuation in the corresponding pixel rows; the first coordinate of the plane coordinate system corresponds to the position of each pixel point in the pixel row, and the second coordinate corresponds to the pixel value of each pixel point in the pixel row; waveforms corresponding to the pixel rows in the waveform diagram are sequentially arranged according to the relative positions of the pixel rows in the image. The extremum pixel points in the pixel rows can be determined using the waveform map.
Fig. 5 shows a waveform diagram constructed by projecting the pixel rows in fig. 4 into a planar coordinate system, wherein the abscissa x corresponds to the position of each pixel point in the pixel row, and the ordinate y corresponds to the pixel value of each pixel point in the pixel row. It should be noted that, the x value of the abscissa of each pixel point is not the actual position information of the pixel point in the image, but is used to represent the relative position information of the pixel point in the pixel row. Waveforms 501, 502, 503 corresponding to three pixel rows are shown in fig. 5, corresponding to pixel rows 401, 402, 403, respectively.
In some embodiments, the first coordinates of each extremum pixel point in the specified pixel row may be compared with the first coordinates of the target pixel point in the reference pixel row respectively; and extracting the extreme value pixel point with the minimum difference of the first coordinate values from the comparison result as the target pixel point of the specified pixel row.
Referring to fig. 5, the waveforms corresponding to the pixel rows in fig. 5 may be analyzed first, the extremum pixel points are extracted, the waveform 501 includes one extremum pixel point a, the waveform 502 includes two extremum pixel points B and C, and the waveform 503 includes three extremum pixel points D, E and F. Assuming that the abscissa of the extremum pixel B in the waveform 502 is x1, the abscissa of the extremum pixel C is x2, and the abscissa of the extremum pixel a in the waveform 501 is x3, when the extremum pixel a in the waveform 501 is determined to be the target pixel, the abscissas of the extremum pixel B and the extremum pixel C can be compared with the abscissas of the pixel a, respectively. Since x2 is relatively close to x3, the extremum pixel point C can be considered as the target pixel point of waveform 502. Similarly, the abscissa of the target pixel point C in the waveform 502 and the abscissa of each extremum pixel point in the waveform 503 may be compared, and the extremum pixel point F is determined as the target pixel point in the waveform 503.
Because each extracted pixel row is perpendicular to the arrangement direction of the laser array, the first coordinate value of each pixel point corresponding to the laser information in the waveform diagram is the same in theory, even if the extending direction of the pixel point corresponding to the laser information is possibly distorted due to the influence of the surface unevenness of an object projected by laser and other external interference factors, the first coordinate value of the target pixel point of the adjacent pixel row is smaller, so that the pixel point corresponding to the laser information can be accurately and efficiently determined by comparing the first coordinate values of the extreme value pixel points in the adjacent waveforms in the waveform diagram.
Alternatively, the location information of the target pixel point in the specified pixel row may also be determined in combination with the extending trend of at least part of the determined target pixel point. For example, when there are a plurality of reference pixel rows, the position information of the target pixel point in the reference pixel row may be fused to determine the position information of the target pixel point in the specified pixel row. If the position information of the target pixel point in the plurality of reference pixel rows can be fitted, the extending trend of the target pixel point is judged, and the extending trend is used for guiding the determination of the target pixel point in the designated pixel row. Of course, the position information of all the determined target pixel points may be fitted, and the determination of the position information of the target pixel points in the specified pixel row may be guided by using the extending trend of each target pixel point. The extending trend of the target pixel point is utilized to guide the determination of the target pixel point in the appointed pixel row, so that the problem that the pixel point corresponding to the laser information deviates greatly or radian exists in the whole pixel point corresponding to the laser information due to uneven surface of an object or other external interference factors, the accuracy of the determination of the target pixel point in each pixel row is affected, and the accuracy of the laser information extraction is further ensured.
Optionally, after determining the target pixel points corresponding to each pixel row, fitting each target pixel point, if more than two initial extension lines are formed. The extending direction, extending length and the like of the initial extending line can be analyzed, if the extending length of the initial extending line is very short or the difference between the initial extending line and the arrangement direction of the laser array is larger, the probability that the initial extending line corresponds to the laser information is smaller, and the corresponding initial extending line can be removed, so that the interference can be further reduced, and the accuracy of extracting the pixel points corresponding to the laser information is improved.
In some embodiments, each pixel row may be sequentially used as a designated pixel row with a vertical upward direction as a detection direction when there is an intersection between the projection direction of a part of the laser points of the linear laser source and the ground. For example, for a robot sweeping floor, part of laser points of the line laser light source are projected on the ground, the ground is generally relatively flat, and interference factors are small, in this case, the target pixel points in each pixel row can be determined sequentially with the vertical upward direction as the detection direction. For example, the relative position information of the extremum pixel points in the first several pixel rows may be analyzed first, the extremum pixel point with smaller difference in the position information in the first several pixel rows is extracted as the target pixel point, then the target pixel point in the first several pixel rows is used as the reference, the next pixel row is extracted as the designated pixel row along the detection direction, the reference pixel row of the designated pixel row is extracted, and the target pixel point in the designated pixel row is determined; and so on, determining target pixel points in other pixel rows. The actual layout structure of the line laser light source in the self-moving equipment such as the sweeping robot is combined, the pixel point corresponding to the laser information projected on the ground is determined, and then the pixel point corresponding to the subsequent laser information is extracted by taking the pixel point as a reference, so that the accuracy of extracting the laser information can be further improved.
Step S210, determining the distance between the object and the appointed equipment according to the corresponding position information of the target pixel point in the image.
For example, the distance between the object and the designated device may be determined by using a method such as a triangle reflection method, a phase method, or a pulse method (also referred to as a laser echo method), in combination with the position information of the target pixel point in the image.
For example, the distance between the object and the specified device can be determined based on the relative position relationship between the image acquisition element and the line laser light source and the corresponding position information of the target pixel point in the image. Taking the triangular reflection method as an example, as shown in fig. 6, the laser 601 and the camera of the infrared camera 602 are on the same horizontal line (called a datum line), the distance is s, the focal length of the camera is f, and the included angle between the laser 601 and the datum line is β. Assume that object 603 is reflected back to the camera imaging plane at point P under the illumination of laser 601. The triangle formed by the laser 601, the camera and the object 603 can be made similar to the triangle formed by the camera, the imaging point P and the auxiliary point P1 by geometrical knowledge. Let pp1=x, q, d be as shown in fig. 6, then it is possible to obtain from similar triangles: f/x=q/s, and q=fs/x. The method can be calculated in two parts: 1) Calculating x: x=pq+qp1=f/tan β+pixelsize, where pixelSize is the pixel unit size and position is the projection imaging position. 2) Find the distance d between the specified device and the object 603: d=q/sin β.
According to the scheme provided by the embodiment, the direction with a certain included angle with the arrangement direction of the laser array is used as the basic data extraction direction of distance detection, and pixel information is extracted from an image containing laser information and ambient light information reflected by an object, so that a series of pixel rows are obtained; and based on the principle of light distribution continuity, the position of the determined laser information in each pixel row is referred to, so that each pixel point corresponding to the laser information can be extracted more simply and accurately, the interference of ambient light on the extraction of the pixel point corresponding to the laser information is reduced, the accuracy of distance measurement between the designated equipment and the object is further improved, and the technical problem of inaccurate distance measurement between the designated equipment and the object caused by the interference of ambient light is solved.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method of the various embodiments of the present invention.
Fig. 7 is a block diagram of a structure of an object distance detecting apparatus according to an embodiment of the present invention; as shown in fig. 7, includes:
an acquiring module 701, configured to acquire an image acquired by an image acquisition element in a specified device, where the image at least includes laser information reflected after laser emitted by a line laser light source in the specified device is projected on an object;
a first extraction module 702, configured to extract a pixel row from the image along a preset direction; wherein, the included angle between the preset direction and the laser array arrangement direction of the line laser light source is larger than zero;
a second extracting module 703, configured to extract, as a reference pixel row, at least one pixel row of a predetermined number of pixel rows adjacent to a specified pixel row, where a target pixel point is determined, where the target pixel point corresponds to the laser information;
a first determining module 704, configured to determine a target pixel point corresponding to the specified pixel row based on the target pixel point of the reference pixel row;
the second determining module 705 determines, according to the position information corresponding to the target pixel point in the image, the distance between the object and the designated device.
According to the scheme provided by the embodiment of the specification, a direction with a certain included angle with the arrangement direction of the laser array is used as a basic data extraction direction of distance detection, and pixel information is extracted from an image containing laser information and ambient light information reflected by an object, so that a series of pixel rows are obtained; and based on the principle of light distribution continuity, the position of the determined laser information in each pixel row is referred to, so that each pixel point corresponding to the laser information can be extracted more simply and accurately, the interference of ambient light on the extraction of the pixel point corresponding to the laser information is reduced, the accuracy of distance measurement between the designated equipment and the object is further improved, and the technical problem of inaccurate distance measurement between the designated equipment and the object caused by the interference of ambient light is solved.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments and optional implementations, and this embodiment is not described herein.
An embodiment of the present invention also provides a storage medium including a stored program, wherein the program executes the method of any one of the above.
Alternatively, in the present embodiment, the above-described storage medium may be configured to store program code for performing the steps of:
s1, acquiring an image acquired by an image acquisition element in a designated device, wherein the image at least comprises laser information reflected after laser emitted by a line laser light source in the designated device is projected on an object;
s2, extracting pixel rows from the image along a preset direction; wherein, the included angle between the preset direction and the laser array arrangement direction of the line laser light source is larger than zero;
s3, extracting at least one pixel row of which the target pixel point is determined from a preset number of pixel rows adjacent to the designated pixel row as a reference pixel row, wherein the target pixel point corresponds to the laser information;
s4, determining a target pixel point corresponding to the appointed pixel row based on the target pixel point of the reference pixel row;
s5, determining the distance between the object and the appointed equipment according to the corresponding position information of the target pixel point in the image.
An embodiment of the invention also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, acquiring an image acquired by an image acquisition element in a designated device, wherein the image at least comprises laser information reflected after laser emitted by a line laser light source in the designated device is projected on an object;
s2, extracting pixel rows from the image along a preset direction; wherein, the included angle between the preset direction and the laser array arrangement direction of the line laser light source is larger than zero;
s3, extracting at least one pixel row of which the target pixel point is determined from a preset number of pixel rows adjacent to the designated pixel row as a reference pixel row, wherein the target pixel point corresponds to the laser information;
s4, determining a target pixel point corresponding to the appointed pixel row based on the target pixel point of the reference pixel row;
s5, determining the distance between the object and the appointed equipment according to the corresponding position information of the target pixel point in the image.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments and optional implementations, and this embodiment is not described herein.
It will be appreciated by those skilled in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may alternatively be implemented in program code executable by computing devices, so that they may be stored in a memory device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than that shown or described, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps within them may be fabricated into a single integrated circuit module for implementation. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An object distance detection method, the method comprising:
acquiring an image acquired by an image acquisition element in a designated device, wherein the image at least comprises laser information reflected after laser emitted by a line laser light source in the designated device is projected on an object;
extracting pixel rows from the image along a preset direction; wherein, the included angle between the preset direction and the laser array arrangement direction of the line laser light source is larger than zero;
extracting at least one pixel row of a determined target pixel point in a preset number of pixel rows adjacent to a designated pixel row as a reference pixel row, wherein the target pixel point corresponds to the laser information;
determining a target pixel point corresponding to the appointed pixel row based on the target pixel point of the reference pixel row;
and determining the distance between the object and the appointed equipment according to the corresponding position information of the target pixel point in the image.
2. The method of claim 1, wherein after extracting the pixel rows from the image, further comprising:
extracting extreme value pixel points in the pixel row; wherein the extremum pixel points represent peak points of pixel value fluctuation in the pixel row;
correspondingly, based on the target pixel points of the reference pixel row, extracting the target pixel points corresponding to the appointed pixel row from the extreme value pixel points of the appointed pixel row.
3. The method according to claim 1 or 2, wherein the preset direction is perpendicular to the laser array arrangement direction of the line laser light source.
4. A method according to claim 3, characterized in that the method further comprises:
projecting the extracted pixel rows into a plane coordinate system to obtain a waveform diagram reflecting pixel value fluctuation in the corresponding pixel rows; the first coordinate of the plane coordinate system corresponds to the position of each pixel point in the pixel row, and the second coordinate corresponds to the pixel value of each pixel point in the pixel row; waveforms corresponding to each pixel row in the waveform chart are sequentially arranged according to the relative positions of the pixel rows in the image;
and determining a target pixel point in the pixel row by using the waveform chart.
5. The method of claim 4, wherein determining extremum pixel points in a pixel row using the waveform map comprises:
comparing the first coordinates of each extreme value pixel point in the appointed pixel row with the first coordinates of the target pixel point of the reference pixel row respectively;
and extracting the extreme value pixel point with the minimum difference of the first coordinate values from the comparison result as the target pixel point of the specified pixel row.
6. The method according to claim 2, wherein, in the case where the specified pixel row contains two or more extreme pixel points, at least one pixel row of which a target pixel point has been determined among a preset number of pixel rows adjacent to the specified pixel row is extracted as a reference pixel row.
7. The method according to claim 1, wherein each pixel row is sequentially set as a specified pixel row with a vertical upward direction as a detection direction in a case where there is an intersection between a projection direction of a part of the laser spots of the line laser light source and the ground.
8. An object distance detecting apparatus, comprising:
the acquisition module is used for acquiring the image acquired by the image acquisition element in the appointed equipment, wherein the image at least comprises laser information reflected after laser emitted by the line laser light source in the appointed equipment is projected on an object;
a first extraction module, configured to extract a pixel row from the image along a preset direction; wherein, the included angle between the preset direction and the laser array arrangement direction of the line laser light source is larger than zero;
the second extraction module is used for extracting at least one pixel row of which the target pixel point is determined from a preset number of pixel rows adjacent to the appointed pixel row as a reference pixel row, wherein the target pixel point corresponds to the laser information;
the first determining module is used for determining a target pixel point corresponding to the appointed pixel row based on the target pixel point of the reference pixel row;
and the second determining module is used for determining the distance between the object and the appointed equipment according to the corresponding position information of the target pixel point in the image.
9. A computer readable storage medium, characterized in that the computer readable storage medium comprises a stored program, wherein the program when run performs the method of any of the preceding claims 1 to 7.
10. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method according to any of the claims 1 to 7 by means of the computer program.
CN202210983482.1A 2022-08-16 2022-08-16 Object distance detection method and device, storage medium and electronic device Pending CN117630957A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210983482.1A CN117630957A (en) 2022-08-16 2022-08-16 Object distance detection method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210983482.1A CN117630957A (en) 2022-08-16 2022-08-16 Object distance detection method and device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN117630957A true CN117630957A (en) 2024-03-01

Family

ID=90025764

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210983482.1A Pending CN117630957A (en) 2022-08-16 2022-08-16 Object distance detection method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN117630957A (en)

Similar Documents

Publication Publication Date Title
US10203402B2 (en) Method of error correction for 3D imaging device
US11551453B2 (en) Method and apparatus for shelf feature and object placement detection from shelf images
KR102559661B1 (en) Volumetric methods, systems, equipment and computer readable storage media
US11675082B2 (en) Method and device for optical distance measurement
US20160364905A1 (en) Apparatus and method for generating 3d model
CN111179358A (en) Calibration method, device, equipment and storage medium
CN109801333B (en) Volume measurement method, device and system and computing equipment
US5889582A (en) Image-directed active range finding system
CN107860337B (en) Structured light three-dimensional reconstruction method and device based on array camera
CN110689577B (en) Active rigid body pose positioning method in single-camera environment and related equipment
US20220027654A1 (en) Point cloud data processing apparatus, point cloud data processing method, and program
CN112689776A (en) Calibrating a depth sensing array using color image data
CN113111513B (en) Sensor configuration scheme determining method and device, computer equipment and storage medium
EP3795945A1 (en) Surveying data processing device, surveying data processing method, and surveying data processing program
US11415408B2 (en) System and method for 3D profile determination using model-based peak selection
CN113610933A (en) Log stacking dynamic scale detecting system and method based on binocular region parallax
CN112184793A (en) Depth data processing method and device and readable storage medium
CN113077523B (en) Calibration method, calibration device, computer equipment and storage medium
CN112740065A (en) Enhanced depth mapping using visual inertial ranging
CN117630957A (en) Object distance detection method and device, storage medium and electronic device
CN111815552A (en) Workpiece detection method and device, readable storage medium and terminal equipment
JP7340434B2 (en) Reinforcement inspection system, reinforcement inspection method, and reinforcement inspection program
CN113959346A (en) Displacement detection module and mobile device
CN114494468A (en) Three-dimensional color point cloud construction method, device and system and storage medium
CN112218098A (en) Data compression method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination