CN111336938A - Robot and object distance detection method and device thereof - Google Patents

Robot and object distance detection method and device thereof Download PDF

Info

Publication number
CN111336938A
CN111336938A CN201911310724.5A CN201911310724A CN111336938A CN 111336938 A CN111336938 A CN 111336938A CN 201911310724 A CN201911310724 A CN 201911310724A CN 111336938 A CN111336938 A CN 111336938A
Authority
CN
China
Prior art keywords
image
target object
distance
size
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911310724.5A
Other languages
Chinese (zh)
Inventor
苏锴坚
谢非
刘松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Banana Intelligent Technology Co ltd
Original Assignee
Shenzhen Banana Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Banana Intelligent Technology Co ltd filed Critical Shenzhen Banana Intelligent Technology Co ltd
Priority to CN201911310724.5A priority Critical patent/CN111336938A/en
Publication of CN111336938A publication Critical patent/CN111336938A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

An object distance detection method for a robot includes: acquiring a color image acquired by a color camera; identifying an image size of a target object included in the color image; determining whether the distance of the object meets a preset linear fitting requirement or not according to the size of the image of the target object; and when the distance of the target object meets the preset linear fitting requirement, acquiring the distance between the target object and the robot according to the preset corresponding relation between the image size of the target object and the distance. The distance between the robot and the target object can be rapidly determined only by simple judgment and search, the requirement on hardware computing resources can be reduced, the robot cost is reduced, the distance is directly calculated according to the size of the image, the registration error caused by smaller image of the target object can be reduced, and the accuracy of the calculated distance is improved.

Description

Robot and object distance detection method and device thereof
Technical Field
The application belongs to the field of robots, and particularly relates to a robot and an object distance detection method and device thereof.
Background
In the moving process of the robot, objects such as obstacles in a scene where the robot is located need to be detected, so that the robot can conveniently and effectively plan a path in time according to information such as the obstacles in the scene, and reliably execute a set task.
At present, when an obstacle in a scene is detected, a mode based on color image and depth image registration is generally adopted. That is, by matching the color image and the depth image, the distance detected by the depth image can be associated with the object identified by the color image, and the distance corresponding to the object can be obtained.
However, when the distance of the object is calculated by real-time registration, more hardware calculation resources of the embedded device need to be consumed, and due to the existence of factors such as structure and camera difference, when the object is far away from the robot, the display area of the object in the image is small, and when the depth information corresponding to the object is acquired by registration, deviation may occur, and the accuracy of the distance acquired by registration is not high.
Disclosure of Invention
In view of this, embodiments of the present application provide a robot and a method and an apparatus for detecting an object distance thereof, so as to solve the problems in the prior art that more calculation resources need to be consumed when determining a distance of an object in a robot scene, and an accuracy of an obtained distance is not high.
A first aspect of an embodiment of the present application provides an object distance detection method for a robot, where the object identification method for a robot includes:
acquiring a color image acquired by a color camera;
identifying an image size of a target object included in the color image;
determining whether the distance of the object meets a preset linear fitting requirement or not according to the size of the image of the target object;
and when the distance of the target object meets the preset linear fitting requirement, acquiring the distance between the target object and the robot according to the preset corresponding relation between the image size of the target object and the distance.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the step of determining whether the distance of the object meets a preset linear fitting requirement according to the size of the image of the target object includes:
comparing the size of the image of the target object with a preset size threshold;
if the image size of the target object is smaller than a preset size threshold, the linear fitting requirement is met;
if the image size of the target object is greater than or equal to the preset size threshold, the linear fitting requirement is not met.
With reference to the first aspect, in a second possible implementation manner of the first aspect, the step of determining whether the distance of the object meets a preset linear fitting requirement according to the size of the image of the target object includes:
determining a first distance corresponding to the size of the acquired image of the target object according to a preset corresponding relation between the size of the image of the target object and the distance;
when the first distance is greater than a preset distance threshold, the linear fitting requirement is met;
and when the first distance is smaller than or equal to a preset distance threshold value, the linear fitting requirement is not met.
With reference to the first aspect, the first possible implementation manner of the first aspect, or the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the image size of the target object includes a length of the image of the target object, or an area of the image of the target object.
With reference to the third possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, when the image size of the target object includes an area of the image of the target object, the identifying the image size of the target object included in the color image includes:
acquiring texture information included in the image of the target object;
determining the transformation ratio of the image of the target object according to the texture information;
and determining the size of the target object image according to the transformation ratio.
With reference to the first aspect, in a fifth possible implementation manner of the first aspect, the method further includes:
when the distance of the target object does not meet the preset linear fitting requirement, a depth image is obtained through a depth camera;
registering the color image with the depth image according to the camera intrinsic and extrinsic parameters;
and acquiring the distance between the robot and the target object according to the position of the registered target object in the depth image.
With reference to the first aspect, in a sixth possible implementation manner of the first aspect, before the step of identifying an image size of a target object included in the color image, the method further includes:
and acquiring a color area corresponding to the target object in the color image through histogram equalization processing.
A second aspect of an embodiment of the present application provides an object distance detection apparatus of a robot, including:
the image acquisition unit is used for acquiring a color image acquired by the color camera;
an image size identifying unit configured to identify an image size of a target object included in the color image;
the condition judgment unit is used for determining whether the distance of the object meets the preset linear fitting requirement or not according to the size of the image of the target object;
and the distance acquisition unit is used for acquiring the distance between the target object and the robot according to the preset corresponding relation between the image size of the target object and the distance when the distance of the target object meets the preset linear fitting requirement.
A third aspect of the embodiments of the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the object distance detection method of the robot according to any one of the first aspect when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and the computer program, when executed by a processor, implements the steps of the object distance detection method for a robot according to any one of the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: the method comprises the steps of acquiring a color image acquired by a color camera, identifying the image size of a target object included in the color image, determining the distance between the target object and a robot according to the corresponding relation between the image size of the target object and the distance when the distance of the target object is determined to meet the preset linear fitting requirement according to the image size of the target object, and determining the distance between the robot and the target object rapidly only by simple judgment and search.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of an object distance detection method for a robot according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart of an implementation of identifying an image size of a target object according to an embodiment of the present disclosure;
fig. 3 is a schematic flow chart illustrating an implementation of determining whether a linear fitting requirement is met according to an embodiment of the present application;
fig. 4 is a schematic flow chart illustrating an implementation of another method for determining whether a linear fitting requirement is met according to the embodiment of the present application;
FIG. 5 is a schematic flow chart of an implementation of determining a distance between a robot and a target object according to an embodiment of the present disclosure;
fig. 6 is a schematic view of an object distance measuring apparatus of a robot according to an embodiment of the present disclosure;
fig. 7 is a schematic view of a robot provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Fig. 1 is a schematic flow chart of an implementation of an object distance detection method for a robot according to an embodiment of the present application, which is detailed as follows:
in step S101, a color image acquired by a color camera is acquired;
the robot of the embodiment of the application is provided with a color camera and a depth camera. The color camera and the depth camera can be preset at preset positions of the robot, and can be calibrated through calibration operation.
When the color camera is calibrated, a plurality of chessboard pictures under different visual angles can be shot by the color camera, and the internal parameters of the color camera and the external parameters corresponding to each image are calculated by the shot pictures.
When the depth camera is calibrated, a peripheral infrared light source can irradiate chessboard pictures, the infrared emitter is shielded, the depth camera shoots a plurality of infrared images, and parameter information such as internal parameters and external parameters of the depth camera is determined according to the shot images.
According to the rotation stationary matrix of the external parameters, the relative position relation of the two cameras can be described, and the coordinates in the depth camera image coordinate system can be transformed into the coordinates in the color camera coordinate system through the rotation stationary matrix transformation.
When the image including the target object is acquired by the color camera, the position of the target object in the color image can be determined according to the characteristic information of the target object, including color characteristics, shape characteristics and the like.
In a possible implementation manner, histogram equalization processing may be further performed on the image acquired by the color camera. Through histogram equalization processing, the contrast of the image can be effectively improved, and therefore the position of the target object in the image can be acquired more accurately.
In step S102, an image size of a target object included in the color image is identified;
in one embodiment, when the image size of the target object included in the color image is identified, histogram equalization processing may be performed on the color image, and after the histogram equalization processing is performed, the contrast of the image may be effectively improved, so that a color region corresponding to the target object in the color image may be obtained more accurately.
The target image may be an image in which a target object is calibrated in advance, and the target object is calibrated in advance to determine a correspondence between the size of the image of the target object and the distance.
The image size of the target object may include the length of the target object in the image and may also include the area of the target object in the image.
When the target object comprises one or more calibration surfaces, and the orientation relationship between one calibration surface and the color camera of the robot is consistent with the orientation relationship in the pre-calibration process, the position of the target object in the image can be directly obtained through the color image obtained by the color camera, and the image size of the target object can be directly determined according to the position of the target object in the image. The calibration surface is a plane of a target object which is calibrated in relation between the size of the image and the distance in advance, and the orientation relation of the calibration surface during calibration is recorded when the calibration surface is calibrated.
When the orientation relation of the calibration surface of the target object does not match the orientation relation of the calibration surface in advance, the shot image needs to be converted to obtain the size of the shot image of the target object. That is, the size of the captured image of the target object is the size of the target object in the image after being converted into the orientation relationship at the time of calibration.
The process of identifying the image size of the target object by transforming may include, as shown in fig. 2:
in step S201, texture information included in the image of the target object is acquired;
the texture information included in the image of the target object may be acquired by orb (organized FAST and Rotated bright) algorithm. Of course, when the calibration surface is preset, only the texture information included in the calibration surface in the image of the target object needs to be acquired.
The texture information may include information of lines, length of pattern, thickness, etc.
In step S202, determining a transformation ratio of the image of the target object according to the texture information;
the obtained texture information may be compared with the texture information of the preset standard, a difference between the obtained texture information and the texture information of the preset standard may be determined, and a conversion ratio corresponding to when the color image of the currently photographed target object is converted into the standard image may be determined. The transformation ratio may be determined by the line length.
For example, if the length of the line a in the standard image is n1 and the length in the image captured by the color camera is n2, the transformation ratio may be n2/n 1. The image of the target object taken by the color camera can be transformed directly by the transformation scale of n2/n 1.
In step S203, the size of the target object image is determined according to the transformation ratio.
Through the transformation ratio, the image of the target object shot by the color camera can be directly transformed into an image with a size corresponding to the direction of the image of the target object during calibration. And determining the size of the image of the target object at the calibration time, namely the size of the image of the target object.
In step S103, determining whether the distance of the object meets a preset linear fitting requirement according to the size of the image of the target object;
when the image size of the target object is determined, the method can be used for determining the distance between the target object and the robot according to the relationship of the size of the image shot by the camera. Specifically, the linear fitting requirement whether the image size of the target object meets can be determined through comparison of image sizes or through comparison of distances.
As shown in fig. 3, determining whether the preset linear fitting requirement is met through the image size specifically includes:
in step S301, comparing the size of the image of the target object with a preset size threshold;
the target object is a target object of which the actual size is predetermined. For example, measurements may be made in advance of the size of objects that may appear in the scene. After the actual size of the target object is determined, the size threshold of the image of the target object is correspondingly determined according to the actual size. The actual size of different objects is different, and thus the size threshold of the corresponding image is also different. In one implementation, the size threshold of the image of the object may be determined by a predetermined proportion of the actual size of the object. Alternatively, the fixed size threshold may be set according to the size of the acquired color image.
The target object is an object which needs to be subjected to distance positioning in an image acquired by the robot.
In step S302, if the size of the image of the target object is smaller than a preset size threshold, the linear fitting requirement is met;
comparing the size of the acquired image of the target object with a preset size threshold, if the size of the image of the target object is smaller than the preset size threshold, it indicates that the distance between the target object and the robot is long, and the area range occupied by the image of the target object in the color image is small, and when the image of the target object is matched with the depth image, if an error occurs, the depth value corresponding to the acquired image is very easy to be inaccurate. In this case, the size of the target object may be determined by means of linear fitting, i.e. the distance between the target object and the robot may be determined by the correspondence of image size to distance.
In step S303, if the image size of the target object is greater than or equal to the preset size threshold, the linear fitting requirement is not met.
If the size of the image of the target object is larger than or equal to the preset size threshold, the image of the target object occupies a larger range in the color image, the corresponding depth image can be more accurately acquired through the image of the target object,
fig. 4 is a further manner for determining whether the distance between the target object and the robot can be determined by using a linear fitting manner according to the embodiment of the present application, as shown in fig. 4, including:
in step S401, a first distance corresponding to the size of the acquired image of the target object is determined according to a preset correspondence between the size of the image of the target object and the distance;
the actual size of an object in a robot scene is predetermined, and the distance between the target object and the robot is calculated according to the principle that the closer the object is to the robot, the larger the image is, the farther the object is from the robot, and the smaller the image is.
In step S402, when the first distance is greater than a preset distance threshold, a linear fitting requirement is met;
if the calculated distance between the target object and the robot is greater than the preset distance threshold, the distance determined by the image size can be directly used as the distance between the target object and the robot.
The distance threshold value can be a fixed value, and the size of the distance threshold value can also be adjusted according to the size of the target object. For example, a correspondence relationship between the size of the target object and the distance threshold may be set, and the larger the target object is, the larger the distance threshold is, and the smaller the target object is, the smaller the distance threshold is. By setting the corresponding relation between the size of the object and the distance threshold, the target object in the color image, which comprises a larger area, can be subjected to more accurate distance acquisition in a cool and deep image mode.
In step S403, when the first distance is smaller than or equal to the preset distance threshold, the linear fitting requirement is not met.
If the calculated first distance is smaller than or equal to the preset distance threshold, an image of a larger target object can be obtained, registration is performed according to the obtained image of the target object and the depth image acquired by the depth camera, the position of the target object in the depth image is obtained, and the distance between the target object and the robot is determined according to the position of the target object in the depth image.
In step S104, when the distance of the target object meets a preset linear fitting requirement, the distance between the target object and the robot is obtained according to a preset corresponding relationship between the image size of the target object and the distance.
As the target object is closer to the robot, the robot has higher accuracy in detecting the distance to the target object, and at this time, in the image collected by the color camera, the image of the target object is usually larger, and at this time, the distance between the target object and the robot may be determined by a registration manner of the depth camera and the color camera, which may be specifically shown in fig. 5, including:
in step S501, when the distance of the target object does not meet a preset linear fitting requirement, a depth image is obtained by a depth camera;
the distance of the target object does not meet the preset linear fitting requirement, the region range of the image of the target object in the color image is large, the preset region range requirement is not met, and more accurate distance can be obtained through depth image registration of the depth camera.
In step S502, the color image is registered with the depth image according to the camera intrinsic and extrinsic parameters;
a transformation matrix between the color image captured by the color camera and the depth image captured by the depth camera may be determined based on the intrinsic and extrinsic parameters of the camera, for example, the rotation parameters and the translation parameters in the transformation matrix may be determined, and the coordinates in the color image may be converted to the coordinates in the depth image.
When the color image and the depth image are registered, the vertex of the outline corresponding to the image of the target object in the color image can be determined to be registered, or the center point of the target object in the color image and the depth image can be registered.
In step S503, a distance between the robot and the target object is acquired according to the position of the registered target object in the depth image.
After the color image and the depth image are registered, the distance of the target object can be determined according to the position of the registered target object in the depth image. For example, the distance between the target object and the robot may be determined by finding the depth of the center position of the target object according to the position of the target object in the depth image. Alternatively, the average depth of the region determined from the determined vertices may be calculated as the distance between the target object and the robot.
According to the method and the device, when the distance between the target object and the robot is far, the distance between the robot and the target object can be directly obtained in a linear fitting mode, complex calculation for converting coordinates through internal and external parameters is not needed, hardware calculation resource consumption is favorably reduced, the target object with a small image cannot be influenced by registration to generate a large error, and the accuracy of distance calculation is favorably improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 6 is a schematic diagram of an object distance measuring apparatus of a robot according to an embodiment of the present disclosure, which is detailed as follows:
the object distance measuring apparatus of the robot includes:
an image acquisition unit 601 configured to acquire a color image acquired by a color camera;
an image size identifying unit 602 for identifying an image size of a target object included in the color image;
a condition determining unit 603, configured to determine whether the distance of the object meets a preset linear fitting requirement according to the size of the image of the target object;
a distance obtaining unit 604, configured to obtain, when the distance of the target object meets a preset linear fitting requirement, a distance between the target object and the robot according to a preset corresponding relationship between an image size of the target object and the distance.
Fig. 7 is a schematic diagram of a robot provided in an embodiment of the present application. As shown in fig. 7, the robot 7 of this embodiment includes: a processor 70, a memory 71 and a computer program 72, such as an object distance detection program for a robot, stored in said memory 71 and executable on said processor 70. The processor 70, when executing the computer program 72, implements the steps in the above-described embodiments of the object distance detection method for each robot. Alternatively, the processor 70 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 72.
Illustratively, the computer program 72 may be partitioned into one or more modules/units that are stored in the memory 71 and executed by the processor 70 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 72 in the robot 7. For example, the computer program 72 may be divided into:
the image acquisition unit is used for acquiring a color image acquired by the color camera;
an image size identifying unit configured to identify an image size of a target object included in the color image;
the condition judgment unit is used for determining whether the distance of the object meets the preset linear fitting requirement or not according to the size of the image of the target object;
and the distance acquisition unit is used for acquiring the distance between the target object and the robot according to the preset corresponding relation between the image size of the target object and the distance when the distance of the target object meets the preset linear fitting requirement.
The robot may include, but is not limited to, a processor 70, a memory 71. Those skilled in the art will appreciate that fig. 7 is merely an example of a robot 7 and does not constitute a limitation of robot 7 and may include more or fewer components than shown, or some components in combination, or different components, e.g., the robot may also include input output devices, network access devices, buses, etc.
The Processor 70 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may be an internal storage unit of the robot 7, such as a hard disk or a memory of the robot 7. The memory 71 may also be an external storage device of the robot 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the robot 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the robot 7. The memory 71 is used for storing the computer program and other programs and data required by the robot. The memory 71 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An object distance detection method for a robot, the object distance detection method for a robot comprising:
acquiring a color image acquired by a color camera;
identifying an image size of a target object included in the color image;
determining whether the distance of the object meets a preset linear fitting requirement or not according to the size of the image of the target object;
and when the distance of the target object meets the preset linear fitting requirement, acquiring the distance between the target object and the robot according to the preset corresponding relation between the image size of the target object and the distance.
2. The method according to claim 1, wherein the step of determining whether the object distance satisfies a preset linear fitting requirement according to the size of the image of the target object includes:
comparing the size of the image of the target object with a preset size threshold;
if the image size of the target object is smaller than a preset size threshold, the linear fitting requirement is met;
if the image size of the target object is greater than or equal to the preset size threshold, the linear fitting requirement is not met.
3. The method according to claim 1, wherein the step of determining whether the object distance satisfies a preset linear fitting requirement according to the size of the image of the target object includes:
determining a first distance corresponding to the size of the acquired image of the target object according to a preset corresponding relation between the size of the image of the target object and the distance;
when the first distance is greater than a preset distance threshold, the linear fitting requirement is met;
and when the first distance is smaller than or equal to a preset distance threshold value, the linear fitting requirement is not met.
4. The object distance detecting method for a robot according to any one of claims 1 to 3, wherein the size of the image of the target object includes a length of the image of the target object or an area of the image of the target object.
5. The object distance detection method for a robot according to claim 4, wherein when the image size of the target object includes an area of the image of the target object, the step of identifying the image size of the target object included in the color image includes:
acquiring texture information included in the image of the target object;
determining the transformation ratio of the image of the target object according to the texture information;
and determining the size of the target object image according to the transformation ratio.
6. The object distance detection method for a robot according to claim 1, characterized by further comprising:
when the distance of the target object does not meet the preset linear fitting requirement, a depth image is obtained through a depth camera;
registering the color image with the depth image according to the camera intrinsic and extrinsic parameters;
and acquiring the distance between the robot and the target object according to the position of the registered target object in the depth image.
7. The object distance detection method of a robot according to claim 1, characterized in that, prior to the step of identifying the image size of the target object included in the color image, the method further comprises:
and acquiring a color area corresponding to the target object in the color image through histogram equalization processing.
8. An object distance detection device for a robot, comprising:
the image acquisition unit is used for acquiring a color image acquired by the color camera;
an image size identifying unit configured to identify an image size of a target object included in the color image;
the condition judgment unit is used for determining whether the distance of the object meets the preset linear fitting requirement or not according to the size of the image of the target object;
and the distance acquisition unit is used for acquiring the distance between the target object and the robot according to the preset corresponding relation between the image size of the target object and the distance when the distance of the target object meets the preset linear fitting requirement.
9. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, carries out the steps of the object distance detection method of the robot according to any of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the object distance detection method of a robot according to any one of claims 1 to 7.
CN201911310724.5A 2019-12-18 2019-12-18 Robot and object distance detection method and device thereof Pending CN111336938A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911310724.5A CN111336938A (en) 2019-12-18 2019-12-18 Robot and object distance detection method and device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911310724.5A CN111336938A (en) 2019-12-18 2019-12-18 Robot and object distance detection method and device thereof

Publications (1)

Publication Number Publication Date
CN111336938A true CN111336938A (en) 2020-06-26

Family

ID=71181364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911310724.5A Pending CN111336938A (en) 2019-12-18 2019-12-18 Robot and object distance detection method and device thereof

Country Status (1)

Country Link
CN (1) CN111336938A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111880575A (en) * 2020-08-10 2020-11-03 重庆依塔大数据研究院有限公司 Control method and device based on color tracking, storage medium and robot
CN113029089A (en) * 2021-03-17 2021-06-25 国网安徽省电力有限公司铜陵供电公司 Video image inner target distance estimation method based on electric power field auxiliary information

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011016421A (en) * 2009-07-08 2011-01-27 Higashi Nippon Transportec Kk Obstacle detector, platform door system having the same and obstacle detecting method
CN102980517A (en) * 2012-11-15 2013-03-20 天津市亚安科技股份有限公司 Monitoring measurement method
CN106225687A (en) * 2016-09-06 2016-12-14 乐视控股(北京)有限公司 The measuring method of dimension of object and device
CN107907884A (en) * 2017-10-02 2018-04-13 广东美的制冷设备有限公司 Object distance detection method, device, air conditioner and storage medium
CN109492639A (en) * 2018-11-12 2019-03-19 北京拓疆者智能科技有限公司 " loaded " position three-dimensional coordinate acquisition methods, system and image recognition apparatus
CN109781008A (en) * 2018-12-30 2019-05-21 北京猎户星空科技有限公司 A kind of distance measurement method, device, equipment and medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011016421A (en) * 2009-07-08 2011-01-27 Higashi Nippon Transportec Kk Obstacle detector, platform door system having the same and obstacle detecting method
CN102980517A (en) * 2012-11-15 2013-03-20 天津市亚安科技股份有限公司 Monitoring measurement method
CN106225687A (en) * 2016-09-06 2016-12-14 乐视控股(北京)有限公司 The measuring method of dimension of object and device
CN107907884A (en) * 2017-10-02 2018-04-13 广东美的制冷设备有限公司 Object distance detection method, device, air conditioner and storage medium
CN109492639A (en) * 2018-11-12 2019-03-19 北京拓疆者智能科技有限公司 " loaded " position three-dimensional coordinate acquisition methods, system and image recognition apparatus
CN109781008A (en) * 2018-12-30 2019-05-21 北京猎户星空科技有限公司 A kind of distance measurement method, device, equipment and medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111880575A (en) * 2020-08-10 2020-11-03 重庆依塔大数据研究院有限公司 Control method and device based on color tracking, storage medium and robot
CN111880575B (en) * 2020-08-10 2023-03-24 重庆依塔大数据研究院有限公司 Control method and device based on color tracking, storage medium and robot
CN113029089A (en) * 2021-03-17 2021-06-25 国网安徽省电力有限公司铜陵供电公司 Video image inner target distance estimation method based on electric power field auxiliary information

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
CN108416791B (en) Binocular vision-based parallel mechanism moving platform pose monitoring and tracking method
CN111339951A (en) Body temperature measuring method, device and system
CN109784250B (en) Positioning method and device of automatic guide trolley
CN112489140B (en) Attitude measurement method
CN108182708B (en) Calibration method and calibration device of binocular camera and terminal equipment
CN112686950B (en) Pose estimation method, pose estimation device, terminal equipment and computer readable storage medium
CN112927306B (en) Calibration method and device of shooting device and terminal equipment
CN107230212B (en) Vision-based mobile phone size measuring method and system
CN110807807A (en) Monocular vision target positioning pattern, method, device and equipment
CN114862929A (en) Three-dimensional target detection method and device, computer readable storage medium and robot
CN111311671B (en) Workpiece measuring method and device, electronic equipment and storage medium
CN111336938A (en) Robot and object distance detection method and device thereof
CN113124763A (en) Optical axis calibration method, device, terminal, system and medium for optical axis detection system
CN111383264A (en) Positioning method, positioning device, terminal and computer storage medium
CN112308930A (en) Camera external parameter calibration method, system and device
CN112102378B (en) Image registration method, device, terminal equipment and computer readable storage medium
WO2024012463A1 (en) Positioning method and apparatus
CN113050022B (en) Image positioning method and device based on rotary antenna and terminal equipment
CN116679267A (en) Combined calibration method, device, equipment and storage medium based on radar and image
CN114415129A (en) Visual and millimeter wave radar combined calibration method and device based on polynomial model
CN113635299B (en) Mechanical arm correction method, terminal device and storage medium
CN111311690B (en) Calibration method and device of depth camera, terminal and computer storage medium
CN111223139B (en) Target positioning method and terminal equipment
CN114744721A (en) Charging control method of robot, terminal device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200626

RJ01 Rejection of invention patent application after publication