WO2020034963A1 - 一种充电设备识别方法、移动机器人和充电设备识别系统 - Google Patents

一种充电设备识别方法、移动机器人和充电设备识别系统 Download PDF

Info

Publication number
WO2020034963A1
WO2020034963A1 PCT/CN2019/100429 CN2019100429W WO2020034963A1 WO 2020034963 A1 WO2020034963 A1 WO 2020034963A1 CN 2019100429 W CN2019100429 W CN 2019100429W WO 2020034963 A1 WO2020034963 A1 WO 2020034963A1
Authority
WO
WIPO (PCT)
Prior art keywords
charging device
area
image
target
depth
Prior art date
Application number
PCT/CN2019/100429
Other languages
English (en)
French (fr)
Inventor
蒋腻聪
朱建华
沈冰伟
郭斌
蒋海青
Original Assignee
杭州萤石软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州萤石软件有限公司 filed Critical 杭州萤石软件有限公司
Priority to US17/266,164 priority Critical patent/US11715293B2/en
Priority to EP19850252.8A priority patent/EP3836084B1/en
Publication of WO2020034963A1 publication Critical patent/WO2020034963A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • G06V10/225Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on a marking or identifier characterising the area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
    • H02J7/0047Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries with monitoring or indicating devices or circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the present application relates to the field of mobile robots, and in particular, to a charging device identification method, a mobile robot, and a charging device identification system.
  • mobile robots As mobile robots become more and more widely used, users' demands for functions such as long-term on-duty and prolonged autonomy are increasing. Therefore, how to supplement the power source of mobile robots has become a hot spot. At present, mobile robots achieve autonomous charging through automatic contact charging technology to supplement the power source.
  • the mobile robot In order to achieve autonomous charging, the mobile robot needs to identify the charging device in the environment when the power is lower than a preset threshold, and then move from the current position to the position of the charging device to dock with the charging device for autonomous charging.
  • the present application provides a method for identifying a charging device, a mobile robot, and a system for identifying a charging device.
  • a first aspect of the present application provides a charging device identification method.
  • a marker is provided on the charging device, and the reflectance of the marker is greater than a first specified value.
  • the method is applied to a mobile robot. The method includes:
  • the first specified condition indicates that the gray value of each pixel in the area is greater than the second specified value, and the pixels in the area The number is greater than the third specified value;
  • the charging device is identified according to the target charging device area.
  • a second aspect of the present application provides a mobile robot, including:
  • Depth camera to capture infrared and depth images in the current field of view
  • the first specified condition indicates that the gray value of each pixel in the area is greater than the second specified value, and the pixels in the area The number is greater than the third specified value;
  • the charging device of the mobile robot is identified according to the target charging device area.
  • a third aspect of the present application provides a charging device identification system, including any mobile robot provided in the second aspect of the present application and a charging device for charging the mobile robot, wherein:
  • a marker is provided on the charging device, and the reflectance of the marker is greater than a first specified value.
  • the charging device identification method, mobile robot, and charging device identification system provided in this application are based on setting a marker with a reflectance greater than a first specified value on the charging device, and preliminary screening out suspected areas of each charging device based on the infrared image, and then based on the depth
  • the image selects the target charging device area that meets the geometric information of the charging device from the suspected areas of the charging device that are initially selected, and finally identifies the charging device based on the target charging device area.
  • the suspected area of the charging device is verified based on the geometric information of the charging device, and then the charging device is identified based on the verified target charging device area, which can effectively improve the recognition accuracy rate.
  • FIG. 1 is a flowchart of Embodiment 1 of a charging device identification method provided by this application;
  • FIG. 2 is a schematic diagram of a mobile robot according to an exemplary embodiment of the present application.
  • FIG. 3 is a flowchart of a second embodiment of a charging device identification method provided by this application.
  • FIG. 4 is a schematic diagram of an installation relationship between a depth camera and an RGB camera in a mobile robot according to an exemplary embodiment of the present application
  • FIG. 5 is a schematic diagram of a charging device according to an exemplary embodiment of the present application.
  • FIG. 6 is a flowchart of Embodiment 3 of a charging device identification method provided by this application.
  • FIG. 7A is a schematic diagram of a normal exposure infrared image according to an exemplary embodiment of the present application.
  • FIG. 7B is a schematic diagram of an HDR infrared image according to an exemplary embodiment of the present application.
  • FIG. 7C is a schematic diagram showing a fused infrared image obtained by using a normal exposure infrared image and an HDR infrared image, according to an exemplary embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a first embodiment of a mobile robot according to the present application.
  • FIG. 10 is a schematic structural diagram of a second embodiment of a mobile robot provided in this application.
  • Fig. 11 is a schematic diagram of a charging device identification system according to an exemplary embodiment of the present application.
  • first, second, third, etc. may be used in this application to describe various information, such information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as the second information, and similarly, the second information may also be referred to as the first information.
  • word “if” as used herein can be interpreted as “at” or "when” or "in response to determination”.
  • the charging device is marked by setting black and white markers on the charging device, and then an image is collected by an RGB camera provided on the robot to recognize the above-mentioned markers in the collected image to complete the recognition of the charging device.
  • the charging method is used to identify the charging device, since the image collected by the RGB camera is easily affected by the light intensity and other objects, the accuracy rate is low when the charging device is identified by the collected image.
  • the present application provides a charging device identification method, a mobile robot, and a charging device identification system to solve the problem of low accuracy of the existing identification methods.
  • FIG. 1 is a flowchart of Embodiment 1 of a charging device identification method provided by this application. Referring to FIG. 1, the method provided in this embodiment may include:
  • S101 Collect an infrared image and a depth image in a current field of view through a depth camera.
  • FIG. 2 is a schematic diagram of a mobile robot according to an exemplary embodiment of the present application.
  • a depth camera 910 is provided on the mobile robot.
  • the depth camera 910 may be disposed on the chassis 10 of the mobile robot. Infrared and depth images can be acquired by this depth camera.
  • the depth camera 910 may be a structured light depth camera or a TOF (TOF) depth camera.
  • the mobile robot may acquire an infrared image and a depth image through a depth camera when it detects that the current power is lower than a preset threshold, so as to identify the charging device in the environment through the acquired infrared image and depth image.
  • the infrared image and depth image may be acquired again by the depth camera, so as to identify the charging device by the infrared image and depth image acquired again. In this embodiment, this is not limited.
  • the depth camera consists of an infrared array sensor with a lens and an infrared light source transmitter. It uses an infrared light source transmitter to illuminate the environment, and then generates an infrared image based on the reflected light received by the infrared array sensor. At the same time, the depth camera can obtain the distance information between the lens and the object according to the time of flight of the light to generate a depth image.
  • the specific structure and working principle of the depth camera refer to the description in the related technology, which is not repeated here.
  • S102 Determine whether there is a suspected area of the charging device that satisfies the first specified condition according to the infrared image; the first specified condition is that a gray value of each pixel in the area is greater than a second specified value, and the pixels in the area are The number is greater than the third specified value.
  • a marker is provided on the charging device, and the reflectance of the marker is greater than the first specified value, that is, the marker is made of a highly reflective material.
  • the first specified value may be set according to actual needs, and in this embodiment, this is not limited.
  • step S101 a normal exposure infrared image may be acquired.
  • a charging device in the current field of view there must be a bright area in the normally exposed infrared image. Therefore, in order to detect whether there is a charging device in the current field of view, in step S102, it can be determined whether there is a suspected area of the charging device that satisfies the first specified condition in the normally exposed infrared image.
  • a suspected area of the charging device can be filtered, and the suspected area of the charging device is a highlighted area in the normally exposed infrared image.
  • the second specified value may be set according to actual needs.
  • the specific value of the second designated value is not limited.
  • the second specified value is 200.
  • the third specified value may also be set according to actual needs, for example, the third specified value may be 100.
  • an infrared image and a depth image may be collected again by the depth camera to identify the charging device.
  • M is greater than or equal to 1.
  • the specified range may be set according to actual needs.
  • the specified range may be set according to the actual height of the marker relative to the depth camera.
  • the specified range may be [ab, a + b], where a is the height of the optical center of the depth camera from the ground minus the height of the center point of the marker from the ground, and b is an error value. .
  • the height of the suspected area of the charging device relative to the depth camera can be represented by the y coordinate of the center point of the suspected area of the charging device in the depth camera coordinate system.
  • the specific implementation process of this step may include the following steps:
  • the depth information of each pixel point in the suspect area of the charging device can be obtained.
  • (u 1 , v 1 ) is coordinate information of the pixel point.
  • the depth information of the pixel point can be found from the depth image.
  • the i-th charging device referred to as a suspected region Si
  • the Si [(- u i1, v i1), «, (u iN, v iN)], where, N is equal to The number of pixels included in the i-th charging device suspected area.
  • N is equal to The number of pixels included in the i-th charging device suspected area.
  • the coordinate information of the j-th pixel point in the suspected area of the i-th charging device is (u ij , v ij ), and the depth information is d ij , where j is equal to 1 to N.
  • the position information of the pixel in the depth camera coordinate system may be determined according to the camera projection model and according to the depth information of each pixel in the suspected area of the charging device and the coordinate information of the pixel.
  • the camera projection model can be expressed by a first formula, and the first formula is:
  • d ij is the depth information of the j-th pixel in the suspected area of the i-th charging device
  • (u ij , v ij ) is the coordinate information of the j-th pixel in the suspected area of the i-th charging device
  • (x ij , y ij , z ij ) is the position information of the j-th pixel point in the suspect area of the i-th charging device in the depth camera coordinate system;
  • f x and f y are the equivalent focal lengths of the u and v axes
  • c x and c y are the optical centers of the depth camera
  • f x , f y , c x , c y and s are internal parameters of the depth camera, which can all be obtained through camera marking.
  • the height of the suspected area of the charging device relative to the depth camera may be characterized by a y coordinate of a center point of the suspected area of the charging device in a depth camera coordinate system.
  • step (3) according to the height of each charging device suspected area calculated with respect to the depth camera calculated in step (3), it is determined whether there is a target charging within the specified range with respect to the height of the depth camera in the suspected charging device area. Device area.
  • the suspected area of the charging device is an area that satisfies geometric information of the charging device.
  • the target charging device area is an area satisfying the first specified condition, and the area meets the geometric information of the charging device. Therefore, in an embodiment, the target charging device area may be determined as the area where the charging device is located.
  • the method provided by this embodiment by setting a marker having a reflectance greater than a first specified value on the charging device, when identifying the charging device, by collecting an infrared image and a depth image under the current field of view, and determining whether there is a satisfaction based on the infrared image A charging device suspected area under a first specified condition, and then based on the depth image, it is determined whether there is a target charging device area within a specified range with respect to the height of the depth camera in the charging device suspected area, thereby identifying the charging device according to the target charging device area.
  • verifying the suspected area of the charging device based on the geometric information of the charging device, and then identifying the charging device based on the verified target charging device area can effectively improve the recognition accuracy rate.
  • FIG. 3 is a flowchart of a second embodiment of a charging device identification method provided by the present application. Based on the above embodiments, the method provided in this embodiment may further include: when acquiring an infrared image and a depth image in the current field of view through a depth camera, or determining whether a target exists in a suspected area of the charging device When charging the device area, the RGB images in the current field of view are collected by the red, green and blue RGB cameras.
  • step S104 may include:
  • the mobile robot may further be provided with an RGB camera 930, and the RGB camera 930 may be provided on the chassis 10 of the mobile robot 1.
  • the method provided in this embodiment may also be used when acquiring an infrared image and a depth image in the current field of view through a depth camera, or when it is determined that a target charging device area exists in the suspected area of the charging device. At this time, the RGB images in the current field of view are collected by the red, green and blue RGB cameras.
  • each pixel point in the target charging device area can be projected into an RGB projective model to obtain a corresponding pixel point of the pixel point in the RGB image.
  • FIG. 4 is a schematic diagram of an installation relationship between a depth camera and an RGB camera in a mobile robot according to an exemplary embodiment of the present application. Please refer to Figure 4:
  • the transformation matrix of the coordinate system of the depth camera 910 relative to the coordinate system of the RGB camera 930 is T t2t1 , and the transformation matrix is a 4 ⁇ 4 order matrix.
  • the internal parameter matrix of the RGB camera 930 is K RGB
  • the internal parameter matrix is a 3 ⁇ 3 order matrix
  • the target charging device area is the i-th suspected charging device area.
  • the target charging device can be calculated based on the following formula
  • K ij K RGB * (T t2t1 * P ij );
  • the coordinate information of the corresponding pixel point in the RGB image of the j-th pixel point in the target charging device area can be obtained according to K ij .
  • all the pixel points in the target charging device region correspond to the pixel points of the RGB image to form the target RGB image region.
  • matting processing can be performed on the RGB image to obtain the target RGB image area. That is, the target RGB image area is considered to be the area in the RGB image where the charging device is suspected.
  • FIG. 5 is a schematic diagram of a charging device according to an exemplary embodiment of the present application.
  • a marker is provided on the charging device.
  • a marker is provided on a side of the charging device having a charging socket.
  • the standard RGB image of the marker is pre-stored in the mobile robot. In this step, when the similarity between the target RGB image region and the standard RGB image of the marker is greater than a preset threshold, it is determined that the target RGB image region matches the pre-stored standard RGB image of the marker.
  • the charging device may further be provided with at least two markers having different shapes, the at least two markers are disposed on different sides of the charging device, and the at least two markers are The distance between the center point and the bottom of the charging device is equal.
  • a marker can be placed on both the front (the side with the charging socket) and the back (the side facing away from the front) of the charging device, for example, a round marker on the front and a square marker on the back Thing.
  • a standard RGB image of the two markers is pre-stored in the mobile robot.
  • the similarity between the target RGB image region and the standard RGB image of any marker is greater than a preset threshold, it is determined that the target RGB image region matches the standard RGB image of the pre-stored marker.
  • the target RGB image area and the standard RGB image of any marker are greater than a preset threshold
  • the target RGB image area and the arbitrary The similarity of a standard RGB image of a marker finds a target marker whose similarity is greater than a preset threshold. In this way, when the mobile robot moves to the target marker, the mobile robot can be controlled to move to the charging socket according to the positional relationship between the target marker and the charging socket, so that the mobile robot and the charging device are docked to realize autonomous charging.
  • the similarity between the target RGB image area and the standard RGB image can be calculated using a related similarity calculation method.
  • the similarity between the target RGB image area and the standard RGB image can be calculated by the following formula according to the NCC (Normalized Cross Correlation) algorithm comparison:
  • a (i, j) is the gray value of the (i, j) th pixel point in the target RGB image area A;
  • B (i, j) is the gray value of the (i, j) th pixel point in the standard RGB image B;
  • step S302 when it is determined through step S302 that the target RGB image area matches the standard RGB image of the pre-stored marker, in this step, it is determined that the target charging device area is the area where the charging device is located.
  • the target RGB image area does not match the standard RGB image, it is considered that the target RGB image area is not the area where the charging device is located, that is, it can be considered that there is no charging device in the collected current field of view. At this point, the infrared and depth images can be acquired again until the charging device is identified.
  • the target charging device area when it is determined that a target charging device area exists in a suspected area of the charging device, the target charging device area is not directly determined as the area where the charging device is located, but the target charging is further extracted from the acquired RGB image.
  • the target RGB image area corresponding to the device area and determine whether the target RGB image area matches the standard RGB image of the pre-stored marker. Then, when determining that the target RGB image area matches the standard RGB image, the target charging device area is considered to be The area where the charging device is located. In this way, the recognition accuracy can be further improved.
  • FIG. 6 is a flowchart of a third embodiment of a charging device identification method provided by the present application.
  • the infrared image may include a normal exposure infrared image and a high dynamic range HDR infrared image; the HDR infrared image is a gray level that exists in determining the normal exposure infrared image It is acquired when the number of pixels whose value is greater than the second specified value is greater than the fourth specified value.
  • step S102 may include:
  • the fourth designated value may be set according to actual needs.
  • the specific value of the fourth designated value is not limited.
  • the fourth designated value may be equal to 100.
  • a normal exposure infrared image and a depth image are collected by a depth camera, and then the number of pixels having a grayscale value greater than the second specified value in the normal exposure infrared image is greater than the first Acquire HDR infrared images at four specified values. It should be noted that when there are no pixels with a gray value greater than the second specified value in the normally exposed infrared image, or the number of pixels with a gray value greater than the second specified value in the normally exposed image is not greater than the fourth specified When the value is set, it is considered that there is no charging device in the current field of view, and the infrared image and depth image can be reacquired until the charging device is identified.
  • the gray values of the corresponding positions in the normal exposure infrared image and the HDR infrared image may be convoluted to obtain a fused infrared image.
  • the weighted convolution can be expressed by the following formula:
  • R (i, j) a * Z (i, j) * b * H (i, j)
  • R (i, j) is the gray value of the (i, j) th pixel point in the fused image
  • Z (i, j) is the gray value of the (i, j) th pixel point in the normal exposure image
  • H (i, j) is the gray value of the (i, j) th pixel in the HDR infrared image
  • a is the weight coefficient of the (i, j) -th pixel in the normal exposure image
  • b is the weight coefficient of the (i, j) -th pixel in the HDR infrared image.
  • the weight coefficient of a pixel can be determined by the following method, that is, when the gray value of the pixel is greater than a preset value, for example, the preset value can be any value between 150 and 200.
  • the weight coefficient of a pixel is determined to be a specified value, for example, the specified value is 1.
  • the weight coefficient of the pixel is determined to be another specified value, such as , The specified value is 0.7.
  • S602. Determine whether there is a suspected area of the charging device in the fusion infrared image that meets the first specified condition.
  • contour detection may be performed on the fused infrared image to determine whether a suspected area of a charging device that meets a first specified condition exists in the fused infrared image.
  • FIG. 7A is a schematic diagram of a normal exposure infrared image according to an exemplary embodiment of the present application
  • FIG. 7B is a schematic diagram of an HDR infrared image according to an exemplary embodiment of the present application
  • FIG. 7C is a diagram illustrating a utilization according to an exemplary embodiment of the present application
  • FIGS. 7A-7C When the charging device shown in FIG. 5 appears in the field of vision of the mobile robot, a circular marker is set on the front of the charging device. At this time, referring to FIG. 7A, there is charging in the normally exposed infrared image The device suspected area is the highlighted area in FIG. 7A. At this time, in order to prevent interference caused by other highly reflective materials in the environment, HDR infrared images are further collected. The acquired HDR image is shown in FIG. 7B.
  • an HDR infrared image acquires an infrared image in a current field of view through a shorter exposure time.
  • a charging device appears in the field of vision of the mobile robot, there is still a region in the HDR infrared image due to the marker with a higher reflectivity on the charging device.
  • the gray value of the area where the object is located is greatly reduced in the HDR infrared image.
  • a fusion infrared image is obtained by performing fusion processing on the normal exposure infrared image and the HDR infrared image.
  • FIG. 7C it can be seen that when a charging device exists in the field of vision of the mobile robot, in the fused infrared image, the area where the charging device is located presents particularly significant peak data. Therefore, the method provided in this embodiment determines whether there is a suspected area of the charging device by fusing the infrared image. In this way, the suspected area of the charging device can be found quickly and accurately, and the non-charged device area is eliminated at the first time to reduce The amount of calculations later.
  • FIG. 8 is a flowchart of Embodiment 4 of a charging device identification method provided by the present application. Referring to FIG. 8, the method provided in this embodiment may include:
  • step S802 Determine whether the number of pixels having a grayscale value greater than the second specified value in the normal exposure image is greater than the fourth specified value. If yes, go to step S803; if not, go to step S801.
  • S804 Perform fusion processing on the normal exposure infrared image and the HDR image to obtain a fusion infrared image.
  • step S805. Determine whether there is a suspected area of the charging device that satisfies the first specified condition in the fusion infrared image, and if yes, perform step S806; if not, perform step S801.
  • step S806 Determine whether there is a target charging device area in the suspected area of the charging device in a specified range with respect to the height of the depth camera according to the depth image. If yes, perform step S807, and if no, perform step S801.
  • step S808 Determine whether the target RGB image area matches a standard RGB image of a pre-stored marker. If yes, go to step S809, and if no, go to step S801.
  • the target charging device area corresponds to the RGB image. Whether the target RGB image area matches the standard RGB image, and then, when the matching is performed, it is determined that the target charging device area is the area where the charging device is located. In this way, after two verifications, the final determined area is the area where the charging device is located. In this way, the accuracy of recognition is greatly improved.
  • the mobile robot when the method provided in this embodiment determines the area where the charging device is located, the mobile robot can be controlled to move to the charging device according to the position information of each pixel in the area under the depth camera coordinates.
  • the mobile robot charges autonomously.
  • the method provided by this embodiment has a relatively high recognition accuracy rate of the charging device. In this way, the mobile robot can be controlled to move to the charging device more quickly, and the recharging efficiency of the mobile robot can be effectively improved.
  • FIG. 9 is a schematic structural diagram of a first embodiment of a mobile robot provided in this application.
  • the mobile robot provided in this embodiment may include a depth camera 910 and a processor 920.
  • the depth camera 910 is configured to collect an infrared image and a depth image in a current field of view.
  • the processor 920 is specifically configured to determine, according to the infrared image, whether there is a suspected area of a charging device that meets a first specified condition; the first specified condition is that a gray value of each pixel in the area is greater than a first Two specified values, and the number of pixels in the area is greater than the third specified value; if so, judging whether there is a target within a specified range relative to the height of the depth camera in the suspect area of the charging device based on the depth image Charging device area; if yes, identifying a charging device according to the target charging device area.
  • the mobile robot provided in this embodiment may be used to execute the technical solution of the method embodiment shown in FIG. 1, and the implementation principles and technical effects thereof are similar, and details are not described herein again.
  • FIG. 10 is a schematic structural diagram of a second embodiment of a mobile robot provided in this application.
  • the mobile robot provided in this embodiment further includes a red, green and blue RGB camera 930.
  • the RGB camera 930 is used when the depth camera 910 collects an infrared image and a depth image in the current field of view, or when the processor 920 determines that a target charging device area exists in the charging device suspected area. To capture RGB images in the current field of view.
  • the processor 920 is configured to extract a target RGB image region corresponding to the target charging device region from the RGB image, and determine whether the target RGB image region matches a pre-stored standard RGB image of the marker, and When it is determined that the target RGB image region matches a standard RGB image of the marker that is stored in advance, it is determined that the target charging device region is a region where the charging device is located.
  • the mobile robot provided in this embodiment may further include a memory 940, where the memory 940 is configured to store the infrared image, the depth image, and the RGB image.
  • the infrared image includes a normal exposure infrared image and a high dynamic range HDR infrared image;
  • the HDR infrared image is a pixel point in which it is determined that a gray value existing in the normal exposure infrared image is greater than the second specified value Collected when the number is greater than the fourth specified value.
  • the processor 920 is specifically configured to: perform fusion processing on the normal exposure infrared image and the HDR image to obtain a fusion infrared image; and determine whether the fusion infrared image satisfies a first specification. Condition of the charging device suspected area.
  • the processor 920 is specifically configured to: obtain depth information of each pixel point in the suspected area of the charging device according to the depth image; according to the depth of each pixel point in the suspected area of the charging device Information and the coordinate information of the pixel point to determine the position information of the pixel point in the depth camera coordinate system; based on the position information of each pixel point in the suspected area of the charging device in the depth camera coordinate system, calculate The height of the suspected area of the charging device relative to the depth camera; determining whether there is a target charging device area in the suspected area of the charging device within a specified range relative to the height of the depth camera.
  • the processor 920 is specifically configured to: according to position information of each pixel point in the target charging device area under the depth camera coordinate system, and relative to the RGB of the depth camera coordinate system
  • the transformation matrix of the camera coordinate system and the internal parameter matrix of the RGB camera determine the coordinate information of the corresponding pixel point of each pixel in the target charging device region in the RGB image; according to the target charging device region
  • the coordinate information of the corresponding pixel point of each pixel point in the RGB image determines the target RGB image region.
  • the charging device is provided with a marker, and the target RGB image region matches a pre-stored standard RGB image of the marker, including: the target RGB image region and the standard RGB image of the marker are The similarity is greater than a preset threshold.
  • the charging device is provided with at least two markers having different shapes, the at least two markers are disposed on different sides of the charging device, and a center point of the at least two markers is away from the The distance from the bottom of the charging device is equal.
  • matching the target RGB image region with a pre-stored standard RGB image of the marker includes: the similarity between the target RGB image region and a standard RGB image of any of the markers is greater than Preset threshold.
  • Fig. 11 is a schematic diagram of a charging device identification system according to an exemplary embodiment of the present application.
  • the charging device identification system provided in this embodiment may include any mobile robot 1 provided in this application and a charging device 2 for charging the mobile robot 1.
  • the charging device 2 is provided with a marker 21, and a reflectance of the marker 21 is greater than a first specified value.
  • the charging device 2 may be provided with at least one marker 21, and the marker 21 may be provided on a designated side. For example, it is provided on the side of the charging device 2 having a charging socket.
  • the charging device 2 may also be provided with at least two markers 21 of different shapes. The at least two markers 21 are provided on different sides of the charging device 2, and the center points of the at least two markers 21 are away from the charging device 2 The distances from the bottom surface are equal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Power Engineering (AREA)
  • Geometry (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Charge And Discharge Circuits For Batteries Or The Like (AREA)

Abstract

本申请提供一种充电设备识别方法、移动机器人和充电设备识别系统。充电设备识别方法可以包括:通过深度相机采集当前视野下的红外图像和深度图像;依据所述红外图像判断是否存在满足第一指定条件的充电设备疑似区域;若是,依据所述深度图像判断所述充电设备疑似区域中是否存在相对于所述深度相机的高度处于指定范围的目标充电设备区域;若是,依据所述目标充电设备区域识别充电设备。其中,所述第一指定条件表示区域中的各个像素点的灰度值均大于第二指定值,且该区域中的像素点的数量大于第三指定值。

Description

一种充电设备识别方法、移动机器人和充电设备识别系统
相关申请的交叉引用
本专利申请要求于2018年8月15日提交的、申请号为2018109294507、发明名称为“一种充电设备识别方法、移动机器人和充电设备识别系统”的中国专利申请的优先权,该申请的全文以引用的方式并入本文中。
技术领域
本申请涉及移动机器人领域,尤其涉及一种充电设备识别方法、移动机器人和充电设备识别系统。
背景技术
随着移动机器人的应用越来越广泛,用户对移动机器人能实现长期值守和延长自治时间等功能的需求越来越高。因此,移动机器人如何补充动力源成为当前关注的热点。目前,移动机器人通过自动接触式充电技术实现自主充电,补充动力源。
为实现自主充电,移动机器人需要在电量低于预设阈值时,识别出环境中的充电设备,进而从当前位置移动到充电设备所在的位置,以与充电设备对接,进行自主充电。
发明内容
有鉴于此,本申请提供一种识别充电设备的方法、移动机器人和识别充电设备的系统。
本申请第一方面提供一种充电设备识别方法,所述充电设备上设置有标记物,所述标记物的反射率大于第一指定值,所述方法应用于移动机器人,所述方法包括:
通过深度相机采集当前视野下的红外图像和深度图像;
依据所述红外图像判断是否存在满足第一指定条件的充电设备疑似区域;所述第一指定条件表示区域中的各个像素点的灰度值均大于第二指定值,且该区域中的像素点的数量大于第三指定值;
若是,依据所述深度图像判断所述充电设备疑似区域中是否存在相对于所述深 度相机的高度处于指定范围的目标充电设备区域;
若是,依据所述目标充电设备区域识别充电设备。
本申请第二方面提供一种移动机器人,包括:
深度相机,用于采集当前视野下的红外图像和深度图像;和
处理器,用于:
依据所述红外图像判断是否存在满足第一指定条件的充电设备疑似区域;所述第一指定条件表示区域中的各个像素点的灰度值均大于第二指定值,且该区域中的像素点的个数大于第三指定值;
若是,依据所述深度图像判断所述充电设备疑似区域中是否存在相对于所述深度相机的高度处于指定范围的目标充电设备区域;
若是,依据所述目标充电设备区域识别所述移动机器人的充电设备。
本申请第三方面提供一种充电设备识别系统,包括本申请第二方面提供的任一移动机器人和用于给所述移动机器人充电的充电设备,其中,
所述充电设备上设置有标记物,所述标记物的反射率大于第一指定值。
本申请提供的充电设备识别方法、移动机器人和充电设备识别系统,基于在充电设备上设置反射率大于第一指定值的标记物,通过红外图像,初步筛选出各充电设备疑似区域,进而依据深度图像从初步筛选出的充电设备疑似区域中筛选出满足充电设备几何信息的目标充电设备区域,最后根据目标充电设备区域识别充电设备。这样,根据充电设备的几何信息对充电设备疑似区域进行验证,继而依据通过验证的目标充电设备区域识别充电设备,可有效提高识别准确率。
附图说明
图1为本申请提供的充电设备识别方法实施例一的流程图;
图2为本申请一示例性实施例示出的移动机器人的示意图;
图3为本申请提供的充电设备识别方法实施例二的流程图;
图4为本申请一示例性实施例示出的移动机器人中的深度相机和RGB相机的安装关系示意图;
图5为本申请一示例性实施例示出的充电设备的示意图;
图6为本申请提供的充电设备识别方法实施例三的流程图;
图7A为本申请一示例性实施例示出的正常曝光红外图像的示意图;
图7B为本申请一示例性实施例示出的HDR红外图像的示意图;
图7C为本申请一示例性实施例示出的利用正常曝光红外图像和HDR红外图像得到融合红外图像的示意图;
图8为本申请提供的充电设备识别方法实施例四的流程图;
图9为本申请提供的移动机器人实施例一的结构示意图;
图10为本申请提供的移动机器人实施例二的结构示意图;
图11为本申请一示例性实施例示出的充电设备识别系统的示意图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
在本申请使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本申请。在本申请和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本申请可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本申请范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
目前,通过在充电设备上设置黑白相间的标记物来标记充电设备,进而通过设置在机器人上的RGB相机采集图像,以在采集到的图像中识别出上述标记物,完成对充电设备的识别。但是,当采用上述方法识别充电设备时,由于RGB相机采集到的图 像容易受光照强度和其他物体的影响,通过该采集到的图像来识别充电设备时,准确率较低。
本申请提供一种充电设备识别方法、移动机器人和充电设备识别系统,以解决现有的识别方法准确率较低的问题。
下面给出几个具体的实施例,用以详细介绍本申请的技术方案。下面这几个具体的实施例可以相互结合,对于相同或相似的概念或过程可能在某些实施例中不再赘述。
图1为本申请提供的充电设备识别方法实施例一的流程图。请参照图1,本实施例提供的方法,可以包括:
S101、通过深度相机采集当前视野下的红外图像和深度图像。
具体的,图2为本申请一示例性实施例示出的移动机器人的示意图。请参照图2,该移动机器人上设置有深度相机910。在一实施例中,该深度相机910可以设置在移动机器人的底盘10上。可通过该深度相机采集红外图像和深度图像。此外,该深度相机910可以为结构光深度相机或TOF(Time of Flight,简称TOF)深度相机。
此外,移动机器人可在检测到当前电量低于预设阈值时,通过深度相机采集红外图像和深度图像,以期通过该采集到的红外图像和深度图像识别出环境中的充电设备。当然,也可以在基于采集到的红外图像和深度图像未识别出充电设备时,再次通过深度相机采集红外图像和深度图像,以期通过再次采集到的红外图像和深度图像识别出充电设备。本实施例中,不对此进行限定。
需要说明的是,深度相机由带镜头的红外阵列传感器和红外光源发射器组成。其利用红外光源发射器对环境进行照射,进而基于红外阵列传感器接收到的反射光线产生红外图像。同时,深度相机能够根据光的飞行时间获取镜头到物体之间的距离信息,产生深度图像。有关深度相机的具体结构和工作原理可以参见相关技术中的描述,此处不再赘述。
S102、依据上述红外图像判断是否存在满足第一指定条件的充电设备疑似区域;上述第一指定条件为区域中的各个像素点的灰度值均大于第二指定值,且该区域中的像素点的数量大于第三指定值。
需要说明的是,充电设备上设置有标记物,该标记物的反射率大于第一指定值,即该标记物由高反射材料制成。其中,该第一指定值可以是根据实际需要设定的,本实施例中,不对此进行限定。当红外光照射到该标记物上时,由于该标记物的反射率较高, 因此,绝大部分的红外光会被该标记物反射,在深度相机采集的红外图像中会表现为高亮区域,即灰度值较高。
进一步地,步骤S101中,可采集正常曝光红外图像。此时,若当前视野中存在充电设备,则该正常曝光红外图像中必然存在高亮区域。因此,为检测当前视野中是否存在充电设备,本步骤S102中,可判断该正常曝光红外图像中是否存在满足第一指定条件的充电设备疑似区域。
参见前面的介绍,这样,通过该步骤S102,可筛选出充电设备疑似区域,该充电设备疑似区域为正常曝光红外图像中的高亮区域。
需要说明的是,第二指定值可以是根据实际需要设定的。本实施例中,不对第二指定值的具体值进行限定。例如,一实施例中,该第二指定值为200。进一步地,第三指定值也可以是根据实际需要设定的,例如,该第三指定值可以为100。
具体的,当判断不存在满足第一指定条件的充电设备疑似区域时,说明当前视野下不存在充电设备,此时,可再次通过深度相机采集红外图像和深度图像,以识别出充电设备。
S103、若是,依据上述深度图像判断上述充电设备疑似区域中是否存在相对于上述深度相机的高度处于指定范围的目标充电设备区域。
例如,在一实施例中,确定存在M个充电设备疑似区域。其中,M大于等于1。本步骤S103中,就判断这M个充电设备疑似区域中,是否存在与相对于深度相机的高度处于指定范围的目标充电设备区域。
需要说明的是,指定范围可以是根据实际需要设定的。例如,可根据标记物相对于深度相机的实际高度设定该指定范围。例如,一实施例中,该指定范围可以为[a-b,a+b],其中,a为深度相机的光心距离地面的高度减去标记物的中心点距离地面的高度,b为一误差值。
相应的,当指定范围为[a-b,a+b]时,此时,充电设备疑似区域相对于深度相机的高度可以用充电设备疑似区域的中心点在深度相机坐标系下的y坐标表征。
此时,本步骤的具体实现过程,可以包括以下步骤:
(1)依据上述深度图像,获取上述充电设备疑似区域的每个像素点的深度信息。
具体的,通过深度图像,可以获取充电设备疑似区域中的每个像素点的深度信 息。例如,针对一充电设备疑似区域中的一像素点(u 1,v 1),其中,(u 1,v 1)为该像素点的坐标信息。此时,通过该坐标信息,可从深度图像中查找到该像素点的深度信息。
需要说明的是,为方便说明,将第i个充电设备疑似区域记为Si,该Si=[(-u i1,v i1),……,(u iN,v iN)],其中,N等于该第i个充电设备疑似区域包含的像素点的个数。这样,通过本步骤,即可依据深度图像,获取该第i个充电设备疑似区域中的每个像素点的深度信息。其中,该第i个充电设备疑似区域中的第j个像素点的坐标信息为(u ij,v ij),深度信息为d ij,其中,j等于1到N。
(2)依据上述充电设备疑似区域中每个像素点的深度信息和该像素点的坐标信息,确定该像素点在所述深度相机坐标系下的位置信息。
本步骤中,可根据相机射影模型,依据上述充电设备疑似区域中每个像素点的深度信息和该像素点的坐标信息,确定该像素点在所述深度相机坐标系下的位置信息。
其中,相机射影模型可以用第一公式表示,第一公式为:
Figure PCTCN2019100429-appb-000001
其中,d ij为第i个充电设备疑似区域中的第j个像素点的深度信息;
(u ij,v ij)为第i个充电设备疑似区域中的第j个像素点的坐标信息;
(x ij,y ij,z ij)为第i个充电设备疑似区域中的第j个像素点在深度相机坐标系下的位置信息;
f x和f y分别为u轴和v轴的等效焦距,c x和c y分别为深度相机的光学中心,s为u轴和v轴的不垂直因子。在一些实施例中,s=0。其中f x、f y、c x、c y和s为深度相机的内部参数,均可通过相机标记获得。
(3)依据上述充电设备疑似区域中每个像素点在上述深度相机坐标系下的位置信息,计算上述充电设备疑似区域相对于上述深度相机的高度。
具体的,充电设备疑似区域相对于上述深度相机的高度可以用该充电设备疑似区域的中心点在深度相机坐标系下的y坐标表征。
为方便说明,将第i个充电设备疑似区域相对于上述深度相机的高度记为Y i,此时,Y i通过以下公式计算获得:
Figure PCTCN2019100429-appb-000002
(4)判断上述充电设备疑似区域中是否存在相对于上述深度相机的高度处于指定范围的目标充电设备区域。
具体的,本步骤中,就依据步骤(3)计算得到的各个充电设备疑似区域相对于深度相机的高度,判断充电设备疑似区域中,是否存在相对于上述深度相机的高度处于指定范围的目标充电设备区域。
需要说明的是,当一充电设备疑似区域相对于深度相机的高度处于指定范围时,此时,说明该充电设备疑似区域为满足充电设备几何信息的区域。
S104、若是,依据上述目标充电设备区域识别充电设备。
具体的,参见前面的介绍,目标充电设备区域为满足第一指定条件的区域,且该区域满足充电设备的几何信息。因此,一实施例中,可将目标充电设备区域确定为充电设备所在的区域。
需要说明的是,当确定充电设备疑似区域中不存在相对于深度相机的高度处于指定范围的目标充电设备区域时,此时,说明上述充电设备疑似区域均不满足充电设备的几何信息,均不是充电设备所在的区域。此时,可再次采集红外图像和深度图像,以识别出充电设备。
本实施例提供的方法,通过在充电设备上设置反射率大于第一指定值的标记物,在识别充电设备时,通过采集当前视野下的红外图像和深度图像,并依据红外图像确定是否存在满足第一指定条件的充电设备疑似区域,进而依据深度图像判断充电设备疑似区域中是否存在相对于深度相机的高度处于指定范围的目标充电设备区域,从而依据目标充电设备区域识别充电设备。其中,根据充电设备的几何信息对充电设备疑似区域进行验证,继而依据通过验证的目标充电设备区域识别充电设备,可有效提高识别准确率。
图3为本申请提供的充电设备识别方法实施例二的流程图。在上述实施例的基础上,本实施例提供的方法,还可以包括:在通过深度相机采集当前视野下的红外图像和深度图像时,或者是,在判断所述充电设备疑似区域中是否存在目标充电设备区域时,通过红绿蓝RGB相机采集当前视野下的RGB图像。
进一步地,本实施例提供的方法,步骤S104可以包括:
S301、从上述RGB图像中提取上述目标充电设备区域对应的目标RGB图像区 域。
具体的,请参照图2,移动机器人还可以设置有RGB相机930,该RGB相机930可以设置在移动机器人1的底盘10上。本实施例提供的方法,在上述实施例的基础上,还可以在通过深度相机采集当前视野下的红外图像和深度图像时,或者是,在判断所述充电设备疑似区域中存在目标充电设备区域时,通过红绿蓝RGB相机采集当前视野下的RGB图像。
进一步地,该步骤的具体实现过程,可以包括:
(1)根据上述目标充电设备区域中的每个像素点在上述深度相机坐标系下的位置信息,以及上述深度相机坐标系相对于RGB相机坐标系的变换矩阵和上述RGB相机的内参矩阵,确定上述目标充电设备区域中的每个像素点在上述RGB图像中的对应像素点的坐标信息。
具体的,可将目标充电设备区域中的每个像素点投影到RGB射影模型中,得到该像素点在RGB图像中的对应像素点。
图4为本申请一示例性实施例示出的移动机器人中的深度相机和RGB相机的安装关系示意图。请参照图4:
其中,深度相机910的坐标系相对于RGB相机930的坐标系的变换矩阵为T t2t1,该变换矩阵为4×4阶矩阵。进一步地,该RGB相机930的内参矩阵为K RGB,该内参矩阵为3×3阶矩阵,且目标充电设备区域为第i个充电设备疑似区域,此时,可基于如下公式计算得到目标充电设备区域中的每个像素点在上述RGB图像中的对应像素点的坐标信息:
K ij=K RGB*(T t2t1*P ij);
Figure PCTCN2019100429-appb-000003
其中,(x ij,y ij,z ij)为该目标充电设备区域中的第j个像素点在深度相机坐标系下的位置信息。
另外,为方便说明,记T t2t1*P ij=A,此时,矩阵A为4×1阶矩阵。当计算得到矩阵A后,将矩阵A中的最后一个行向量删除,得到矩阵B,该矩阵B为3×1阶矩阵,最后,再将K RGB与矩阵B相乘,得到K ij,该K ij为3×1阶矩阵。进一步地,为方便说 明,将K ij记为如下:
Figure PCTCN2019100429-appb-000004
此时,根据K ij即可得到该目标充电设备区域中的第j个像素点在RGB图像中的对应像素点的坐标信息,例如,将该对应像素点的坐标信息记为(u j,v j),其中,u j=a ij/c ij,v ij=b ij/c ij。这样,通过上述方法,即可得到该目标充电设备区域中的每个像素点在RGB图像中的对应像素点的坐标信息。
(2)根据上述目标充电设备区域中的每个像素点在上述RGB图像中的对应像素点的坐标信息,确定上述目标RGB图像区域。
具体的,该目标充电设备区域中的所有像素点在RGB图像的对应像素点构成上述目标RGB图像区域。这样,即可对RGB图像进行抠图处理,得到上述目标RGB图像区域。即认为该目标RGB图像区域为RGB图像中疑似充电设备所在的区域。
S302、判断上述目标RGB图像区域与预存的所述标记物21的标准RGB图像是否匹配。
具体的,图5为本申请一示例性实施例示出的充电设备的示意图。请参照图5,该充电设备上设置有一个标记物,例如,在图5所示示例中,在充电设备具有充电插座的侧面上设置一个标记物。其中,移动机器人中预存有该标记物的标准RGB图像。在本步骤中,当目标RGB图像区域与该标记物的标准RGB图像之间的相似度大于预设阈值时,确定该目标RGB图像区域与预存的标记物的标准RGB图像匹配。
当然,在本申请一可能的实现方式中,充电设备上还可以设置有至少两个形状不同的标记物,上述至少两个标记物设置在充电设备的不同侧面上,且上述至少两个标记物的中心点距离充电设备底面的距离相等。例如,可以在充电设备的正面(具有充电插座的侧立面)和背面(背对正面的一面)上均设置一个标记物,例如,在正面设置一个圆形标记物,在背面设置一个正方形标记物。此时,移动机器人中预存有这两个标记物的标准RGB图像。本步骤中,当目标RGB图像区域与任意一个标记物的标准RGB图像之间的相似度大于预设阈值时,确定目标RGB图像区域与预存的标记物的标准RGB图像匹配。
需要说明的是,当充电设备上设置有至少两个标记物时,若目标RGB图像区域与任意一个标记物的标准RGB图像的相似度大于预设阈值,则可根据目标RGB图像区 域与该任意一个标记物的标准RGB图像的相似度,查找到相似度大于预设阈值的目标标记物。这样,在移动机器人移动到该目标标记物处时,即可根据该目标标记物与充电插座的位置关系,控制移动机器人移动到充电插座处,以使移动机器人与充电设备对接,实现自主充电。
需要说明的是,目标RGB图像区域与标准RGB图像之间的相似度可以采用相关的相似度计算方法来计算。例如,在一可能的实现方式中,可按照NCC(Normalized Cross Correlation,归一化互相关)算法比较,通过如下公式计算目标RGB图像区域与标准RGB图像之间的相似度:
Figure PCTCN2019100429-appb-000005
其中,A(i,j)为目标RGB图像区域A中第(i,j)个像素点的灰度值;
B(i,j)为标准RGB图像B中第(i,j)个像素点的灰度值;
S(A,B)为目标RGB图像区域A与标准RGB图像B之间的相似度。若S(A,B)=0,则表明目标RGB图像区域A与标准RGB图像B不相似;若S(A,B)接近1则表示二者相似。在一些实施例中,当S(A,B)大于一指定值时,例如S(A,B)>0.7时,可以判断为二者相似,其中,该指定值可以根据需要设置。
S303、若是,确定上述目标充电设备区域为充电设备所在的区域。
具体的,当经过步骤S302确定目标RGB图像区域与预存的标记物的标准RGB图像匹配时,本步骤中,就确定上述目标充电设备区域为充电设备所在的区域。
需要说明的是,当目标RGB图像区域与标准RGB图像不匹配时,认为该目标RGB图像区域不是充电设备所在的区域,即可认为所采集的当前视野中不存在充电设备。此时,可再次采集红外图像和深度图像,直至识别出充电设备。
本实施例提供的方法,在确定充电设备疑似区域中存在目标充电设备区域时,不直接将该目标充电设备区域确定为充电设备所在的区域,而是进一步从采集的RGB图像中提取上述目标充电设备区域对应的目标RGB图像区域,并判断上述目标RGB图像区域与预存的标记物的标准RGB图像是否匹配,进而在判断目标RGB图像区域与标准RGB图像匹配时,才认为上述目标充电设备区域为充电设备所在的区域。这样,可进一步提高识别准确率。
图6为本申请提供的充电设备识别方法实施例三的流程图。在上述实施例的基 础上,本实施例提供的方法,红外图像可以包括正常曝光红外图像和高动态范围HDR红外图像;所述HDR红外图像是在确定所述正常曝光红外图像中存在的灰度值大于所述第二指定值的像素点的数量大于第四指定值时采集的。此时,步骤S102,可以包括:
S601、将上述正常曝光红外图像和上述HDR图像进行融合处理,得到融合红外图像。
具体的,第四指定值可以是根据实际需要设定的,本实施例中,不对第四指定值的具体值进行限定,例如,一实施例中,第四指定值可以等于100。
具体的,本实施例提供的方法,首先通过深度相机采集正常曝光红外图像和深度图像,进而在上述正常曝光红外图像中存在的灰度值大于所述第二指定值的像素点的数量大于第四指定值时采集HDR红外图像。需要说明的是,当正常曝光红外图像中不存在灰度值大于第二指定值的像素点,或正常曝光图像中存在的灰度值大于第二指定值的像素点的数量不大于第四指定值时,认为当前视野中不存在充电设备,可重新采集红外图像和深度图像,直至识别出充电设备。
具体的,有关融合处理的具体实现原理和实现过程可以参见相关技术中的描述,此处不再赘述。
例如,一实施例中,可将正常曝光红外图像和HDR红外图像中对应位置的灰度值做权值卷积,得到融合红外图像。其中,权值卷积可以用如下公式表示:
R(i,j)=a*Z(i,j)*b*H(i,j)
其中,R(i,j)为融合图像中第(i,j)个像素点的灰度值;
Z(i,j)为正常曝光图像中第(i,j)个像素点的灰度值;
H(i,j)为HDR红外图像中第(i,j)个像素点的灰度值;
a为正常曝光图像中第(i,j)个像素点的权值系数;
b为HDR红外图像中第(i,j)个像素点的权值系数。
需要说明的是,一个像素点的权值系数可通过如下方法确定,即当该像素点的灰度值大于预设值,例如,该预设值可以为150~200间的任意值,将该像素点的权值系数确定为一指定值,例如,该指定值为1;当该像素点的灰度值不大于预设值时,将该像素点的权值系数确定另一指定值,例如,该指定值为0.7。
S602、判断上述融合红外图像中是否存在满足所述第一指定条件的充电设备疑 似区域。
具体的,可对融合红外图像做轮廓检测,以确定该融合红外图像中是否存在满足第一指定条件的充电设备疑似区域。
图7A为本申请一示例性实施例示出的正常曝光红外图像的示意图;图7B为本申请一示例性实施例示出的HDR红外图像的示意图;图7C为本申请一示例性实施例示出的利用正常曝光红外图像和HDR红外图像得到的融合红外图像的示意图。
请参照图7A-7C,当图5所示充电设备出现在移动机器人的视野中时,由于充电设备的正面上设置一个圆形标记物,此时,参照图7A,正常曝光红外图像中存在充电设备疑似区域,即图7A中的高亮区域。此时,为了防止环境中其他高反射材料造成的干扰,进一步采集HDR红外图像。其中,采集到的HDR图像如图7B所示。
需要说明的是,与正常曝光红外图像相比,HDR红外图像通过较短曝光时间来获取当前视野下的红外图像。这样,当移动机器人的视野中出现充电设备时,由于充电设备上具有较高反射率的标记物,因此HDR红外图像中仍然存在区域。而对于其他低反射率物体,即使其在正常曝光红外图像中表现为高亮区域,但是,在HDR红外图像中,该物体所在区域的灰度值大幅度降低。
进一步地,通过将正常曝光红外图像与HDR红外图像进行融合处理,得到融合红外图像。参照图7C,可见,当移动机器人的视野中存在充电设备时,在融合红外图像中,该充电设备所在的区域表现为特别明显的峰值数据。因此,本实施例提供的方法,通过融合红外图像来判断是否存在充电设备疑似区域,这样,可快速、准确的找到充电设备疑似区域,并在第一时间将非充电设备区域排除掉,以降低后面的计算量。
图8为本申请提供的充电设备识别方法实施例四的流程图。请参照图8,本实施例提供的方法,可以包括:
S801、采集当前视野下的正常曝光红外图像、深度图像和RGB图像。
S802、判断上述正常曝光图像中存在的灰度值大于第二指定值的像素点的数量是否大于第四指定值,若是,执行步骤S803,若否,执行步骤S801。
S803、采集当前视野下的HDR红外图像。
S804、将上述正常曝光红外图像和上述HDR图像进行融合处理,得到融合红外图像。
S805、判断上述融合红外图像中是否存在满足第一指定条件的充电设备疑似区域,若是,执行步骤S806,若否,执行步骤S801。
S806、依据上述深度图像判断上述充电设备疑似区域中是否存在相对于深度相机的高度处于指定范围的目标充电设备区域,若是,执行步骤S807,若否,执行步骤S801。
S807、从上述RGB图像中提取上述目标充电设备区域对应的目标RGB图像区域;
S808、判断上述目标RGB图像区域与预存的标记物的标准RGB图像是否匹配,若是,执行步骤S809,若否,执行步骤S801。
S809、确定上述目标充电设备区域为充电设备所在的区域。
具体的,有关各步骤的具体实现原理和实现过程可以参见前面实施例中的描述,此处不再赘述。
本实施例提供的方法,在确定存在充电设备疑似区域后,通过确定上述充电设备疑似区域中是否目标充电设备区域,进而在判断存在目标充电设备区域,判断目标充电设备区域在RGB图像中对应的目标RGB图像区域是否与标准RGB图像进行匹配,进而在匹配时,确定目标充电设备区域为充电设备所在的区域。这样,经过两方面验证,最终确定的区域为充电设备所在的区域。这样,极大地提高了识别准确率。
需要说明的是,本实施例提供的方法,在确定了充电设备所在的区域时,可依据该区域中各个像素点在深度相机坐标下的位置信息,控制移动机器人移动到该充电设备处,实现移动机器人自主充电。进一步地,本实施例提供方法,充电设备的识别准确率相对较高,这样,可较快速控制移动机器人移动到充电设备处,可有效的提高移动机器人的回充效率。
以上对本申请提供的充电设备识别方法进行了介绍,下面对本申请提供的移动机器人和充电设备识别系统进行介绍:
图9为本申请提供的移动机器人实施例一的结构示意图。请参照图9,本实施例提供的移动机器人,可以包括深度相机910和处理器920。其中,所述深度相机910,用于采集当前视野下的红外图像和深度图像。所述处理器920,具体用于:依据所述红外图像判断是否存在满足第一指定条件的充电设备疑似区域;所述第一指定条件为该区域中的各个像素点的灰度值均大于第二指定值,且该区域中的像素点的个数大于第三指 定值;若是,依据所述深度图像判断所述充电设备疑似区域中是否存在相对于所述深度相机的高度处于指定范围的目标充电设备区域;若是,依据所述目标充电设备区域识别充电设备。
具体的,本实施例提供的移动机器人,可用于执行图1所示方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
图10为本申请提供的移动机器人实施例二的结构示意图。请参照图10,在上述实施例的基础上,本实施例提供的移动机器人,还包括红绿蓝RGB相机930。其中,所述RGB相机930用于在所述深度相机910采集当前视野下的红外图像和深度图像时,或者是,在所述处理器920判断所述充电设备疑似区域中存在目标充电设备区域时,采集当前视野下的RGB图像。所述处理器920用于从所述RGB图像中提取所述目标充电设备区域对应的目标RGB图像区域,并判断所述目标RGB图像区域与预存的所述标记物的标准RGB图像是否匹配,以及在判断所述目标RGB图像区域与预存的所述标记物的标准RGB图像匹配时,确定所述目标充电设备区域为充电设备所在的区域。
需要说明的是,本实施例提供的移动机器人,还可以包括存储器940,所述存储器940,用于存储所述红外图像、所述深度图像和所述RGB图像。
进一步地,所述红外图像包括正常曝光红外图像和高动态范围HDR红外图像;所述HDR红外图像是在确定所述正常曝光红外图像中存在的灰度值大于所述第二指定值的像素点的数量大于第四指定值时采集的。
在这种情况下,所述处理器920,具体用于:将所述正常曝光红外图像和所述HDR图像进行融合处理,得到融合红外图像;判断所述融合红外图像中是否存在满足第一指定条件的充电设备疑似区域。
进一步地,所述处理器920,具体用于:依据所述深度图像,获取所述充电设备疑似区域中的每个像素点的深度信息;依据所述充电设备疑似区域中每个像素点的深度信息和该像素点的坐标信息,确定该像素点在所述深度相机坐标系下的位置信息;依据所述充电设备疑似区域中每个像素点在所述深度相机坐标系下的位置信息,计算所述充电设备疑似区域相对于所述深度相机的高度;判断所述充电设备疑似区域中是否存在相对于所述深度相机的高度处于指定范围的目标充电设备区域。
进一步地,所述处理器920,具体用于:根据所述目标充电设备区域中的每个像素点在所述深度相机坐标系下的位置信息,以及所述深度相机坐标系相对于所述RGB 相机坐标系的变换矩阵和所述RGB相机的内参矩阵,确定所述目标充电设备区域中的每个像素点在所述RGB图像中的对应像素点的坐标信息;根据所述目标充电设备区域中的每个像素点在所述RGB图像中的对应像素点的坐标信息,确定所述目标RGB图像区域。
进一步地,所述充电设备设置有一个标记物,所述目标RGB图像区域与预存的所述标记物的标准RGB图像匹配,包括:所述目标RGB图像区域与所述标记物的标准RGB图像的相似度大于预设阈值。
或者是,所述充电设备设置有至少两个形状不同的标记物,所述至少两个标记物设置在所述充电设备的不同侧面上,且所述至少两个标记物的中心点距离所述充电设备底面的距离相等。在这种情况下,所述目标RGB图像区域与预存的所述标记物的标准RGB图像匹配,包括:所述目标RGB图像区域与任意一个所述标记物的标准RGB图像之间的相似度大于预设阈值。
图11为本申请一示例性实施例示出的充电设备识别系统的示意图。请参照图11,本实施例提供的充电设备识别系统,可以包括本申请提供的任一移动机器人1和用于给上述移动机器人1充电的充电设备2。其中,所述充电设备2上设置有标记物21,所述标记物21的反射率大于第一指定值。
具体的,参见前面的介绍,充电设备2上可设置至少一个标记物21,该标记物21可以设置在指定侧面上。例如,设置在充电设备2具有充电插座的侧面上。当然,充电设备2上也可以设置至少两个不同形状的标记物21,上述至少两个标记物21设置在充电设备2的不同侧面上,且上述至少两个标记物21的中心点距离充电设备2底面的距离相等。
以上所述仅为本申请的较佳实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (11)

  1. 一种充电设备识别方法,所述充电设备上设置有标记物,所述标记物的反射率大于第一指定值,所述方法应用于移动机器人,所述方法包括:
    通过深度相机采集当前视野下的红外图像和深度图像;
    依据所述红外图像判断是否存在满足第一指定条件的充电设备疑似区域;所述第一指定条件表示区域中的各个像素点的灰度值均大于第二指定值,且该区域中的像素点的数量大于第三指定值;
    若是,依据所述深度图像判断所述充电设备疑似区域中是否存在相对于所述深度相机的高度处于指定范围的目标充电设备区域;
    若是,依据所述目标充电设备区域识别所述充电设备。
  2. 根据权利要求1所述的方法,其特征在于,
    所述方法还包括:在通过所述深度相机采集所述当前视野下的所述红外图像和所述深度图像时,或者是,在确定所述充电设备疑似区域中存在所述目标充电设备疑似区域时,通过红绿蓝RGB相机采集所述当前视野下的RGB图像;
    依据所述目标充电设备区域识别所述充电设备,包括:
    从所述RGB图像中提取所述目标充电设备区域对应的目标RGB图像区域;
    判断所述目标RGB图像区域与预存的所述标记物的标准RGB图像是否匹配;
    若是,确定所述目标充电设备区域为所述充电设备所在的区域。
  3. 根据权利要求1所述的方法,其特征在于,
    所述红外图像包括正常曝光红外图像和高动态范围HDR红外图像;所述HDR红外图像是在确定所述正常曝光红外图像中存在的灰度值大于所述第二指定值的像素点的数量大于第四指定值时采集的;
    依据所述红外图像判断是否存在满足所述第一指定条件的所述充电设备疑似区域,包括:
    将所述正常曝光红外图像和所述HDR图像进行融合处理,得到融合红外图像;
    判断所述融合红外图像中是否存在满足所述第一指定条件的所述充电设备疑似区域。
  4. 根据权利要求1所述的方法,其特征在于,依据所述深度图像判断所述充电设 备疑似区域中是否存在相对于所述深度相机的高度处于所述指定范围的所述目标充电设备区域,包括:
    依据所述深度图像,获取所述充电设备疑似区域中的每个像素点的深度信息;
    依据所述充电设备疑似区域中每个像素点的深度信息和该像素点的坐标信息,确定该像素点在深度相机坐标系下的位置信息;
    依据所述充电设备疑似区域中每个像素点在所述深度相机坐标系下的位置信息,计算所述充电设备疑似区域相对于所述深度相机的高度;
    判断所述充电设备疑似区域相对于所述深度相机的高度是否处于所述指定范围。
  5. 根据权利要求2所述的方法,其特征在于,从所述RGB图像中提取所述目标充电设备区域对应的目标RGB图像区域,包括:
    根据所述目标充电设备区域中的每个像素点在深度相机坐标系下的位置信息,以及所述深度相机坐标系相对于RGB相机坐标系的变换矩阵和所述RGB相机的内参矩阵,确定所述目标充电设备区域中的每个像素点在所述RGB图像中的对应像素点的坐标信息;
    根据所述目标充电设备区域中的每个像素点在所述RGB图像中的对应像素点的坐标信息,确定所述目标RGB图像区域。
  6. 根据权利要求2所述的方法,其特征在于,
    所述充电设备设置有一个所述标记物,
    所述目标RGB图像区域与预存的所述标记物的标准RGB图像匹配,包括:所述目标RGB图像区域与所述标记物的标准RGB图像的相似度大于预设阈值。
  7. 根据权利要求2所述的方法,其特征在于,
    所述充电设备设置有至少两个形状不同的所述标记物,所述至少两个标记物设置在所述充电设备的不同侧面上,且所述至少两个标记物的中心点距离所述充电设备底面的距离相等;
    所述目标RGB图像区域与预存的所述标记物的标准RGB图像匹配,包括:所述目标RGB图像区域与任意一个所述标记物的标准RGB图像之间的相似度大于预设阈值。
  8. 一种移动机器人,包括:
    深度相机,用于采集当前视野下的红外图像和深度图像;和
    处理器,用于:
    依据所述红外图像判断是否存在满足第一指定条件的充电设备疑似区域;所述第一指定条件表示区域中的各个像素点的灰度值均大于第二指定值,且该区域中的像素点的个数大于第三指定值;
    若是,依据所述深度图像判断所述充电设备疑似区域中是否存在相对于所述深度相机的高度处于指定范围的目标充电设备区域;
    若是,依据所述目标充电设备区域识别所述移动机器人的充电设备。
  9. 根据权利要求8所述的移动机器人,还包括红绿蓝RGB相机,
    所述RGB相机,用于在所述深度相机采集所述当前视野下的所述红外图像和所述深度图像时,或者是,在所述处理器确定所述充电设备疑似区域中存在所述目标充电设备区域时,采集所述当前视野下的RGB图像;
    所述处理器,用于从所述RGB图像中提取所述目标充电设备区域对应的目标RGB图像区域,并判断所述目标RGB图像区域与预存的所述标记物的标准RGB图像是否匹配,以及在判断所述目标RGB图像区域与预存的所述标记物的标准RGB图像匹配时,确定所述目标充电设备区域为所述充电设备所在的区域。
  10. 根据权利要求8所述的移动机器人,其特征在于,
    所述红外图像包括正常曝光红外图像和高动态范围HDR红外图像;所述HDR红外图像是在确定所述正常曝光红外图像中存在的灰度值大于所述第二指定值的像素点的数量大于第四指定值时采集的;
    所述处理器,用于将所述正常曝光红外图像和所述HDR图像进行融合处理,得到融合红外图像,并判断所述融合红外图像中是否存在满足所述第一指定条件的所述充电设备疑似区域。
  11. 一种充电设备识别系统,包括:
    如权利要求6-8任一项所述的移动机器人;和
    用于给所述移动机器人充电的充电设备,其中,所述充电设备上设置有标记物,所述标记物的反射率大于第一指定值。
PCT/CN2019/100429 2018-08-15 2019-08-13 一种充电设备识别方法、移动机器人和充电设备识别系统 WO2020034963A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/266,164 US11715293B2 (en) 2018-08-15 2019-08-13 Methods for identifying charging device, mobile robots and systems for identifying charging device
EP19850252.8A EP3836084B1 (en) 2018-08-15 2019-08-13 Charging device identification method, mobile robot and charging device identification system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810929450.7A CN110838144B (zh) 2018-08-15 2018-08-15 一种充电设备识别方法、移动机器人和充电设备识别系统
CN201810929450.7 2018-08-15

Publications (1)

Publication Number Publication Date
WO2020034963A1 true WO2020034963A1 (zh) 2020-02-20

Family

ID=69524708

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/100429 WO2020034963A1 (zh) 2018-08-15 2019-08-13 一种充电设备识别方法、移动机器人和充电设备识别系统

Country Status (4)

Country Link
US (1) US11715293B2 (zh)
EP (1) EP3836084B1 (zh)
CN (1) CN110838144B (zh)
WO (1) WO2020034963A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465959A (zh) * 2020-12-17 2021-03-09 国网四川省电力公司电力科学研究院 基于局部场景更新的变电站三维实景模型巡视方法

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572968B (zh) * 2020-04-24 2023-07-18 杭州萤石软件有限公司 图像融合方法、装置、摄像设备及存储介质
CN114943941A (zh) * 2021-02-07 2022-08-26 华为技术有限公司 一种目标检测方法及装置
CN114915036A (zh) * 2021-02-09 2022-08-16 北京小米移动软件有限公司 足式机器人充电控制方法及足式机器人
CN113671944B (zh) * 2021-07-05 2024-04-16 上海高仙自动化科技发展有限公司 控制方法、控制装置、智能机器人及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106875444A (zh) * 2017-01-19 2017-06-20 浙江大华技术股份有限公司 一种目标物定位方法及装置
CN107392962A (zh) * 2017-08-14 2017-11-24 深圳市思维树科技有限公司 一种基于图案识别的机器人充电对接系统和方法
CN107633528A (zh) * 2017-08-22 2018-01-26 北京致臻智造科技有限公司 一种刚体识别方法及系统
CN108171212A (zh) * 2018-01-19 2018-06-15 百度在线网络技术(北京)有限公司 用于检测目标的方法和装置
US20180210448A1 (en) * 2017-01-25 2018-07-26 Lg Electronics Inc. Method of identifying functional region in 3-dimensional space, and robot implementing the method

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100656701B1 (ko) * 2004-10-27 2006-12-13 삼성광주전자 주식회사 로봇청소기 시스템 및 외부충전장치 복귀 방법
US8515580B2 (en) * 2011-06-17 2013-08-20 Microsoft Corporation Docking process for recharging an autonomous mobile device
JP2015535373A (ja) 2012-10-05 2015-12-10 アイロボット コーポレイション 移動ロボットを含むドッキングステーション姿勢を決定するためのロボット管理システムとこれを用いる方法
US9398287B2 (en) * 2013-02-28 2016-07-19 Google Technology Holdings LLC Context-based depth sensor control
CN104036226B (zh) * 2013-03-04 2017-06-27 联想(北京)有限公司 一种目标物信息获取方法及电子设备
KR102095817B1 (ko) * 2013-10-31 2020-04-01 엘지전자 주식회사 이동 로봇, 이동 로봇의 충전대 및 이들을 포함하는 이동 로봇 시스템
US9704043B2 (en) * 2014-12-16 2017-07-11 Irobot Corporation Systems and methods for capturing images and annotating the captured images with information
KR101772084B1 (ko) * 2015-07-29 2017-08-28 엘지전자 주식회사 이동 로봇 및 그 제어방법
CN106444777B (zh) * 2016-10-28 2019-12-17 北京进化者机器人科技有限公司 机器人自动回位充电方法和系统
CN106647747B (zh) * 2016-11-30 2019-08-23 北京儒博科技有限公司 一种机器人充电方法及装置
CN106826815B (zh) * 2016-12-21 2019-05-31 江苏物联网研究发展中心 基于彩色图像与深度图像的目标物体识别与定位的方法
CN107124014A (zh) * 2016-12-30 2017-09-01 深圳市杉川机器人有限公司 一种移动机器人的充电方法及充电系统
CN106826821A (zh) * 2017-01-16 2017-06-13 深圳前海勇艺达机器人有限公司 基于图像视觉引导的机器人自动返回充电的方法和系统
CN106980320B (zh) * 2017-05-18 2020-06-19 上海思岚科技有限公司 机器人充电方法及装置
CN107590836B (zh) * 2017-09-14 2020-05-22 斯坦德机器人(深圳)有限公司 一种基于Kinect的充电桩动态识别与定位方法及系统
CN108124142A (zh) * 2018-01-31 2018-06-05 西北工业大学 基于rgb景深相机和高光谱相机的图像目标识别系统及方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106875444A (zh) * 2017-01-19 2017-06-20 浙江大华技术股份有限公司 一种目标物定位方法及装置
US20180210448A1 (en) * 2017-01-25 2018-07-26 Lg Electronics Inc. Method of identifying functional region in 3-dimensional space, and robot implementing the method
CN107392962A (zh) * 2017-08-14 2017-11-24 深圳市思维树科技有限公司 一种基于图案识别的机器人充电对接系统和方法
CN107633528A (zh) * 2017-08-22 2018-01-26 北京致臻智造科技有限公司 一种刚体识别方法及系统
CN108171212A (zh) * 2018-01-19 2018-06-15 百度在线网络技术(北京)有限公司 用于检测目标的方法和装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465959A (zh) * 2020-12-17 2021-03-09 国网四川省电力公司电力科学研究院 基于局部场景更新的变电站三维实景模型巡视方法
CN112465959B (zh) * 2020-12-17 2022-07-01 国网四川省电力公司电力科学研究院 基于局部场景更新的变电站三维实景模型巡视方法

Also Published As

Publication number Publication date
US11715293B2 (en) 2023-08-01
US20210312178A1 (en) 2021-10-07
CN110838144A (zh) 2020-02-25
EP3836084A1 (en) 2021-06-16
CN110838144B (zh) 2022-09-30
EP3836084A4 (en) 2021-08-18
EP3836084B1 (en) 2023-10-04

Similar Documents

Publication Publication Date Title
WO2020034963A1 (zh) 一种充电设备识别方法、移动机器人和充电设备识别系统
US9443143B2 (en) Methods, devices and systems for detecting objects in a video
JP4612635B2 (ja) 低照度の深度に適応可能なコンピュータ視覚を用いた移動物体検出
WO2019120011A1 (zh) 一种目标检测方法及装置
CN107390721B (zh) 机器人随行控制方法、装置及机器人
US20040125207A1 (en) Robust stereo-driven video-based surveillance
CN107667527A (zh) 用于检测可见光和投影图案的成像器
WO2020244414A1 (zh) 一种检测障碍物的方法、装置、存储介质和移动机器人
WO2020237565A1 (zh) 一种目标追踪方法、装置、可移动平台及存储介质
CN108021926A (zh) 一种基于全景环视系统的车辆刮痕检测方法及系统
WO2019174484A1 (zh) 一种充电座识别方法及移动机器人
KR20200055239A (ko) 로봇군 제어 방법 및 시스템
CN114905512B (zh) 一种智能巡检机器人全景追踪及避障方法及系统
JP2011209896A (ja) 障害物検知装置、障害物検知方法及び障害物検知プログラム
WO2019165613A1 (zh) 移动设备的控制方法、设备和存储装置
Fahn et al. A high-definition human face tracking system using the fusion of omni-directional and PTZ cameras mounted on a mobile robot
KR101233938B1 (ko) 객체 추적 로봇 및 그 방법
CN212044739U (zh) 一种基于惯性数据和视觉特征的定位装置及机器人
CN110770739A (zh) 一种基于图像识别的控制方法、装置及控制设备
AU2020317303B2 (en) Information processing device, data generation method, and program
Thornton et al. Multi-sensor detection and tracking of humans for safe operations with unmanned ground vehicles
CN116506735B (zh) 一种基于主动视觉相机的通用摄像头干扰方法及系统
JP7488222B2 (ja) 相対位置推定装置及びプログラム
US12008778B2 (en) Information processing apparatus, control method for same, non-transitory computer-readable storage medium, and vehicle driving support system
CN114794992A (zh) 充电座、机器人的回充方法和扫地机器人

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19850252

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019850252

Country of ref document: EP

Effective date: 20210311