EP3836084B1 - Procédé d'identification de dispositif de charge, robot mobile et système d'identification de dispositif de charge - Google Patents
Procédé d'identification de dispositif de charge, robot mobile et système d'identification de dispositif de charge Download PDFInfo
- Publication number
- EP3836084B1 EP3836084B1 EP19850252.8A EP19850252A EP3836084B1 EP 3836084 B1 EP3836084 B1 EP 3836084B1 EP 19850252 A EP19850252 A EP 19850252A EP 3836084 B1 EP3836084 B1 EP 3836084B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- charging device
- area
- target
- image
- suspected
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 46
- 239000003550 marker Substances 0.000 claims description 67
- 239000011159 matrix material Substances 0.000 claims description 18
- 230000004044 response Effects 0.000 claims description 13
- 238000002310 reflectometry Methods 0.000 claims description 11
- 238000007499 fusion processing Methods 0.000 claims description 7
- 230000009466 transformation Effects 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 18
- 238000003032 molecular docking Methods 0.000 description 16
- 230000003287 optical effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/92—Dynamic range modification of images or parts thereof based on global image properties
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
- G06V10/225—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on a marking or identifier characterising the area
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02J—CIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
- H02J7/00—Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
- H02J7/0047—Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries with monitoring or indicating devices or circuits
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20208—High dynamic range [HDR] image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the present application relates to the field of mobile robots, and in particular, to a method of identifying a charging device, a mobile robot and a system for identifying a charging device.
- the mobile robots are autonomously charged through an automatic contact charging technology to supplement the power source.
- a mobile robot In order to realize autonomous charging, a mobile robot needs to identify a charging device in an environment when its battery level is lower than a preset threshold, and moves from a current location to a location of the charging device to be docked with the charging device for autonomous charging.
- US 2012/323365 A1 discloses an approach for autonomously docking of a mobile robot at a docking station for purposes of recharging batteries of the mobile robot.
- the mobile robot uses vision-based navigation and a known map of the environment to navigate toward the docking station. Once sufficiently proximate to the docking station, the mobile robot captures infrared images of the docking station, and granularly aligns itself with the docking station based upon the captured infrared images of the docking station. As the robot continues to drive towards the docking station, the robot monitors infrared sensors for infrared beams emitted from the docking station. If the infrared sensors receive the infrared beams, the robot continues to drive forward until the robot successfully docks with the docking station.
- US 2014/100693 A1 discloses a mobile robot system that includes a docking station having at least two pose-defining fiducial markers.
- the pose-defining fiducial markers have a predetermined spatial relationship with respect to one another and/or to a reference point on the docking station such that a docking path to the base station can be determined from one or more observations of the at least two pose-defining fiducial markers.
- a mobile robot in the system includes a pose sensor assembly.
- a controller is located on the chassis and is configured to analyze an output signal from the pose sensor assembly. The controller is configured to determine a docking station pose, to locate the docking station pose on a map of a surface traversed by the mobile robot and to path plan a docking trajectory.
- US 2015/115876 A1 discloses a charging apparatus that is configured to charge a mobile robot.
- the mobile robot is configured to emit an optical pattern.
- the charging apparatus includes a main body configured to perform charging of the mobile robot as the mobile robot docks with the charging apparatus, and two or more position markers located at the main body and spaced apart from each other.
- the position markers are configured to create indications distinguishable from a surrounding region when the optical pattern is emitted to surfaces of the position markers.
- the invention is defined by a method of identifying a charging device, and a mobile robot according to the independent claims. Preferred embodiments are defined in the dependent claims.
- the present application provides a method of identifying a charging device, a mobile robot and a system for identifying a charging device.
- a first aspect of the present application provides a method of identifying a charging device.
- a marker is provided on the charging device.
- a reflectivity of the marker is greater than a first specified value.
- the method is applied to a mobile robot, and includes:
- a second aspect of the present application provides a mobile robot, including:
- a third aspect of the present application provides a system for identifying a charging device, including: a mobile robot provided in the second aspect of the present application, and a charging device configured to charge the mobile robot, where a marker is provided on the charging device, and a reflectivity of the marker is greater than a first specified value.
- a marker with a reflectivity greater than a first specified value is provided on the charging device, and one or more suspected charging device areas are screened preliminarily through an infrared image. Further, according to a depth image, a target charging device area that satisfies geometric information of the charging device is screened from the one or more suspected charging device areas that are screened preliminarily. Finally, the charging device is identified according to the target charging device area. In this way, the one or more suspected charging device areas are verified based on the geometric information of the charging device, and the charging device is identified based on a verified target charging device area, which may effectively improve the accuracy of identification.
- first, second, third, and the like may be used in the present disclosure to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one category of information from another. For example, without departing from the scope of the present disclosure, first information may be referred as second information; and similarly, second information may also be referred as first information. Depending on the context, the word “if' as used herein may be interpreted as “when” or “upon” or “in response to determining”.
- a charging device is marked by providing a black and white marker on the charging device, and a RGB camera provided on a robot is used to capture an image to identify the marker from the captured image and complete identification of the charging device.
- a RGB camera provided on a robot is used to capture an image to identify the marker from the captured image and complete identification of the charging device.
- This application provides a method of identifying a charging device, a mobile robot and a system for identifying a charging device to solve the problem of low accuracy of the existing identification method.
- FIG. 1 is a flowchart illustrating a method of identifying a charging device according to an
- the method provided in this embodiment may include: At S101, an infrared image and a depth image of a current field of view are captured with a depth camera.
- FIG. 2 is a schematic diagram illustrating a mobile robot according to an exemplary embodiment of the present application.
- a depth camera 910 is provided on the mobile robot.
- the depth camera 910 may be provided on a chassis 10 of the mobile robot.
- An infrared image and a depth image may be captured with the depth camera.
- the depth camera 910 may be a structured light depth camera or a TOF (Time of Flight) depth camera.
- the mobile robot when detecting that a current battery level is lower than a preset threshold, may capture an infrared image and a depth image with the depth camera in order to identify a charging device in an environment through the captured infrared image and depth image.
- the mobile robot when no charging device is identified based on the captured infrared image and depth image, may re-capture an infrared image and a depth image with the depth camera in order to identify a charging device through the re-captured infrared image and depth image.
- no limitation is imposed thereon.
- the depth camera includes an infrared array sensor with a lens and an infrared light emitter.
- the depth camera uses the infrared light emitter to illuminate the environment, and then generates an infrared image based on reflected light received by the infrared array sensor.
- the depth camera may obtain distance information between the lens and an object according to the flight time of light to generate the depth image.
- the first specified conditions indicate that a gray-scale value of each pixel in an area is greater than a second specified value, and a number of pixels in the area is greater than a third specified value.
- a marker is provided on the charging device, and a reflectivity of the marker is greater than a first specified value, that is, the marker is made of a highly reflective material.
- the first specified value may be set according to actual needs, which is not limited in this embodiment.
- a normal exposure infrared image may be captured.
- a charging device in the current field of view there is necessarily a highly bright area in the normal exposure infrared image. Therefore, in order to detect whether there is a charging device in the current field of view, at step S102, it may be determined whether there are one or more suspected charging device areas that satisfy the first specified conditions in the normal exposure infrared image.
- the one or more suspected charging device areas may be screened, and the one or more suspected charging device areas are the highly bright areas in the normal exposure infrared image.
- the second specified value may be set according to actual needs.
- the second specified value is not limited particularly.
- the second specified value is 200.
- the third specified value may also be set according to actual needs.
- the third specified value may be 100.
- an infrared image and a depth image may be re-captured with the depth camera to identify a charging device.
- step S103 it is determined whether there is a target charging device area whose height relative to the depth camera is within a specified range in the M suspected charging device areas.
- the specified range may be set according to actual needs.
- the specified range may be set according to an actual height of the marker relative to the depth camera.
- the specified range may be [a-b, a+b], where a is a difference value obtained by subtracting a height of a center point of the marker to ground from a height of an optical center of the depth camera to the ground, and b is an error value.
- the height of a suspected charging device area relative to the depth camera may be characterized by a y coordinate of a center point of the suspected charging device area in a depth camera coordinate system.
- the specific implementation process of this step may include the following steps:
- the depth information of each pixel in the one or more suspected charging device areas may be obtained. For example, for a pixel (u 1 , v 1 ) in a suspected charging device area, where (u 1 , v 1 ) is the coordinate information of the pixel, at this time, with the coordinate information, the depth information of the pixel may be retrieved from the depth image.
- the depth information of each pixel in the i th suspected charging device area may be obtained according to the depth image.
- Position information of each pixel in the depth camera coordinate system is determined according to the depth information and coordinate information of each pixel in the one or more suspected charging device areas.
- the position information of each pixel in the depth camera coordinate system may be determined according to the depth information and the coordinate information of each pixel in the one or more suspected charging device areas.
- a height of each of the one or more suspected charging device areas relative to the depth camera is calculated according to the position information of respective pixels in the one or more suspected charging device areas in the depth camera coordinate system.
- a height of a suspected charging device area relative to the depth camera may be characterized by the y coordinate of a center point of the suspected charging device area in the depth camera coordinate system.
- the height of the i th suspected charging device area relative to the depth camera is designated as Yi.
- this step based on the height of each suspected charging device area relative to the depth camera calculated in the step (3), it is determined whether there is the target charging device area whose height relative to the depth camera is within the specified range in the suspected charging device area.
- the suspected charging device area is an area that satisfies geometric information of the charging device.
- the charging device is identified according to the target charging device area.
- the target charging device area is an area that satisfies the first specified conditions, and the target charging device area satisfies the geometric information of the charging device. Therefore, the target charging device area may be determined as an area where the charging device is located.
- a marker with a reflectivity greater than a first specified value is provided on a charging device.
- an infrared image and a depth image of a current field of view are captured, and whether there are one or more suspected charging device areas that satisfy first specified conditions is determined according to the infrared image, then whether there is a target charging device area whose height relative to the depth camera is within a specified range in the suspected charging device areas is determined according to the depth image.
- the charging device is identified according to the target charging device area.
- the suspected charging device areas are verified based on geometric information of the charging device, and the charging device is identified based on a verified target charging device area, which may effectively improve the accuracy of identification.
- FIG. 3 is a flowchart illustrating a method of identifying a charging device according to an embodiment of the present application.
- the method provided in this embodiment may further include: when capturing the infrared image and the depth image of the current field of view with the depth camera, or when it is determined that there is the target charging device area in the one or more suspected charging device areas, capturing a red, green and blue (RGB) image of the current field of view with a RGB camera.
- RGB red, green and blue
- step S104 includes:
- a target RGB image area corresponding to the target charging device area is extracted from the RGB image.
- the mobile robot is also be provided with a RGB camera 930.
- the RGB camera 930 is provided on the chassis 10 of the mobile robot 1.
- the RGB image of the current field of view is captured with the RGB camera.
- each pixel in the target charging device area may be projected into a RGB projection model to obtain respective corresponding pixels in the RGB image of the pixels.
- FIG. 4 is a schematic diagram illustrating an installation relationship between a depth camera and a RGB camera on a mobile robot according to an exemplary embodiment of the present application. Reference may be made to FIG. 4 .
- a transformation matrix of the depth camera 910 coordinate system relative to the RGB camera 930 coordinate system is T t2t1 , and the transformation matrix has an order of 4 ⁇ 4.
- an intrinsic parameter matrix of the RGB camera 930 is K RGB .
- the intrinsic parameter matrix has an order of 3 ⁇ 3.
- the target charging device area is the i th suspected charging device area.
- K ij a matrix of 3 ⁇ 1 order.
- the coordinate information of a corresponding pixel to the j th pixel in the target charging device area in the RGB image may be obtained.
- the coordinate information of respective corresponding pixels to each pixel in the target charging device area in the RGB image may be obtained.
- the target RGB image area is determined according to the coordinate information of the respective corresponding pixels to each pixel in the target charging device area in the RGB image.
- corresponding pixels to all pixels in the target charging device area in the RGB image constitute the target RGB image area.
- matting processing may be performed on the RGB image to obtain the target RGB image area. That is, the target RGB image area is considered as a suspected area where a charging device is located in the RGB image.
- FIG. 5 is a schematic diagram illustrating a charging device according to an exemplary embodiment of the present application.
- the charging device is provided with a marker.
- a marker is provided on a side of the charging device with a charging socket.
- a standard RGB image of the marker is pre-stored in the mobile robot. In this step, when a similarity between the target RGB image area and the standard RGB image of the marker is greater than a preset threshold, it is determined that the target RGB image area matches the pre-stored standard RGB image of the marker.
- At least two markers with different shapes may be provided on the charging device.
- the at least two markers are provided on different sides of the charging device, and distances of center points of the at least two markers from a bottom of the charging device are equal.
- a marker may be provided on both a front side (a side with a charging socket) and a back side (a side opposite to the front side) of the charging device.
- a circular marker is provided on the front side, and a square marker is provided on the back side.
- standard RGB images of the two markers are pre-stored in the mobile robot. In this step, when a similarity between the target RGB image area and the standard RGB image of any of the markers is greater than the preset threshold, it is determined that the target RGB image area matches the pre-stored standard RGB image of the marker.
- a target marker to whom a similarity is greater than the preset threshold may be determined based on the similarity between the target RGB image area and the standard RGB image of any marker.
- a mobile robot when moving to the target marker, may be controlled to move to a charging socket according to a positional relationship between the target marker and the charging socket, so that the mobile robot may be docked with the charging device to realize autonomous charging.
- NCC Normalized Cross Correlation
- the target charging device area is an area where the charging device is located.
- step S302 when it is determined through step S302 that the target RGB image area matches the pre-stored standard RGB image of the marker, in this step, it is determined that the target charging device area is an area where the charging device is located.
- the target RGB image area does not match the standard RGB image, it is considered that the target RGB image area is not an area where a charging device is located. That is, it may be considered that there is no charging device in the current field of view. At this time, an infrared image and a depth image may be re-captured until the charging device is identified.
- the target charging device area when it is determined that there is a target charging device area in one or more suspected charging device areas, the target charging device area is not directly determined as an area where a charging device is located. Instead, a target RGB image area corresponding to the target charging device area is extracted from a captured RGB image, and it is determined whether the target RGB image area matches a pre-stored standard RGB image of a marker. When it is determined that the target RGB image area matches the standard RGB image, the target charging device area is considered as an area where the charging device is located. In this way, the accuracy of identification may be improved.
- FIG. 6 is a flowchart illustrating a method of identifying a charging device according to an embodiment of the present application.
- the infrared image may include a normal exposure infrared image and a high dynamic range (HDR) infrared image.
- the HDR infrared image is captured when it is determined that a number of pixels with a gray-scale value greater than the second specified value in the normal exposure infrared image is greater than a fourth specified value.
- step S102 may include:
- fusion processing is performed on the normal exposure infrared image and the HDR infrared image to obtain a fused infrared image.
- the fourth specified value may be set according to actual needs.
- the fourth specified value is not limited particularly.
- the fourth specified value may be equal to 100.
- a normal exposure infrared image and a depth image are captured firstly with the depth camera. Then, an HDR infrared image is captured when a number of pixels with a gray-scale value greater than a second specified value in the normal exposure infrared image is greater than a fourth specified value.
- weight convolution may be performed on gray-scale values at corresponding positions in the normal exposure infrared image and the HDR infrared image to obtain the fused infrared image.
- a weight coefficient of a pixel may be determined in the following way: when a gray-scale value of the pixel is greater than a preset value, which may be any value between 150 and 200, for example, the weight coefficient of the pixel is determined as a specified value, which is 1, for example; when the gray-scale value of the pixel is not greater than the preset value, the weight coefficient of the pixel is determined as another specified value, which is 0.7, for example.
- a preset value which may be any value between 150 and 200, for example
- the weight coefficient of the pixel is determined as a specified value, which is 1, for example
- the weight coefficient of the pixel is determined as another specified value, which is 0.7, for example.
- contour detection may be performed on a fused infrared image to determine whether there are one or more suspected charging device areas that satisfy first specified conditions in the fused infrared image.
- FIG. 7A is a schematic diagram illustrating a normal exposure infrared image according to an exemplary embodiment of the present application.
- FIG. 7B is a schematic diagram illustrating an HDR infrared image according to an exemplary embodiment of the present application.
- FIG. 7C is a schematic diagram illustrating a fused infrared image obtained using a normal exposure infrared image and an HDR infrared image according to an exemplary embodiment of the present application.
- FIGS. 7A-7C when the charging device shown in FIG. 5 appears in the field of view of a mobile robot, since a circular marker is provided on a front side of the charging device, at this time, referring to FIG. 7A , there are one or more suspected charging device areas, i.e., the highly bright areas in FIG. 7A , in the normal exposure infrared image. In order to prevent interference caused by other highly reflective material in an environment, an HDR infrared image is further captured. The captured HDR infrared image is shown in FIG. 7B .
- an HDR infrared image is captured with a shorter exposure time to obtain an infrared image of a current field of view.
- a charging device appears in the field of view of the mobile robot, since the charging device has a marker with a higher reflectivity thereon, an area still exists in the HDR infrared image.
- gray-scale values of the areas where the objects are located are greatly reduced in the HDR infrared image.
- a fused infrared image is obtained by performing fusion processing on the normal exposure infrared image and the HDR infrared image.
- FIG. 7C when there is a charging device in the field of view of the mobile robot, in the fused infrared image, an area where the charging device is located shows particularly obvious peak data. Therefore, in the method provided in this embodiment, the fused infrared image is used to determine whether there are one or more suspected charging device areas. In this way, the one or more suspected charging device areas may be found quickly and accurately, and noncharging device areas may be excluded in the first place to reduce subsequent calculation amount.
- FIG. 8 is a flowchart illustrating a method of identifying a charging device according to an exemplary embodiment of the present application.
- the method provided in this embodiment may include: At S801, a normal exposure infrared image, a depth image and a RGB image of a current field of view are captured.
- step S802 it is determined whether a number of pixels with a gray-scale value greater than a second specified value in the normal exposure infrared image is greater than a fourth specified value. If the number of pixels with a gray-scale value greater than the second specified value in the normal exposure infrared image is greater than the fourth specified value, step S803 is performed. If the number of pixels with a gray-scale value greater than the second specified value in the normal exposure infrared image is not greater than the fourth specified value, step S801 is performed.
- fusion processing is performed on the normal exposure infrared image and the HDR infrared image to obtain a fused infrared image.
- step S805 it is determined whether there are one or more suspected charging device areas that satisfy first specified conditions in the fused infrared image. If there are one or more suspected charging device areas that satisfy the first specified conditions in the fused infrared image, step S806 is performed. If there is no suspected charging device area that satisfies the first specified conditions in the fused infrared image, step S801 is performed.
- step S806 it is determined according to the depth image whether there is a target charging device area whose height relative to the depth camera is within a specified range in the one or more suspected charging device areas. If there is the target charging device area in the one or more suspected charging device areas, step S807 is performed. If there is no target charging device area in the suspected charging device areas, step S801 is performed.
- a target RGB image area corresponding to the target charging device area is extracted from the RGB image
- step S808 it is determined whether the target RGB image area matches a pre-stored standard RGB image of a marker. If the target RGB image area matches the pre-stored standard RGB image of the marker, step S809 is performed. If the target RGB image area does not match the pre-stored standard RGB image of the marker, step S801 is performed.
- the target charging device area is an area where the charging device is located.
- the method provided in this embodiment after it is determined that there are one or more suspected charging device areas, it is determined whether there is a target charging device area in the one or more suspected charging device areas. Then, when it is determined that there is the target charging device area, it is determined whether a target RGB image area corresponding to the target charging device area in a RGB image matches a standard RGB image. Further, when it is determined that the target RGB image area matches the standard RGB image, it is determined that the target charging device area is an area where a charging device is located. In this way, after two verifications, a finally determined area is the area where the charging device is located. Therefore, the accuracy of identification is greatly improved.
- a mobile robot when an area where a charging device is located is determined, a mobile robot may be controlled to move to the charging device according to position information of each pixel in the area in a depth camera coordinate system to realize autonomous charging of the mobile robot. Further, in the method provided in this embodiment, the accuracy of identifying the charging device is relatively high, so that the mobile robot may be controlled to move to the charging device more quickly, and the recharging efficiency of the mobile robot may be effectively improved.
- FIG. 9 is a schematic structural diagram illustrating a mobile robot according to the present application.
- the mobile robot provided includes a depth camera 910 and a processor 920.
- the depth camera 910 is configured to capture an infrared image and a depth image of a current field of view.
- the processor 920 is configured specifically to determine, according to the infrared image, whether there are one or more suspected charging device areas that satisfy first specified conditions, where the first specified conditions indicate that a gray-scale value of each pixel in an area is greater than a second specified value, and a number of pixels in the area is greater than a third specified value; if there are the one or more suspected charging device areas, determine, according to the depth image, whether there is a target charging device area whose height relative to the depth camera is within a specified range in the one or more suspected charging device areas; if there is the target charging device area in the one or more suspected charging device areas, identify a charging device according to the target charging device area.
- the mobile robot provided may be used to implement the technical solution of the method embodiment shown in FIG. 1 .
- Their implementation principles and technical effects are similar, and will not be repeated here.
- FIG. 10 is a schematic structural diagram illustrating a mobile robot according to an embodiment of the present application.
- the mobile robot provided in this embodiment includes a red, green and blue (RGB) camera 930.
- the RGB camera 930 is configured to capture a RGB image of the current field of view when the depth camera 910 captures the infrared image and the depth image of the current field of view, or when the processor 920 determines that there is the target charging device area in the one or more suspected charging device areas.
- the processor 920 is configured to extract a target RGB image area corresponding to the target charging device area from the RGB image, determine whether the target RGB image area matches a pre-stored standard RGB image of a marker, and when it is determined that the target RGB image area matches the pre-stored standard RGB image of the marker, determine that the target charging device area is an area where the charging device is located.
- the mobile robot provided may further include a memory 940.
- the memory 940 is configured to store the infrared image, the depth image and the RGB image.
- the infrared image includes a normal exposure infrared image and a high dynamic range (HDR) infrared image
- the HDR infrared image is captured when it is determined that a number of pixels with a gray-scale value greater than the second specified value in the normal exposure infrared image is greater than a fourth specified value.
- the processor 920 is configured specifically to perform fusion processing on the normal exposure infrared image and the HDR infrared image to obtain a fused infrared image, and determine whether there are the one or more suspected charging device areas that satisfy the first specified conditions in the fused infrared image.
- the processor 920 is configured specifically to: obtain depth information of each pixel in the one or more suspected charging device areas according to the depth image; determine, according to the depth information and coordinate information of each pixel in the suspected charging device areas, position information of each pixel in a depth camera coordinate system; calculate, according to the position information of each pixel in the suspected charging device areas in the depth camera coordinate system, a height of each of the suspected charging device areas relative to the depth camera; and determine whether there is a target charging device area whose height relative to the depth camera falls within the specified range in the one or more suspected charging device areas.
- the processor 920 is configured specifically to: determine coordinate information of a corresponding pixel to each pixel in the target charging device area in the RGB image according to position information of each pixel in the target charging device area in a depth camera coordinate system, a transformation matrix of the depth camera coordinate system relative to a RGB camera coordinate system, and an intrinsic parameter matrix of the RGB camera; and determine the target RGB image area according to the coordinate information of the corresponding pixel to each pixel in the target charging device area in the RGB image.
- the charging device is provided with one marker. That the target RGB image area matches the pre-stored standard RGB image of the marker comprises: a similarity between the target RGB image area and a standard RGB image of the marker being greater than a preset threshold.
- the charging device is provided with at least two markers with different shapes.
- the at least two markers are provided on different sides of the charging device. Distances of center points of the at least two markers from a bottom of the charging device are equal.
- that the target RGB image area matches the pre-stored standard RGB image of the marker comprises: a similarity between the target RGB image area and a standard RGB image of any of the markers being greater than a preset threshold.
- FIG. 11 is a schematic diagram illustrating a system for identifying a charging device according to an exemplary embodiment of the present application.
- the system for identifying the charging device provided in this embodiment may include any mobile robot 1 provided in this application, and a charging device 2 configured to charge the mobile robot 1, where the charging device 2 is provided with a marker 21, and a reflectivity of the marker 21 is greater than a first specified value.
- At least one marker 21 may be provided on the charging device 2, and the at least one marker 21 may be provided on a specified side.
- the at least one marker 21 is provided on a side of the charging device 2 with a charging socket.
- at least two markers 21 with different shapes may be provided on the charging device 2.
- the at least two markers 21 are provided on different sides of the charging device 2, and distances of center points of the at least two markers 21 from a bottom of the charging device 2 are equal.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Power Engineering (AREA)
- Geometry (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Charge And Discharge Circuits For Batteries Or The Like (AREA)
Claims (13)
- Procédé d'identification d'un dispositif de charge, dans lequel un marqueur est disposé sur le dispositif de charge, une réflectivité du marqueur est supérieure à une première valeur spécifiée, et le procédé est appliqué à un robot mobile, et comprend :la capture (S101) d'une image infrarouge et d'une image de profondeur d'un champ de vision actuel avec une caméra de profondeur ;la détermination (S102), en fonction de l'image infrarouge, s'il existe une ou plusieurs zones de dispositif de charge suspectées qui satisfont des premières conditions spécifiées, dans lequel les premières conditions spécifiées indiquent qu'une valeur d'échelle de gris de chacun des pixels dans une zone est supérieure à une deuxième valeur spécifiée, et un nombre des pixels dans la zone est supérieur à une troisième valeur spécifiée ;en réponse à la détermination que les une ou plusieurs zones de dispositif de charge suspectées existent, la détermination (S103), en fonction de l'image de profondeur, s'il existe une zone de dispositif de charge cible dont la hauteur par rapport à la caméra de profondeur se situe dans une plage spécifiée dans les une ou plusieurs zones de dispositif de charge suspectées ; eten réponse à la détermination que la zone de dispositif de charge cible existe dans les une ou plusieurs zones de dispositif de charge suspectées, l'identification (S104) du dispositif de charge en fonction de la zone de dispositif de charge cible,dans lequel le procédé comprend en outre :lors de la capture (S101) de l'image infrarouge et de l'image de profondeur du champ de vision actuel avec la caméra de profondeur, ou lorsqu'il est déterminé que la zone de dispositif de charge cible existe dans les une ou plusieurs zones de dispositif de charge suspectées, la capture d'une image rouge, verte et bleue, RVB, du champ de vision actuel avec une caméra RVB ; etdans lequel l'identification (S104) du dispositif de charge en fonction de la zone de dispositif de charge cible comprend :l'extraction (S301) d'une zone d'image RVB cible correspondant à la zone de dispositif de charge cible à partir de l'image RVB ;la détermination (S302) si la zone d'image RVB cible correspond à une image RVB standard pré-stockée du marqueur ; eten réponse à la détermination que la zone d'image RVB cible correspond à l'image RVB standard pré-stockée du marqueur, la détermination (S303) que la zone de dispositif de charge cible est une zone où le dispositif de charge est situé.
- Procédé selon la revendication 1, dans lequell'image infrarouge comprend une image infrarouge à exposition normale et une image infrarouge à plage dynamique élevée, HDR, et l'image infrarouge HDR est capturée lorsqu'il est déterminé qu'un nombre de pixels avec une valeur d'échelle de gris supérieure à la deuxième valeur spécifiée dans l'image infrarouge à exposition normale est supérieur à une quatrième valeur spécifiée ; etdans lequel la détermination, en fonction de l'image infrarouge, si les une ou plusieurs zones de dispositif de charge suspectées qui satisfont les premières conditions spécifiées existent, comprend :la réalisation d'un traitement de fusion sur l'image infrarouge à exposition normale et l'image infrarouge HDR pour obtenir une image infrarouge fusionnée ; etla détermination si les une ou plusieurs zones de dispositif de charge suspectées qui satisfont les premières conditions spécifiées dans l'image infrarouge fusionnée existent.
- Procédé selon la revendication 1, dans lequel la détermination, en fonction de l'image de profondeur, si la zone de dispositif de charge cible dont la hauteur par rapport à la caméra de profondeur se situe dans la plage spécifiée dans les une ou plusieurs zones de dispositif de charge suspectées existe comprend :l'obtention d'informations de profondeur de chacun des pixels dans les une ou plusieurs zones de dispositif de charge suspectées en fonction de l'image de profondeur ;la détermination, en fonction des informations de profondeur et d'informations de coordonnées de chacun des pixels dans les une ou plusieurs zones de dispositif de charge suspectées, d'informations de position de chacun des pixels dans un système de coordonnées de caméra de profondeur ;le calcul, en fonction des informations de position de chacun des pixels dans les une ou plusieurs zones de dispositif de charge suspectées dans le système de coordonnées de caméra de profondeur, de hauteurs respectives des une ou plusieurs zones de dispositif de charge suspectées par rapport à la caméra de profondeur ; etla détermination si la zone de dispositif de charge cible dont la hauteur par rapport à la caméra de profondeur se situe dans la plage spécifiée dans les une ou plusieurs zones de dispositif de charge suspectées existe.
- Procédé selon la revendication 1, dans lequel l'extraction de la zone d'image RVB cible correspondant à la zone de dispositif de charge cible à partir de l'image RVB comprend :la détermination d'informations de coordonnées d'un pixel correspondant à chacun des pixels dans la zone de dispositif de charge cible dans l'image RVB en fonction d'informations de position de chacun des pixels dans la zone de dispositif de charge cible dans un système de coordonnées de caméra de profondeur, d'une matrice de transformation du système de coordonnées de caméra de profondeur par rapport à un système de coordonnées de caméra RVB, et d'une matrice de paramètres intrinsèques de la caméra RVB ; etla détermination de la zone d'image RVB cible en fonction des informations de coordonnées du pixel correspondant à chacun des pixels dans la zone de dispositif de charge cible dans l'image RVB.
- Procédé selon la revendication 1, dans lequelle dispositif de charge est pourvu d'un marqueur, etla zone d'image RVB cible correspondant à l'image RVB standard pré-stockée du marqueur comprend : une similitude entre la zone d'image RVB cible et l'image RVB standard du marqueur étant supérieure à un seuil prédéfini.
- Procédé selon la revendication 1, dans lequelle dispositif de charge est pourvu d'au moins deux marqueurs de formes différentes, les au moins deux marqueurs sont disposés sur des côtés différents du dispositif de charge, et des distances de points centraux des au moins deux marqueurs à partir du bas du dispositif de charge sont égales ; etla zone d'image RVB cible correspondant à l'image RVB standard pré-stockée du marqueur comprend : une similitude entre la zone d'image RVB cible et une image RVB standard de l'un quelconque des marqueurs étant supérieure à un seuil prédéfini.
- Robot mobile, comprenant :une caméra de profondeur (910) configurée pour capturer une image infrarouge et une image de profondeur d'un champ de vision actuel ; etun processeur (920) configuré pour :déterminer, en fonction de l'image infrarouge, s'il existe une ou plusieurs zones de dispositif de charge suspectées qui satisfont des premières conditions spécifiées, dans lequel les premières conditions spécifiées indiquent qu'une valeur d'échelle de gris de chacun des pixels dans une zone est supérieure à une deuxième valeur spécifiée, et un nombre des pixels dans la zone est supérieur à une troisième valeur spécifiée ;en réponse à la détermination que les une ou plusieurs zones de dispositif de charge suspectées existent, déterminer, en fonction de l'image de profondeur, s'il existe une zone de dispositif de charge cible dont la hauteur par rapport à la caméra de profondeur (910) se situe dans une plage spécifiée dans les une ou plusieurs zones de dispositif de charge suspectées ; eten réponse à la détermination que la zone de dispositif de charge cible existe dans les une ou plusieurs zones de dispositif de charge suspectées, identifier un dispositif de charge pour le robot mobile en fonction de la zone de dispositif de charge cible ;comprenant en outre : une caméra rouge, verte et bleue, RVB, configurée pour, lorsque la caméra de profondeur (910) capture l'image infrarouge et l'image de profondeur du champ de vision actuel, ou lorsque le processeur (920) détermine que la zone de dispositif de charge cible existe dans les une ou plusieurs zones de dispositif de charge suspectées, capturer une image RVB du champ de vision actuel ; etdans lequel le processeur (920) est configuré pour extraire une zone d'image RVB cible correspondant à la zone de dispositif de charge cible à partir de l'image RVB, déterminer si la zone d'image RVB cible correspond à une image RVB standard pré-stockée d'un marqueur (21), et lorsqu'il est déterminé que la zone d'image RVB cible correspond à l'image RVB standard pré-stockée du marqueur (21), déterminer que la zone de dispositif de charge cible est une zone où le dispositif de charge est situé.
- Robot mobile selon la revendication 7, dans lequell'image infrarouge comprend une image infrarouge à exposition normale et une image infrarouge à plage dynamique élevée, HDR, et l'image infrarouge HDR est capturée lorsqu'il est déterminé qu'un nombre de pixels avec une valeur d'échelle de gris supérieure à la deuxième valeur spécifiée dans l'image infrarouge à exposition normale est supérieur à une quatrième valeur spécifiée ; etle processeur (920) est configuré pour réaliser un traitement de fusion sur l'image infrarouge à exposition normale et l'image infrarouge HDR pour obtenir une image infrarouge fusionnée, et déterminer si les une ou plusieurs zones de dispositif de charge suspectées qui satisfont les premières conditions spécifiées dans l'image infrarouge fusionnée existent.
- Robot mobile selon la revendication 7, dans lequel le processeur (920) est configuré pour, lorsqu'il est déterminé, en fonction de l'image de profondeur, si la zone de dispositif de charge cible dont la hauteur par rapport à la caméra de profondeur (910) se situe dans la plage spécifiée dans les une ou plusieurs zones de dispositif de charge suspectées existe :obtenir des informations de profondeur de chacun des pixels dans les une ou plusieurs zones de dispositif de charge suspectées en fonction de l'image de profondeur ;déterminer, en fonction des informations de profondeur et d'informations de coordonnées de chacun des pixels dans les une ou plusieurs zones de dispositif de charge suspectées, d'informations de position de chacun des pixels dans un système de coordonnées de caméra de profondeur (910) ;calculer, en fonction des informations de position de chacun des pixels dans les une ou plusieurs zones de dispositif de charge suspectées dans le système de coordonnées de caméra de profondeur (910), de hauteurs respectives des une ou plusieurs zones de dispositif de charge suspectées par rapport à la caméra de profondeur (910) ; etdéterminer si la zone de dispositif de charge cible dont la hauteur par rapport à la caméra de profondeur (910) se situe dans la plage spécifiée dans les une ou plusieurs zones de dispositif de charge suspectées existe.
- Robot mobile selon la revendication 7, dans lequel, lors de l'extraction de la zone d'image RVB cible correspondant à la zone de dispositif de charge cible à partir de l'image RVB, le processeur (920) est configuré pour :déterminer des informations de coordonnées d'un pixel correspondant à chacun des pixels dans la zone de dispositif de charge cible dans l'image RVB en fonction d'informations de position de chacun des pixels dans la zone de dispositif de charge cible dans un système de coordonnées de caméra de profondeur (910), d'une matrice de transformation du système de coordonnées de caméra de profondeur (910) par rapport à un système de coordonnées de caméra RVB (930), et d'une matrice de paramètres intrinsèques de la caméra RVB (930) ; etdéterminer la zone d'image RVB cible en fonction des informations de coordonnées du pixel correspondant à chacun des pixels dans la zone de dispositif de charge cible dans l'image RVB.
- Robot mobile selon la revendication 7, dans lequelle dispositif de charge est pourvu d'un marqueur (21), etla zone d'image RVB cible correspondant à l'image RVB standard pré-stockée du marqueur (21) comprend : une similitude entre la zone d'image RVB cible et l'image RVB standard du marqueur (21) étant supérieure à un seuil prédéfini.
- Robot mobile selon la revendication 7, dans lequelle dispositif de charge est pourvu d'au moins deux marqueurs (21) de formes différentes, les au moins deux marqueurs (21) sont disposés sur des côtés différents du dispositif de charge, et des distances de points centraux des au moins deux marqueurs (21) à partir du bas du dispositif de charge sont égales ; etla zone d'image RVB cible correspondant à l'image RVB standard pré-stockée du marqueur (21) comprend : une similitude entre la zone d'image RVB cible et une image RVB standard de l'un quelconque des marqueurs (21) étant supérieure à un seuil prédéfini.
- Système d'identification d'un dispositif de charge, comprenant :un robot mobile selon l'une quelconque des revendications 7 à 12 ; etun dispositif de charge configuré pour charger le robot mobile, dans lequel un marqueur (21) est disposé sur le dispositif de charge, et une réflectivité du marqueur (21) est supérieure à une première valeur spécifiée.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810929450.7A CN110838144B (zh) | 2018-08-15 | 2018-08-15 | 一种充电设备识别方法、移动机器人和充电设备识别系统 |
PCT/CN2019/100429 WO2020034963A1 (fr) | 2018-08-15 | 2019-08-13 | Procédé d'identification de dispositif de charge, robot mobile et système d'identification de dispositif de charge |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3836084A1 EP3836084A1 (fr) | 2021-06-16 |
EP3836084A4 EP3836084A4 (fr) | 2021-08-18 |
EP3836084B1 true EP3836084B1 (fr) | 2023-10-04 |
Family
ID=69524708
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19850252.8A Active EP3836084B1 (fr) | 2018-08-15 | 2019-08-13 | Procédé d'identification de dispositif de charge, robot mobile et système d'identification de dispositif de charge |
Country Status (4)
Country | Link |
---|---|
US (1) | US11715293B2 (fr) |
EP (1) | EP3836084B1 (fr) |
CN (1) | CN110838144B (fr) |
WO (1) | WO2020034963A1 (fr) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113572968B (zh) * | 2020-04-24 | 2023-07-18 | 杭州萤石软件有限公司 | 图像融合方法、装置、摄像设备及存储介质 |
CN112465959B (zh) * | 2020-12-17 | 2022-07-01 | 国网四川省电力公司电力科学研究院 | 基于局部场景更新的变电站三维实景模型巡视方法 |
CN114943941A (zh) * | 2021-02-07 | 2022-08-26 | 华为技术有限公司 | 一种目标检测方法及装置 |
CN114915036A (zh) * | 2021-02-09 | 2022-08-16 | 北京小米移动软件有限公司 | 足式机器人充电控制方法及足式机器人 |
CN113671944B (zh) * | 2021-07-05 | 2024-04-16 | 上海高仙自动化科技发展有限公司 | 控制方法、控制装置、智能机器人及可读存储介质 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100656701B1 (ko) * | 2004-10-27 | 2006-12-13 | 삼성광주전자 주식회사 | 로봇청소기 시스템 및 외부충전장치 복귀 방법 |
US8515580B2 (en) | 2011-06-17 | 2013-08-20 | Microsoft Corporation | Docking process for recharging an autonomous mobile device |
EP2903787B1 (fr) | 2012-10-05 | 2019-05-15 | iRobot Corporation | Systèmes de gestion de robots permettant de déterminer une pose d'une station d'ancrage comprenant des robots mobiles et leurs procédés d'utilisation |
US9398287B2 (en) * | 2013-02-28 | 2016-07-19 | Google Technology Holdings LLC | Context-based depth sensor control |
CN104036226B (zh) * | 2013-03-04 | 2017-06-27 | 联想(北京)有限公司 | 一种目标物信息获取方法及电子设备 |
KR102095817B1 (ko) * | 2013-10-31 | 2020-04-01 | 엘지전자 주식회사 | 이동 로봇, 이동 로봇의 충전대 및 이들을 포함하는 이동 로봇 시스템 |
US9704043B2 (en) * | 2014-12-16 | 2017-07-11 | Irobot Corporation | Systems and methods for capturing images and annotating the captured images with information |
KR101772084B1 (ko) * | 2015-07-29 | 2017-08-28 | 엘지전자 주식회사 | 이동 로봇 및 그 제어방법 |
CN106444777B (zh) * | 2016-10-28 | 2019-12-17 | 北京进化者机器人科技有限公司 | 机器人自动回位充电方法和系统 |
CN106647747B (zh) * | 2016-11-30 | 2019-08-23 | 北京儒博科技有限公司 | 一种机器人充电方法及装置 |
CN106826815B (zh) * | 2016-12-21 | 2019-05-31 | 江苏物联网研究发展中心 | 基于彩色图像与深度图像的目标物体识别与定位的方法 |
CN107124014A (zh) * | 2016-12-30 | 2017-09-01 | 深圳市杉川机器人有限公司 | 一种移动机器人的充电方法及充电系统 |
CN106826821A (zh) * | 2017-01-16 | 2017-06-13 | 深圳前海勇艺达机器人有限公司 | 基于图像视觉引导的机器人自动返回充电的方法和系统 |
CN106875444B (zh) | 2017-01-19 | 2019-11-19 | 浙江大华技术股份有限公司 | 一种目标物定位方法及装置 |
KR102033143B1 (ko) * | 2017-01-25 | 2019-10-16 | 엘지전자 주식회사 | 3차원 공간에서 기능 영역을 식별하는 방법 및 이를 구현하는 로봇 |
CN106980320B (zh) * | 2017-05-18 | 2020-06-19 | 上海思岚科技有限公司 | 机器人充电方法及装置 |
CN107392962A (zh) | 2017-08-14 | 2017-11-24 | 深圳市思维树科技有限公司 | 一种基于图案识别的机器人充电对接系统和方法 |
CN107633528A (zh) | 2017-08-22 | 2018-01-26 | 北京致臻智造科技有限公司 | 一种刚体识别方法及系统 |
CN107590836B (zh) * | 2017-09-14 | 2020-05-22 | 斯坦德机器人(深圳)有限公司 | 一种基于Kinect的充电桩动态识别与定位方法及系统 |
CN108171212A (zh) | 2018-01-19 | 2018-06-15 | 百度在线网络技术(北京)有限公司 | 用于检测目标的方法和装置 |
CN108124142A (zh) * | 2018-01-31 | 2018-06-05 | 西北工业大学 | 基于rgb景深相机和高光谱相机的图像目标识别系统及方法 |
-
2018
- 2018-08-15 CN CN201810929450.7A patent/CN110838144B/zh active Active
-
2019
- 2019-08-13 WO PCT/CN2019/100429 patent/WO2020034963A1/fr unknown
- 2019-08-13 EP EP19850252.8A patent/EP3836084B1/fr active Active
- 2019-08-13 US US17/266,164 patent/US11715293B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US20210312178A1 (en) | 2021-10-07 |
EP3836084A4 (fr) | 2021-08-18 |
CN110838144A (zh) | 2020-02-25 |
WO2020034963A1 (fr) | 2020-02-20 |
EP3836084A1 (fr) | 2021-06-16 |
CN110838144B (zh) | 2022-09-30 |
US11715293B2 (en) | 2023-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3836084B1 (fr) | Procédé d'identification de dispositif de charge, robot mobile et système d'identification de dispositif de charge | |
CN111989544B (zh) | 基于光学目标的室内车辆导航的系统和方法 | |
EP3187895B1 (fr) | Système de radar de vol à résolution variable | |
KR101967088B1 (ko) | 가시 광 및 적외선 투영 패턴들을 검출하기 위한 이미저 | |
US11579254B2 (en) | Multi-channel lidar sensor module | |
US11688030B2 (en) | Shading topography imaging for robotic unloading | |
CN112346453A (zh) | 机器人自动回充方法、装置、机器人和存储介质 | |
CN111964680B (zh) | 一种巡检机器人的实时定位方法 | |
Bai et al. | Stereovision based obstacle detection approach for mobile robot navigation | |
US20240051152A1 (en) | Autonomous solar installation using artificial intelligence | |
US7653247B2 (en) | System and method for extracting corner point in space using pixel information, and robot using the system | |
JP5874252B2 (ja) | 対象物との相対位置計測方法と装置 | |
KR20190129551A (ko) | 무인 이동체용 사물 유도 시스템 및 방법 | |
Iz et al. | An image-based path planning algorithm using a UAV equipped with stereo vision | |
Krause et al. | Remission based improvement of extrinsic parameter calibration of camera and laser scanner | |
CN111114367A (zh) | 电动汽车自动充电方法和系统 | |
Jiang et al. | Target object identification and localization in mobile manipulations | |
JP7488222B2 (ja) | 相対位置推定装置及びプログラム | |
Bellone et al. | A kinect-based parking assistance system | |
US20240208675A1 (en) | Positioning device, moving object, positioning method and storage medium | |
US11815626B2 (en) | Method for detecting intensity peaks of a specularly reflected light beam | |
US20240335080A1 (en) | Mobile robot | |
CN118050732A (zh) | 一种远距离iTOF深度相机及机器人 | |
CN116258998A (zh) | 障碍物识别方法、系统及存储介质 | |
CN118052858A (zh) | 一种结构光深度相机及扫地机器人 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210311 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20210720 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06T 7/73 20170101AFI20210714BHEP Ipc: G06K 9/00 20060101ALI20210714BHEP Ipc: G06K 9/20 20060101ALI20210714BHEP |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06V 20/10 20220101ALI20230328BHEP Ipc: G06V 10/22 20220101ALI20230328BHEP Ipc: G06T 7/73 20170101AFI20230328BHEP |
|
INTG | Intention to grant announced |
Effective date: 20230426 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: HANGZHOU EZVIZ SOFTWARE CO., LTD. |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230828 |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602019038840 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20231004 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1618542 Country of ref document: AT Kind code of ref document: T Effective date: 20231004 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240105 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240204 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240204 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240105 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240104 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240205 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240104 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602019038840 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231004 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20240705 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240806 Year of fee payment: 6 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240823 Year of fee payment: 6 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20240829 Year of fee payment: 6 |