WO2019109230A1 - Robot de balayage visuel et procédé de recharge associé - Google Patents

Robot de balayage visuel et procédé de recharge associé Download PDF

Info

Publication number
WO2019109230A1
WO2019109230A1 PCT/CN2017/114507 CN2017114507W WO2019109230A1 WO 2019109230 A1 WO2019109230 A1 WO 2019109230A1 CN 2017114507 W CN2017114507 W CN 2017114507W WO 2019109230 A1 WO2019109230 A1 WO 2019109230A1
Authority
WO
WIPO (PCT)
Prior art keywords
cleaning robot
charging
visual
image
recharging
Prior art date
Application number
PCT/CN2017/114507
Other languages
English (en)
Chinese (zh)
Inventor
张立新
周毕兴
Original Assignee
深圳市沃特沃德股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市沃特沃德股份有限公司 filed Critical 深圳市沃特沃德股份有限公司
Priority to PCT/CN2017/114507 priority Critical patent/WO2019109230A1/fr
Publication of WO2019109230A1 publication Critical patent/WO2019109230A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Definitions

  • the invention relates to the field of sweeping robots, in particular to a visual sweeping robot and a charging method thereof.
  • the infrared sensor has a small emission angle, a short distance of the transmitted coded signal, and a slight occlusion, and the infrared signal cannot be completely penetrated, and if the cleaning environment of the sweeper is used, The space is relatively large. The time to detect the infrared pilot signal while walking on the sweeper becomes very long. It is very likely that the sweeper cannot return to the base, and the power is exhausted and stranded on the road.
  • the main object of the present invention is to provide a method for recharging a visual sweeping robot, so that the sweeping robot can quickly and accurately find the refill base for charging.
  • the sweeper photographs and stores an environmental image of the surrounding environment during the cleaning process
  • the preset condition is that the robot can identify the identification identifier set on the refill base; and the step of moving to the recharge seat charging position includes:
  • the identification identifier is a two-dimensional code.
  • the identifying the identification mark on the refill base, with the identification identifier as a reference, moving to the recharge seat charging location includes:
  • the step of identifying the identification identifier set on the refill base, with the identification identifier as a reference, moving to the charging position of the refill base includes:
  • S315 Identify two identification identifiers on the charging stand that are symmetrically disposed with the charging electrode as a symmetry axis.
  • the environment image of the Sweeper storing the surrounding environment during the cleaning process includes:
  • the step of comparing the stored environment image with the pre-stored refill image includes:
  • the feature point of the stored environment image is matched with the feature point of the pre-stored refill image by using a feature matching method
  • the acquiring position information of the cleaning robot includes:
  • the sweeper acquires and stores the posture of the sweeping robot at each moment by using a visual sensor or a laser sensor during the cleaning process.
  • the step of moving the cleaning robot to the recharging charging position according to the current position information and the environmental image information with the highest similarity includes:
  • the invention also provides a visual cleaning robot, comprising:
  • a photographing module configured for the sweeper to take an environment image of the surrounding environment during the cleaning process
  • the comparison module when used for recharging, compares the stored environment image with the pre-stored refill image
  • a moving module configured to move to the recharge base charging according to the current position information and the environment image information with the highest similarity when the similarity between the environment image and the refill image is the highest position;
  • the condition module is used to call the comparison module and the mobile module until the precondition is reached.
  • the preset condition is that the robot can identify the identification identifier set on the refill base; the mobile module includes:
  • a charging unit configured to identify the identification mark on the refill base, and move to the recharge seat charging position with the identification mark as a reference.
  • the identification identifier is a two-dimensional code.
  • the charging unit includes:
  • a line segment sub-unit configured to merge pixel points of the similarity gradient information on the identification mark into line segments
  • a polygonal subunit for connecting the merged line segments to form a polygon
  • the charging unit includes:
  • a charging subunit configured to identify two identifiers on the charging base that are symmetrically disposed with the charging electrode as a symmetry axis.
  • the shooting module includes:
  • a feature point unit for extracting feature points of the image and storing.
  • comparison module includes:
  • a matching unit configured to match feature points of the stored environment image with feature points of the pre-stored refill image
  • a matching value unit that calculates the number of interior points and generates a matching value.
  • the mobile module further includes:
  • the storage unit is configured to acquire and store the posture of the cleaning robot at each moment by using the visual sensor or the laser sensor during the cleaning process.
  • the mobile module further includes:
  • An expansion unit marking a current position of the cleaning robot, expanding a position that is not marked near the current position of the cleaning robot, and generating a child node;
  • An evaluation value unit for calculating an evaluation function value for each child node and marking a child node having the smallest evaluation function value
  • a path unit configured to stop expanding if the child node with the smallest evaluation value is the target node, and join all the marked minimum child nodes to generate a path;
  • a mobile unit for moving to the recharging charging position in accordance with the path.
  • the beneficial effect of the invention is that the sweeping robot can find the refill seat by collecting the environment image during the sweeping process, and improve the speed of finding the refill seat.
  • the two two-dimensional codes of the charging electrode are symmetrically arranged on the refilling seat, so that the cleaning robot can be inserted into the refilling seat without the electrical signal emitted by the recharging base.
  • FIG. 1 is a schematic diagram showing the steps of a refilling method of a visual cleaning robot according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram showing the steps of a refilling method of a visual cleaning robot according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram showing the steps of a refilling method of a visual cleaning robot according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram showing the steps of a refilling method of a visual cleaning robot according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram showing the steps of a refilling method of a visual cleaning robot according to an embodiment of the present invention
  • FIG. 6 is a schematic diagram showing the steps of a refilling method of a visual cleaning robot according to an embodiment of the present invention.
  • FIG. 7 is a schematic structural view of a visual sweeping robot according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural view of a visual sweeping robot according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural view of a visual sweeping robot according to an embodiment of the present invention.
  • FIG. 10 is a schematic structural view of a visual cleaning robot according to an embodiment of the present invention.
  • FIG. 11 is a schematic structural view of a visual cleaning robot according to an embodiment of the present invention.
  • FIG. 12 is a schematic structural view of a visual cleaning robot according to an embodiment of the present invention.
  • FIG. 13 is a schematic structural view of a visual cleaning robot according to an embodiment of the present invention.
  • FIG. 14 is a schematic structural view of a visual cleaning robot according to an embodiment of the present invention.
  • a method for recharging a visual cleaning robot including the steps of:
  • the sweeper photographs and stores an environmental image of the surrounding environment during the cleaning process
  • the sweeping robot when the sweeping robot starts to sweep the ground, it does not necessarily start from the refilling seat. It may be that the user directly puts it into a certain room and starts cleaning. It may also be that the sweeping robot is from a corner of the room. Starting to clean up.
  • the visual sensor of the sweeping robot such as the camera, also collects the surrounding environment image, and some of the sweeping robots have cameras around the robot. At this time, the cameras around the sweeping robot can also be used. At the same time, the environment image is collected.
  • the position of the environment image is recorded at this time, the position of the sweeping robot is recorded, and the starting point of the sweeping robot is recorded as a reference.
  • the point from the starting point to the image of the collected environment can be recorded.
  • the displacement and direction of the sweeping robot record their own motion trajectory in real time during cleaning.
  • the preset rule for collecting the environment image may be collected once every time, or once every time.
  • the refill command is generated, which may be that the power of the cleaning robot is lower than a preset threshold, or a refill command generated after receiving a command to end the cleaning sent by the user.
  • the collected environment image is compared with the image of the refill seat that the user previously stored in the sweeping robot or when the sweeping robot is photographed and stored at the charging position, and the similarity with the image of the refilling seat is found.
  • the highest environmental image is then moved to the position corresponding to the highest similarity environment image by the path planning, and the above steps are repeated until the identification mark set on the refill base can be recognized at the position corresponding to the environment image.
  • the cleaning robot accurately positions the identification mark set on the refilling seat as a reference, so that the robot can be aligned with the charging electrode to perform charging.
  • the stored environment image is compared with a pre-stored refill image
  • the pre-stored refill image includes the user pre-storing the refill image in the visual sweeping robot, including The image of the refill seat captured and stored by the visual sweeping robot when it is recharged in the refill position.
  • the visual cleaning robot updates its stored refill seat image when the refill position changes.
  • the preset condition is that the robot can identify the identification identifier set on the refill base; and the step of moving to the recharge charging position includes:
  • the determination may be based on the distance value measured by the camera photographing, or It is determined by the distance between the sensor orientation measurement and the refilling seat, and the identification mark is used as a reference to confirm the specific position of the refilling seat, and the cleaning robot moves to the charging position of the refilling seat.
  • the identification identifier is a two-dimensional code.
  • the two-dimensional code is a black and white picture with high recognition and simple production.
  • the identification mark is placed on the refill seat, it is convenient for the sweeping robot to recognize and align.
  • the identifying the identification identifier on the refill base with reference to the identification identifier, moving to the recharging charging position includes:
  • the gradient direction and the gradient size of each pixel in the acquired image are calculated, and then the similarity metrics of the pixel point gradient are used, and adjacent pixel points having similar gradient information are combined into one.
  • the nodes of the graph are one pixel, and the weight of the edges is the gradient similarity of two pixels (regions).
  • the detected lines are connected to form a polygon by spatial neighboring criteria, and the quadrilateral is obtained by limiting the length of the polygon and the number of corner points formed by the polygon, and the spatially adjacent quadrilateral is merged.
  • the homography matrix represents the second transformation performed by the 2D point projection onto the camera coordinate system on the two-dimensional code coordinate system, and can be directly linearly changed. (D i r e c t L i n e a r Transform algorithm) is obtained.
  • the camera's internal reference is represented by P, including the camera focal length and center deviation.
  • the external reference is represented by E. Then the homography matrix can be written as follows:
  • the column of the rotation matrix must be a unit size, according to the corresponding direction information of the two-dimensional code and the camera (the two-dimensional code appears in front of the camera), the size and direction of s can be obtained.
  • the third column of the rotation matrix can be recovered by computing the cross product of two known columns, since the rotated column matrix must be orthogonal. Thereby, the relative positional relationship of the two-dimensional code with respect to the camera can be obtained. Then, according to the relative positional relationship, it is moved to the charging position of the refilling seat for charging.
  • the identifying the identification identifier disposed on the refill base, with the identification identifier as a reference, moving to the recharge charging position includes:
  • S315 Identify two identification identifiers on the charging stand that are symmetrically disposed with the charging electrode as a symmetry axis.
  • two identification marks are disposed on the refilling seat, the two identification marks are located at the same height and symmetrically distributed on both sides of the charging electrode; the cleaning robot moves to the axis of symmetry of the two identification marks; the sweeping machine continuously adjusts With the symmetry axis on the floor in the two identification marks, the accuracy of aligning the charging electrodes is higher. After aligning the charging electrode, keep driving straight, you can plug in the charging smoothly or wait for the call back to the charging station to charge immediately.
  • the environment image of the Sweeper storing the surrounding environment during the cleaning process includes:
  • the feature extraction of the environment image can reduce the memory of the image, save space when saving, and compare the feature points when comparing, and can reduce the workload of the comparison.
  • the step of comparing the stored environment image with the pre-stored refill image includes:
  • the feature point of the stored environment image is matched with the feature point of the pre-stored refill image by using a feature matching method
  • the feature points extracted by the environment image collected by the cleaning robot during the sweeping process are compared with the feature points of the refilling seat, and the feature points of the extracted collected environment image are one by one and the preset refill seat
  • the feature points are compared, and the feature matching method is used for matching; the inner point refers to the feature points of the one image in the two images.
  • the environmental image with the highest similarity is selected subsequently, that is, the environmental image with the highest matching value. Confirm that the object corresponding to the feature point with the highest matching value is the refill seat.
  • the acquiring position information of the cleaning robot includes:
  • the sweeper acquires and stores the posture of the sweeping robot at each moment by using a visual sensor or a laser sensor during the cleaning process.
  • the visual sensor refers to acquiring the position of the cleaning robot in the movement by using the internal reference of the camera, and acquiring the posture of the cleaning robot in real time, so that the cleaning robot realizes the position of the cleaning robot in the environment.
  • the laser sensor determines the attitude of the cleaning robot by detecting the distance from the surrounding object, and stores the posture acquired at each moment in the cleaning robot.
  • the step of moving the cleaning robot to the recharging charging position according to the current position information and the environmental image information with the highest similarity includes:
  • the cleaning robot when the cleaning robot returns to the position where the environment image is collected, path planning is performed, and the shortest path is also searched by the A* algorithm, also called A-Star algorithm, which is the most effective direct solution for solving the shortest path in the static road network.
  • the search method firstly marks the starting position of the sweeping robot and expands its unmarked child nodes.
  • the child nodes are the locations where the sweeping robot collects the environment image during the cleaning process, and then calculate the evaluation function for each child node.
  • the value is arranged according to the size of the evaluation value, and the node with the smallest evaluation value is found and marked, and if the current node is the target node, that is, the position of the image of the refilled seat that needs to be returned, the search is stopped.
  • the specific calculation steps are:
  • h(n) (number of abscissas between the current node and the target node + number of ordinates between the current node and the target node)*10, 8 nodes around the current node respectively solve h(n) with the target node.
  • the circuit is returned to the charging base for charging according to the path.
  • the recharging method of the visual cleaning robot of the present invention enables the cleaning robot to find the refilling seat by collecting the environmental image during the sweeping process, thereby improving the speed of finding the refilling seat.
  • the two two-dimensional codes of the charging electrode are symmetrically arranged on the refilling seat, so that the cleaning robot can be inserted into the refilling seat without the electrical signal emitted by the recharging base.
  • the present invention also provides a visual sweeping robot comprising:
  • a photographing module 1 configured to photograph and store an environment image of a surrounding environment of the sweeper during the cleaning process
  • the comparison module 2 when used for recharging, compares the stored environment image with a pre-stored refill image
  • a moving module 3 configured to: when the similarity between the environment image and the refill image is the highest, the robot moves to the refill base according to the current location information and the environment image information with the highest similarity Charging location
  • the condition module 4 is used to call the comparison module and the mobile module until the precondition is reached.
  • the sweeping robot when the sweeping robot starts to sweep the ground, it does not necessarily start from the refilling seat as a starting point. It may be that the user directly puts it into a certain room and starts cleaning, and may also be a sweeping robot from the room. Depart from the corner and start cleaning.
  • the sweeping robot shooting module 1 also collects the surrounding environment image, and some of the sweeping robots have cameras around the robot. At this time, the cameras around the sweeping robot can simultaneously collect the environment. The image, while collecting the environment image, also records the position of the environment image at this time, records the position of the sweeping robot, and records the starting point of the sweeping robot as a reference.
  • the sweeping robot can record the point from the starting point to the point where the image is collected.
  • the sweeping robot records its own motion trajectory in real time during cleaning.
  • the preset rule for collecting the environment image may be collected once every time, or once every time.
  • the refill command is generated, which may be that the power of the cleaning robot is lower than a preset threshold, or a refill command generated after receiving a command to end the cleaning sent by the user.
  • the comparison module 2 compares the collected environment image with the image of the refill seat that the user previously placed in the sweeping robot, finds the environment image with the highest similarity with the refill seat, and then moves the module.
  • the stored environment image is compared with a pre-stored refill image
  • the pre-stored refill image includes the user pre-storing the refill image in the visual sweeping robot, including The image of the refill seat captured and stored by the visual sweeping robot when it is recharged in the refill position.
  • the visual cleaning robot updates its stored refill seat image when the refill position changes.
  • the preset condition is that the robot can identify the identification identifier set on the refill base; the mobile module 3 further includes:
  • the charging unit 31 is configured to identify an identification identifier disposed on the refill base, and move to the recharging charging position with the identification identifier as a reference.
  • the determination may be a distance value measured according to the camera photograph, or may be When the distance between the sensor orientation measurement and the refill base is determined, the charging unit 31 confirms the specific position of the refill seat with the identification mark as a reference, and the cleaning robot moves to the charging position of the refill base.
  • the identification identifier is a two-dimensional code.
  • the two-dimensional code is a black and white picture with high recognition and simple production.
  • the identification mark is placed on the refill seat, it is convenient for the sweeping robot to recognize and align.
  • the charging unit 31 includes:
  • a line segment sub-unit 311, configured to merge pixel points of the similarity gradient information on the identification identifier into line segments;
  • the calculating subunit 313 calculates a relative positional relationship between the cleaning robot and the identification mark according to the visual sensor internal parameter of the visual cleaning robot;
  • the refill subunit 314 is configured to move to the recharging charging position according to the relative positional relationship.
  • the line segment sub-unit 313 calculates the gradient direction and the gradient size of each pixel in the acquired image, and then uses the similarity measure of the pixel point gradient, and the adjacent pixel points with similar gradient information are merged into one whole.
  • the nodes of the graph are one pixel
  • the weight of the edges is the gradient similarity of two pixels (regions).
  • the polygon sub-unit 312 connects the detected lines through the spatial adjacent criterion to form a polygon, and limits the number of polygons by limiting the length of the polygon and the number of corner points formed by the polygon to obtain a quadrilateral, spatially adjacent.
  • the calculation sub-unit 313 calculates the distance by comparing the encoding of the large quadrilateral with the preset encoding type, resulting in a more accurate detection target.
  • the homography matrix represents the second transformation performed by the 2D point projection onto the camera coordinate system on the two-dimensional code coordinate system, and can be directly linearly changed. (D i r e c t L i n e a r Transform algorithm) is obtained.
  • the camera's internal reference is represented by P, including the camera focal length and center deviation.
  • the external reference is represented by E. Then the homography matrix can be written as follows:
  • the column of the rotation matrix must be a unit size, according to the corresponding direction information of the two-dimensional code and the camera (the two-dimensional code appears in front of the camera), the size and direction of s can be obtained.
  • the third column of the rotation matrix can be recovered by computing the cross product of two known columns, since the rotated column matrix must be orthogonal. Thereby, the relative positional relationship of the two-dimensional code with respect to the camera can be obtained.
  • the backfill subunit 313 then controls the sweeping robot to move to the charging position of the refill base according to the relative positional relationship for charging.
  • the charging unit 31 includes:
  • the charging subunit 315 is configured to identify two identifiers on the charging base that are symmetrically disposed with the charging electrode as a symmetry axis.
  • two identification marks are disposed on the refilling seat, the two identification marks are located at the same height and symmetrically distributed on both sides of the charging electrode; the cleaning robot moves to the axis of symmetry of the two identification marks; the charging subunit 315
  • the control sweeping robot is continuously adjusted to be on the symmetry axis of the floor on the two identification marks, so that the accuracy of aligning the charging electrodes is higher. After aligning the charging electrode and keeping it straight, you can insert the charging smoothly. Or wait for the charge back to charge immediately after charging. If the home is out of power, there is no electricity in the recharge seat, you can also insert the charging electrode. After the home call, the recharged seat has electricity, and the sweeping robot immediately charges.
  • the photographing module 1 further includes:
  • the feature point unit 11 is configured to extract feature points of the image and store them.
  • the feature point unit 11 performs feature extraction on the environment image, which can reduce the memory of the image, saves space when saving, and compares the feature points when comparing, and can reduce the workload of the comparison.
  • the comparison module 2 includes:
  • a matching unit 21 configured to match feature points of the stored environment image with feature points of the pre-stored refill image
  • the matching value unit 22 is configured to calculate the number of interior points to generate a matching value.
  • the feature points extracted by the environment image collected by the cleaning robot during the sweeping process are compared with the feature points of the refilling seat, and the matching unit 21 extracts the feature points of the extracted environment image one by one with a preset back.
  • the feature points of the full seat are compared, and the feature matching method is used for matching; the inner point refers to the feature point of the one image in the two images.
  • the matching value unit 22 generates The higher the match value.
  • the environmental image with the highest similarity is selected subsequently, that is, the environmental image with the highest matching value. Confirm that the object corresponding to the feature point with the highest matching value is the refill seat.
  • the mobile module 3 further includes:
  • the storage unit 32 is configured to acquire and store the posture of the sweeping robot at each moment by the sweeper during the cleaning process by using a visual sensor or a laser sensor.
  • the visual sensor refers to acquiring the position of the cleaning robot in the movement by using the internal reference of the camera, and acquiring the posture of the cleaning robot in real time, so that the cleaning robot realizes the position of the cleaning robot in the environment.
  • the laser sensor determines the posture of the cleaning robot by detecting the distance from the surrounding object, and the storage unit 32 stores the posture acquired at each time in the cleaning robot.
  • the mobile module 3 further includes:
  • the expansion unit 33 marks the current position of the cleaning robot, and expands a position that is not marked near the current position of the cleaning robot to generate a child node;
  • An evaluation value unit 34 configured to calculate an evaluation function value for each child node, and mark a child node with the smallest evaluation function value
  • a path unit 35 configured to stop expanding if the child node with the smallest evaluation value is the target node, and join all the marked minimum child nodes to generate a path;
  • the mobile unit 36 is configured to move to the recharging charging position according to the path.
  • the cleaning robot when the cleaning robot returns to the position where the environment image is collected, path planning is performed, and the shortest path is also searched by the A* algorithm, also called A-Star algorithm, which is the most effective direct solution for solving the shortest path in the static road network.
  • the search method firstly expands the node 33 to mark the starting position of the cleaning robot, and expands its unmarked child node.
  • the child node is the position where the cleaning robot collects the environment image during the cleaning process, and then the evaluation value unit 34 pairs
  • Each sub-node calculates the evaluation function value, arranges according to the size of the evaluation value, finds the node with the smallest evaluation value, and marks it. If the current node is the target node, it needs to return to the image of the recharged seat. Location, stop searching.
  • the specific calculation steps are:
  • h(n) (number of abscissas between the current node and the target node + number of ordinates between the current node and the target node)*10, 8 nodes around the current node respectively solve h(n) with the target node.
  • the visual sweeping robot of the present invention enables the sweeping robot to find the refill seat by collecting the environment image during the sweeping process, thereby improving the speed of finding the refill seat.
  • the two two-dimensional codes of the charging electrode are symmetrically arranged on the refilling seat, so that the cleaning robot can be inserted into the refilling seat without the electrical signal emitted by the recharging base.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un robot de balayage visuel et un procédé de recharge associé. Le procédé de recharge consiste : à acquérir une image de l'environnement pendant le balayage; à comparer l'image de l'environnement stockée à une image prédéfinie d'une prise de recharge pendant la recharge; à sélectionner une image de l'environnement présentant la similarité la plus élevée et à retourner à la position où l'image de l'environnement a été photographiée; et à brancher de manière alignée la prise de recharge pour la recharge. Le procédé de l'invention permet au robot de balayage de trouver rapidement la position de la prise de recharge pour la recharge.
PCT/CN2017/114507 2017-12-04 2017-12-04 Robot de balayage visuel et procédé de recharge associé WO2019109230A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/114507 WO2019109230A1 (fr) 2017-12-04 2017-12-04 Robot de balayage visuel et procédé de recharge associé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/114507 WO2019109230A1 (fr) 2017-12-04 2017-12-04 Robot de balayage visuel et procédé de recharge associé

Publications (1)

Publication Number Publication Date
WO2019109230A1 true WO2019109230A1 (fr) 2019-06-13

Family

ID=66750731

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/114507 WO2019109230A1 (fr) 2017-12-04 2017-12-04 Robot de balayage visuel et procédé de recharge associé

Country Status (1)

Country Link
WO (1) WO2019109230A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1853552A (zh) * 2005-04-20 2006-11-01 Lg电子株式会社 具有自动返回充电座功能的清洁机器人及其使用方法
CN105825160A (zh) * 2015-01-05 2016-08-03 苏州宝时得电动工具有限公司 基于图像识别的定位装置及其定位方法
CN106451635A (zh) * 2016-11-02 2017-02-22 深圳乐行天下科技有限公司 一种智能回充方法及装置
US20170105592A1 (en) * 2012-10-05 2017-04-20 Irobot Corporation Robot management systems for determining docking station pose including mobile robots and methods using same
CN107392962A (zh) * 2017-08-14 2017-11-24 深圳市思维树科技有限公司 一种基于图案识别的机器人充电对接系统和方法
CN107427177A (zh) * 2015-02-13 2017-12-01 三星电子株式会社 清洁机器人及其控制方法
CN107945233A (zh) * 2017-12-04 2018-04-20 深圳市沃特沃德股份有限公司 视觉扫地机器人及其回充方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1853552A (zh) * 2005-04-20 2006-11-01 Lg电子株式会社 具有自动返回充电座功能的清洁机器人及其使用方法
US20170105592A1 (en) * 2012-10-05 2017-04-20 Irobot Corporation Robot management systems for determining docking station pose including mobile robots and methods using same
CN105825160A (zh) * 2015-01-05 2016-08-03 苏州宝时得电动工具有限公司 基于图像识别的定位装置及其定位方法
CN107427177A (zh) * 2015-02-13 2017-12-01 三星电子株式会社 清洁机器人及其控制方法
CN106451635A (zh) * 2016-11-02 2017-02-22 深圳乐行天下科技有限公司 一种智能回充方法及装置
CN107392962A (zh) * 2017-08-14 2017-11-24 深圳市思维树科技有限公司 一种基于图案识别的机器人充电对接系统和方法
CN107945233A (zh) * 2017-12-04 2018-04-20 深圳市沃特沃德股份有限公司 视觉扫地机器人及其回充方法

Similar Documents

Publication Publication Date Title
CN107945233B (zh) 视觉扫地机器人及其回充方法
CN111126304B (zh) 一种基于室内自然场景图像深度学习的增强现实导航方法
CN109631855B (zh) 基于orb-slam的高精度车辆定位方法
CN110568447B (zh) 视觉定位的方法、装置及计算机可读介质
Pathak et al. Online three‐dimensional SLAM by registration of large planar surface segments and closed‐form pose‐graph relaxation
CN107843251B (zh) 移动机器人的位姿估计方法
Stückler et al. Integrating depth and color cues for dense multi-resolution scene mapping using rgb-d cameras
CN109509230A (zh) 一种应用于多镜头组合式全景相机的slam方法
CN110827353B (zh) 一种基于单目摄像头辅助的机器人定位方法
Collins et al. Matching perspective views of coplanar structures using projective unwarping and similarity matching
CN108519102A (zh) 一种基于二次投影的双目视觉里程计算方法
JP2017117386A (ja) 自己運動推定システム、自己運動推定システムの制御方法及びプログラム
Zhu et al. Real-time global localization with a pre-built visual landmark database
CN109313822B (zh) 基于机器视觉的虚拟墙构建方法及装置、地图构建方法、可移动电子设备
JP6410231B2 (ja) 位置合わせ装置、位置合わせ方法及び位置合わせ用コンピュータプログラム
JP2009217456A (ja) ランドマーク装置および移動ロボットの制御システム
CN111656138A (zh) 构建地图及定位方法、客户端、移动机器人及存储介质
Nirmal et al. Homing with stereovision
Nüchter et al. Skyline-based registration of 3D laser scans
Yong-guo et al. The navigation of mobile robot based on stereo vision
WO2019109230A1 (fr) Robot de balayage visuel et procédé de recharge associé
WO2023159668A1 (fr) Système et procédé de capture de scènes à grande échelle à l'aide de dispositifs de mesure inertielle portables et de capteurs de détection et d'estimation de la distance par la lumière
Shere et al. Temporally consistent 3D human pose estimation using dual 360deg cameras
KR102516450B1 (ko) 맵 생성 방법 및 이를 이용한 이미지 기반 측위 시스템
Ortiz-Coder et al. Accurate 3d reconstruction using a videogrammetric device for heritage scenarios

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17933868

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17933868

Country of ref document: EP

Kind code of ref document: A1