CN112022025A - Automatic robot back flushing method and system based on visual positioning - Google Patents

Automatic robot back flushing method and system based on visual positioning Download PDF

Info

Publication number
CN112022025A
CN112022025A CN202010818063.3A CN202010818063A CN112022025A CN 112022025 A CN112022025 A CN 112022025A CN 202010818063 A CN202010818063 A CN 202010818063A CN 112022025 A CN112022025 A CN 112022025A
Authority
CN
China
Prior art keywords
robot
screen image
angle
aruco code
charging pile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010818063.3A
Other languages
Chinese (zh)
Inventor
宋君毅
姜冬辉
伍祁林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Elephant Robotics Technology Co ltd
Original Assignee
Shenzhen Elephant Robotics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Elephant Robotics Technology Co ltd filed Critical Shenzhen Elephant Robotics Technology Co ltd
Priority to CN202010818063.3A priority Critical patent/CN112022025A/en
Publication of CN112022025A publication Critical patent/CN112022025A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/02Docking stations; Docking operations
    • A47L2201/022Recharging of batteries
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention belongs to the field of computer vision, and discloses a robot automatic backlash method and system based on visual positioning. The method comprises the following steps: step 1: acquiring a screen image of a robot viewing frame, and preprocessing the screen image; step 2: judging whether the preprocessed screen image or the current screen image contains an arUco code; and step 3: judging whether the central coordinate point of the arUco code obtained in the step 2 is in the middle area of the screen; and 4, step 4: and identifying the position of the charging pile at the position of the arUco code, obtaining the relative position of the robot and the charging pile to obtain the relative distance, and finishing automatic recoil. The invention carries out attitude estimation by means of Aruco, thereby establishing a 3-dimensional space coordinate system. Then through the position of Aruco in the screen constantly adjust the position of camera thereby find and fill electric pile position, then through constantly advancing, find to fill electric pile and come successive approximation to fill electric pile, reach and fill electric pile. Compared with the existing back flushing algorithm, the method is low in cost and greatly reduced in implementation difficulty.

Description

Automatic robot back flushing method and system based on visual positioning
Technical Field
The invention belongs to the field of computer vision, and particularly relates to a robot automatic backlash method and system based on visual positioning.
Background
The automatic recoiling technology is a technology that a robot finds a charging pile and then returns to the charging pile for charging. At the present stage, the robot is really troublesome to charge and often needs external help, for people, the robot needs to be helped constantly to charge, the burden is increased, for the robot, the use experience is influenced due to sudden power failure in the use process, the non-intelligent feeling is also brought to people, and therefore the automatic backwashing technology enters the visual field of people. At present, there are 4 realization methods of the automatic backwash technology:
(1) the method is applied to 70% of the existing sweeping robots based on the infrared ray, and has the advantages of obvious high positioning precision and mature technology. The disadvantage is also obvious, because infrared rays cannot penetrate through an object, the sweeping robot is likely to be blocked by fine objects such as dust and debris.
(2) The realization method based on ultrasonic wave mainly realizes the triangulation location calculation of the current position by imitating bat to send sound waves and reflected sound waves. The method is currently applied in a few cases and has high cost.
(3) The implementation method based on the Bluetooth technology and the positioning system based on the Bluetooth technology are not influenced by the sight distance, and can realize positioning within a straight-line distance even if a barrier blocks the positioning system. When the Bluetooth device is positioned, a plurality of Bluetooth signals need to be searched to position, and the deployment is complex.
(4) The camera-based implementation method comprises the steps of identifying the position of the charging pile by using the camera, calculating the position and the angle of the camera relative to the charging pile by using posture estimation, and finally performing path navigation to reach the position of the camera. The algorithm needs to introduce a neural network algorithm to identify the charging pile, if the position of the charging pile changes, the background of the charging pile changes, the identification rate is reduced, and the stability is insufficient.
Disclosure of Invention
The invention aims to provide a robot automatic backwash method and system based on visual positioning, which are used for overcoming the defects of the backwash method in the prior art and the like.
In order to realize the task, the invention adopts the following technical scheme:
a robot automatic backlash method based on visual positioning comprises the following steps:
step 1: acquiring a screen image of a robot viewing frame, and preprocessing the screen image;
step 2: judging whether the preprocessed screen image contains the arUco code, if so, calculating the pose of the robot and obtaining a central coordinate point of the arUco code, and executing the step 3; if not, adjusting the angle of the robot according to the first angle range, and then returning to the step 1 to acquire the screen image of the robot viewing frame again;
and step 3: judging whether the central coordinate point of the arUco code obtained in the step 2 is in the middle area of the screen, if so, executing a step 4; if not, adjusting the angle of the robot according to the second angle range, and then returning to the step 1 to acquire the screen image of the view finder of the robot again;
the middle area of the screen is a strip longitudinal area with the center point of the screen image being +/-5 pix on the left and right;
and 4, step 4: identifying the position of a charging pile at the arUco code, obtaining the relative position of the robot and the charging pile to obtain the relative distance, and if the relative distance is smaller than the minimum distance, moving the robot forward to the charging pile to finish automatic recoiling;
if the relative distance is larger than or equal to the minimum distance, the included angle between the robot and the charging pile is obtained, the robot rotates the included angle in the direction close to the charging pile, moves forwards for minimum displacement, then rotates 90 degrees in the direction far away from the charging pile, and then returns to the step 1 to acquire the screen image of the robot viewing frame again.
Further, the first angle range is [15,20] °, and the second angle range is [4,6] °.
Further, the first angle is 15 °, the minimum distance is 20cm, and the minimum displacement is 10 cm.
Further, the arUco code sets up directly over charging pile.
A robot automatic back flushing system based on visual positioning comprises a screen image acquisition and preprocessing module, an arUco code identification module, a distance acquisition module, an angle adjustment module and a distance adjustment module;
the screen image acquisition and preprocessing module is used for acquiring a screen image of a robot viewing frame and preprocessing the screen image;
the arUco code identification module is used for judging whether the preprocessed screen image contains an arUco code or not, calculating the pose of the robot and obtaining a central coordinate point of the arUco code; the central coordinate point of the obtained arUco code is also used for judging whether the central coordinate point is in the middle area of the screen;
the distance acquisition module is used for identifying the position of the charging pile at the arUco code position by adopting a distance sensor, and obtaining the relative position of the robot and the charging pile to obtain the relative distance;
the angle adjustment module is used for adjusting the angle of robot according to first angle scope, second angle scope or robot and the contained angle that fills between the electric pile, includes:
if the arUco code identification module judges that the preprocessed screen image does not contain the arUco code, adjusting the angle of the robot according to the first angle range, and then calling the screen image acquisition and preprocessing module to acquire the screen image of the robot viewing frame again;
if the central coordinate point of the obtained arUco code is judged not to be in the middle area of the screen by the arUco code identification module, adjusting the angle of the robot according to the second angle range until the central coordinate point of the arUco code is in the middle area of the screen; the middle area of the screen is a strip longitudinal area with the center point of the screen image being +/-5 pix on the left and right;
if the arUco code identification module judges that the relative distance is smaller than the minimum distance, calling a distance adjustment module to enable the robot to move forwards to the charging pile to complete automatic recoil;
if the arUco code identification module judges that the relative distance is larger than or equal to the minimum distance range, obtaining an included angle between the robot and the charging pile, rotating the robot to the direction close to the charging pile, calling the distance adjustment module to move forwards for minimum displacement, then rotating the robot to the direction far away from the charging pile by 90 degrees, and calling the screen image acquisition and preprocessing module to obtain a screen image of a robot viewing frame;
the distance adjusting module is used for controlling the robot to move according to the instruction of the angle adjusting module.
Further, the first angle range is [15,20] °, and the second angle range is [4,6] °.
Further, the first angle is 15 °, the minimum distance is 20cm, and the minimum displacement is 10 cm.
Further, the arUco code sets up directly over charging pile.
Compared with the prior art, the invention has the following technical characteristics:
(1) the invention carries out attitude estimation by means of Aruco, thereby establishing a 3-dimensional space coordinate system. Then through the position of Aruco in the screen constantly adjust the position of camera thereby find and fill electric pile position, then through constantly advancing, find to fill electric pile and come successive approximation to fill electric pile, reach and fill electric pile. Compared with the existing back flushing algorithm, the method is low in cost and greatly reduced in implementation difficulty.
(2) The use mode is diversified. The scheme uses the Aruco code to perform positioning identification, realizes positioning different products by changing the Aruco code, is used for charging piles, and can be used for any points needing to be marked, such as cat bowls.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a flow chart of an image collection process;
FIG. 3 is a flow chart of searching for arUco codes;
FIG. 4 is a flow chart of fine adjustment of self-angle;
FIG. 5 is a navigation strategy diagram in embodiment 1;
FIG. 6 is an image capture of a robot camera view box;
fig. 7 is a screenshot of the robot camera capturing the marker code.
Detailed Description
arUco code: the arUco code is a two-dimensional code similar to a two-dimensional code formed by combining two-dimensional matrixes with black borders around and with the ID determined in the two-dimensional matrixes. The black frame accelerates the detection speed in the marked image, and the internal two-dimensional code can uniquely mark the identification mark. In the system, the four-angle position of the black frame is detected mainly by using the quick detection speed (compared with a two-dimensional code) of the black frame to judge the center position of the black frame, and further judge the position condition of the black frame relative to the camera.
The embodiment discloses a robot automatic backlash method based on visual positioning, which comprises the following steps:
step 1: acquiring a screen image of a robot viewing frame, and preprocessing the screen image;
step 2: judging whether the preprocessed screen image contains the arUco code, if so, calculating the pose of the robot and obtaining a central coordinate point of the arUco code, and executing the step 3; if not, adjusting the angle of the robot according to the first angle range, and then returning to the step 1 to acquire the screen image of the robot viewing frame again;
and step 3: judging whether the central coordinate point of the arUco code obtained in the step 2 is in the middle area of the screen, if so, executing a step 4; if not, adjusting the angle of the robot according to the second angle range, and then returning to the step 1 to acquire the screen image of the view finder of the robot again;
the middle area of the screen is a strip longitudinal area with the center point of the screen image being +/-5 pix on the left and right;
and 4, step 4: identifying the position of a charging pile at the arUco code, obtaining the relative position of the robot and the charging pile to obtain the relative distance, and if the relative distance is smaller than the minimum distance, moving the robot forward to the charging pile to finish automatic recoiling;
if the relative distance is larger than or equal to the minimum distance, the included angle between the robot and the charging pile is obtained, the robot rotates the included angle in the direction close to the charging pile, moves forwards for minimum displacement, then rotates 90 degrees in the direction far away from the charging pile, and then returns to the step 1 to acquire the screen image of the robot viewing frame again.
Specifically, the screen image of the robot viewfinder refers to a picture in the robot camera viewfinder.
Specifically, if the included angle between the robot and the charging pile is too large (greater than 40 degrees) in step 4, the robot cannot complete at one time, and the robot can rotate in batches according to a first angle range until the robot rotates completely.
Specifically, the middle area of the screen is an area of +/-5 pix in the longitudinal direction of the center of the screen, namely a +/-strip-shaped area in the screen, and since the up-down position of the camera is judged by the distance, and the distance can be accurately measured by a distance sensor, the distance is not discussed here, and only the left-right coordinate position is judged.
Specifically, the first angle is in the range of [15,20] °, and preferably, the first angle is 15 °.
Specifically, the range of the second angle is [4,6] °.
Specifically, the setting of arUco sign indicating number is directly over filling electric pile, and highly be about 5cm, and the height that the camera normally can be seen under the concrete position needs stand state according to the robot, and the arUco sign indicating number should be in spacious place, provides 3 x 3m2The position of the movable member is adjusted.
A robot automatic back flushing system based on visual positioning comprises a screen image acquisition and preprocessing module, an arUco code identification module, a distance acquisition module, an angle adjustment module and a distance adjustment module;
the screen image acquisition and preprocessing module is used for acquiring a screen image of a robot viewing frame and preprocessing the screen image;
the arUco code identification module is used for judging whether the preprocessed screen image contains an arUco code or not, calculating the pose of the robot and obtaining a central coordinate point of the arUco code; the central coordinate point of the obtained arUco code is also used for judging whether the central coordinate point is in the middle area of the screen;
the distance acquisition module is used for identifying the position of the charging pile at the arUco code position by adopting a distance sensor, and obtaining the relative position of the robot and the charging pile to obtain the relative distance;
the angle adjustment module is used for adjusting the angle of robot according to first angle scope, second angle scope or robot and the contained angle that fills between the electric pile, includes:
if the arUco code identification module judges that the preprocessed screen image does not contain the arUco code, adjusting the angle of the robot according to the first angle range, and then calling the screen image acquisition and preprocessing module to acquire the screen image of the robot viewing frame again;
if the central coordinate point of the obtained arUco code is judged not to be in the middle area of the screen by the arUco code identification module, adjusting the angle of the robot according to the second angle range until the central coordinate point of the arUco code is in the middle area of the screen; the middle area of the screen is a strip longitudinal area with the center point of the screen image being +/-5 pix on the left and right;
if the arUco code identification module judges that the relative distance is smaller than the minimum distance, calling a distance adjustment module to enable the robot to move forwards to the charging pile to complete automatic recoil;
if the arUco code identification module judges that the relative distance is larger than or equal to the minimum distance range, obtaining an included angle between the robot and the charging pile, rotating the robot to the direction close to the charging pile, calling the distance adjustment module to move forwards for minimum displacement, then rotating the robot to the direction far away from the charging pile by 90 degrees, and calling the screen image acquisition and preprocessing module to obtain a screen image of a robot viewing frame;
the distance adjusting module is used for controlling the robot to move according to the instruction of the angle adjusting module.
Example 1
The embodiment discloses a robot automatic backlash method based on visual positioning, wherein the implementation environment is OpenCV, the method is implemented on the basis of identifying an arUco code in OpenCV, the arUco code is a very mature library in OpenCV, and attitude estimation is performed by means of the arUco code, so that a 3-dimensional space coordinate system is established. Then the position of camera is constantly adjusted through the position of arUco sign indicating number in the screen thereby find and fill electric pile position, then through constantly advancing, find to fill electric pile and come to approach gradually and fill electric pile, reach and fill electric pile. The navigation strategy diagram is shown in FIG. 5.
The method comprises the following four steps: image collection processing, data processing, movement track planning and movement track adjustment. The position of the arUco code is known by collecting images, namely processing, then the arUco code is positioned in the middle of a screen by adjusting the angle of the arUco code, and then the arUco code rotates leftwards, forwards by about 10cm and rotates rightwards by 90 degrees, so that the offset distance of the camera is adjusted, the plane where the arUco code of the camera is positioned is in a vertical state, and the vertical point is the arUco code. Then continue to collect images, look for the arUco code, loop so until the camera is within 20cm of the arUco code directly in front of it, and then move forward by a distance of 20 cm.
Specifically, in the image collection processing stage, the camera image is mainly read using cv2.videocapture (). read (), and the image graying processing is performed.
Specifically, the step 2 of judging whether the arUco code is included is to judge whether the arUco is recognized by acquiring coordinates of four corners of the arUco in the screen picture and judging whether the arUco is recognized by judging whether the coordinates of the four corners are null, and if the arUco is recognized, calculating the posture data and returning the posture data. As shown in fig. 2.
Specifically, in the stage of planning the motion trajectory, firstly, attitude data is acquired, whether the arUco code is identified is judged according to whether the attitude data is empty, if the arUco code is identified, the task in the stage is ended, the next stage is started, and if the arUco code is not identified, the self angle is continuously adjusted until the arUco code is identified. As shown in fig. 3.
Before implementing this scheme, an attempt is made to use two-dimensional codes as positioning targets, and two-dimensional code schemes are abandoned for accuracy reasons, as shown in the table by comparison:
Figure RE-GDA0002687006660000081
when the identification target is selected, the identification effect of the charging pile is not good, so that an attempt is made to find out a marker which is easy to identify. Test 3 kinds of schemes, fill electric pile, two-dimensional code, arUco sign indicating number. Through testing, the two-dimensional code is identified and then calculated, the period of the obtained data is about 0.8s, the identification speed is low, and the instantaneity cannot be guaranteed. Stake of electric is filled in discernment, only one camera is difficult to carry out 3D to the object and models, can not guarantee to the accuracy of data. The recognition speed and accuracy are highest using the arUco code. The method therefore decides to use the arUco code recognition. The arUco code is a binary square fiducial marker that can provide correspondence to acquire camera pose and its internal binary coding is robust, often used for error detection and correction techniques.
Through tests, the identification range of the camera is about 85 degrees, namely the camera can rotate for one circle by 4 to 5 times approximately, and all surrounding pictures are captured. However, as the capturing time of the camera is delayed, the method for converting the large angle and the small angle is adopted in the method, and the basic principle is that when the state of the arUco code is not identified, the large angle can be rotated once after the small angle is rotated for 4 times, the small angle cannot exceed 20 degrees, and the large angle cannot exceed 60 degrees. Although the rotation times are increased, the efficiency is not reduced, and if the rotation times are less, the scanned picture is fuzzy and is not favorable for capturing the arUco code. When the arUco code appears on the screen, the self angle needs to be finely adjusted, so that the rotation angle is randomly selected between [4 and 6], and the situation that the position of the camera is adjusted and the arUco code cannot be located in the middle position all the time can be avoided.

Claims (8)

1. A robot automatic backlash method based on visual positioning is characterized by comprising the following steps:
step 1: acquiring a screen image of a robot viewing frame, and preprocessing the screen image;
step 2: judging whether the preprocessed screen image contains the arUco code, if so, calculating the pose of the robot and obtaining a central coordinate point of the arUco code, and executing the step 3; if not, adjusting the angle of the robot according to the first angle range, and then returning to the step 1 to acquire the screen image of the robot viewing frame again;
and step 3: judging whether the central coordinate point of the arUco code obtained in the step 2 is in the middle area of the screen, if so, executing a step 4; if not, adjusting the angle of the robot according to the second angle range, and then returning to the step 1 to acquire the screen image of the view finder of the robot again;
the middle area of the screen is a strip longitudinal area with the center point of the screen image being +/-5 pix on the left and right;
and 4, step 4: identifying the position of a charging pile at the arUco code, obtaining the relative position of the robot and the charging pile to obtain the relative distance, and if the relative distance is smaller than the minimum distance, moving the robot forward to the charging pile to finish automatic recoiling;
if the relative distance is larger than or equal to the minimum distance, the included angle between the robot and the charging pile is obtained, the robot rotates the included angle in the direction close to the charging pile, moves forwards for minimum displacement, then rotates 90 degrees in the direction far away from the charging pile, and then returns to the step 1 to acquire the screen image of the robot viewing frame again.
2. The vision localization based robot automatic backwash method of claim 1, wherein said first angle range is [15,20] ° and said second angle range is [4,6] °.
3. The vision localization based robot automatic backlash method of claim 3, wherein said first angle is 15 °, minimum distance is 20cm, and minimum displacement is 10 cm.
4. The automatic robot backflushing method based on visual positioning as claimed in claim 1, wherein the arUco code is arranged right above the charging pile.
5. A robot automatic back flushing system based on visual positioning is characterized by comprising a screen image acquisition and preprocessing module, an arUco code identification module, a distance acquisition module, an angle adjustment module and a distance adjustment module;
the screen image acquisition and preprocessing module is used for acquiring a screen image of a robot viewing frame and preprocessing the screen image;
the arUco code identification module is used for judging whether the preprocessed screen image contains an arUco code or not, calculating the pose of the robot and obtaining a central coordinate point of the arUco code; the central coordinate point of the obtained arUco code is also used for judging whether the central coordinate point is in the middle area of the screen;
the distance acquisition module is used for identifying the position of the charging pile at the arUco code position by adopting a distance sensor, and obtaining the relative position of the robot and the charging pile to obtain the relative distance;
the angle adjustment module is used for adjusting the angle of robot according to first angle scope, second angle scope or robot and the contained angle that fills between the electric pile, includes:
if the arUco code identification module judges that the preprocessed screen image does not contain the arUco code, adjusting the angle of the robot according to the first angle range, and then calling the screen image acquisition and preprocessing module to acquire the screen image of the robot viewing frame again;
if the central coordinate point of the obtained arUco code is judged not to be in the middle area of the screen by the arUco code identification module, adjusting the angle of the robot according to the second angle range until the central coordinate point of the arUco code is in the middle area of the screen; the middle area of the screen is a strip longitudinal area with the center point of the screen image being +/-5 pix on the left and right;
if the arUco code identification module judges that the relative distance is smaller than the minimum distance, calling a distance adjustment module to enable the robot to move forwards to the charging pile to complete automatic recoil;
if the arUco code identification module judges that the relative distance is larger than or equal to the minimum distance range, obtaining an included angle between the robot and the charging pile, rotating the robot to the direction close to the charging pile, calling the distance adjustment module to move forwards for minimum displacement, then rotating the robot to the direction far away from the charging pile by 90 degrees, and calling the screen image acquisition and preprocessing module to obtain a screen image of a robot viewing frame;
the distance adjusting module is used for controlling the robot to move according to the instruction of the angle adjusting module.
6. The vision localization based robot automatic backwash system of claim 5, wherein said first angular range is [15,20] ° and said second angular range is [4,6] °.
7. The vision localization based robot automatic backwash system of claim 6, wherein the first angle is 15 °, the minimum distance is 20cm, and the minimum displacement is 10 cm.
8. The vision positioning-based robotic automatic recoil system of claim 5, wherein the arUco code is disposed directly above the charging post.
CN202010818063.3A 2020-08-14 2020-08-14 Automatic robot back flushing method and system based on visual positioning Pending CN112022025A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010818063.3A CN112022025A (en) 2020-08-14 2020-08-14 Automatic robot back flushing method and system based on visual positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010818063.3A CN112022025A (en) 2020-08-14 2020-08-14 Automatic robot back flushing method and system based on visual positioning

Publications (1)

Publication Number Publication Date
CN112022025A true CN112022025A (en) 2020-12-04

Family

ID=73577950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010818063.3A Pending CN112022025A (en) 2020-08-14 2020-08-14 Automatic robot back flushing method and system based on visual positioning

Country Status (1)

Country Link
CN (1) CN112022025A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112734844A (en) * 2021-01-08 2021-04-30 河北工业大学 Monocular 6D pose estimation method based on octahedron
CN113706452A (en) * 2021-07-09 2021-11-26 武汉思恒达科技有限公司 Charging pile automatic detection system based on image recognition and detection method thereof
CN113879162A (en) * 2021-09-27 2022-01-04 国网北京市电力公司 Robot control method, device and system
CN114397886A (en) * 2021-12-20 2022-04-26 烟台杰瑞石油服务集团股份有限公司 Charging method and charging system
CN114601373A (en) * 2021-10-18 2022-06-10 北京石头世纪科技股份有限公司 Control method and device for cleaning robot, cleaning robot and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106647747A (en) * 2016-11-30 2017-05-10 北京智能管家科技有限公司 Robot charging method and device
CN206369965U (en) * 2016-12-26 2017-08-01 旗瀚科技有限公司 A kind of Automatic-searching charging pile robot based on Machine Vision Recognition
US20170225891A1 (en) * 2016-02-05 2017-08-10 inVia Robotics, LLC Robotic Navigation and Mapping
CN108594822A (en) * 2018-05-10 2018-09-28 哈工大机器人(昆山)有限公司 Robot localization method, robot charging method based on Quick Response Code and system
CN108983603A (en) * 2018-06-27 2018-12-11 广州视源电子科技股份有限公司 Butt joint method of robot and object and robot thereof
CN109460044A (en) * 2019-01-10 2019-03-12 轻客小觅智能科技(北京)有限公司 A kind of robot method for homing, device and robot based on two dimensional code
CN109683605A (en) * 2018-09-25 2019-04-26 上海肇观电子科技有限公司 Robot and its automatic recharging method, system, electronic equipment, storage medium
US20190278288A1 (en) * 2018-03-08 2019-09-12 Ubtech Robotics Corp Simultaneous localization and mapping methods of mobile robot in motion area
CN110801181A (en) * 2019-11-07 2020-02-18 珠海格力电器股份有限公司 Sweeping robot system, sweeping robot control method and sweeping robot control device
CN111070205A (en) * 2019-12-04 2020-04-28 上海高仙自动化科技发展有限公司 Pile alignment control method and device, intelligent robot and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170225891A1 (en) * 2016-02-05 2017-08-10 inVia Robotics, LLC Robotic Navigation and Mapping
CN106647747A (en) * 2016-11-30 2017-05-10 北京智能管家科技有限公司 Robot charging method and device
CN206369965U (en) * 2016-12-26 2017-08-01 旗瀚科技有限公司 A kind of Automatic-searching charging pile robot based on Machine Vision Recognition
US20190278288A1 (en) * 2018-03-08 2019-09-12 Ubtech Robotics Corp Simultaneous localization and mapping methods of mobile robot in motion area
CN108594822A (en) * 2018-05-10 2018-09-28 哈工大机器人(昆山)有限公司 Robot localization method, robot charging method based on Quick Response Code and system
CN108983603A (en) * 2018-06-27 2018-12-11 广州视源电子科技股份有限公司 Butt joint method of robot and object and robot thereof
CN109683605A (en) * 2018-09-25 2019-04-26 上海肇观电子科技有限公司 Robot and its automatic recharging method, system, electronic equipment, storage medium
CN109460044A (en) * 2019-01-10 2019-03-12 轻客小觅智能科技(北京)有限公司 A kind of robot method for homing, device and robot based on two dimensional code
CN110801181A (en) * 2019-11-07 2020-02-18 珠海格力电器股份有限公司 Sweeping robot system, sweeping robot control method and sweeping robot control device
CN111070205A (en) * 2019-12-04 2020-04-28 上海高仙自动化科技发展有限公司 Pile alignment control method and device, intelligent robot and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112734844A (en) * 2021-01-08 2021-04-30 河北工业大学 Monocular 6D pose estimation method based on octahedron
CN113706452A (en) * 2021-07-09 2021-11-26 武汉思恒达科技有限公司 Charging pile automatic detection system based on image recognition and detection method thereof
CN113879162A (en) * 2021-09-27 2022-01-04 国网北京市电力公司 Robot control method, device and system
CN114601373A (en) * 2021-10-18 2022-06-10 北京石头世纪科技股份有限公司 Control method and device for cleaning robot, cleaning robot and storage medium
CN114397886A (en) * 2021-12-20 2022-04-26 烟台杰瑞石油服务集团股份有限公司 Charging method and charging system
CN114397886B (en) * 2021-12-20 2024-01-23 烟台杰瑞石油服务集团股份有限公司 Charging method and charging system

Similar Documents

Publication Publication Date Title
CN112022025A (en) Automatic robot back flushing method and system based on visual positioning
CN108171733B (en) Method of registering two or more three-dimensional 3D point clouds
JP4488804B2 (en) Stereo image association method and three-dimensional data creation apparatus
CN108369743B (en) Mapping a space using a multi-directional camera
CN105841687B (en) indoor positioning method and system
US6470271B2 (en) Obstacle detecting apparatus and method, and storage medium which stores program for implementing the method
CN110728715A (en) Camera angle self-adaptive adjusting method of intelligent inspection robot
CN110163963B (en) Mapping device and mapping method based on SLAM
WO2011005783A2 (en) Image-based surface tracking
WO2015024407A1 (en) Power robot based binocular vision navigation system and method based on
CN101957197A (en) Location measurement method and position measuring instrument
US12078488B2 (en) Mapping and tracking methods and systems principally for use in connection with swimming pools and spas
CN110597265A (en) Recharging method and device for sweeping robot
EP3529978B1 (en) An image synthesis system
JP4132068B2 (en) Image processing apparatus, three-dimensional measuring apparatus, and program for image processing apparatus
JP2007147341A (en) Method and apparatus for creating three-dimensional data
CN111571618B (en) Autonomous pickup robot based on visual algorithm and pickup method thereof
Nagy et al. Online targetless end-to-end camera-LiDAR self-calibration
CN110998241A (en) System and method for calibrating an optical system of a movable object
CN106370160A (en) Robot indoor positioning system and method
CN104469170A (en) Binocular shooting device and image processing method and device
CN110445982A (en) A kind of tracking image pickup method based on six degree of freedom equipment
KR102065337B1 (en) Apparatus and method for measuring movement information of an object using a cross-ratio
CN112433542B (en) Automatic robot recharging method and system based on visual positioning
CN112304250B (en) Three-dimensional matching equipment and method between moving objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201204

RJ01 Rejection of invention patent application after publication