CN112265463B - Control method and device of self-moving equipment, self-moving equipment and medium - Google Patents

Control method and device of self-moving equipment, self-moving equipment and medium Download PDF

Info

Publication number
CN112265463B
CN112265463B CN202011110956.9A CN202011110956A CN112265463B CN 112265463 B CN112265463 B CN 112265463B CN 202011110956 A CN202011110956 A CN 202011110956A CN 112265463 B CN112265463 B CN 112265463B
Authority
CN
China
Prior art keywords
coordinate system
self
determining
charging device
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011110956.9A
Other languages
Chinese (zh)
Other versions
CN112265463A (en
Inventor
高梓翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Orion Star Technology Co Ltd
Original Assignee
Beijing Orion Star Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Orion Star Technology Co Ltd filed Critical Beijing Orion Star Technology Co Ltd
Priority to CN202011110956.9A priority Critical patent/CN112265463B/en
Publication of CN112265463A publication Critical patent/CN112265463A/en
Application granted granted Critical
Publication of CN112265463B publication Critical patent/CN112265463B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L53/00Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
    • B60L53/30Constructional details of charging stations
    • B60L53/35Means for automatic or assisted adjustment of the relative position of charging devices and vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L53/00Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
    • B60L53/30Constructional details of charging stations
    • B60L53/35Means for automatic or assisted adjustment of the relative position of charging devices and vehicles
    • B60L53/37Means for automatic or assisted adjustment of the relative position of charging devices and vehicles using optical position determination, e.g. using cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation
    • Y02T90/10Technologies relating to charging of electric vehicles
    • Y02T90/12Electric charging stations

Abstract

The application provides a control method and device of self-moving equipment, the self-moving equipment and a medium, wherein the method comprises the following steps: acquiring an image collected by an image sensor carried by mobile equipment; the image shows a guide mark of the charging device; determining the pose of the self-moving equipment relative to the charging device according to the guide identifier; and controlling the mobile equipment to move to a charging device for charging according to the pose. Therefore, the self-moving equipment can find the guide identifier of the charging device based on image recognition in the moving process, and the self-moving equipment is controlled to move to the charging device for charging based on the guide identifier, so that the back charging function is realized, the accurate position of the charging device does not need to be known, the accurate position of the charging device does not need to be marked in a map of the self-moving equipment, the pose of the self-moving equipment relative to the charging device can be recognized even if the position of the charging device is changed, and the self-moving equipment is controlled to move to the charging device.

Description

Control method and device of self-moving equipment, self-moving equipment and medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for controlling a self-moving device, and a medium.
Background
The mobile device moves to the charging pile, and the charging pile is in butt joint with the charging pile for charging, namely recharging. The charging pile is a charging device for charging the mobile equipment. When carrying out the repayment from mobile device, need survey the position of filling electric pile, and then the adjustment and fill the position relation between the electric pile to carry out accurate butt joint with filling electric pile.
In the related art, in order to enable the self-mobile device to be accurately connected with the charging pile, the position of the charging pile needs to be marked in a map of the self-mobile device, and two parallel marking lines need to be arranged on the charging pile. After the mobile device moves to the position with the distance between the positions of the charging piles as the set radius according to the map, the positions of the two marking lines are detected by the mobile device, and the self position is continuously adjusted in the process of moving towards the two marking lines, so that the self mobile device is aligned with the charging piles.
However, in the above manner, the accurate position of the charging pile needs to be known in advance, and the position of the charging pile is marked in the map of the mobile device, and if the position of the charging pile moves, the self-mobile device may not realize recharging.
Disclosure of Invention
The present application is directed to solving, at least in part, one of the technical problems in the related art.
The application provides a control method and device of self-moving equipment, the self-moving equipment and a medium, so that in the moving process of the self-moving equipment, a guide identifier of a charging device is searched based on image recognition, based on the guide identifier, the self-moving equipment is controlled to move to the charging device for charging, the back charging function is realized, the accurate position of the charging device (such as a charging pile) does not need to be known, in addition, the accurate position of the charging device does not need to be annotated in a map of the self-moving equipment, even if the position of the charging device changes, the position and posture of the self-moving equipment relative to the charging device can be recognized, the self-moving equipment is controlled to move to the charging device, the self-moving equipment is not influenced to carry out back charging, and the applicability of the method can be improved.
An embodiment of a first aspect of the present application provides a control method for a self-moving device, including:
acquiring an image collected by an image sensor carried by mobile equipment; the image shows a guide mark of the charging device;
determining the pose of the self-moving equipment relative to the charging device according to the guide identification;
and controlling the self-moving equipment to move to the charging device for charging according to the pose.
An embodiment of a second aspect of the present application provides a control apparatus for a self-moving device, including:
the acquisition module is used for acquiring an image acquired by an image sensor carried by the mobile equipment; the image shows a guide mark of the charging device;
the determination module is used for determining the pose of the self-moving equipment relative to the charging device according to the guide identification;
and the control module is used for controlling the self-moving equipment to move to the charging device for charging according to the pose.
To achieve the above object, a third aspect of the present application provides an autonomous mobile device, including: the image sensor, the memory, the processor and the computer program stored in the memory and capable of running on the processor, when the processor executes the program, the control method of the self-moving device as proposed in the embodiment of the first aspect of the present application is realized.
In order to achieve the above object, a fourth aspect of the present application provides a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a control method of a self-moving device as set forth in the first aspect of the present application.
In order to achieve the above object, an embodiment of a fifth aspect of the present application proposes a computer program product, where when being executed by an instruction processor, a control method of a self-moving device as proposed in an embodiment of the first aspect of the present application is executed.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart illustrating a control method of a self-moving device according to an embodiment of the present disclosure;
fig. 2 is a flowchart illustrating a control method of a self-moving device according to a second embodiment of the present application;
FIG. 3 is a first drawing of a guide identifier in an embodiment of the present application;
FIG. 4 is a second schematic drawing of a guidance identifier in an embodiment of the present application;
FIG. 5 is a third schematic drawing of a guidance sign in the embodiment of the present application;
fig. 6 is a flowchart illustrating a control method of a self-moving device according to a third embodiment of the present application;
fig. 7 is a flowchart illustrating a control method of a self-moving device according to a fourth embodiment of the present application;
FIG. 8 is a fourth schematic drawing of a guidance mark in the embodiment of the present application;
fig. 9 is a flowchart illustrating a control method of a self-moving device according to a fifth embodiment of the present application;
fig. 10 is a flowchart illustrating a control method of a self-moving device according to a sixth embodiment of the present application;
fig. 11 is a flowchart illustrating a control method of a self-moving device according to a seventh embodiment of the present application;
fig. 12 is a schematic structural diagram of a control apparatus of a self-moving device according to an eighth embodiment of the present application.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are illustrative and intended to explain the present application and should not be construed as limiting the present application.
A control method and apparatus for a self-moving device, and a medium according to embodiments of the present application are described below with reference to the accompanying drawings.
Fig. 1 is a flowchart illustrating a control method of a self-moving device according to an embodiment of the present disclosure.
The execution subject of the embodiment of the present application may be the control apparatus of the self-moving device provided in the present application, and the control apparatus of the self-moving device may be configured in the self-moving device, such as a local controller of the self-moving device, so that the execution of the control function by the self-moving device may be implemented. The self-moving equipment can be equipment which has a navigation obstacle avoidance function and can move autonomously, such as an intelligent robot.
As shown in fig. 1, the control method of the self-moving apparatus may include the steps of:
step 101, acquiring an image collected from an image sensor carried by mobile equipment; wherein, the image shows the guide mark of the charging device.
In the embodiment of the application, the image sensor can be mounted on the self-moving equipment, and images can be collected through the image sensor in the moving process of the self-moving equipment. Wherein, the guide sign of charging device is shown in the image of gathering. The image sensor may be a Charge Coupled Device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), a Thin Film Transistor (TFT), or other image sensors.
In the embodiment of the application, when the mobile device detects that the electric quantity of the mobile device is lower than the preset electric quantity threshold value, the mobile device can move in a located space, and images are collected through the carried image sensor, so that the control device of the mobile device can obtain the collected images from the carried image sensor of the mobile device.
The collected image shows a guide identifier of the charging device, and the guide identifier is used for determining the pose of the mobile equipment relative to the charging device.
It can be understood that, the shape, size, color and other features of the guide identifier of the charging device are known, and therefore, in a possible implementation manner of the embodiment of the present application, whether the guide identifier of the charging device is shown in the acquired image may be identified based on an image identification algorithm such as a Region of Interest (ROI) extraction algorithm, an object detection algorithm and the like. When the image shows the guide identifier of the charging device, the subsequent steps can be executed, and when the image does not show the guide identifier of the charging device, the mobile device can be controlled to move in the space where the mobile device is located, the image collected by the image sensor in the moving process of the mobile device is obtained, and the collected image is re-identified until the guide identifier of the charging device is identified and obtained in the collected image.
For example, the collected image may be identified based on a Single Shot multi box Detector (SSD for short), a target detection algorithm that You Only see Once (YOLO for short), fast-RCNN, and the like, and the region where the guide identifier is located in the collected image is determined, which is not limited in the present application.
In another possible implementation manner of the embodiment of the application, connected domain detection may be performed on an image acquired by an image sensor to acquire a plurality of connected domains, and a region where a guide identifier is located in the acquired image is determined according to a geometric feature of each connected domain. The geometric characteristics may be characteristic information of the size, the length-width ratio, the color distribution, and the like of the connected region.
In another possible implementation manner of the embodiment of the present application, the image acquired by the image sensor may be preprocessed, for example, gaussian blurring, binarization, edge extraction, and the like, and the region where the guide identifier is located in the acquired image is determined according to the value of each pixel point in the preprocessed image. Or, the area where the guide identifier is located in the acquired image can be determined directly according to the value of each pixel point in the image acquired by the image sensor.
For example, it may be determined that each pixel point whose value exceeds a preset threshold in the acquired image, and the connected domain may be determined according to each pixel point whose value exceeds the preset threshold, and then the region where the guide identifier is located in the acquired image may be identified according to the geometric features of each connected domain. That is to say, the area where the guide identifier is located in the acquired image can be identified according to the brightness condition in the connected domain, and the connected domain with the shape similar to that of the guide identifier but the brightness not meeting the condition can be excluded to improve the accuracy of the detection result of the guide identifier.
And 102, determining the pose of the self-moving equipment relative to the charging device according to the guide identifier.
It should be understood that when the observation positions are different, the position, size, deformation degree, and the like of the guide identifier of the charging device in the image collected by the image sensor in the collected image are different, and based on the above characteristics, the pose of the self-moving device relative to the charging device can be identified and obtained according to the guide identifier of the charging device shown in the collected image.
And 103, controlling the mobile equipment to move to a charging device for charging according to the determined pose.
In the embodiment of the application, after the pose of the mobile equipment relative to the charging device is determined, the mobile equipment can be controlled to move to the charging device for charging according to the pose.
According to the control method of the self-moving equipment, the self-moving equipment can be moved to the vicinity of the charging device under the condition that the approximate position of the charging device is known in advance, then only the self-moving equipment needs to find the guide identifier of the charging device based on image recognition, based on the guide identifier, the self-moving equipment is controlled to move to the charging device for charging, the recharging function is achieved, the accurate position of the charging device (such as a charging pile) does not need to be known, in addition, the accurate position of the charging device does not need to be marked in a map of the self-moving equipment, even if the position of the charging device changes, the position and the pose of the self-moving equipment relative to the charging device can be recognized, the self-moving equipment is controlled to move to the charging device, the recharging of the self-moving equipment is not influenced, and the applicability of the method can be improved.
In a possible implementation manner of the embodiment of the present application, the guidance identifier of the charging device may include a coordinate system portion and a coding portion, the coordinate system portion is used to determine a candidate coordinate system, and the coding portion is used to decode to obtain a code corresponding to the guidance identifier. When the pose of the self-moving equipment relative to the charging device is determined according to the guide identifier, a candidate coordinate system can be determined according to a coordinate system part in the guide identifier, a target code is obtained by decoding at a coordinate position in the candidate coordinate system according to a plurality of mark points (marked as first mark points) contained in a coding part, whether the target code is matched with the code of the self-moving equipment is judged, and if yes, the pose of the self-moving equipment relative to the charging device is determined according to the plurality of first mark points contained in the coding part.
The above process is described in detail with reference to example two.
Fig. 2 is a flowchart illustrating a control method of a self-moving device according to a second embodiment of the present application.
As shown in fig. 2, the control method of the self-moving apparatus may include the steps of:
step 201, acquiring an image collected from an image sensor carried by mobile equipment; the image shows a guide identifier of the charging device, and the guide identifier comprises a coordinate system part and a coding part.
It is understood that the shape, size, color, and other characteristics of the coordinate system portion and the code portion in the guide identifier are known, and therefore, in one possible implementation of the embodiment of the present application, the coordinate system portion and the code portion in the captured image may be identified based on an object detection algorithm. For example, the image collected by the image sensor may be identified by a target detection algorithm such as SSD, YOLO, fast-RCNN, etc., and the coordinate system portion and the encoding portion are determined, which is not limited in this application.
It should be understood that, in order to improve the accuracy of the recognition result and improve the processing efficiency of the image, the guide identifier in the captured image may be recognized first, for example, a region where the guide identifier is located in the captured image may be recognized based on an ROI extraction algorithm, an object detection algorithm, and the like, and then the coordinate system portion and the coding portion may be recognized in the region where the guide identifier is located.
In another possible implementation manner of the embodiment of the present application, the image acquired by the image sensor may be preprocessed, for example, gaussian blurring, binarization, edge extraction, and the like, and the coordinate system part and the encoding part are determined according to values of pixel points in the preprocessed image. Or, the coordinate system part and the coding part can be determined directly according to the value of each pixel point in the image acquired by the image sensor.
For example, each pixel point whose value exceeds a preset threshold value in the acquired image can be determined, the connected domain can be determined according to each pixel point whose value exceeds the preset threshold value, and then the coordinate system part and the coding part can be identified according to the geometric features of each connected domain. That is to say, the identification of the coordinate system part and the coding part can be realized according to the brightness condition in the connected domain, and the connected domain with the shape similar to the coordinate system part and the coding part but the brightness not meeting the condition can be excluded to improve the accuracy of the detection results of the coordinate system part and the coding part.
Step 202, determining a candidate coordinate system according to the coordinate system part.
In the embodiment of the present application, an example is described in which the candidate coordinate system is a two-dimensional coordinate system, and the candidate coordinate system is a coordinate system established on the acquired image, specifically a coordinate system partially established by the coordinate system. The units in the candidate coordinate system may be pixels, or may also be set according to actual requirements, for example, the units in the candidate coordinate system may be set to a set length, and the set length may be, for example, 0.001cm, 0.01cm, and the like, which is not limited in this application. Of course, the candidate coordinate system may also be a three-dimensional coordinate system, which is not limited in this application.
In the embodiment of the application, the candidate coordinate system can be established according to the coordinate system part in the guide identifier.
In a possible implementation manner of the embodiment of the present application, the coordinate system portion may include an asymmetric image, and coordinate axes of the candidate coordinate system may be determined according to a set reference line in the asymmetric image; wherein, the direction of the coordinate axis is determined according to the position of the set local pattern in the asymmetric pattern.
For example, there may be two connecting lines in the asymmetric pattern, and the two connecting lines may be respectively used as the X-axis and the Y-axis of the candidate coordinate system; the directions of the X axis and the Y axis may also be determined according to the position where the local pattern is set in the asymmetric pattern, for example, the quadrant where the local pattern is set may be set as a first quadrant, so that the directions of the X axis and the Y axis may be determined. Of course, the quadrant in which the local pattern is located may also be set as the second quadrant, the third quadrant, or the fourth quadrant, which is not limited in this application.
It should be noted that, the above is only exemplified by determining the direction of the coordinate axis according to the position where the local pattern is set in the asymmetric pattern, and in practical applications, the direction of the coordinate axis may also be determined directly according to the image characteristics of the set local pattern, for example, if the local pattern is set as an arrow pattern, the positive direction of the coordinate axis may be determined according to the direction of the arrow of the set local pattern.
In another possible implementation manner of the embodiment of the present application, the coordinate system portion may include an asymmetric image, and the coordinate points set in the candidate coordinate system may be determined according to positions of the set key points in the asymmetric pattern. For example, it is known that a coordinate point set in the candidate coordinate system by the key point in the asymmetric pattern is (-1,1), and in this case, the candidate coordinate system can be directly established based on the set coordinate point being (-1, 1).
In another possible implementation manner of the embodiment of the present application, the coordinate system portion may include an asymmetric image, and the coordinate axes of the candidate coordinate system may be determined according to the set reference line in the asymmetric pattern, and the coordinate points set in the candidate coordinate system may be determined according to the positions of the set key points in the asymmetric pattern. Wherein, the direction of the coordinate axis is determined according to the position of the set local pattern in the asymmetric pattern.
As an example, referring to fig. 3, fig. 3 is a first schematic diagram of a guidance identifier in an embodiment of the present application. The guide mark includes a set partial pattern 21 and a coding portion 22 composed of a plurality of mark points (referred to as first mark points in this application). The set reference line may be a symmetry axis of the set local pattern 21, the set reference line may be an X axis in a candidate coordinate system, an arrow direction of the set local pattern 21 is a positive direction of the X axis, at least one set key point is disposed in the asymmetric pattern, the key point may be any point in the asymmetric pattern, and the set key point may be an origin of the candidate coordinate system or a point in a positive direction or a negative direction of the Y axis, which is not limited in the present application. For example, the set keypoints may be arranged in a positive direction of the Y-axis (not shown in fig. 3), so that according to the set keypoints, the Y-axis and Y-axis directions can be determined, and thus, according to the set keypoints, and the X-axis, a candidate coordinate system can be established.
In yet another possible implementation manner of the embodiment of the present application, the coordinate system portion may include at least five second mark points, and at least three collinear second mark points may be connected in the acquired image respectively to obtain two connecting lines, the second mark point at the intersection of the two connecting lines is determined as an origin of the candidate coordinate system, and the two connecting lines are determined as coordinate axes of the candidate coordinate system; wherein, the direction of the coordinate axis is determined according to the distance between the second mark point on the coordinate axis and the origin.
As an example, referring to fig. 4, fig. 4 is a schematic diagram of a guide mark in an embodiment of the present application, where a letter a represents a second mark point, and a letter B represents a first mark point. The distance between each second marker point and the origin can be determined, the positive direction of the coordinate axis is determined according to the side with the longer distance, and the negative direction of the coordinate axis is determined according to the side with the shorter distance. It should be noted that fig. 4 is only illustrated by determining the positive direction of the coordinate axis according to the side with the long distance and determining the negative direction of the coordinate axis according to the side with the short distance, and in practical applications, the negative direction of the coordinate axis may be determined according to the side with the long distance and the positive direction of the coordinate axis may be determined according to the side with the short distance, which is not limited in the present application. For convenience of explanation, the present application exemplifies that the positive direction of the coordinate axis is determined according to the side where the distance is long, and the negative direction of the coordinate axis is determined according to the side where the distance is short.
It should be noted that, when the coordinate system portion includes at least five second marked points, the above example is performed by only one of the second marked points being located at the origin of the candidate coordinate system, and in practical applications, the second marked point may not be located at the origin of the candidate coordinate system, but may be located only on a coordinate axis of the candidate coordinate system. For example, when the coordinate system portion includes at least five second marker points, at least three second marker points that are collinear may be connected in the captured image to obtain a first line segment, and each second marker point that is not located on the first line segment may be connected to obtain a second line segment, and the first line segment and the second line segment may be determined as coordinate axes of the candidate coordinate system, and an intersection point of the first line segment and the second line segment may be determined as an origin of the candidate coordinate system, and a direction of the coordinate axes may be determined according to positions of the second marker points. For example, the direction in which the number of second marker points is large may be the positive direction of the coordinate axis.
In yet another possible implementation manner of the embodiment of the present application, the coordinate system portion may include at least three second mark points, and two adjacent second mark points may be connected to obtain a plurality of connection lines, a target included angle with a smallest difference from a preset included angle is determined from an included angle formed by any two connection lines in the plurality of connection lines, the two connection lines forming the target included angle are determined as coordinate axes of the candidate coordinate system, and directions of the coordinate axes are determined according to positions where the second mark points are located.
As an example, referring to fig. 5, fig. 5 is a schematic diagram of a third guidance sign in the embodiment of the present application, where a letter a represents a second mark point, and a letter B represents a first mark point. The 3 second mark points a may form an "L" shape, the preset included angle may be 90 °, two connecting lines forming 90 ° (or an included angle whose difference with 90 ° is within a preset range) are determined as coordinate axes of the candidate coordinate system, the second mark points may be in positive directions of the coordinate axes, or may also be in negative directions of the coordinate axes, and fig. 2 illustrates that only the second mark points are in the positive directions of the coordinate axes.
It should be noted that fig. 5 only exemplifies that the number of the second mark points is 3, and when the second mark points are actually applied, the number of the second mark points may also be 4, 5, 6, and the like, which is not limited in the present application.
The plurality of second mark points may be identical to or different from the plurality of first mark points in the encoding section in terms of shape, size, color, and other characteristics.
It should be noted that, when the features of the plurality of second mark points in the coordinate system portion and the plurality of first mark points in the encoding portion, such as shapes, sizes, colors, and the like, are completely the same, based on the image recognition algorithm in the above embodiment, only each mark point in the captured image can be recognized, and in this case, the plurality of first mark points and the plurality of second mark points need to be further determined from each mark point. For example, referring to fig. 4, the image features of the plurality of first marker points B and the plurality of second marker points a are the same.
As a possible implementation manner of the embodiment of the application, in order to improve the accuracy of the recognition result, the marking points may be recognized from the acquired image, and then the second marking points are recognized from the marking points; the position distribution of the second mark points accords with the set geometric constraint condition, so that all mark points except the second mark points can be used as the first mark points.
In the embodiment of the application, the set geometric constraint condition is used for constraining the value ranges of the relative angle, the relative distance and the like between the second mark points. For example, the geometric constraint condition set as described above may be an asymmetric geometric constraint condition, and when the position distribution of the second marker point conforms to the asymmetric geometric constraint condition, the coordinate axes of the candidate coordinate system in the captured image and the directions of the coordinate axes may be determined based on the geometric constraint condition.
It should be noted that the above is only an exemplary embodiment, but the present application is not limited thereto, and may include other image recognition methods known in the art as long as each marker point in the image can be recognized. For example, the image collected by the image sensor may be preprocessed, such as gaussian blurring, binarization, edge extraction, and the like, and each marker point may be determined according to a value of each pixel point in the preprocessed collected image, or may be directly determined according to a value of each pixel point in the image collected by the image sensor. For example, it may be determined that each pixel point whose value of a pixel point in the acquired image exceeds a preset threshold, and a connected domain is determined according to each pixel point whose value exceeds the preset threshold, and each connected domain is used as each mark point. Or, each pixel point whose value of the pixel point in the acquired image exceeds the preset threshold value may be directly used as each mark point, which is not limited in the present application.
And step 203, decoding to obtain the target code according to the coordinate positions of the plurality of first mark points in the candidate coordinate system.
In the embodiment of the application, after the candidate coordinate system is established in the acquired image according to the coordinate system part, the coordinate position of the coding part in the guide identifier in the candidate coordinate system can be determined, and then the target code can be obtained by decoding according to the coordinate position of the coding part in the guide identifier in the candidate coordinate system.
In a possible implementation manner of the embodiment of the present application, the encoding portion may include a plurality of first marker points, a coordinate position of each first marker point in the candidate coordinate system may be determined, and the target code may be obtained by decoding according to the coordinate position of each first marker point in the candidate coordinate system.
It can be understood that the first marker may include a plurality of pixel points, and for each first marker, the coordinate position of the first marker in the candidate coordinate system may be determined according to the coordinate positions of the plurality of pixel points included in the first marker.
As an example, for each first mark point, the coordinate positions of multiple pixel points included in the first mark point in the candidate coordinate system may be determined, and the coordinate positions of the multiple pixel points included in the first mark point in the candidate coordinate system may be averaged to determine the coordinate position of the first mark point in the candidate coordinate system.
As another example, for each first marker, the coordinate positions of multiple pixel points included in the first marker in the candidate coordinate system may be determined, a mathematical equation may be fitted to the shape of the first marker according to the coordinate positions of the multiple pixel points included in the first marker in the candidate coordinate system, the centroid of the first marker may be determined according to the fitted mathematical equation, and the coordinate position of the centroid may be used as the coordinate position of the first marker in the candidate coordinate system.
In a possible implementation manner of the embodiment of the present application, when the encoding portion includes a plurality of first mark points, after determining a coordinate position of each first mark point in the candidate coordinate system, each first mark point may be mapped into the standard coordinate system to obtain a coordinate position of each first mark point in the standard coordinate system, and the corresponding target code is determined according to the coordinate position of each first mark point in the standard coordinate system. For example, the coordinate positions of the first mark points in the standard coordinate system may be combined by using a preset combination rule to obtain the target code, for example, the abscissa and the ordinate of each first mark point in the standard coordinate system are sequentially arranged to obtain the target code.
The standard coordinate system is a coordinate system where the guide mark is located, and the standard coordinate system is a coordinate system which is pre-established on the guide mark according to the coordinate system part on the guide mark of the charging device in the three-dimensional space. It should be understood that the image captured by the image sensor may be distorted, the candidate coordinate system may be distorted or distorted, and the coordinate axes may not be straight as the observed coordinate system changes with the change of the observed position. And the standard coordinate system is a coordinate system which corresponds to the candidate coordinate system and has no distortion, namely the standard coordinate system does not change along with the change of the observed position.
For example, the number of the first marker points is 3, the coordinate positions of the 3 first marker points in the standard coordinate system are (1, 1), (2, 2) and (3, 3), respectively, and the abscissa and the ordinate of the 3 first marker points in the coordinate positions in the standard coordinate system are arranged sequentially by using a preset combination rule, so that the obtained target code may be, for example, 112233, 11-22-33, 112233, 1-1-2-2-3-3, and the like.
It will be appreciated that when the coordinate axes in the candidate coordinate system have directions, the codes have different meanings in different quadrants, which may also increase the coding capacity.
And 204, if the target code is matched with the code of the self-moving equipment, determining the pose of the self-moving equipment relative to the charging device according to the first mark point.
In the embodiment of the application, each self-moving device may have a corresponding code, and each self-moving device may be charged by using a charging apparatus having the same code as the self-moving device. The code of the charging device is an actual code corresponding to the guide identifier of the charging device.
For example, the server may configure a corresponding relationship between the mobile device and the code in real time or periodically, or the server may further configure a charging device to change a code of the server, for example, after the charging device changes the guide identifier, the code corresponding to the charging device will be changed, at this time, the server may reconfigure the code corresponding to the mobile device according to the updated code, so that the mobile device can successfully recharge, and after the code of the charging device is changed, other mobile devices may also be charged by using the charging device, so that a deployment function of the charging device may be implemented, and the applicability of the method may be improved.
It should be noted that, since the candidate coordinate system may have distortion, and the target code is determined according to the coordinate position of the coding part in the guide mark in the candidate coordinate system, the target code may have deviation from the code corresponding to the charging device. Therefore, in the present application, in order to improve the success rate of recharging from the mobile device, it may be determined whether the target code matches the code of the mobile device, for example, a matching degree between the target code and the code of the mobile device may be calculated, and it is determined whether the matching degree exceeds a preset matching degree threshold, if so, it is determined that the target code matches the code of the mobile device, and if not, it is determined that the target code does not match the code of the mobile device.
In the embodiment of the application, when the target code matches with the code of the self-moving device, it indicates that the charging apparatus is a charging apparatus corresponding to the self-moving device, and the self-moving device can be charged by using the charging apparatus, so that the pose of the self-moving device relative to the charging apparatus can be determined according to the plurality of first marker points included in the code part. And if the target code is not matched with the code of the self-moving device, it indicates that the charging device is not a charging device corresponding to the self-moving device, and the self-moving device cannot use the charging device for charging, at this time, the acquired image may be discarded to control the image sensor to acquire the next frame of image, and step 101 to the subsequent steps are repeatedly performed.
And step 205, controlling the mobile equipment to move to a charging device for charging according to the determined pose.
According to the control method of the self-moving equipment, under the condition that the code of the self-moving equipment is matched with the target code, the self-moving equipment can determine the pose of the self-moving equipment relative to the charging device according to the first mark point, so that the self-moving equipment can move to the charging device to be charged according to the determined pose, and under the condition that the pose is not matched, the pose is not calculated, and on the basis of saving calculation resources, the situation that the self-moving equipment cannot be charged due to the fact that other charging devices are used for charging can be avoided.
In a possible implementation manner of the embodiment of the present application, the pose of the self-moving device relative to the charging apparatus may be determined according to an actual position relationship between a plurality of first marker points included in the encoding portion and an image position relationship of each first marker point in the acquired image.
The above process is described in detail with reference to example three. Fig. 6 is a flowchart illustrating a control method of a self-moving device according to a third embodiment of the present application. As shown in fig. 6, based on the embodiment shown in fig. 2, step 204 may specifically include the following steps:
step 301, inquiring the actual position relationship among the first mark points.
In the embodiment of the present application, the actual position relationship between the first mark points refers to a position relationship of each first mark point in a standard coordinate system where the guide identifier is located.
It should be understood that each guide mark is known, and the coding part and the coordinate system part in the guide mark are also known, and when the charging device sets the guide mark, the standard coordinate system in which the guide mark is located can be determined, and the standard coordinate position of the coding part in the standard coordinate system can also be determined, that is, the standard coordinate position of each first marking point in the coding part in the standard coordinate system can be calculated. Therefore, in the present application, for each guide identifier, each first mark point in the coding portion of the guide identifier may be calculated in advance, and the actual code corresponding to the guide identifier is determined according to the standard coordinate position of each first mark point in the standard coordinate system where the guide identifier is located. For example, the preset combination rule may be adopted to combine the standard coordinate positions of the first marker points in the standard coordinate system to obtain the actual code of the guide identifier, for example, the abscissa and the ordinate of the first marker points in the standard coordinate positions in the standard coordinate system may be sequentially arranged to obtain the actual code of the guide identifier. After the actual code corresponding to each guide identifier is obtained through calculation, the actual code corresponding to each guide identifier may be stored.
Therefore, in the application, the actual codes corresponding to the guide identifiers can be queried according to the target codes, and the target actual codes matched with the target codes are obtained, for example, the similarity between the target codes and the actual codes corresponding to the guide identifiers can be calculated, and the actual codes corresponding to the maximum similarity are used as the target actual codes, so that the target actual codes can be decoded, and the standard coordinate positions of the first mark points in the standard coordinate system where the guide identifiers are located are obtained.
The similarity may be euclidean distance similarity, manhattan distance similarity, cosine cos similarity, or the like, which is not limited in the present application.
For example, when the number of the first mark points included in the encoding portion is 3, the coordinate positions of the 3 first mark points in the candidate coordinate system are mapped into the standard coordinate system, the coordinate positions of the 3 first mark points in the standard coordinate system are obtained to be (1, 1), (2, 2), and (3.1 ), and the 3 coordinate positions are combined by adopting a preset combination rule, so that the obtained target code is 1-1-2-2-3.1-3.1. If the actual codes corresponding to the predetermined multiple guide identifiers are 1-1-2-2-3-3, 2-2-4-4-6-6 and 1-1-4-4-7-7 respectively, the actual code with the highest matching degree with the target code can be determined to be 1-1-2-2-3-3, and the actual code is decoded to obtain standard coordinate positions of 3 first mark points in a standard coordinate system where the guide identifiers are located, namely (1, 1), (2, 2) and (3, 3) respectively.
In the embodiment of the application, after the standard coordinate position of each first mark point in the standard coordinate system where the guide identifier is located in the coding part is obtained through query, the actual position relationship between the first mark points can be determined according to the standard coordinate position of each first mark point in the standard coordinate system where the guide identifier is located.
Step 302, determining the image position relationship of each first mark point in the image.
In this embodiment of the present application, the image position relationship of each first marker in the acquired image may be an image position relationship of each first marker in a candidate coordinate system, an image coordinate system, or a pixel coordinate system.
For example, the origin of coordinates of the image coordinate system is the center point of the image, the X-axis is horizontally to the right, the Y-axis is horizontally to the bottom, and the unit is pixels. The origin of coordinates of the pixel coordinate system is the upper left corner of the image, the X-axis is horizontally to the right, the Y-axis is horizontally to the bottom, and the unit is pixel.
In the embodiment of the application, after the candidate coordinate system, the image coordinate system or the pixel coordinate system is determined, the coordinate position of each first marking point in the candidate coordinate system, the image coordinate system or the pixel coordinate system may be determined, so that the image position relationship of each first marking point in the image may be determined according to the coordinate position of each first marking point in the candidate coordinate system, the image coordinate system or the pixel coordinate system.
And step 303, determining the pose of the mobile equipment relative to the charging device according to the actual position relation and the image position relation.
It should be understood that, at each observation position, the image position relationship of each first marker point in the acquired image in the image may be different, and according to the difference between the actual position relationship and the image position relationship, the relative position relationship of the self-moving device with respect to the charging apparatus may be determined, which is denoted as the pose in this application. Therefore, the accuracy of the pose calculation result can be improved.
In another possible implementation manner of the embodiment of the application, the guide identifier of the charging device may include at least three reference mark points, and the position relationship of each reference mark point in the image may be directly determined, the actual position relationship between each reference mark point is determined, and the pose of the mobile device relative to the charging device is determined according to the image position relationship and the actual position relationship.
The above process is described in detail with reference to example four. Fig. 7 is a flowchart illustrating a control method of a self-moving device according to a fourth embodiment of the present disclosure. As shown in fig. 7, the control method of the self-moving apparatus may include the steps of:
step 401, acquiring an image collected from an image sensor carried by mobile equipment; the image shows a guide mark of the charging device, and the guide mark comprises at least three reference mark points.
In the embodiment of the present application, the recognition principle of each reference mark point is similar to the recognition principle of the first mark point and the second mark point, and at least three reference mark points included in the guidance identifier may be recognized and obtained according to the image recognition technology described in the second embodiment, which is not described herein again.
Step 402, determining the image position relation of each reference mark point in the image.
In the embodiment of the present application, the image position relationship of each reference mark point in the image may be an image position relationship of each reference mark point in a candidate coordinate system, an image coordinate system, or a pixel coordinate system, which is not limited in the present application.
Wherein, the candidate coordinate system can be determined according to each reference mark point. For example, two adjacent reference mark points may be connected to obtain a plurality of connection lines, a target angle having a minimum difference from a preset angle is determined from an angle formed by any two connection lines of the plurality of connection lines, the two connection lines forming the target angle are determined as coordinate axes of a candidate coordinate system, and directions of the coordinate axes are determined according to positions of the reference mark points.
As an example, referring to fig. 8, fig. 8 is a schematic diagram of a guidance identifier in the embodiment of the present application. The guidance mark comprises 3 reference mark points C, the 3 reference mark points C may form an "L" shape, the preset included angle may be 90 °, two connecting lines forming 90 ° (or an included angle whose difference value from 90 ° is within a preset range) are determined as coordinate axes of the candidate coordinate system, the reference mark points may be in positive directions of the coordinate axes, or may also be in negative directions of the coordinate axes, and fig. 8 illustrates the reference mark points only in the positive directions of the coordinate axes.
It should be noted that fig. 8 only exemplifies that the number of the reference mark points is 3, and when the application is in practice, the number of the reference mark points may also be 4, 5, 6, and the like, which is not limited in the application.
In the embodiment of the application, after the candidate coordinate system, the image coordinate system or the pixel coordinate system is determined, the coordinate position of each reference mark point in the candidate coordinate system, the image coordinate system or the pixel coordinate system can be determined, so that the image position relation of each reference mark point in the image can be determined according to the coordinate position of each reference mark point in the candidate coordinate system, the image coordinate system or the pixel coordinate system.
And step 403, inquiring the actual position relation between each reference mark point.
In a possible implementation manner of the embodiment of the present application, the actual position relationship between each reference mark point may refer to a position relationship of each reference mark point in a standard coordinate system where the guide identifier is located. The standard coordinate system is a coordinate system pre-established on the guide mark according to each reference mark point on the guide mark of the charging device in the three-dimensional space. It should be understood that the image captured by the image sensor may have distortion, the candidate coordinate system may have distortion or distortion, and for an observed coordinate system, the coordinate axes may not be straight as the observed position changes. And the standard coordinate system is a coordinate system which corresponds to the candidate coordinate system and has no distortion, namely the standard coordinate system does not change along with the change of the observed position.
In the embodiment of the application, the target code can be obtained by decoding according to the coordinate position of each reference mark point in the candidate coordinate system, and the standard coordinate position of each reference mark point in the standard coordinate system where the position identifier is located is obtained by querying according to the target code, so that the actual position relationship between each reference mark point is determined according to the standard coordinate position of each reference mark point in the standard coordinate system where the guide identifier is located.
It should be understood that each guide mark is known, and the reference mark points in the guide mark are also known, and when the charging device sets the guide mark, the standard coordinate system in which the guide mark is located can be determined, and the standard coordinate positions of the reference mark points in the standard coordinate system can also be determined.
Therefore, in the present application, for each guidance identifier, each reference mark point in the guidance identifier may be pre-calculated, and the actual code corresponding to the guidance identifier is determined according to the standard coordinate position of each reference mark point in the standard coordinate system where the guidance identifier is located at the standard coordinate position in the standard coordinate system where the guidance identifier is located. For example, the preset combination rule may be adopted to combine the standard coordinate positions of the reference mark points in the standard coordinate system to obtain the actual code of the guide identifier, for example, the abscissa and the ordinate of each reference mark point in the standard coordinate position in the standard coordinate system may be sequentially arranged to obtain the actual code of the guide identifier. After the actual code corresponding to each guide identifier is obtained through calculation, the actual code corresponding to each guide identifier may be stored.
Therefore, in the present application, the actual codes corresponding to the above-mentioned guide identifiers may be queried according to the target code, and the target actual code matched with the target code is obtained, for example, the similarity between the target code and the actual codes corresponding to the guide identifiers may be calculated, and the actual code corresponding to the maximum similarity is taken as the target actual code, so that the target actual code may be decoded, and the standard coordinate position of each reference mark point in the standard coordinate system is obtained.
In the embodiment of the application, after the standard coordinate position of each reference mark point in the standard coordinate system where the guide identifier is located is obtained through query, the actual position relation between the reference mark points can be determined according to the standard coordinate position of each reference mark point in the standard coordinate system where the guide identifier is located.
That is to say, in the present application, each reference mark point may serve as a coordinate system portion in the above embodiment, or may serve as a coding portion, that is, a candidate coordinate system may be determined according to each reference mark point, or a target code may be obtained by decoding according to a coordinate position of each reference mark point in the determined candidate coordinate system. Therefore, the standard coordinate position of each reference mark point in the standard coordinate system of the guide identifier can be obtained through query according to the target code, and the actual position relation between the reference mark points is determined according to the standard coordinate position of each reference mark point in the standard coordinate system of the guide identifier.
In another possible implementation manner of the present application, the actual position relationship between each reference marker point may refer to a position relationship of each reference marker point in a three-dimensional space.
In the embodiment of the present application, an actual positional relationship matching the image positional relationship of each reference marker point may be determined from a plurality of candidate positional relationships.
Specifically, the actual position relationship between each reference marker point in each guide marker may be determined in advance for each guide marker to obtain a plurality of candidate position relationships, so that the actual position relationship matching the image position relationship of each reference marker point may be determined in the plurality of candidate position relationships.
For example, the number of the reference mark points is three, assuming that the shape formed by three reference mark points in the guide identifier 1 in the three-dimensional space is a positive "L" shape, the coordinates of the three reference mark points are (0, 0), (0, 1) and (1, 0), respectively, determining the actual position relationship of the three reference mark points as a candidate position relationship 1 according to the coordinates of the three reference mark points, assuming that the shape formed by three reference mark points in the guide identifier 2 in the three-dimensional space in a summary manner is an inverse "L" shape, the coordinates of the three reference mark points are (0, 0), (0, -1) and (-1, 0), respectively, and determining the actual position relationship of the three reference mark points as a candidate position relationship 2 according to the coordinates of the three reference mark points. In practical application, assuming that three reference mark points in an image acquired by an image sensor are shown in fig. 8, and according to the image position relationships corresponding to the three reference mark points, it can be determined that the shape formed by the three reference mark points is a positive "L" shape, and then it is determined that the actual position relationship matched with the image position relationships of at least three reference mark points is a candidate position relationship 1.
And step 404, determining the pose of the mobile equipment relative to the charging device according to the actual position relation and the image position relation.
The execution process of step 404 may refer to the execution process of step 303 in the above embodiment, which is not described herein again.
And 405, controlling the mobile equipment to move to a charging device for charging according to the determined pose.
According to the embodiment of the application, the pose of the mobile equipment relative to the charging device is determined according to different modes, and the applicability of the method can be improved.
In a possible implementation manner of the embodiment of the present application, each self-moving device may have a corresponding code, and each self-moving device may be charged by using a charging apparatus that is the same as the code of the self-moving device. The code of the charging device is a code corresponding to the guide identifier of the charging device.
In this embodiment of the application, in order to improve the success rate of recharging from the mobile device, the code of the charging device may be determined, that is, whether the code corresponding to the actual position relationship matches the code of the mobile device, for example, the matching degree between the code corresponding to the actual position relationship and the code of the mobile device may be calculated, and whether the matching degree exceeds a preset matching degree threshold value is determined, if yes, it is determined that the code corresponding to the actual position relationship matches the code of the mobile device, and if not, it is determined that the code corresponding to the actual position relationship does not match the code of the mobile device.
When the actual position relationship between each reference mark point is the position relationship of each reference mark point in the standard coordinate system where the guide identifier is located, the code of the actual position relationship may be the target code or the actual code corresponding to the guide identifier. And when the actual position relation among each reference mark point is the position relation of each reference mark point in the three-dimensional space, the codes of the actual position relation are obtained by combining the coordinates of each reference mark point in the three-dimensional space.
In the embodiment of the application, when the code corresponding to the actual position relationship is matched with the code of the self-moving device, it is indicated that the charging device is the charging device corresponding to the self-moving device, and the self-moving device can be charged by using the charging device, so that the pose of the self-moving device relative to the charging device can be determined according to the image position relationship and the actual position relationship. And under the condition that the code corresponding to the actual position relationship is not matched with the code of the self-moving equipment, the charging device is not the charging device corresponding to the self-moving equipment, the self-moving equipment cannot be charged by using the charging device, at this time, the acquired image can be discarded so as to control the image sensor to acquire the next frame of image, and the step 101 to the subsequent steps are repeatedly executed.
In a possible implementation manner of the embodiment of the application, the guidance identifier may be disposed on a charging pile of the charging device, and the pose of the image sensor relative to a coordinate system of the charging pile may be determined by using a PNP algorithm according to the actual position relationship and the image position relationship, and the pose of the self-moving device relative to the charging device may be determined according to an external parameter matrix between the charging pile and a charging contact of the charging device, an external parameter matrix between the image sensor and the self-moving device, and the pose.
The above process is described in detail with reference to example five. Fig. 9 is a flowchart illustrating a control method of a self-moving device according to a fifth embodiment of the present application. As shown in fig. 9, based on the embodiment shown in fig. 6 or fig. 7, step 303 or step 404 may specifically include the following steps:
and step 501, determining the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile by adopting a PNP algorithm according to the actual position relation and the image position relation.
In this embodiment, the coordinate system of the image sensor may be a camera coordinate system, and the origin of coordinates of the camera coordinate system is defined at the optical center of the lens, and the unit is mm, where the X-axis is horizontally to the right and the Y-axis is horizontally to the bottom.
In this application embodiment, fill electric pile's coordinate system is the coordinate system of demarcating in advance, for example, this fill electric pile's coordinate system can be for filling the coordinate system of demarcating in advance on electric pile, for example this fill electric pile's coordinate system's initial point can be for filling the barycenter of electric pile, and the Y axle is vertical upwards, and the X axle level is right or left.
In the embodiment of the present application, a pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile may be determined (also referred to as pose transformation) by using a PNP algorithm according to the actual position relationship and the image position relationship, and the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile may include a first rotation matrix R1 and a first displacement vector T1 (or referred to as a first translation vector), for example, the pose may be a transformation matrix T _ T1 composed of a first rotation matrix R1 and a first displacement vector T1.
The PNP (peer-n-point) algorithm may include P3P, EPNP, UPNP, DLT (Direct Linear Transform), and optimization solution.
Step 502, querying the first extrinsic matrix and the second extrinsic matrix.
The first external parameter matrix is a transformation matrix transformed from a coordinate system of the charging pile to a coordinate system of a charging contact of the charging device, namely the first external parameter matrix is an external parameter matrix between the charging pile and the charging contact of the charging device. The second external reference matrix is a transformation matrix transformed from the coordinate system of the image sensor to the coordinate system of the self-moving device, i.e. the second external reference matrix is an external reference matrix between the image sensor and the self-moving device.
In the embodiment of the present application, the coordinate system of the charging contact of the charging device is a coordinate system calibrated in advance, for example, the coordinate system of the charging contact may be a coordinate system calibrated in advance on the charging contact, for example, the origin of the coordinate system of the charging contact may be the middle point of the connection line of the two charging contacts, the Y axis is vertically upward, and the X axis is horizontally rightward or leftward.
In the embodiment of the present application, the coordinate system of the self-moving device is a coordinate system calibrated in advance, for example, the coordinate system of the self-moving device may be a coordinate system calibrated in advance on the self-moving device, for example, the origin of the coordinate system of the self-moving device may be the centroid of the self-moving device, the Y axis is vertically upward, and the X axis is horizontally rightward or leftward.
In the embodiment of the application, the first external parameter matrix and the second external parameter matrix can be obtained by pre-calibration, so that the first external parameter matrix and the second external parameter matrix can be directly inquired.
Step 503, transforming the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile according to the first external parameter matrix and the second external parameter matrix to obtain the pose of the mobile device relative to the charging device.
In the embodiment of the application, the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile can be transformed according to the first external parameter matrix and the second external parameter matrix, and the pose of the mobile device relative to the charging device is obtained. For example, if the first external reference matrix is labeled as E1, the second external reference matrix is labeled as E2, and the pose of the coordinate system of the image sensor with respect to the coordinate system of the charging post is composed of a first rotation matrix R1 and a first displacement vector t1, the pose of the mobile device with respect to the charging apparatus may include a second rotation matrix and a second displacement vector from the coordinate system of the mobile device with respect to the coordinate system of the charging contact, and the second rotation matrix may be: e2 × R1 × E1, the second displacement vector may be E2 × t1 × E1.
It should be noted that, in the prior art, the location information of the self-moving device does not include ID information, and in a scenario where multiple self-moving devices are used, there are high requirements for the location of the charging pile and the location of the self-moving device, and once one self-moving device is recharged incorrectly, recharging of other self-moving devices in the same scenario may fail.
And in this application, the guide identification sets up on charging device's the stake of filling, every fills electric pile can have corresponding guide identification, every code that fills electric pile and can correspond promptly, thereby every can correspond one from mobile device fills electric pile, only under the condition that the code that fills electric pile from mobile device and the guide identification that fills electric pile corresponds matches, just can use this to fill electric pile to charging from mobile device, and under the unmatched condition, then can't use this to fill electric pile and charge from mobile device, therefore, can realize many functions of recharging from mobile device jointly, and, can also avoid causing other circumstances of recharging failure from mobile device.
As a possible implementation manner, when the encoding portion includes a plurality of first mark points, the coordinate positions of the first mark points in the candidate coordinate system may be mapped into the standard coordinate system where the guide identifier is located, so as to obtain the coordinate positions of the first mark points in the standard coordinate system, and the target encoding may be determined according to the coordinate positions of the first mark points in the standard coordinate system. The above process is described in detail with reference to the sixth embodiment.
Fig. 10 is a flowchart illustrating a control method of a self-moving device according to a sixth embodiment of the present application. As shown in fig. 10, on the basis of the embodiment shown in fig. 2, step 203 may specifically include the following steps:
step 601, performing coordinate system transformation on the candidate coordinate system and the standard coordinate system to obtain an affine transformation matrix between the candidate coordinate system and the standard coordinate system.
In the embodiment of the present application, the standard coordinate system is a coordinate system calibrated in advance on the guiding identifier of the charging device in the space where the mobile device is located. Specifically, according to the coordinate system part on the guide mark, the coordinate system is calibrated on the guide mark in advance. The standard coordinate system is a coordinate system corresponding to the candidate coordinate system without distortion.
In the embodiment of the application, affine transformation can be understood as a new coordinate axis formed after scaling, rotating and translating an original coordinate axis. After the candidate coordinate system and the standard coordinate system are determined, an affine transformation matrix between the candidate coordinate system and the standard coordinate system may be determined according to the two-dimensional geometric transformation.
And step 602, transforming the coordinate position of each first mark point in the candidate coordinate system to a standard coordinate system by using an affine transformation matrix to obtain the coordinate position of each first mark point in the standard coordinate system.
In this embodiment of the application, for each first mark point in the candidate coordinate system, an affine transformation matrix may be adopted to transform the coordinate position of the first mark point to a standard coordinate system, so as to obtain the coordinate position of the mark point in the standard coordinate system.
Step 603, determining a corresponding target code according to the coordinate position of each first mark point in the standard coordinate system.
In the embodiment of the application, the corresponding target code can be determined according to the coordinate position of each first mark point in the standard coordinate system. For example, the coordinate positions of the first mark points in the standard coordinate system may be combined to obtain the corresponding target code. For example, when the number of the first marker points is 3, assuming that the coordinate positions of the 3 first marker points in the standard coordinate system are (1, 1), (2, 2) and (3, 3), respectively, the target code may be 112233, 11-22-33, 112233, 1-1-2-2-3-3, and the like.
In a possible implementation manner of the embodiment of the present application, in order to improve processing efficiency and reduce workload, after determining a candidate coordinate system in an acquired image according to a coordinate system portion in the acquired image, each first marker point in the encoding portion may be determined only from within a set distribution range.
Specifically, the coordinate system part may be determined first according to the value of each pixel in the preprocessed image, or the coordinate system part may be determined directly according to the value of each pixel in the acquired image, then, the candidate coordinate system is determined according to the coordinate system part, then, the set distribution range of the coding part is determined in the candidate coordinate system, and finally, each first mark point is identified in the set distribution range in the acquired image or the preprocessed image, if the first mark point cannot be identified in the set distribution range, the acquired image is discarded, and if the first mark point is identified in the set distribution range, step 102 and subsequent steps may be performed.
That is, the guidance flag is known, the distribution range of the first marker point is also known, and only the marker point in the set distribution range may be identified as the first marker point in order to improve the processing efficiency of the image. For example, when the guiding mark is as shown in fig. 3, the set distribution range may be a second quadrant of the candidate coordinate system, and only the marker points in the second quadrant in the captured image may be used as the first marker points in the encoding portion. Alternatively, when the guide marker is as shown in fig. 4, the set distribution range may be a fourth quadrant of the candidate coordinate system, that is, the marker points in the fourth quadrant in the acquired image may be used as the first marker points in the encoded portion.
It should be noted that, the guiding mark may be affected by illumination and illumination, and when the ambient brightness is higher or lower, the accuracy of the image recognition result is affected, therefore, in the present application, in order to improve the accuracy of the recognition result of the coordinate system portion and the coding portion, or at least three reference mark points in the captured image, and thus improve the accuracy of the pose calculation result, when the charging device or the charging pile is provided with the guiding mark, the coordinate system portion (such as five second mark points in fig. 3) and the coding portion (such as three first mark points in fig. 3 and 4) or the positions where the at least three reference mark points are located in the guiding mark may be provided with an infrared LED light source for emitting infrared light outwards, the image sensor may be an infrared camera, so that the captured image including the guiding mark may be captured without being affected by the ambient brightness in the space where the self-moving device is located, the accuracy of the recognition result can be improved. Further, the band of infrared light may be 940 nm to be invisible to the naked eye of the user, thereby avoiding interference with the user.
In a possible implementation manner of the embodiment of the application, a path can be planned according to a pose of the self-moving device relative to the charging device, and the self-moving device is controlled to move to the charging device according to the planned path. The above process is described in detail with reference to the seventh embodiment.
Fig. 11 is a flowchart illustrating a control method of a self-moving device according to a seventh embodiment of the present application. As shown in fig. 11, on the basis of the foregoing embodiment, step 103 or 205 may specifically include the following steps:
step 701, planning a target path between the self-moving equipment and the charging device according to the pose of the self-moving equipment relative to the charging device.
In the embodiment of the present application, a path may be planned according to a pose of the self-moving device with respect to the charging apparatus, a target path between the self-moving device and the charging apparatus is planned, and the charging contact from the self-moving device and the charging apparatus is contacted in a recharging direction, which may be an arc, a part of a cosine curve, or other segmented path even if the self-moving device is successfully charged, and the present application does not limit this.
Step 702, controlling the self-moving device to move along the target path until the self-moving device receives a charging signal of the charging device, or controlling the self-moving device to stop moving when the self-moving device reaches the charging device.
In the embodiment of the application, after the target path is obtained through planning, the mobile device can be controlled to move along the target path until the mobile device receives a charging signal of the charging device, or the mobile device reaches the charging device, the mobile device is controlled to stop moving. If the charging signal is not detected for a long time, a prompt or retry of the recharging process is performed, that is, step 101 and the subsequent steps are repeatedly performed.
In a possible implementation manner of the embodiment of the present application, after the target path is obtained through planning, the self-moving device may move along the target path, and during the process of moving along the target path, the pose of the self-moving device with respect to the charging apparatus may be continuously updated through image recognition, or the pose of the self-moving device with respect to the charging apparatus may be updated after a preset condition is met, for example, the pose of the self-moving device with respect to the charging apparatus is updated every preset time period, or the pose of the self-moving device with respect to the charging apparatus is updated every preset distance, and the planned target path is updated according to the updated pose until the charging signal of the charging apparatus is received from the self-moving device is monitored, or the self-moving device reaches the charging apparatus, so that the pose of the self-moving device with respect to the charging apparatus is updated according to the pose of the self-moving device, the planned path is dynamically updated, so that the self-mobile equipment can adjust the self state and move along the updated target path, and the success rate of recharging from the mobile equipment can be improved.
In order to implement the foregoing embodiments, the present application further provides a control apparatus for a self-moving device. Fig. 12 is a schematic structural diagram of a control apparatus of a self-moving device according to an eighth embodiment of the present application. As shown in fig. 12, the control apparatus 100 of the self-moving device includes:
an obtaining module 110, configured to obtain an image collected by an image sensor mounted on a mobile device; wherein, the image shows the guide mark of the charging device.
A determining module 120, configured to determine a pose of the self-moving device with respect to the charging apparatus according to the guiding identifier.
And the control module 130 is configured to control the mobile device to move to the charging apparatus for charging according to the pose.
In a possible implementation manner of the embodiment of the present application, the guidance identifier includes a coordinate system portion and a coding portion, and the determining module 120 is specifically configured to: determining a candidate coordinate system according to the coordinate system part; decoding to obtain a target code according to the coordinate positions of a plurality of first mark points in the candidate coordinate system; and if the target code is matched with the code of the self-moving equipment, determining the pose of the self-moving equipment relative to the charging device according to the first mark point.
In a possible implementation manner of the embodiment of the present application, the determining module 120 is specifically configured to: inquiring the actual position relation among the first marking points; determining the image position relation of each first mark point in the image; and determining the pose of the mobile equipment relative to the charging device according to the actual position relation and the image position relation.
In another possible implementation manner of the embodiment of the present application, the guidance identifier includes at least three reference mark points; the determining module 120 is specifically configured to: determining the image position relation of at least three reference mark points in the guide identifier in the image; inquiring the actual position relation among the at least three reference mark points; and determining the pose of the mobile equipment relative to the charging device according to the actual position relation and the image position relation.
In a possible implementation manner of the embodiment of the present application, the determining module 120 is specifically configured to: determining an actual position relation matched with the image position relation of the at least three reference mark points in the plurality of candidate position relations; and if the code corresponding to the actual position relation is matched with the code of the self-moving equipment, determining the pose of the self-moving equipment relative to the charging device according to the image position relation and the actual position relation.
In a possible implementation manner of the embodiment of the application, the guidance identifier is disposed on a charging pile of the charging device; the determining module 120 is specifically further configured to: determining the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile by adopting a PNP algorithm according to the actual position relation and the image position relation; querying the first external parameter matrix and the second external parameter matrix; the first external parameter matrix is a transformation matrix transformed from a coordinate system of the charging pile to a coordinate system of a charging contact of the charging device, and the second external parameter matrix is a transformation matrix transformed from a coordinate system of the image sensor to a coordinate system of the mobile equipment; and transforming the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile according to the first external parameter matrix and the second external parameter matrix to obtain the pose of the mobile equipment relative to the charging device.
In a possible implementation manner of the embodiment of the present application, the coordinate system portion includes at least five second mark points; the determining module 120 is specifically configured to: respectively connecting at least three collinear second mark points in the image to obtain two connecting lines; and determining the second mark point at the intersection point of the two connecting lines as the origin of the candidate coordinate system, and determining the two connecting lines as the coordinate axes of the candidate coordinate system, wherein the directions of the coordinate axes are determined according to the distance between the second mark point on the coordinate axes and the origin.
In one possible implementation of the embodiment of the present application, the coordinate system portion includes an asymmetric pattern; the determining module 120 is specifically configured to: determining coordinate axes of a candidate coordinate system according to the set reference lines in the asymmetric patterns, wherein the directions of the coordinate axes are determined according to the positions of the set local patterns in the asymmetric patterns; and/or determining the coordinate points set in the candidate coordinate system according to the positions of the set key points in the asymmetric pattern.
In a possible implementation manner of the embodiment of the present application, the determining module 120 is specifically configured to: carrying out coordinate system transformation on the candidate coordinate system and a set standard coordinate system to obtain an affine transformation matrix between the candidate coordinate system and the standard coordinate system; transforming the coordinate position of each first mark point under the candidate coordinate system to a standard coordinate system by adopting an affine transformation matrix so as to obtain the coordinate position of each first mark point under the standard coordinate system; and determining the corresponding target code according to the coordinate position of each first mark point in the standard coordinate system.
In a possible implementation manner of the embodiment of the present application, the control module 130 is specifically configured to: planning a target path between the mobile equipment and the charging device according to the pose; and controlling the self-moving equipment to move along the target path until the self-moving equipment receives a charging signal of the charging device or the self-moving equipment stops moving when the self-moving equipment reaches the charging device.
Further, in a possible implementation manner of the embodiment of the present application, the control module 130 is further configured to: updating the pose of the self-moving equipment relative to the charging device in the process of moving the self-moving equipment along the target path; and updating the planning target path according to the updated pose.
It should be noted that the foregoing explanation on the embodiment of the control method for the self-moving device is also applicable to the control apparatus for the self-moving device in this embodiment, and is not repeated herein.
In order to implement the foregoing embodiments, the present application further provides a self-moving device, including: the image sensor, the memory, the processor and the computer program stored on the memory and capable of running on the processor, when the processor executes the program, the control method of the self-moving device as proposed by the foregoing embodiments of the present application is realized.
In order to implement the above embodiments, the present application also proposes a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the control method of the self-moving device as proposed in the foregoing embodiments of the present application.
In order to implement the foregoing embodiments, the present application also proposes a computer program product, which when being executed by an instruction processor in the computer program product, executes the control method of the self-moving device as proposed by the foregoing embodiments of the present application.
In the description herein, references to the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like, mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Further, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are well known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
One of ordinary skill in the art can appreciate that all or part of the steps carried out in the method of implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer-readable storage medium, and the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a computer-readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (18)

1. A method of controlling a mobile device, the method comprising:
acquiring an image collected by an image sensor carried by mobile equipment; the image shows a guide mark of the charging device;
determining the pose of the self-moving equipment relative to the charging device according to the guide identification;
controlling the self-moving equipment to move to the charging device for charging according to the pose;
the guide identifier comprises a coordinate system part and a coding part; the determining the pose of the self-moving device relative to the charging device according to the guide identifier comprises:
determining a candidate coordinate system according to the coordinate system part;
decoding to obtain a target code according to the coordinate positions of the plurality of first mark points in the candidate coordinate system contained in the coding part; and
if the target code is matched with the code of the self-moving equipment, determining the pose of the self-moving equipment relative to the charging device according to the first mark point;
the determining the pose of the self-moving equipment relative to the charging device according to the first mark point comprises the following steps:
inquiring the actual position relation among the first mark points;
determining the image position relation of each first mark point in the image;
determining the pose of the self-moving equipment relative to the charging device according to the actual position relation and the image position relation;
the guide mark is arranged on a charging pile of the charging device; the determining the pose of the self-moving device relative to the charging device according to the actual position relationship and the image position relationship includes:
determining the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile by adopting a PNP algorithm according to the actual position relation and the image position relation;
querying the first external parameter matrix and the second external parameter matrix; wherein the first external reference matrix is a transformation matrix transformed from a coordinate system of the charging post to a coordinate system of a charging contact of the charging device, and the second external reference matrix is a transformation matrix transformed from a coordinate system of the image sensor to a coordinate system of the mobile device;
and transforming the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile according to the first external parameter matrix and the second external parameter matrix to obtain the pose of the self-moving equipment relative to the charging device.
2. The control method according to claim 1, wherein the guide identifier comprises at least three reference mark points; the determining the pose of the self-moving equipment relative to the charging device according to the guide identifier comprises:
determining the image position relation of the at least three reference mark points in the image;
inquiring the actual position relation among the at least three reference mark points;
and determining the pose of the self-moving equipment relative to the charging device according to the actual position relation and the image position relation.
3. The control method according to claim 2, wherein the querying for the actual positional relationship between the at least three reference marker points comprises:
determining an actual position relation matched with the image position relation of the at least three reference mark points in a plurality of candidate position relations;
the determining the pose of the self-moving device relative to the charging device according to the image position relationship and the actual position relationship comprises:
and if the code corresponding to the actual position relation is matched with the code of the self-moving equipment, determining the pose of the self-moving equipment relative to the charging device according to the actual position relation and the image position relation.
4. The control method according to claim 1, wherein the coordinate system section includes at least five second marker points; said determining a candidate coordinate system from said coordinate system portion comprises:
respectively connecting at least three collinear second mark points in the image to obtain two connecting lines;
and determining a second mark point at the intersection point of the two connecting lines as the origin of the candidate coordinate system, and determining the two connecting lines as the coordinate axes of the candidate coordinate system, wherein the directions of the coordinate axes are determined according to the distance between the second mark point on the coordinate axes and the origin.
5. The control method according to claim 1, wherein the coordinate system portion includes an asymmetric pattern; said determining a candidate coordinate system from said portion of coordinate systems comprises:
determining coordinate axes of the candidate coordinate system according to the set reference lines in the asymmetric patterns, wherein the directions of the coordinate axes are determined according to the positions of the set local patterns in the asymmetric patterns;
and/or the presence of a gas in the atmosphere,
and determining the coordinate points set in the candidate coordinate system according to the positions of the set key points in the asymmetric pattern.
6. The control method according to claim 1, wherein the decoding, according to the plurality of first marked points included in the encoded part, at the coordinate positions in the candidate coordinate system to obtain the target code comprises:
carrying out coordinate system transformation on the candidate coordinate system and a set standard coordinate system to obtain an affine transformation matrix between the candidate coordinate system and the standard coordinate system;
transforming the coordinate position of each first mark point in the candidate coordinate system to the coordinate position of each first mark point in the standard coordinate system by using the affine transformation matrix to obtain the coordinate position of each first mark point in the standard coordinate system;
and determining corresponding target codes according to the coordinate positions of the first mark points in the standard coordinate system.
7. The control method according to claim 1, wherein the controlling of the self-moving apparatus to move toward the charging device according to the pose includes:
planning a target path between the self-moving equipment and the charging device according to the pose;
and controlling the self-moving equipment to move along the target path until the self-moving equipment is monitored to receive a charging signal of the charging device, or controlling the self-moving equipment to stop moving when the self-moving equipment reaches the charging device.
8. The control method according to claim 7, wherein after planning a target path between the self-moving device and the charging apparatus according to the pose, the method further comprises:
updating the pose of the self-moving equipment relative to the charging device in the process of moving the self-moving equipment along the target path;
and updating and planning the target path according to the updated pose.
9. A control apparatus for a self-moving device, comprising:
the acquisition module is used for acquiring an image acquired by an image sensor carried by the mobile equipment; the image shows a guide mark of the charging device;
the determination module is used for determining the pose of the self-moving equipment relative to the charging device according to the guide identification;
the control module is used for controlling the self-moving equipment to move to the charging device for charging according to the pose;
the guide mark comprises a coordinate system part and a coding part; the determining module is specifically configured to:
determining a candidate coordinate system according to the coordinate system part; decoding to obtain a target code according to the coordinate positions of the plurality of first mark points in the candidate coordinate system contained in the coding part; if the target code is matched with the code of the self-moving equipment, determining the pose of the self-moving equipment relative to the charging device according to the first mark point;
the determining module is specifically configured to:
inquiring the actual position relation among the first mark points; determining the image position relation of each first mark point in the image; determining the pose of the self-moving equipment relative to the charging device according to the actual position relation and the image position relation;
the guide mark is arranged on a charging pile of the charging device; the determining module is specifically configured to:
determining the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile by adopting a PNP algorithm according to the actual position relation and the image position relation;
querying the first external parameter matrix and the second external parameter matrix; wherein the first external reference matrix is a transformation matrix transformed from a coordinate system of the charging post to a coordinate system of a charging contact of the charging device, and the second external reference matrix is a transformation matrix transformed from a coordinate system of the image sensor to a coordinate system of the mobile device;
and transforming the pose of the coordinate system of the image sensor relative to the coordinate system of the charging pile according to the first external parameter matrix and the second external parameter matrix to obtain the pose of the self-moving equipment relative to the charging device.
10. The control device of claim 9, wherein the guide mark comprises at least three reference mark points; the determining module is specifically configured to:
determining the image position relation of at least three reference mark points in the guide identifier in the image; inquiring the actual position relation among the at least three reference mark points; and determining the pose of the self-moving equipment relative to the charging device according to the actual position relation and the image position relation.
11. The control device according to claim 10, wherein the determination module is specifically configured to:
determining an actual position relation matched with the image position relation of the at least three reference mark points in a plurality of candidate position relations; and if the code corresponding to the actual position relation is matched with the code of the self-moving equipment, determining the pose of the self-moving equipment relative to the charging device according to the image position relation and the actual position relation.
12. The control device of claim 9, wherein the coordinate system portion comprises at least five second marker points; the determining module is specifically configured to:
respectively connecting at least three collinear second mark points in the image to obtain two connecting lines; determining a second mark point at the intersection point of the two connecting lines as the origin of the candidate coordinate system, and determining the two connecting lines as the coordinate axes of the candidate coordinate system; and the direction of the coordinate axis is determined according to the distance between the second marking point on the coordinate axis and the origin.
13. The control device of claim 9, wherein the coordinate system portion comprises an asymmetric pattern; the determining module is specifically configured to:
determining coordinate axes of the candidate coordinate system according to the set reference lines in the asymmetric patterns, wherein the directions of the coordinate axes are determined according to the positions of the set local patterns in the asymmetric patterns;
and/or the presence of a gas in the atmosphere,
and determining the coordinate points set in the candidate coordinate system according to the positions of the set key points in the asymmetric pattern.
14. The control device according to claim 9, wherein the determination module is specifically configured to:
performing coordinate system transformation on the candidate coordinate system and a set standard coordinate system to obtain an affine transformation matrix between the candidate coordinate system and the standard coordinate system; transforming the coordinate position of each first mark point in the candidate coordinate system to the coordinate position of each first mark point in the standard coordinate system by using the affine transformation matrix to obtain the coordinate position of each first mark point in the standard coordinate system; and determining corresponding target codes according to the coordinate positions of the first mark points in the standard coordinate system.
15. The control device of claim 9, wherein the control module is specifically configured to:
planning a target path between the self-moving equipment and the charging device according to the pose; and controlling the self-moving equipment to move along the target path until the self-moving equipment receives a charging signal of the charging device or the self-moving equipment is monitored, or controlling the self-moving equipment to stop moving when the self-moving equipment reaches the charging device.
16. The control device of claim 15, wherein the control module is further configured to:
updating the pose of the self-moving equipment relative to the charging device in the process of moving the self-moving equipment along the target path; and updating and planning the target path according to the updated pose.
17. Self-moving apparatus comprising an image sensor, a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of controlling the self-moving apparatus as claimed in any one of claims 1 to 8 when the program is executed by the processor.
18. A non-transitory computer-readable storage medium on which a computer program is stored, the program, when executed by a processor, implementing the control method of the self-moving apparatus according to any one of claims 1 to 8.
CN202011110956.9A 2020-10-16 2020-10-16 Control method and device of self-moving equipment, self-moving equipment and medium Active CN112265463B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011110956.9A CN112265463B (en) 2020-10-16 2020-10-16 Control method and device of self-moving equipment, self-moving equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011110956.9A CN112265463B (en) 2020-10-16 2020-10-16 Control method and device of self-moving equipment, self-moving equipment and medium

Publications (2)

Publication Number Publication Date
CN112265463A CN112265463A (en) 2021-01-26
CN112265463B true CN112265463B (en) 2022-07-26

Family

ID=74338600

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011110956.9A Active CN112265463B (en) 2020-10-16 2020-10-16 Control method and device of self-moving equipment, self-moving equipment and medium

Country Status (1)

Country Link
CN (1) CN112265463B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113872287A (en) * 2021-09-26 2021-12-31 追觅创新科技(苏州)有限公司 Charging device, self-moving device, charging method, charging system and storage medium
CN114397886B (en) * 2021-12-20 2024-01-23 烟台杰瑞石油服务集团股份有限公司 Charging method and charging system
CN115871492B (en) * 2022-06-06 2023-10-31 武汉路特斯科技有限公司 Charging equipment control method, device, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107314771A (en) * 2017-07-04 2017-11-03 合肥工业大学 Unmanned plane positioning and attitude angle measuring method based on coded target
CN108829112A (en) * 2018-08-24 2018-11-16 北京猎户星空科技有限公司 Charging method, device, equipment and the storage medium of robot

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100669250B1 (en) * 2005-10-31 2007-01-16 한국전자통신연구원 System and method for real-time calculating location
WO2011133527A1 (en) * 2010-04-19 2011-10-27 Interim Designs Inc. Automated electric vehicle charging system and method
CN106647747B (en) * 2016-11-30 2019-08-23 北京儒博科技有限公司 A kind of robot charging method and device
US10189642B2 (en) * 2017-01-30 2019-01-29 Walmart Apollo, Llc Systems and methods for distributed autonomous robot interfacing using live image feeds
CN107966155A (en) * 2017-12-25 2018-04-27 北京地平线信息技术有限公司 Object positioning method, object positioning system and electronic equipment
CN108195381B (en) * 2017-12-26 2020-06-30 中国科学院自动化研究所 Indoor robot vision positioning system
CN108803608B (en) * 2018-06-08 2021-11-30 广州市远能物流自动化设备科技有限公司 Docking positioning method for parking AGV and automobile and parking AGV
CN109739237B (en) * 2019-01-09 2020-08-18 华南理工大学 AGV visual navigation and positioning method based on novel coding marks
US11365973B2 (en) * 2019-01-23 2022-06-21 Hewlett Packard Enterprise Development Lp Drone-based scanning for location-based services
CN110884488B (en) * 2019-11-28 2022-05-31 东风商用车有限公司 Auxiliary positioning system for automatic driving engineering vehicle and using method thereof
CN111679671A (en) * 2020-06-08 2020-09-18 南京聚特机器人技术有限公司 Method and system for automatic docking of robot and charging pile

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107314771A (en) * 2017-07-04 2017-11-03 合肥工业大学 Unmanned plane positioning and attitude angle measuring method based on coded target
CN108829112A (en) * 2018-08-24 2018-11-16 北京猎户星空科技有限公司 Charging method, device, equipment and the storage medium of robot

Also Published As

Publication number Publication date
CN112265463A (en) 2021-01-26

Similar Documents

Publication Publication Date Title
CN112265463B (en) Control method and device of self-moving equipment, self-moving equipment and medium
CN112013858B (en) Positioning method, positioning device, self-moving equipment and storage medium
CN110243360B (en) Method for constructing and positioning map of robot in motion area
CN111932675B (en) Map building method and device, self-moving equipment and storage medium
CN107328420B (en) Positioning method and device
US10930015B2 (en) Method and system for calibrating multiple cameras
US9402151B2 (en) Method for recognizing position of mobile robot by using features of arbitrary shapes on ceiling
CN112013850B (en) Positioning method, positioning device, self-moving equipment and storage medium
US8265425B2 (en) Rectangular table detection using hybrid RGB and depth camera sensors
US8508527B2 (en) Apparatus and method of building map for mobile robot
CN108022264B (en) Method and equipment for determining camera pose
CN110807350A (en) System and method for visual SLAM for scan matching
Mishra et al. Extrinsic Calibration of a 3D-LIDAR and a Camera
KR20110011424A (en) Method for recognizing position and controlling movement of a mobile robot, and the mobile robot using the same
CN111964680B (en) Real-time positioning method of inspection robot
CN110108269A (en) AGV localization method based on Fusion
CN113643380A (en) Mechanical arm guiding method based on monocular camera vision target positioning
Manivannan et al. Vision based intelligent vehicle steering control using single camera for automated highway system
WO2019113859A1 (en) Machine vision-based virtual wall construction method and device, map construction method, and portable electronic device
Jung et al. Graph SLAM for AGV using geometrical arrangement based on lamp and SURF features in a factory environment
KR20120019661A (en) Robot and method of tracing object
CN113984081B (en) Positioning method, positioning device, self-mobile equipment and storage medium
Hagiwara et al. Fpga-based stereo vision system using gradient feature correspondence
US20240087162A1 (en) Map processing device and method thereof
Davis et al. Reflective fiducials for localization with 3D light detection and ranging scanners

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant