CN113378750B - Charging pile butt joint method and device, computer equipment and storage medium - Google Patents

Charging pile butt joint method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113378750B
CN113378750B CN202110695800.XA CN202110695800A CN113378750B CN 113378750 B CN113378750 B CN 113378750B CN 202110695800 A CN202110695800 A CN 202110695800A CN 113378750 B CN113378750 B CN 113378750B
Authority
CN
China
Prior art keywords
charging pile
environment map
position information
channel color
color image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110695800.XA
Other languages
Chinese (zh)
Other versions
CN113378750A (en
Inventor
陈思钡
唐彬
甘泉
谌振宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Haqi Robot Technology Co ltd
Original Assignee
Beijing Haqi Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Haqi Robot Technology Co ltd filed Critical Beijing Haqi Robot Technology Co ltd
Priority to CN202110695800.XA priority Critical patent/CN113378750B/en
Publication of CN113378750A publication Critical patent/CN113378750A/en
Application granted granted Critical
Publication of CN113378750B publication Critical patent/CN113378750B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a charging pile butt joint method, a charging pile butt joint device, computer equipment and a storage medium. The method is performed by a robot, the method comprising: acquiring an environment map, and generating a travelling instruction according to the position information of the charging pile in the environment map; if the travel is determined to meet the condition of the preset distance range, acquiring a depth image and a three-channel color image, and calculating to obtain the current pose information of the charging pile according to the depth image and the three-channel color image; and determining a butt joint gesture according to the current gesture information of the charging pile so as to butt joint with the charging pile. By using the technical scheme of the invention, the butt joint of the robot and the charging pile can be realized automatically, accurately and dynamically.

Description

Charging pile butt joint method and device, computer equipment and storage medium
Technical Field
The embodiment of the invention relates to a computer vision technology, in particular to a charging pile docking method, a charging pile docking device, computer equipment and a storage medium.
Background
With the popularization of robots requiring autonomous charging, such as sweeping robots and meal delivery robots, the autonomous charging function is particularly important, and in order to realize autonomous charging, the robots are required to find a charging seat in a relatively complex environment.
In the prior art, a robot searches a charging pile for recharging, the charging pile is mainly positioned through a laser sensor, the robot continuously sends signals to the robot through an infrared sensor, an ultrasonic sensor, bluetooth and the like after moving to the vicinity of the charging pile according to the position of the charging pile, and the robot adjusts a route according to the received signals to realize the butt joint with the charging pile. However, the signals sent by the infrared sensor are easy to be shielded in a complex environment, when the signals sent by the ultrasonic sensor are applied to a moving object, the accuracy of positioning can be affected by jitter in the moving process, and the distance limitation exists in the use of Bluetooth. Therefore, in the prior art, the mode of realizing the butt joint with the charging pile through the sensor has poor applicability in a complex environment, and influences the accuracy of the robot in positioning the charging pile.
Disclosure of Invention
The embodiment of the invention provides a charging pile butt joint method, a charging pile butt joint device, computer equipment and a storage medium, so as to realize automatic, accurate and dynamic butt joint of a charging pile.
In a first aspect, an embodiment of the present invention provides a docking method of a charging pile, where the method includes:
acquiring an environment map, and generating a travelling instruction according to the position information of the charging pile in the environment map;
If the travel is determined to meet the condition of the preset distance range, acquiring a depth image and a three-channel color image, and calculating to obtain the current pose information of the charging pile according to the depth image and the three-channel color image;
and determining a butt joint gesture according to the current gesture information of the charging pile so as to butt joint with the charging pile.
In a second aspect, an embodiment of the present invention further provides a docking device for a charging pile, where the docking device includes:
The environment map acquisition module is used for acquiring an environment map and generating a travelling instruction according to the charging pile position information in the environment map;
The current pose information acquisition module is used for acquiring a depth image and a three-channel color image if the current pose information acquisition module determines that the current pose information of the charging pile is advanced to meet the condition of the preset distance range, and calculating the current pose information of the charging pile according to the depth image and the three-channel color image;
And the charging pile docking module is used for determining docking postures according to the current posture information of the charging pile so as to dock with the charging pile.
In a third aspect, an embodiment of the present invention further provides a computer device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor implements the charging pile docking method according to any one of the embodiments of the present invention when executing the program.
In a fourth aspect, embodiments of the present invention also provide a storage medium containing computer executable instructions, which when executed by a computer processor, are for performing a charging pile docking method according to any of the embodiments of the present invention.
According to the embodiment of the invention, the environment map is obtained, the charging pile is advanced according to the position information of the charging pile in the environment map, when the robot is determined to advance to meet the condition of the preset distance range, the current pose information of the charging pile is determined according to the depth image containing the charging pile and the three-channel color image, and the butt joint pose is adjusted according to the current pose information, so that the butt joint with the charging pile is realized. The problem of among the prior art with fill the mode of electric pile butt joint, the suitability is relatively poor in complicated environment, the accuracy of butt joint is lower is solved, automatic, accurate, dynamic with fill electric pile and dock has been realized.
Drawings
Fig. 1 is a flowchart of a charging pile docking method according to a first embodiment of the present invention;
fig. 2 is a flowchart of a charging pile docking method in a second embodiment of the present invention;
Fig. 3 is a schematic structural view of a charging pile docking device according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a computer device in a fourth embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Example 1
Fig. 1 is a flowchart of a charging pile docking method according to an embodiment of the present invention, where the method may be implemented by a charging pile docking device, and the device may be implemented by software and/or hardware and is generally integrated in a robot.
As shown in fig. 1, the technical solution of the embodiment of the present invention specifically includes the following steps:
s110, acquiring an environment map, and generating a traveling instruction according to the charging pile position information in the environment map.
The environment map is generated by the robot according to the shot environment image in advance, and the robot scans the surrounding environment through a depth camera carried by the robot body in the running process and builds the environment map according to the environment image. And simultaneously, carrying out feature recognition on the environment image, and recording identification information and position information of the charging pile into an environment map when the charging pile is recognized.
When the robot enters an idle mode or the electric quantity is insufficient and needs to return to the charging pile, loading a pre-established environment map, acquiring charging pile position information recorded in the environment map, generating a return route and a traveling instruction according to the current position information of the robot and the charging pile position information, transmitting the traveling instruction to a power device controller, and controlling the power device to travel to the charging pile along the return route. The embodiment of the invention does not limit the specific mode of carrying out the return route planning, and simultaneously, the navigation obstacle avoidance module can be started to avoid the obstacle on the return route in the process of advancing to the charging pile along the return route.
In the embodiment of the invention, when the robot is about to return to the charging pile, the environment map is loaded, the charging pile position information in the environment map is acquired, and the robot advances to the charging pile position.
And S120, if the travel is determined to meet the condition of the preset distance range, acquiring a depth image and a three-channel color image, and calculating to obtain the current pose information of the charging pile according to the depth image and the three-channel color image.
The condition that the preset distance range is met means that the distance between the robot and the charging pile is smaller than or equal to the preset distance, and the distance between the robot and the charging pile is updated in real time in the process of returning along the return route. And when the charging pile is moved to a preset distance from the charging pile, acquiring a depth image and a three-channel color image, which are obtained by shooting the charging pile, through a depth camera. A depth image, also called range image, refers to an image with the distance (depth) from the image collector to points in the scene as pixel values, which directly reflects the geometry of the visible surface of the object. The depth camera may output a depth image and RGB (Red, green, blue, red, green and blue) images, that is, a depth image and a three-channel color image, may acquire feature points according to the depth image and the three-channel color image, and calculate pose information of the feature points. In the embodiment of the invention, the pose information of the charging pile can be obtained based on the depth image and the RGB image, but the pose estimation algorithm adopted by the embodiment is not limited.
The pose information includes position information and pose information, and current pose information of the charging pile may be used to represent a distance and an angle between the charging pile and the robot. In the embodiment of the invention, when the robot travels to the preset distance range of the charging pile along the return route, the depth image of the charging pile is shot by the depth camera, and the current pose information of the charging pile is acquired according to the depth image.
And S130, determining a butt joint gesture according to the current gesture information of the charging pile so as to butt joint with the charging pile.
In the embodiment of the invention, the expected pose information of the robot when the robot is in butt joint with the charging pile, namely the pose information of the charging pile relative to the robot when the robot is in successful butt joint with the charging pile, can be predetermined. According to the current pose information and the expected pose information of the charging pile, the docking pose of the robot when docking with the charging pile can be adjusted, and the position and/or the pose of the charging pile can be also adjusted.
In the embodiment of the invention, in the process of docking the robot with the charging pile, the depth image is acquired in real time, the current pose information of the charging pile is acquired in real time, and the docking pose is continuously adjusted according to the current pose information of the charging pile acquired in real time until the robot and the charging pile are successfully docked.
According to the technical scheme, the environment map is obtained, the charging pile is moved to according to the position information of the charging pile in the environment map, when the robot is determined to move to meet the preset distance range condition, the current pose information of the charging pile is determined according to the depth image containing the charging pile and the three-channel color image, and the butt joint pose is adjusted according to the current pose information, so that the butt joint with the charging pile is realized. The problem of among the prior art with fill the mode of electric pile butt joint, the suitability is relatively poor in complicated environment, the accuracy of butt joint is lower is solved, automatic, accurate, dynamic with fill electric pile and dock has been realized.
Example two
Fig. 2 is a flowchart of a charging pile docking method provided in a second embodiment of the present invention, where the process of adjusting the docking posture is further embodied based on the above embodiment, and a process of searching for a charging pile according to predicted position information, a process of generating an environmental image, and a process of marking the charging pile position information and the predicted position information in an environmental map are added.
Correspondingly, as shown in fig. 2, the technical scheme of the embodiment of the invention specifically includes the following steps:
S210, acquiring a depth image and a three-channel color image, calculating pose information of the feature points according to the depth image and the three-channel color image, and acquiring point cloud data according to the depth image.
For example, a Depth camera RGB-D (RGB image sensor+depth Depth sensor) may be disposed in front of and behind the robot body, and images of the surrounding environment of the robot may be acquired through the Depth camera during the mapping process of the robot. According to the depth image and the three-channel color image, characteristic points can be obtained, pose information of the characteristic points is calculated, point cloud data can be obtained according to the depth image, the point cloud is a massive point set expressing target space distribution and target surface characteristics under the same space reference system, in the embodiment of the invention, the point cloud is the set of the characteristic points in the environment image, and the specific mode for generating the point cloud data is not limited.
According to the depth image and the three-channel color image, the pose information of the robot can be estimated, and the specific mode of pose estimation according to the depth image is not limited in this embodiment.
S220, generating an environment map according to the point cloud data and the pose information of the feature points.
Specifically, a three-dimensional sparse map can be established according to pose information of the feature points, and the map construction mode is not limited in the embodiment. After the three-dimensional sparse map is generated, the three-dimensional dense map can be used for positioning a robot, the three-dimensional dense map is further generated according to pose information, and the three-dimensional dense map is used as an environment map, so that a target area in which the charging pile can be placed can be conveniently found, and the predicted position of the charging pile can be determined.
S230, judging whether the charging pile identification exists according to the three-channel color image, if so, executing S240, otherwise, returning to executing S230.
The charging pile identifier can be bar code information such as a two-dimensional code and a bar code, and can also be feature information of the charging pile, and exemplary, contour extraction can be performed on the charging pile image in advance, and a feature value of the charging pile contour is used as the charging pile identifier.
And the robot performs feature recognition on the acquired RGB image while constructing the environment map, and determines a target RGB image comprising the charging pile identifier.
And S240, determining the charging pile position information according to the three-channel color image and the depth image, and marking the charging pile identification and the charging pile position information in the environment map.
And carrying out pose estimation on the three-channel color image containing the charging pile identification by combining the depth image, determining the position of the charging pile, and recording the charging pile identification and the charging pile position information into an environment map, so that when the robot searches for the charging pile, the robot can directly search according to the stored charging pile position information, and judging whether the charging pile does exist at the charging pile position stored in the environment map according to the stored charging pile identification.
S250, judging whether a target area of the environment map meets the charging pile placement condition, if so, executing S260, otherwise, executing S270.
In the process of constructing an environment map, the robot performs feature recognition on the environment, and the recognized wall surface area is used as a target area. According to the target area meeting the placement conditions of the charging pile, the predicted position information of the charging pile is determined, and the specific mode of environment identification is not limited in the embodiment.
Accordingly, S250 may further include:
S251, judging whether the height of the target area is larger than the height of the charging pile, if so, executing S252, otherwise, executing S270.
The target area satisfies the condition of placing the charging pile, needs to ensure that the length and the height of the target area are greater than those of the charging pile, and can ensure that the charging pile can be placed as long as the height of the target area is greater than that of the charging pile. Meanwhile, if the height of the charging pile is lower than that of the robot, it is also necessary to ensure that the height of the target area should be greater than that of the robot so as to ensure that the robot can successfully dock and charge with the charging pile.
S252, judging whether the difference between the length of the target area and the length of the charging pile is larger than or equal to a preset value, if so, executing S253, otherwise, executing S270.
For the length of the target area, because a space is reserved when the robot and the charging pile are in butt joint and charge, the length of the target area needs to be larger than the length of the charging pile by a certain distance, and likewise, if the length of the charging pile is smaller than the length of the robot, the difference value between the length of the target area and the length of the robot needs to be larger than or equal to a preset value, so that the robot can be ensured to be in butt joint and charge with the charging pile successfully.
S253, determining that the target area meets the charging pile placement condition.
And S260, determining the predicted position information of the charging pile according to the target area, and marking the predicted position information in the environment map.
After a target area meeting the placement conditions of the charging piles is selected, the predicted positions of the charging piles can be determined according to the target area, and the predicted positions are marked in an environment map, so that when the robot cannot find the charging piles according to the charging pile position information in the environment map, namely, when the charging piles change positions, the robot can address the charging piles according to the predicted position information.
Accordingly, S260 may further include:
s261, determining a predicted point in the target area, and taking the position information of the predicted point as the predicted position information of the charging pile.
For example, the center point of the bottom edge of the target area may be used as the prediction point, and if the divisor of the length of the target area and the length of the charging pile is greater than or equal to 2, the number of points that are the divisors of the length of the target area and the length of the charging pile may be selected as the prediction point, which is not limited in this embodiment.
S262, marking the predicted position information in the environment map.
After the predicted point is determined, the predicted position information is recorded in the environment map.
S270, acquiring an environment map, and generating a traveling instruction according to the charging pile position information in the environment map.
In the embodiment of the invention, when the robot finishes working or has insufficient electric quantity and needs to be docked with a charging pile, an environment map is loaded first, charging pile position information in the environment map is obtained, a return route is determined according to the charging pile position and the current position of the robot, and the robot advances towards the charging pile position along the return route.
S280, acquiring a depth image and a three-channel color image in real time.
And in the process of returning the robot, acquiring a depth image and an RGB image in real time, performing feature recognition on the RGB image, if the target RGB image comprises a charging pile identifier, calculating and updating charging pile position information according to the depth image and the three-channel color image, and docking according to the updated charging pile position information.
S290, judging whether the three-channel color image comprises a charging pile identification, if so, executing S2100, otherwise, executing S2110.
And S2100, determining the position information of the charging pile according to the depth image and the three-channel color image, and updating the position information of the charging pile in the environment map.
After determining that the three-channel color image comprises the charging pile identification, estimating the pose of the charging pile in the three-channel color image according to the depth image, acquiring the position information of the charging pile, and updating the position information of the charging pile originally stored in the environment map into the currently acquired position information.
S2110, judging whether the travel is performed until the preset distance range condition is met, if yes, executing S2120, otherwise, returning to executing S2110.
Whether the position information of the charging pile is updated during the return of the robot or not, when the robot advances to a preset distance from the position of the charging pile, feature recognition is carried out on the obtained depth image, whether the charging pile is included in the depth image or not is judged, when the charging pile is included, the butt joint pose is adjusted according to the depth image, and when the charging pile is not included, the charging pile is searched according to the predicted position information in the environment map.
S2120, judging whether the charging pile is included in the three-channel color image, if yes, executing S2130, otherwise executing S2150.
S2130, calculating to obtain the current pose information of the charging pile according to the depth image and the three-channel color image.
If the three-channel color image contains a charging pile, the charging pile is not changed. And estimating the pose according to the three-channel color image and the depth image which are acquired in real time, and determining the current pose information of the charging pile.
S2140, acquiring expected pose information of the charging pile, and determining a butt joint pose according to the current pose information and the expected pose information.
And when the charging pile and the robot are successfully docked, generating a docking route according to the current pose information and the expected pose information by the charging pile relative to the pose information of the robot, and adjusting the position and the orientation of the robot to travel along the docking route. The robot acquires the depth image in real time, calculates the current pose information of the charging pile in real time, and continuously adjusts the butt joint route until the butt joint of the robot and the charging pile is finally realized.
S2150, obtaining predicted position information of the charging pile in the environment map, and generating an addressing route according to the predicted position information.
When the depth image does not comprise the charging pile, the predicted position information in the environment map is obtained, an addressing route is generated, the robot runs along the addressing route until the charging pile is found, and then the robot is in butt joint with the charging pile in a mode of S2110-S2140. The present embodiment does not limit the manner in which the addressing route is generated from the predicted position information of each predicted point.
In the embodiment of the invention, the predicted position is recorded in the environment map in advance, and when the charging pile moves, a new position of the charging pile can be searched according to the predicted position, so that flexible response to the movement situation of the charging pile is realized.
S2160, travel along the addressing route. Execution returns to S2110.
According to the technical scheme, an environment map is built in advance, meanwhile, charging pile position information and predicted position information are recorded while the environment map is built, the environment map is loaded when the environment map needs to be docked with the charging pile, the charging pile is moved to the charging pile according to the charging pile position information in the environment map, if the charging pile is identified according to the three-channel color image in the moving process, the charging pile position information is updated, when the robot is determined to move to a preset distance range from the charging pile, if the charging pile is contained in the three-channel color image, current pose information of the charging pile is determined according to the depth image and the three-channel color image, docking pose is adjusted according to the current pose information, docking with the charging pile is achieved, if the charging pile is not contained in the three-channel color image, addressing of the charging pile is conducted according to the predicted position information in the environment map, the current pose information of the charging pile is determined according to the depth image and the three-channel color image after the charging pile is found, docking pose is adjusted according to the current pose information, and docking with the charging pile is achieved. The problem of among the prior art with fill the mode of electric pile butt joint, the suitability is relatively poor in complicated environment, the accuracy of butt joint is lower is solved, automatic, accurate, dynamic with fill electric pile and dock has been realized.
Example III
Fig. 3 is a schematic structural view of a docking device for charging piles according to a third embodiment of the present invention, the docking device being provided in a robot, the device comprising: the system comprises an environment map acquisition module 310, a current pose information acquisition module 320 and a charging pile docking module 330. Wherein:
The environment map acquisition module 310 is configured to acquire an environment map, and generate a travel instruction according to charging pile position information in the environment map;
the current pose information obtaining module 320 is configured to obtain a depth image and a three-channel color image if it is determined that the current pose information of the charging pile is determined to travel to meet a preset distance range condition, and calculate the current pose information of the charging pile according to the depth image and the three-channel color image;
and the charging pile docking module 330 is configured to determine a docking gesture according to current pose information of the charging pile, so as to dock with the charging pile.
According to the technical scheme, the environment map is obtained, the charging pile is moved to according to the position information of the charging pile in the environment map, when the robot is determined to move to meet the preset distance range condition, the current pose information of the charging pile is determined according to the depth image containing the charging pile and the three-channel color image, and the butt joint pose is adjusted according to the current pose information, so that the butt joint with the charging pile is realized. The problem of among the prior art with fill the mode of electric pile butt joint, the suitability is relatively poor in complicated environment, the accuracy of butt joint is lower is solved, automatic, accurate, dynamic with fill electric pile and dock has been realized.
On the basis of the above embodiment, the apparatus further includes:
The data acquisition module is used for acquiring a depth image and a three-channel color image, calculating pose information of the feature points according to the depth image and the three-channel color image, and acquiring point cloud data according to the depth image;
the environment map generation module is used for generating an environment map according to the point cloud data and the pose information of the feature points;
And the charging pile labeling module is used for determining charging pile position information according to the three-channel color image and the depth image if the charging pile identification exists according to the three-channel color image, and labeling the charging pile identification and the charging pile position information in the environment map.
On the basis of the above embodiment, the apparatus further includes:
the addressing route generation module is used for acquiring the predicted position information of the charging pile in the environment map and generating an addressing route according to the predicted position information if the charging pile is not included in the three-channel color image;
and the charging pile addressing module is used for advancing along the addressing route and acquiring a three-channel color image until the three-channel color image comprises the charging pile when the preset distance range condition is met.
On the basis of the above embodiment, the apparatus further includes:
And the predicted position information labeling module is used for determining predicted position information of the charging pile according to the target area if the target area of the environment map meets the charging pile placement condition, and labeling the predicted position information in the environment map.
On the basis of the above embodiment, the prediction position information labeling module includes:
the charging pile placement condition judging unit is used for determining that the target area meets the charging pile placement condition if the height of the target area is larger than that of the charging pile and the difference between the length of the target area and the length of the charging pile is larger than or equal to a preset value;
And the predicted position information determining unit is used for determining predicted points in the target area and taking the position information of the predicted points as the predicted position information of the charging pile.
Based on the above embodiment, the charging pile docking module 330 includes:
and the butt joint gesture determining unit is used for acquiring expected gesture information of the charging pile and determining the butt joint gesture according to the current gesture information and the expected gesture information.
On the basis of the above embodiment, the apparatus further includes:
and the charging pile position information updating module is used for acquiring the depth image and the three-channel color image in real time, determining the charging pile position information according to the depth image and the three-channel color image if the three-channel color image comprises the charging pile identification, and updating the charging pile position information in the environment map.
The charging pile butt joint device provided by the embodiment of the invention can execute the charging pile butt joint method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 4 is a schematic structural diagram of a computer device according to a fourth embodiment of the present invention, and as shown in fig. 4, the computer device includes a processor 70, a memory 71, an input device 72 and an output device 73; the number of processors 70 in the computer device may be one or more, one processor 70 being taken as an example in fig. 4; the processor 70, memory 71, input means 72 and output means 73 in the computer device may be connected by a bus or other means, in fig. 4 by way of example.
The memory 71 is used as a computer readable storage medium for storing software programs, computer executable programs and modules, such as modules corresponding to the charging pile docking method in the embodiment of the present invention (for example, the environment map acquisition module 310, the current pose information acquisition module 320 and the charging pile docking module 330 in the charging pile docking device). The processor 70 executes various functional applications of the computer device and data processing, namely, implements the above-described charging pile docking method by running software programs, instructions and modules stored in the memory 71. The method comprises the following steps:
acquiring an environment map, and generating a travelling instruction according to the position information of the charging pile in the environment map;
If the travel is determined to meet the condition of the preset distance range, acquiring a depth image and a three-channel color image, and calculating to obtain the current pose information of the charging pile according to the depth image and the three-channel color image;
and determining a butt joint gesture according to the current gesture information of the charging pile so as to butt joint with the charging pile.
The memory 71 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functions; the storage data area may store data created according to the use of the terminal, etc. In addition, memory 71 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, memory 71 may further include memory remotely located relative to processor 70, which may be connected to the computer device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 72 may be used to receive entered numeric or character information and to generate key signal inputs related to user settings and function control of the computer device. The output means 73 may comprise a display device such as a display screen.
Example five
A fifth embodiment of the present invention also provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing a charging pile docking method, the method comprising:
acquiring an environment map, and generating a travelling instruction according to the position information of the charging pile in the environment map;
If the travel is determined to meet the condition of the preset distance range, acquiring a depth image and a three-channel color image, and calculating to obtain the current pose information of the charging pile according to the depth image and the three-channel color image;
and determining a butt joint gesture according to the current gesture information of the charging pile so as to butt joint with the charging pile.
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present invention is not limited to the above-described method operations, and may also perform the related operations in the charging pile docking method provided in any embodiment of the present invention.
From the above description of embodiments, it will be clear to a person skilled in the art that the present invention may be implemented by means of software and necessary general purpose hardware, but of course also by means of hardware, although in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a FLASH Memory (FLASH), a hard disk, or an optical disk of a computer, etc., and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments of the present invention.
It should be noted that, in the embodiment of the charging pile docking device, each unit and module included are only divided according to the functional logic, but not limited to the above-mentioned division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present invention.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (6)

1. A charging pile docking method, the method being performed by a robot, the method comprising:
acquiring an environment map, and generating a travelling instruction according to the position information of the charging pile in the environment map;
If the travel is determined to meet the condition of the preset distance range, acquiring a depth image and a three-channel color image, and calculating to obtain the current pose information of the charging pile according to the depth image and the three-channel color image;
Determining a butt joint gesture according to the current gesture information of the charging pile so as to butt joint with the charging pile;
wherein, before the obtaining the environment map, the method further comprises:
acquiring a depth image and a three-channel color image, calculating pose information of feature points according to the depth image and the three-channel color image, and acquiring point cloud data according to the depth image;
Generating an environment map according to the point cloud data and the pose information of the feature points;
If the charging pile identification exists according to the three-channel color image, determining charging pile position information according to the three-channel color image and the depth image, and marking the charging pile identification and the charging pile position information in the environment map;
The generating the environment map according to the point cloud data and the pose information of the feature points comprises the following steps: establishing a three-dimensional sparse map according to the pose information of the feature points; the three-dimensional sparse map is used for positioning the robot, and further generates a three-dimensional dense map according to the pose information, and the three-dimensional dense map is used as an environment map;
Before the current pose information of the charging pile is calculated according to the depth image and the three-channel color image, the method further comprises the following steps:
if the three-channel color image is determined to not comprise the charging pile, acquiring predicted position information of the charging pile in the environment map, and generating an addressing route according to the predicted position information;
Advancing along the addressing route, and acquiring a three-channel color image when a preset distance range condition is met until the three-channel color image comprises the charging pile;
After the charging pile identification and the charging pile position information are marked in the environment map, the method further comprises the following steps:
if the target area of the environment map meets the charging pile placement condition, determining predicted position information of the charging pile according to the target area, and marking the predicted position information in the environment map;
wherein, determining that the target area of the environment map satisfies the charging pile placement condition includes:
If the height of the target area is larger than the height of the charging pile and the difference between the length of the target area and the length of the charging pile is larger than or equal to a preset value, determining that the target area meets the placing condition of the charging pile;
Determining predicted location information of the charging pile according to the target area, including:
And determining a predicted point in the target area, and taking the position information of the predicted point as the predicted position information of the charging pile.
2. The method of claim 1, wherein determining a docking pose based on current pose information of the charging stake comprises:
And acquiring expected pose information of the charging pile, and determining a butt joint pose according to the current pose information and the expected pose information.
3. The method of claim 1, further comprising, after obtaining the environment map:
And acquiring the depth image and the three-channel color image in real time, if the three-channel color image comprises the charging pile identification, determining charging pile position information according to the depth image and the three-channel color image, and updating the charging pile position information in the environment map.
4. A charging pile docking device, characterized in that the device is provided in a robot, the device comprising:
The environment map acquisition module is used for acquiring an environment map and generating a travelling instruction according to the charging pile position information in the environment map;
The current pose information acquisition module is used for acquiring a depth image and a three-channel color image if the current pose information acquisition module determines that the current pose information of the charging pile is advanced to meet the condition of the preset distance range, and calculating the current pose information of the charging pile according to the depth image and the three-channel color image;
the charging pile docking module is used for determining docking postures according to the current pose information of the charging pile so as to dock with the charging pile;
The data acquisition module is used for acquiring a depth image and a three-channel color image, calculating pose information of the feature points according to the depth image and the three-channel color image, and acquiring point cloud data according to the depth image;
the environment map generation module is used for generating an environment map according to the point cloud data and the pose information of the feature points;
The charging pile labeling module is used for determining charging pile position information according to the three-channel color image and the depth image if the charging pile identification exists according to the three-channel color image, and labeling the charging pile identification and the charging pile position information in the environment map;
The generating the environment map according to the point cloud data and the pose information of the feature points comprises the following steps: establishing a three-dimensional sparse map according to the pose information of the feature points; the three-dimensional sparse map is used for positioning the robot, and further generates a three-dimensional dense map according to the pose information, and the three-dimensional dense map is used as an environment map;
Wherein the apparatus further comprises:
the addressing route generation module is used for acquiring the predicted position information of the charging pile in the environment map and generating an addressing route according to the predicted position information if the charging pile is not included in the three-channel color image;
The charging pile addressing module is used for advancing along the addressing route and acquiring three-channel color images until the three-channel color images comprise the charging pile when the preset distance range condition is met;
Wherein the apparatus further comprises:
The preset information labeling module is used for determining the predicted position information of the charging pile according to the target area if the target area of the environment map meets the charging pile placement condition, and labeling the predicted position information in the environment map;
the preset information labeling module comprises:
the charging pile placement condition judging unit is used for determining that the target area meets the charging pile placement condition if the height of the target area is larger than that of the charging pile and the difference between the length of the target area and the length of the charging pile is larger than or equal to a preset value;
And the predicted position information determining unit is used for determining predicted points in the target area and taking the position information of the predicted points as the predicted position information of the charging pile.
5. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the charging pile docking method according to any one of claims 1-3 when the program is executed.
6. A storage medium containing computer executable instructions, which when executed by a computer processor are for performing the charging pile docking method of any one of claims 1-3.
CN202110695800.XA 2021-06-23 2021-06-23 Charging pile butt joint method and device, computer equipment and storage medium Active CN113378750B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110695800.XA CN113378750B (en) 2021-06-23 2021-06-23 Charging pile butt joint method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110695800.XA CN113378750B (en) 2021-06-23 2021-06-23 Charging pile butt joint method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113378750A CN113378750A (en) 2021-09-10
CN113378750B true CN113378750B (en) 2024-06-07

Family

ID=77578447

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110695800.XA Active CN113378750B (en) 2021-06-23 2021-06-23 Charging pile butt joint method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113378750B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113780030A (en) * 2021-09-13 2021-12-10 福州符号信息科技有限公司 Regional decoding method and bar code reading equipment
CN113872287A (en) * 2021-09-26 2021-12-31 追觅创新科技(苏州)有限公司 Charging device, self-moving device, charging method, charging system and storage medium
CN114355911B (en) * 2021-12-24 2024-03-29 深圳甲壳虫智能有限公司 Charging method and device for robot, robot and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106125724A (en) * 2016-06-13 2016-11-16 华讯方舟科技有限公司 A kind of method and system of robot autonomous charging
CN108805014A (en) * 2018-04-26 2018-11-13 内蒙古河山新能源技术推广服务有限公司 Localization method, device and the computer readable storage medium of mobile charging stake
CN109460044A (en) * 2019-01-10 2019-03-12 轻客小觅智能科技(北京)有限公司 A kind of robot method for homing, device and robot based on two dimensional code
CN111674278A (en) * 2020-06-19 2020-09-18 福建易动力电子科技股份有限公司 Underground mobile charging pile system and operation method thereof
CN112183133A (en) * 2020-08-28 2021-01-05 同济大学 Aruco code guidance-based mobile robot autonomous charging method
CN112739505A (en) * 2018-08-31 2021-04-30 罗博艾特有限责任公司 Investigation of autonomous mobile robot into robot working area

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6607162B2 (en) * 2016-09-23 2019-11-20 カシオ計算機株式会社 Robot, state determination system, state determination method and program
US10593060B2 (en) * 2017-04-14 2020-03-17 TwoAntz, Inc. Visual positioning and navigation device and method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106125724A (en) * 2016-06-13 2016-11-16 华讯方舟科技有限公司 A kind of method and system of robot autonomous charging
CN108805014A (en) * 2018-04-26 2018-11-13 内蒙古河山新能源技术推广服务有限公司 Localization method, device and the computer readable storage medium of mobile charging stake
CN112739505A (en) * 2018-08-31 2021-04-30 罗博艾特有限责任公司 Investigation of autonomous mobile robot into robot working area
CN109460044A (en) * 2019-01-10 2019-03-12 轻客小觅智能科技(北京)有限公司 A kind of robot method for homing, device and robot based on two dimensional code
CN111674278A (en) * 2020-06-19 2020-09-18 福建易动力电子科技股份有限公司 Underground mobile charging pile system and operation method thereof
CN112183133A (en) * 2020-08-28 2021-01-05 同济大学 Aruco code guidance-based mobile robot autonomous charging method

Also Published As

Publication number Publication date
CN113378750A (en) 2021-09-10

Similar Documents

Publication Publication Date Title
CN113378750B (en) Charging pile butt joint method and device, computer equipment and storage medium
CN107990899B (en) Positioning method and system based on SLAM
JP7274674B1 (en) Performing 3D reconstruction with unmanned aerial vehicle
US11238653B2 (en) Information processing device, information processing system, and non-transitory computer-readable storage medium for storing program
CN111338361A (en) Obstacle avoidance method, device, equipment and medium for low-speed unmanned vehicle
EP3633478B1 (en) Method and device for assessing probability of presence of obstacle in unknown position
US11429098B2 (en) Path providing apparatus and path providing method
CN112346453A (en) Automatic robot recharging method and device, robot and storage medium
CA3139625C (en) Generating a 2d-navigation map for collision-free navigation by multiple robots
CN113561963A (en) Parking method and device and vehicle
CN110986920A (en) Positioning navigation method, device, equipment and storage medium
CN108521809A (en) Obstacle information reminding method, system, unit and recording medium
CN111604898A (en) Livestock retrieval method, robot, terminal equipment and storage medium
US20240054895A1 (en) Parking method and apparatus, storage medium, chip and vehicle
CN113587930A (en) Indoor and outdoor navigation method and device of autonomous mobile robot based on multi-sensor fusion
CN112884900A (en) Landing positioning method and device for unmanned aerial vehicle, storage medium and unmanned aerial vehicle nest
CN115981305A (en) Robot path planning and control method and device and robot
CN114593739A (en) Vehicle global positioning method and device based on visual detection and reference line matching
US11619495B2 (en) Position estimating apparatus and position estimating method
CN112220405A (en) Self-moving tool cleaning route updating method, device, computer equipment and medium
CN112540613A (en) Method and device for searching recharging seat position and mobile robot
CN113607166A (en) Indoor and outdoor positioning method and device for autonomous mobile robot based on multi-sensor fusion
CN114623836A (en) Vehicle pose determining method and device and vehicle
JP2021081850A (en) Location estimation device, location estimation method, and program
JP2021099384A (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant