CN113110408B - Robot positioning system and method, robot, and computer-readable storage medium - Google Patents

Robot positioning system and method, robot, and computer-readable storage medium Download PDF

Info

Publication number
CN113110408B
CN113110408B CN201911355279.4A CN201911355279A CN113110408B CN 113110408 B CN113110408 B CN 113110408B CN 201911355279 A CN201911355279 A CN 201911355279A CN 113110408 B CN113110408 B CN 113110408B
Authority
CN
China
Prior art keywords
positioning
robot
markers
target
marker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911355279.4A
Other languages
Chinese (zh)
Other versions
CN113110408A (en
Inventor
王迎春
陈超
郭晓丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jizhijia Technology Co Ltd
Original Assignee
Beijing Jizhijia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jizhijia Technology Co Ltd filed Critical Beijing Jizhijia Technology Co Ltd
Priority to CN201911355279.4A priority Critical patent/CN113110408B/en
Publication of CN113110408A publication Critical patent/CN113110408A/en
Application granted granted Critical
Publication of CN113110408B publication Critical patent/CN113110408B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a robot positioning method and system, a robot and a computer readable storage medium, wherein a plurality of positioning markers are arranged above two sides of a channel on which the robot runs, a target image comprising the plurality of positioning markers is shot when the robot runs on the channel, and the position coordinates and the movement direction of the robot in a geographic coordinate system are determined based on the position coordinates of at least two target positioning markers in the target image. The application utilizes a plurality of positioning markers above the robot to position the robot, overcomes the defect that positioning failure is easy to cause by utilizing the ground markers in the prior art, and further utilizes the center line of the target image to screen the target positioning markers which are closer to the distance between other positioning markers in the target image and the robot, thereby being beneficial to improving the accuracy of the position coordinates and the movement direction of the determined robot.

Description

Robot positioning system and method, robot, and computer-readable storage medium
Technical Field
The application relates to the field of computer technology and image processing, in particular to a robot positioning method and system, a robot and a computer readable storage medium.
Background
In positioning a device such as a robot, the position information of the robot is generally positioned by using the position information of a marker such as a two-dimensional code on the ground.
Because the ground marker is easily damaged by abrasion and the like or is shielded by other objects and the like, the image of the ground marker cannot be acquired, and then the position information of the ground marker cannot be acquired, so that the positioning of equipment such as a robot cannot be realized.
Disclosure of Invention
In view of the above, the present application provides at least a robot positioning method and system, a robot, and a computer readable storage medium.
In a first aspect, the present application provides a robotic positioning system comprising: the robot comprises a robot body and a plurality of positioning markers symmetrically distributed above two sides of a channel on which the robot runs, wherein the positioning markers above any side of the channel are uniformly distributed;
when the robot runs on the channel, shooting a target image comprising a plurality of positioning markers, and determining the position coordinates and the movement direction of the robot in a geographic coordinate system based on the position coordinates of at least two target positioning markers in the target image; wherein, the target positioning markers are respectively distributed above two sides of the channel.
In one possible embodiment, the locating markers are distributed on the roof above both sides of the channel.
In a second aspect, the present application provides a robot comprising a first camera, a processor, and a movement mechanism; the processor comprises a positioning and screening module, a first positioning module and a second positioning module;
the first camera is arranged to shoot target images of a plurality of positioning markers above the robot;
the positioning and screening module is used for screening at least two target positioning markers for positioning the robot from the target image based on the central line of the target image; the target positioning markers obtained through screening are respectively distributed above two sides of the channel;
the first positioning module is configured to determine a position coordinate and a movement direction of the robot in an image coordinate system corresponding to the target image based on position coordinates of the at least two target positioning markers in the image coordinate system;
the second positioning module is used for determining the position coordinates and the movement direction of the robot in the geographic coordinate system based on the position coordinates and the movement direction of the robot in the image coordinate system;
The action mechanism is configured to drive the robot to move based on the position coordinates and the movement direction of the robot in the geographic coordinate system.
In one possible implementation manner, the location screening module is specifically configured to, if two location markers exist on a midline of the target image, take, as target location markers, two location markers in the target image, which are closest to each location marker on the midline, and the location marker on the midline.
In one possible embodiment, the location screening module is specifically configured to,
if no positioning marker exists on the midline of the target image, acquiring two candidate positioning markers closest to the midline;
and if the two obtained candidate positioning markers are positioned on the same side of the central line, taking the two positioning markers which are in the target image and are closest to each obtained candidate positioning marker respectively and the two candidate positioning markers as target positioning markers.
In one possible embodiment, the location screening module is specifically configured to,
If a positioning marker exists on the midline of the target image, acquiring a first positioning marker closest to the midline;
screening a second positioning marker closest to the first positioning marker;
screening a third positioning marker from the rest positioning markers in the target image based on one positioning marker, the first positioning marker, the second positioning marker and a preset shape for marker screening on the middle line;
and taking one positioning marker, the first positioning marker, the second positioning marker and the third positioning marker on the middle line as target positioning markers.
In one possible embodiment, the location screening module is specifically configured to,
and screening two target positioning markers which are positioned on the middle line of the target image or are closest to the middle line of the target image from the positioning markers.
In one possible embodiment, the positioning markers are symmetrically distributed over both sides of the channel, and the positioning markers over either side of the channel are uniformly arranged.
In one possible embodiment, the first positioning module is specifically configured to,
Dividing the at least two target positioning markers into at least one positioning marker group, wherein each positioning marker group comprises two target positioning markers which are respectively distributed above two sides of the channel;
for each positioning mark group, determining the midpoint coordinate of the connecting line of the two target positioning marks in the positioning mark group and the angle information of the connecting line of the two target positioning marks in the positioning mark group based on the position coordinates of the two target positioning marks in the positioning mark group in an image coordinate system corresponding to the target image;
determining a position coordinate of the robot corresponding to a first coordinate axis and a position coordinate relative to a second coordinate axis in the image coordinate system based on the midpoint coordinates corresponding to each positioning mark group; the first coordinate axis is perpendicular to the second coordinate axis, and the second coordinate axis is parallel to the advancing direction of the channel;
and determining the movement direction of the robot in the image coordinate system based on the angle information corresponding to each positioning mark group.
In one possible embodiment, the first positioning module is specifically configured to,
determining the position coordinates of the robot corresponding to the second coordinate axis in the image coordinate system based on the coordinate information of each target positioning marker corresponding to the second coordinate axis in the image coordinate system; wherein the second coordinate axis is parallel to the advancing direction of the channel.
In one possible embodiment, the robot may further comprise a second camera, the second camera being configured to capture a plurality of auxiliary positioning markers located under the robot;
the processor further comprises a correction module, wherein the correction module is configured to determine a correction position coordinate corresponding to the robot based on the position coordinate of each auxiliary positioning marker under the geographic coordinate system; and correcting the position coordinates of the robot in a geographic coordinate system based on the ground-assisted positioning marker navigation based on the corrected position coordinates.
In a third aspect, the present application provides a robot positioning method, including:
acquiring target images of a plurality of positioning markers, which are shot by a robot and are positioned above a channel on which the robot runs;
screening at least two target positioning markers for positioning the position of the robot from the target image based on the central line of the target image; the target positioning markers obtained through screening are respectively distributed above two sides of the channel;
determining the position coordinates and the movement direction of the robot in an image coordinate system corresponding to the target image based on the position coordinates of the at least two target positioning markers in the image coordinate system;
And determining the position coordinates and the movement direction of the robot in the geographic coordinate system based on the position coordinates and the movement direction of the robot in the image coordinate system.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the robot positioning method described above.
The application provides a robot positioning method and system, a robot and a computer readable storage medium, wherein a plurality of positioning markers are arranged above two sides of a channel on which the robot runs, a target image comprising the plurality of positioning markers is shot when the robot runs on the channel, and the position coordinates and the movement direction of the robot in a geographic coordinate system are determined based on the position coordinates of at least two target positioning markers in the target image. The application utilizes a plurality of positioning markers above the robot to position the robot, overcomes the defect that positioning failure is easy to cause by utilizing the ground markers in the prior art, and further utilizes the center line of the target image to screen the target positioning markers which are closer to the distance between other positioning markers in the target image and the robot, thereby being beneficial to improving the accuracy of the position coordinates and the movement direction of the determined robot.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 shows a schematic diagram of a positioning marker setup in a robotic positioning system according to an embodiment of the application;
fig. 2 shows a schematic structural diagram of a robot according to an embodiment of the present application;
FIG. 3 shows a schematic diagram of a screening target-localization marker in an embodiment of the present application;
FIG. 4 shows a schematic diagram of another screening targeting marker in an embodiment of the application;
FIG. 5 shows a schematic diagram of yet another screening target-localization marker in an embodiment of the present application;
fig. 6 shows a flowchart of a robot positioning method according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for the purpose of illustration and description only and are not intended to limit the scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this disclosure, illustrates operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to or removed from the flow diagrams by those skilled in the art under the direction of the present disclosure.
In addition, the described embodiments are only some, but not all, embodiments of the application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present application.
In order to enable a person skilled in the art to use the present disclosure, the following embodiments are presented in connection with a specific application scenario "positioning of a cargo handling robot in a warehouse logistics". It will be apparent to those skilled in the art that the general principles defined herein may be applied to other embodiments and applications requiring robotic positioning without departing from the spirit and scope of the application. While the application is described primarily in terms of positioning of a cargo robot in a warehouse logistics, it should be understood that this is but one exemplary embodiment.
It should be noted that the term "comprising" will be used in embodiments of the application to indicate the presence of the features stated hereafter, but not to exclude the addition of other features.
At present, the positioning of the storage logistics robot is realized by generally utilizing the two-dimension code arranged on the ground in storage logistics, under many conditions, the two-dimension code on the ground is easy to be rolled by equipment such as the storage logistics robot, objects and the like to cause abrasion, and the abrasion can possibly cause missed detection of the two-dimension code, so that the positioning inaccuracy or the positioning failure of the storage logistics robot is caused. In addition, the two-dimensional code arranged on the ground is easy to miss due to shielding by other objects, and the positioning inaccuracy or positioning failure of the storage logistics robot can be caused. Aiming at the technical problems, the application provides a robot positioning method, a system and a robot, wherein the positioning markers positioned above the storage logistics robot are used for positioning the storage logistics robot, so that the defect that positioning failure is easy to cause by using ground markers in the prior art is overcome, meanwhile, the target positioning markers screened by using the central line of an image shot by the storage logistics robot are closer to the storage logistics robot relative to other positioning markers in the image, and the accuracy of the position coordinates and the movement direction of the storage logistics robot is improved.
The embodiment of the application provides a robot positioning system which comprises a robot and a plurality of positioning markers symmetrically distributed above two sides of a channel on which the robot runs, wherein the positioning markers above any side of the channel are uniformly distributed.
As shown in fig. 1, in order to realize the positioning of the robot in the channel, positioning markers are respectively arranged at symmetrical positions above two sides of the channel, and the positioning markers above any side of the channel are uniformly distributed, for example, one positioning marker is arranged every 1 m. In a specific implementation, the positioning markers 11 may be set as shown in fig. 1, where the direction indicated by the arrow 12 is the advancing direction of the channel.
In particular embodiments, the positioning markers may be distributed on the roof above both sides of the channel. Of course, the positioning markers are not required to be arranged on the roof of the warehouse, and only need to be arranged at a position higher than the robot, and can be prevented from being blocked by other objects in other warehouses.
When the robot runs on the channel, shooting a target image comprising a plurality of positioning markers, and determining the position coordinates and the movement direction of the robot in a geographic coordinate system based on the position coordinates of at least two target positioning markers in the target image; wherein, the target positioning markers are respectively distributed above two sides of the channel.
The embodiment utilizes the positioning marker above the storage logistics robot to position the storage logistics robot, and overcomes the defect that positioning failure is easily caused by utilizing the ground marker in the prior art.
An embodiment of the present application provides a robot, as shown in fig. 2, including: a first camera 210, a processor 220, and an action mechanism 230; the processor includes a location screening module 2201, a first location module 2202, and a second location module 2203.
The first camera 210 is configured to capture target images of a plurality of positioning markers located above the robot.
To overcome shielding, wear, etc. of the locating markers, the locating markers are placed above the ground, for example, they may be placed on the roof of a warehouse. In a specific application scenario, the robot generally advances or retreats according to a set channel, the first camera 210 is just opposite to the roof for shooting, and due to the limitation of the shooting angle of the first camera, the shot positioning markers are a plurality of positioning markers above the robot and close to the robot. Positioning the robot with a positioning marker that is closer to the robot is beneficial to improving positioning accuracy.
The positioning screening module 2201 is configured to screen at least two target positioning markers for positioning the robot from the target image based on a centerline of the target image; the target positioning markers obtained through screening are respectively distributed above two sides of the channel.
Because the first camera is just over to shooting the upper side, the first camera is arranged above the robot, and therefore, the center line of the shot target image is the position of the robot in the target image. In the step, the target positioning markers used for positioning the robot are screened based on the central line of the target image, so that the positioning markers which are closer to the robot are further screened from the positioning markers which are closer to the robot, and the positioning accuracy can be further improved.
Here, the positioning markers obtained by screening at least comprise two positioning markers symmetrically distributed above two sides of the channel, and the position coordinates of the robot in the image coordinates can be determined based on the symmetrically distributed markers. In addition, based on the connecting lines of the two positioning markers symmetrically distributed above the two sides of the channel, the movement direction information of the robot under the image coordinates can be determined.
The first positioning module 2202 is configured to determine a position coordinate and a movement direction of the robot in an image coordinate system corresponding to the target image based on position coordinates of the at least two target positioning markers in the image coordinate system.
When the target positioning markers are more than two, the target positioning markers can be divided into one or more positioning marker groups, and each positioning marker group comprises two target positioning markers which are distributed according to a preset direction, namely, two positioning markers which are symmetrically distributed above two sides of the channel. And then, respectively determining the position coordinates of one positioning marker in the image coordinate system based on each positioning marker group, and finally, determining the final position coordinates of the robot in the image coordinate system based on the position coordinates determined by each positioning marker group. The predetermined direction is a direction corresponding to the cross section of the passage.
Here, the movement direction of one positioning marker can be determined based on each positioning marker group, respectively, and finally the movement direction of the robot in the image coordinate system can be determined based on the movement direction determined by each positioning marker group.
The second positioning module 2203 is configured to determine the position coordinates and the movement direction of the robot in the geographic coordinate system based on the position coordinates and the movement direction of the robot in the image coordinate system.
Here, after determining the position coordinates and the movement direction of the robot in the image coordinate system, the determination of the positioning and movement direction of the robot in the geographic coordinate system may be implemented in combination with the two-dimensional code on the ground. In the specific implementation, the position coordinates of the two-dimension code under the geographic coordinates can be determined by analyzing the ground two-dimension code, and the position coordinates and the movement direction of the robot in the geographic coordinate system can be determined based on the position coordinates of the ground two-dimension code under the geographic coordinates and the position coordinates and the movement direction of the robot in the image coordinate system.
The action mechanism 230 is arranged to drive the robot to move based on its position coordinates in the geographical coordinate system and its direction of movement.
After the position coordinates and the movement direction of the robot in the geographic coordinate system are determined, the position and the movement direction of the robot can be corrected based on the position coordinates and the movement direction of the channel in the geographic coordinate system, and the robot can be driven to move, for example, the robot can be corrected to move according to the direction of the channel, the movement of the robot to the middle position of the channel is controlled, and the distance of the robot along the channel is controlled.
The embodiment utilizes the positioning markers above the robot to position the robot, overcomes the defect that positioning failure is easily caused by utilizing the ground markers in the prior art, and simultaneously utilizes the center line of the target image to screen the target positioning markers which are relatively close to other positioning markers in the target image and the distance between the other positioning markers and the robot, thereby being beneficial to improving the accuracy of the position coordinates and the movement direction of the determined robot.
In some embodiments, the location screening module 2201 is specifically configured to, if two location markers are detected on a centerline of the target image, take, as target location markers, two location markers in the target image that are closest to each location marker on the centerline, and the location marker on the centerline.
Specifically, as shown in fig. 3, the positioning markers 31 and 32 are symmetrically distributed above two sides of the channel, and the two positioning markers are both located on a center line 37 of the target image, that is, the two positioning markers are located directly above the robot and are closest to the robot, and the two positioning markers can be used as target positioning markers.
In order to improve the positioning accuracy, as shown in fig. 3, other target positioning markers may be selected from other positioning markers in the target image, so as to realize the positioning of the robot together with the positioning markers 31 and 32. Specifically, a positioning marker 33 and a positioning marker 34 closest to the positioning marker 31, and a positioning marker 35 and a positioning marker 36 closest to the positioning marker 32 may be both taken as target positioning markers.
After the target positioning marker is obtained through screening, the robot is positioned by using the first positioning module 2202, namely, the first positioning module is set as follows:
dividing the at least two target positioning markers into at least one positioning marker group, wherein each positioning marker group comprises two target positioning markers which are respectively distributed above two sides of the channel;
for each positioning mark group, determining the midpoint coordinate of the connecting line of the two target positioning marks in the positioning mark group and the angle information of the connecting line of the two target positioning marks in the positioning mark group based on the position coordinates of the two target positioning marks in the positioning mark group in an image coordinate system corresponding to the target image;
Determining a position coordinate of the robot corresponding to a first coordinate axis and a position coordinate relative to a second coordinate axis in the image coordinate system based on the midpoint coordinates corresponding to each positioning mark group; the first coordinate axis is perpendicular to the second coordinate axis, and the second coordinate axis is parallel to the advancing direction of the channel;
and determining the movement direction of the robot in the image coordinate system based on the angle information corresponding to each positioning mark group.
The two positioning markers included in the positioning marker group are symmetrically distributed above two sides of the channel.
And the midpoint coordinates of the connecting lines and the angle information of the connecting lines in the image coordinate system are determined. The midpoint coordinates are the position coordinates of the robot in the image coordinate system determined by the positioning mark group. The angle information is the angle information of the robot moving under the image coordinate system determined by the positioning mark group, namely the moving direction of the robot under the image coordinate system.
The midpoint coordinates include a position coordinate of a midpoint of the connecting line corresponding to the first coordinate axis in the image coordinate system and a position coordinate of a midpoint of the connecting line corresponding to the second coordinate axis in the image coordinate system. In the specific calculation, a first average value of position coordinates of each midpoint relative to a first coordinate axis and a second average value of position coordinates of each midpoint relative to a second coordinate axis can be calculated, wherein the first average value is used as the position coordinates of the robot corresponding to the first coordinate axis in the image coordinate system, and the second average value is used as the position coordinates of the robot corresponding to the second coordinate axis in the image coordinate system.
The average value of the angles corresponding to the angle information can be calculated, and the obtained average value is used as the moving direction of the robot in the image coordinate system.
In addition, when there is a positioning marker that does not belong to any positioning marker group among the screened target positioning markers, the first positioning module 2202 may determine the position coordinates of the robot in the image coordinate system corresponding to the second coordinate axis by:
determining the position coordinates of the robot corresponding to the second coordinate axis in the image coordinate system based on the coordinate information of each target positioning marker corresponding to the second coordinate axis in the image coordinate system; wherein the second coordinate axis is parallel to the advancing direction of the channel. Specifically, a mean value of coordinates of each target positioning marker corresponding to the second coordinate axis in the image coordinate system may be calculated, and the obtained mean value is used as a position coordinate of the robot corresponding to the second coordinate axis in the image coordinate system.
It should be noted that, when the positioning and screening module 2201 screens the target positioning markers, only two positioning markers located on the middle line of the target image may be selected, and the position coordinates of the robot corresponding to the first coordinate axis in the image coordinate system and the position coordinates of the robot corresponding to the second coordinate axis in the image coordinate system may be determined by using the two target positioning markers on the middle line.
In addition, if more than two target positioning markers are selected, the position coordinates of the robot corresponding to the second coordinate axis in the image coordinate system can be determined by using only the two target positioning markers on the middle line.
In some embodiments, the location filtering module 2201 may be further specifically configured to:
if no positioning marker exists on the midline of the target image, acquiring two candidate positioning markers closest to the midline; and if the two obtained candidate positioning markers are positioned on the same side of the central line, taking the two positioning markers which are in the target image and are closest to each obtained candidate positioning marker respectively and the two candidate positioning markers as target positioning markers.
Specifically, as shown in fig. 4, if there is no positioning marker on the centerline 41 of the target image, the candidate positioning marker 42 and the candidate positioning marker 43 closest to the centerline 41 need to be obtained at this time, and if the candidate positioning marker 42 and the candidate positioning marker 43 are located on the same side of the centerline 41, the candidate positioning marker 42 and the candidate positioning marker 43 may be used as the target positioning markers at this time.
In order to improve the positioning accuracy, as shown in fig. 4, other target positioning markers may be selected from other positioning markers in the target image, so as to realize the positioning of the robot together with the candidate positioning markers 42 and 43. Specifically, a positioning marker 44 and a positioning marker 45 closest to the candidate positioning marker 42, and a positioning marker 46 and a positioning marker 47 closest to the candidate positioning marker 43 may be both set as target positioning markers.
After the target positioning markers are determined, the position coordinates and the movement direction of the robot under the image coordinate system can be determined by the same method as above, and the description thereof will be omitted.
It should be noted that, when the target positioning markers are screened, only two positioning markers closest to the center line of the target image may be selected, and the position coordinates of the robot corresponding to the first coordinate axis in the image coordinate system and the position coordinates of the robot corresponding to the second coordinate axis in the image coordinate system may be determined by using the two selected target positioning markers.
In addition, if more than two target positioning markers are selected, the position coordinates of the robot corresponding to the second coordinate axis in the image coordinate system can be determined by using only the two positioning markers closest to the center line of the target image.
The method for screening the target positioning markers is suitable for a scene with a small difference between the moving direction of the robot in the channel and the advancing direction of the channel, if the difference between the moving direction of the robot in the channel and the direction of the channel is large, the situation that two positioning markers symmetrically distributed above two sides of the channel are simultaneously positioned on the same side of the center line of the target image or simultaneously positioned on the center line of the target image cannot occur, and at this time, the positioning screening module 2201 can screen the target positioning markers from the target image by using the following steps:
if a positioning marker exists on the midline of the target image, acquiring a first positioning marker closest to the midline; screening a second positioning marker closest to the first positioning marker; screening a third positioning marker from the rest positioning markers in the target image based on one positioning marker, the first positioning marker, the second positioning marker and a preset shape for marker screening on the middle line; and taking one positioning marker, the first positioning marker, the second positioning marker and the third positioning marker on the middle line as target positioning markers.
Specifically, as shown in fig. 5, only the positioning marker 52 exists on the centerline 51 of the target image, and then one positioning marker 53 closest to the centerline 51, i.e., the first positioning marker, needs to be acquired, and then one positioning marker 54 closest to the positioning marker 53, i.e., the second positioning marker, is screened. At this time, three positioning markers have been determined, and based on the principle that the connecting lines of the positioning markers form a rectangle as much as possible, that is, a preset shape, the positioning markers 55 are obtained by screening, and finally the positioning markers 52, 53, 54 and 55 are used as target positioning markers.
In addition, the target positioning markers can be screened based on the principle that a rectangle formed by connecting lines of the positioning markers is positioned in the advancing direction of the robot.
After the target positioning markers are determined, the position coordinates and the movement direction of the robot under the image coordinate system can be determined by the same method as above, and the description thereof will be omitted.
In special cases, if light or other reasons cause that more than 4 positioning mark areas cannot be detected, two positioning marks symmetrically distributed above two sides of the channel can be used for positioning the robot. Specifically, the localization screening module 2201 may screen the target localization markers from the target images using the following steps:
And screening two target positioning markers which are positioned on the middle line of the target image or are closest to the middle line of the target image from the positioning markers.
The two target positioning markers obtained by the screening are not required to be positioned on the middle line at the same time, one can be positioned on the middle line, the other is closest to the middle line, and the two target positioning markers can also be positioned on two sides of the middle line respectively.
The positioning markers in the above embodiments may be set to be circular, rectangular, or the like, and may be set to be red, yellow, or the like in order to avoid interference by other objects. The specific shape and color of the positioning marker can be set according to the specific application scene.
Before the robot positioning operation, the positioning marker needs to be detected and determined first, and then the positioning operation is performed by utilizing the center of the positioning marker. For example, when the positioning marker is circular, it is first necessary to perform sobel edge detection, and then detect with a hough circle to determine the center of the positioning marker. The positioning markers are set to be round, and under the condition that the positioning markers are broken, the positioning markers can be detected by utilizing partial edge detection, so that missed detection is avoided. The circular anti-interference performance is stronger.
The positioning markers in the above embodiments are uniformly disposed above both sides of the passage on which the robot travels, and are symmetrical in pairs in the cross-sectional direction of the passage, and thus, the positioning markers are arranged in one or more passages that match the passage on the ground on which the robot travels.
The first camera 210 is shooting forward and only a locating marker located within a certain range above the robot is shot. When more positioning markers are shot, the markers far away from the robot can be removed, so that the positioning confidence is improved.
In addition, the positioning markers may not be arranged in a channel arrangement form, and may be uniformly arranged on the roof, and then the robot may be positioned based on the position information of more than two target markers that can be shot by the first camera 210. The positioning markers here need to include their position in the geographical coordinate system, so that the robot can be positioned based on the position of the target positioning marker in the geographical coordinate system.
In some embodiments, the robot may further include a second camera 240, where the second camera 240 is configured to capture a plurality of auxiliary positioning markers located under the robot.
The processor 220 further includes a correction module 2204, where the correction module 2204 is configured to determine a corrected position coordinate corresponding to the robot based on the position coordinate of each auxiliary positioning marker in the geographic coordinate system; and correcting the position coordinates of the robot in a geographic coordinate system based on the ground-assisted positioning marker navigation based on the corrected position coordinates.
The auxiliary positioning marker may be a two-dimensional code arranged on the ground, after the two-dimensional code on the ground is shot, the two-dimensional code is analyzed, the coordinate of the two-dimensional code under the geographic coordinate can be obtained, the correction position coordinate corresponding to the robot can be determined by utilizing the position coordinate of the two-dimensional code obtained by analysis under the geographic coordinate system, and finally the position coordinate of the robot in the geographic coordinate system is corrected by utilizing the correction position coordinate.
The above-mentioned operations performed by the processor may be performed by a peripheral processor, which acquires images acquired by the first camera 210 and the second camera 240, determines the position coordinates and the movement direction of the robot in the geographic coordinate system according to the acquired images, and drives the robot to move by using the action mechanism 230 of the robot.
Corresponding to the robot, the embodiment of the application also provides a robot positioning method which is applied to the robot and can achieve the same or similar beneficial effects. Specifically, as shown in fig. 6, the robot positioning method provided by the application may include the following steps:
s610, acquiring target images of a plurality of positioning markers, which are shot by the robot and are positioned above a channel on which the robot runs.
S620, screening at least two target positioning markers for positioning the position of the robot from the target image based on the central line of the target image; the target positioning markers obtained through screening are respectively distributed above two sides of the channel.
S630, determining the position coordinates and the movement direction of the robot in the image coordinate system based on the position coordinates of the at least two target positioning markers in the image coordinate system corresponding to the target image.
S640, determining the position coordinates and the movement direction of the robot in the geographic coordinate system based on the position coordinates and the movement direction of the robot in the image coordinate system.
The embodiment of the application also provides a computer program product corresponding to the method and the system, which comprises a computer readable storage medium storing program codes, wherein the instructions included in the program codes can be used for executing the method in the previous method embodiment, and specific implementation can be referred to the method embodiment and will not be repeated here.
The foregoing description of various embodiments is intended to highlight differences between the various embodiments, which may be the same or similar to each other by reference, and is not repeated herein for the sake of brevity.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the method embodiments, and are not repeated in the present disclosure. In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, and the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, and for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, indirect coupling or communication connection of devices or modules, electrical, mechanical, or other form.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily appreciate variations or alternatives within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (12)

1. A robotic positioning system, comprising: the robot comprises a robot body and a plurality of positioning markers symmetrically distributed above two sides of a channel on which the robot runs, wherein the positioning markers above any side of the channel are uniformly distributed;
when the robot runs on the channel, shooting a target image comprising a plurality of positioning markers, and determining the position coordinates and the movement direction of the robot in a geographic coordinate system based on the position coordinates of at least two target positioning markers in the target image; wherein the target positioning markers are respectively distributed above two sides of the channel;
the target positioning markers comprise at least one pair of positioning markers which are arranged in pairs, and the connecting line between the target positioning markers which are arranged in pairs is vertical to the central line of the channel;
When the robot runs on the channel, shooting a plurality of auxiliary positioning markers positioned below the robot; determining a correction position coordinate corresponding to the robot based on the position coordinate of each auxiliary positioning marker under the geographic coordinate system; and correcting the position coordinates of the robot in a geographic coordinate system based on the ground-assisted positioning marker navigation based on the corrected position coordinates.
2. The robotic positioning system of claim 1, wherein the positioning markers are distributed on the roof above both sides of the channel.
3. The robot is characterized by comprising a first camera, a processor and a movement mechanism; the processor comprises a positioning and screening module, a first positioning module and a second positioning module;
the first camera is arranged to shoot target images of a plurality of positioning markers above the robot;
the positioning and screening module is used for screening at least two target positioning markers for positioning the robot from the target image based on the central line of the target image; the target positioning markers obtained through screening are respectively distributed above two sides of a channel on which the robot runs; the target positioning markers comprise at least one pair of positioning markers which are arranged in pairs, and the connecting line between the target positioning markers which are arranged in pairs is perpendicular to the central line of the channel;
The first positioning module is configured to determine a position coordinate and a movement direction of the robot in an image coordinate system corresponding to the target image based on position coordinates of the at least two target positioning markers in the image coordinate system;
the second positioning module is used for determining the position coordinates and the movement direction of the robot in the geographic coordinate system based on the position coordinates and the movement direction of the robot in the image coordinate system;
the action mechanism is arranged to drive the robot to move based on the position coordinates and the movement direction of the robot in the geographic coordinate system;
the robot can further comprise a second camera, wherein the second camera is arranged to shoot a plurality of auxiliary positioning markers positioned below the robot;
the processor further comprises a correction module, wherein the correction module is configured to determine a correction position coordinate corresponding to the robot based on the position coordinate of each auxiliary positioning marker under the geographic coordinate system; and correcting the position coordinates of the robot in a geographic coordinate system based on the ground-assisted positioning marker navigation based on the corrected position coordinates.
4. A robot according to claim 3, wherein the positioning and screening module is specifically configured to, if two positioning markers are detected on a midline of the target image, take two positioning markers in the target image, which are closest to each positioning marker on the midline, respectively, and the positioning marker on the midline as target positioning markers.
5. The robot of claim 3, wherein the positioning and screening module is specifically configured to,
if no positioning marker exists on the midline of the target image, acquiring two candidate positioning markers closest to the midline;
and if the two obtained candidate positioning markers are positioned on the same side of the central line, taking the two positioning markers which are in the target image and are closest to each obtained candidate positioning marker respectively and the two candidate positioning markers as target positioning markers.
6. The robot of claim 3, wherein the positioning and screening module is specifically configured to,
if a positioning marker exists on the midline of the target image, acquiring a first positioning marker closest to the midline;
Screening a second positioning marker closest to the first positioning marker;
screening a third positioning marker from the rest positioning markers in the target image based on one positioning marker, the first positioning marker, the second positioning marker and a preset shape for marker screening on the middle line;
and taking one positioning marker, the first positioning marker, the second positioning marker and the third positioning marker on the middle line as target positioning markers.
7. The robot of claim 3, wherein the positioning and screening module is specifically configured to,
and screening two target positioning markers which are positioned on the middle line of the target image or are closest to the middle line of the target image from the positioning markers.
8. A robot according to claim 3, wherein the positioning markers are symmetrically distributed over both sides of the channel and the positioning markers over either side of the channel are evenly arranged.
9. The robot of claim 3, wherein the first positioning module is specifically configured to,
Dividing the at least two target positioning markers into at least one positioning marker group, wherein each positioning marker group comprises two target positioning markers which are respectively distributed above two sides of the channel;
for each positioning mark group, determining the midpoint coordinate of the connecting line of the two target positioning marks in the positioning mark group and the angle information of the connecting line of the two target positioning marks in the positioning mark group based on the position coordinates of the two target positioning marks in the positioning mark group in an image coordinate system corresponding to the target image;
determining a position coordinate of the robot corresponding to a first coordinate axis and a position coordinate relative to a second coordinate axis in the image coordinate system based on the midpoint coordinates corresponding to each positioning mark group; the first coordinate axis is perpendicular to the second coordinate axis, and the second coordinate axis is parallel to the advancing direction of the channel;
and determining the movement direction of the robot in the image coordinate system based on the angle information corresponding to each positioning mark group.
10. The robot of claim 3, wherein the first positioning module is specifically configured to,
Determining the position coordinates of the robot corresponding to the second coordinate axis in the image coordinate system based on the coordinate information of each target positioning marker corresponding to the second coordinate axis in the image coordinate system; wherein the second coordinate axis is parallel to the advancing direction of the channel.
11. A robot positioning method, comprising:
acquiring target images of a plurality of positioning markers, which are shot by a robot and are positioned above a channel on which the robot runs;
screening at least two target positioning markers for positioning the position of the robot from the target image based on the central line of the target image; the target positioning markers obtained through screening are respectively distributed above two sides of the channel; the target positioning markers comprise at least one pair of positioning markers which are arranged in pairs, and the connecting line between the target positioning markers which are arranged in pairs is perpendicular to the central line of the channel;
determining the position coordinates and the movement direction of the robot in an image coordinate system corresponding to the target image based on the position coordinates of the at least two target positioning markers in the image coordinate system;
Determining the position coordinates and the movement direction of the robot in a geographic coordinate system based on the position coordinates and the movement direction of the robot in the image coordinate system;
shooting a plurality of auxiliary positioning markers positioned below the robot;
determining a correction position coordinate corresponding to the robot based on the position coordinate of each auxiliary positioning marker under the geographic coordinate system; and correcting the position coordinates of the robot in a geographic coordinate system based on the ground-assisted positioning marker navigation based on the corrected position coordinates.
12. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when run by a processor, performs the robot positioning method according to claim 11.
CN201911355279.4A 2019-12-25 2019-12-25 Robot positioning system and method, robot, and computer-readable storage medium Active CN113110408B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911355279.4A CN113110408B (en) 2019-12-25 2019-12-25 Robot positioning system and method, robot, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911355279.4A CN113110408B (en) 2019-12-25 2019-12-25 Robot positioning system and method, robot, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN113110408A CN113110408A (en) 2021-07-13
CN113110408B true CN113110408B (en) 2023-09-29

Family

ID=76708504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911355279.4A Active CN113110408B (en) 2019-12-25 2019-12-25 Robot positioning system and method, robot, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN113110408B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1598610A (en) * 2003-09-16 2005-03-23 三星电子株式会社 Apparatus and method for estimating a position and an orientation of a mobile robot
CN101452292A (en) * 2008-12-29 2009-06-10 天津理工大学 Fish glasses head omnidirectional vision aiming method based on sequence dual-color dot matrix type navigation mark
CN103294059A (en) * 2013-05-21 2013-09-11 无锡普智联科高新技术有限公司 Hybrid navigation belt based mobile robot positioning system and method thereof
CN107463173A (en) * 2017-07-31 2017-12-12 广州维绅科技有限公司 AGV air navigation aids of storing in a warehouse and device, computer equipment and storage medium
CN107703940A (en) * 2017-09-25 2018-02-16 芜湖智久机器人有限公司 A kind of air navigation aid based on ceiling Quick Response Code
CN109920266A (en) * 2019-02-20 2019-06-21 武汉理工大学 A kind of intelligent vehicle localization method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110243360B (en) * 2018-03-08 2022-02-22 深圳市优必选科技有限公司 Method for constructing and positioning map of robot in motion area

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1598610A (en) * 2003-09-16 2005-03-23 三星电子株式会社 Apparatus and method for estimating a position and an orientation of a mobile robot
CN101452292A (en) * 2008-12-29 2009-06-10 天津理工大学 Fish glasses head omnidirectional vision aiming method based on sequence dual-color dot matrix type navigation mark
CN103294059A (en) * 2013-05-21 2013-09-11 无锡普智联科高新技术有限公司 Hybrid navigation belt based mobile robot positioning system and method thereof
CN107463173A (en) * 2017-07-31 2017-12-12 广州维绅科技有限公司 AGV air navigation aids of storing in a warehouse and device, computer equipment and storage medium
CN107703940A (en) * 2017-09-25 2018-02-16 芜湖智久机器人有限公司 A kind of air navigation aid based on ceiling Quick Response Code
CN109920266A (en) * 2019-02-20 2019-06-21 武汉理工大学 A kind of intelligent vehicle localization method

Also Published As

Publication number Publication date
CN113110408A (en) 2021-07-13

Similar Documents

Publication Publication Date Title
EP3290592B1 (en) Method and apparatus to track a blade
US10930015B2 (en) Method and system for calibrating multiple cameras
US10198632B2 (en) Survey data processing device, survey data processing method, and survey data processing program
JP6510247B2 (en) Survey data processing apparatus, survey data processing method and program
CN111598952B (en) Multi-scale cooperative target design and online detection identification method and system
US8107687B2 (en) System and method for tracking motion of an object image
US20200101619A1 (en) Ground Mark For Spatial Positioning
CN102622767A (en) Method for positioning binocular non-calibrated space
CN110097498B (en) Multi-flight-zone image splicing and positioning method based on unmanned aerial vehicle flight path constraint
CN112991401B (en) Vehicle running track tracking method and device, electronic equipment and storage medium
CN110779395A (en) Target shooting correction system and method
US20130100281A1 (en) Method, System and Computer Program Product for Detecting an Obstacle with a Camera
CN113110408B (en) Robot positioning system and method, robot, and computer-readable storage medium
CN111596371B (en) Ferromagnetic target detection method, device and system
EP4002243A1 (en) Pickup robot, pickup method, and computer-readable storage medium
CN110738867A (en) parking space detection method, device, equipment and storage medium
CN106934832B (en) A kind of simple straight line automatic positioning method towards vision line walking
CN106023139A (en) Indoor tracking and positioning method based on multiple cameras and system
CN113031582A (en) Robot, positioning method, and computer-readable storage medium
US8620464B1 (en) Visual automated scoring system
KR101844328B1 (en) Occlusion and rotation invariant object recognition system and method in factory automation
CN111738082A (en) Identification method and device for automatically tracking and positioning fire source point based on machine vision
CN110031014B (en) Visual positioning method based on pattern recognition
CN105930801A (en) Track switch indicator image recognition method
CN113033743A (en) Positioning identifier, robot for identifying positioning identifier and positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant