CN112847349A - Robot walking control method and device - Google Patents

Robot walking control method and device Download PDF

Info

Publication number
CN112847349A
CN112847349A CN202011615957.9A CN202011615957A CN112847349A CN 112847349 A CN112847349 A CN 112847349A CN 202011615957 A CN202011615957 A CN 202011615957A CN 112847349 A CN112847349 A CN 112847349A
Authority
CN
China
Prior art keywords
robot
dimensional code
code pattern
rectangular frame
target image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011615957.9A
Other languages
Chinese (zh)
Other versions
CN112847349B (en
Inventor
陈海波
张仲璐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenlan Robot Shanghai Co ltd
Original Assignee
Deep Blue Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deep Blue Technology Shanghai Co Ltd filed Critical Deep Blue Technology Shanghai Co Ltd
Priority to CN202011615957.9A priority Critical patent/CN112847349B/en
Publication of CN112847349A publication Critical patent/CN112847349A/en
Application granted granted Critical
Publication of CN112847349B publication Critical patent/CN112847349B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • G06K17/0025Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device the arrangement consisting of a wireless interrogation device in combination with a device for optically marking the record carrier

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The application relates to the technical field of artificial intelligence, and provides a robot walking control method, which comprises the following steps: collecting a target image through a camera; confirming that a two-dimensional code pattern exists in the target image, and acquiring the two-dimensional code pattern; determining position information and direction information of the robot based on the two-dimensional code pattern; and determining the walking direction of the robot based on the preset track of the robot, the position information and the direction information. According to the robot walking control method and device, the camera is used for collecting the target image in the field, the position information and the direction information of the robot are determined according to the two-dimensional code pattern in the target pattern, the walking direction of the robot is adjusted according to the position information and the direction information, the robot is positioned through the two-dimensional code with low cost, the cost can be reduced, the accuracy of a positioning result is improved, and the efficiency is improved.

Description

Robot walking control method and device
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a robot walking control method and device.
Background
With the development of artificial intelligence technology, the application of the robot in various scenes is more and more common, and the direction of the robot needs to be adjusted in time in the walking process to avoid deviation from the planned path, so that the robot needs to be positioned in the walking process of the robot.
At present, a robot is positioned mainly by schemes such as a laser radar, inertial navigation, ground Radio Frequency Identification (RFID) and the like, but the schemes have requirements on scenes and accuracy, and parts have high cost and positioning results are not accurate enough.
Disclosure of Invention
The application provides a robot walking control method and device, so that cost is reduced, accuracy of a positioning result is improved, and efficiency is improved.
The application provides a robot walking control method, which comprises the following steps: collecting a target image through a camera; confirming that a two-dimensional code pattern exists in the target image, and acquiring the two-dimensional code pattern; determining position information and direction information of the robot based on the two-dimensional code pattern; and determining the walking direction of the robot based on the preset track of the robot, the position information and the direction information.
According to the robot walking control method provided by the application, the step of confirming that the two-dimensional code pattern exists in the target image comprises the following steps: preprocessing the target image to obtain a reference image; and extracting contour features from the reference image, and if the number of the contour features is more than or equal to three, determining that the two-dimensional code pattern exists in the reference image.
According to the robot walking control method provided by the application, the preprocessing of the target image comprises the following steps: and carrying out graying processing on the target image.
According to the robot walking control method provided by the application, the step of acquiring the two-dimensional code pattern comprises the following steps: drawing a plurality of rectangular frames based on the outline characteristics and generating a minimum rectangular frame surrounding the rectangular frames; and determining the two-dimensional code pattern based on the minimum rectangular frame.
According to the robot walking control method provided by the application, the drawing a plurality of rectangular frames based on the outline features and generating a minimum rectangular frame surrounding the plurality of rectangular frames comprises the following steps: generating a first rectangular frame, a second rectangular frame and a third rectangular frame based on the outline features; generating a fourth rectangular frame based on the first rectangular frame, the second rectangular frame, and the third rectangular frame; generating a minimum rectangular frame surrounding the first, second, third, and fourth rectangular frames based on vertices of the first, second, third, and fourth rectangular frames.
According to the robot walking control method provided by the application, the robot walking control method further comprises the following steps: and confirming that the two-dimension code pattern does not exist in the target image, and controlling the robot to continue moving along the original walking direction until the two-dimension code pattern is confirmed to exist in the target image.
According to the robot walking control method provided by the application, the robot walking control method further comprises the following steps: and if the robot cannot detect the two-dimensional code pattern after moving the target time threshold along the original walking direction, controlling the robot to return along the original path until the two-dimensional code pattern in the target image is confirmed.
According to the robot walking control method provided by the application, the two-dimension code pattern is obtained by shooting a two-dimension code made of visible light stealth paint by the camera.
The application still provides a robot walking controlling means, and this robot walking controlling means includes: the acquisition module is used for acquiring a target image through the camera; the confirming module is used for confirming that a two-dimensional code pattern exists in the target image and acquiring the two-dimensional code pattern; the first determining module is used for determining the position information and the direction information of the robot based on the two-dimensional code pattern; and the second determining module is used for determining the walking direction of the robot based on the preset track of the robot, the position information and the direction information.
According to the application, a robot walking control device is provided, confirm the module, include: the first confirming submodule is used for preprocessing the target image to obtain a reference image; and the second confirming submodule is used for extracting the outline features from the reference image, and if the number of the outline features is more than or equal to three, confirming that the two-dimensional code pattern exists in the reference image.
According to the application, a robot walking control device is provided, confirm the module, still include: the first obtaining submodule is used for drawing a plurality of rectangular frames based on the outline characteristics and generating a minimum rectangular frame surrounding the rectangular frames; and the second acquisition submodule is used for determining the two-dimensional code pattern based on the minimum rectangular frame.
The application also provides an electronic device, which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein when the processor executes the computer program, the steps of the robot walking control method are realized according to any one of the above steps.
The present application also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the robot walking control method as any one of the above.
According to the robot walking control method and device, the camera is used for collecting the target image in the field, the position information and the direction information of the robot are determined according to the two-dimensional code pattern in the target pattern, the walking direction of the robot is adjusted according to the position information and the direction information, the robot is positioned through the two-dimensional code with low cost, the cost can be reduced, the accuracy of a positioning result is improved, and the efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the present application or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a robot walking control method provided by the present application;
fig. 2 is a schematic flowchart of an embodiment of step 200 in the robot walking control method provided in the present application;
fig. 3 is a second schematic flowchart of an embodiment of step 200 in the robot walking control method provided in the present application;
fig. 4 is a schematic flowchart of an embodiment of step 201 in the robot walking control method provided in the present application;
fig. 5 is a schematic diagram of two-dimensional code arrangement in a field in the robot walking control method provided by the present application;
fig. 6 is a block diagram of a process for confirming whether a two-dimensional code pattern exists in a target image in the robot walking control method provided by the present application;
FIG. 7 is a schematic structural diagram of a robot walking control device provided by the present application;
FIG. 8 is a schematic diagram of a confirmation module of the robot walking control device provided in the present application;
fig. 9 is a second schematic structural diagram of a confirmation module of the robot walking control device provided in the present application;
FIG. 10 is a schematic diagram of a first acquisition submodule of the robot walking control device provided in the present application;
fig. 11 is a schematic structural diagram of an electronic device provided in the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The robot walking control method and device of the present application are described below with reference to fig. 1 to 11.
As shown in fig. 1, an embodiment of the present application provides a robot walking control method, including: step 100-step 400 as follows.
And step 100, acquiring a target image through a camera.
It can be understood that the robot is provided with the camera, the robot can walk in the field, the camera can face upwards or downwards, and the two-dimensional code can be arranged at some positions of the ground or the top surface of the field, and the position information and the direction information contained in the field are included in the two-dimensional code, that is to say, the position in the field where the two-dimensional code is located can be known by identifying the two-dimensional code, and the setting direction of the two-dimensional code itself.
The camera shoots a site environment in real time to obtain a target image, and the two-dimension code is only arranged at a fixed position, so that a two-dimension code pattern does not exist in the target image shot by the camera all the time.
And 200, confirming that the two-dimensional code pattern exists in the target image, and acquiring the two-dimensional code pattern.
It can be understood that the target image may be detected, whether a two-dimensional code pattern exists in the target image is determined, if the two-dimensional code pattern exists in the target image, the two-dimensional code pattern may be obtained, for example, if the two-dimensional code pattern exists in the target image, the target image may be cut, the contour of the two-dimensional code pattern is used as a boundary, the background pattern outside the contour is removed, and the two-dimensional code pattern inside the contour is retained.
And step 300, determining the position information and the direction information of the robot based on the two-dimensional code pattern.
It is understood that the information contained in the two-dimensional code pattern may be read from the two-dimensional code pattern, and the information may include: the position of two-dimensional code place and the direction of two-dimensional code, because the two-dimensional code sets up ground or the top surface in the place, when the robot walks to the position of two-dimensional code, the positional information of robot just is unanimous with the position that the two-dimensional code was located, just can obtain the positional information of robot through the position that the two-dimensional code pattern contains was located, meanwhile, because the two-dimensional code pattern itself that the appearance of making a video recording was shot can carry the direction, just can be according to the orientation of the orientation determination robot of two-dimensional code pattern.
For example, if the two-dimensional code pattern shot by the robot at a certain moment shows that the position coordinate of the two-dimensional code is (4, 5), the true north direction is indicated by three angles of the two-dimensional code pattern, the position information of the robot is (4, 5), an included angle between the driving direction of the robot and the true north direction can be obtained according to the inclination degree of the two-dimensional code pattern, for example, the included angle is 30 degrees east, and then the direction information of the robot is determined to be 30 degrees north east.
And step 400, determining the walking direction of the robot based on the preset track of the robot, the position information and the direction information.
It can be understood that the robot may be preset with a track, the track may be set manually, or may be planned by itself according to an executed task, the robot needs to travel along the preset track to complete the task, here, the position information and the direction information of the robot can be determined according to the two-dimensional code pattern, whether there is a deviation and a deviation degree with the preset track can be determined according to the position information and the direction information, if there is no deviation, the traveling direction of the robot is not changed, if there is a deviation, the original traveling direction is corrected according to the deviation degree, and the next traveling direction is determined according to the deviation degree.
According to the robot walking control method provided by the embodiment of the application, the camera is used for collecting the target image in the field, the position information and the direction information of the robot are determined according to the two-dimension code pattern in the target pattern, the walking direction of the robot is adjusted according to the position information and the direction information, the robot is positioned through the two-dimension code with low cost, the cost can be reduced, the accuracy of a positioning result is improved, and the efficiency is improved.
As shown in fig. 2, in some embodiments, the determining that the two-dimensional code pattern exists in the target image in step 200 includes: step 210-step 220 as follows.
In step 210, the target image is preprocessed to obtain a reference image.
It can be understood that after the target image is collected by the camera, the target image may be preprocessed to obtain a reference image so as to adapt to subsequent feature extraction, and the preprocessing may include: at least one of denoising processing, matrixing processing, color processing, smoothing processing and image enhancement processing can improve the identification degree of key features in the target region image and reduce the identification degree of irrelevant features through preprocessing.
And step 220, extracting contour features from the reference image, and if the number of the contour features is greater than or equal to three, determining that the two-dimensional code pattern exists in the reference image.
As shown in fig. 5, it is understood that the two-dimensional code set in the field may be a conventional two-dimensional code, such as a point a, a point b, a point c and a point d in fig. 5, the point a, the point b, the point c and the point d are arranged with reference to a point o in the field, where the two-dimensional code is generally square, three of the four corners of the square, such as 1, 2 and 3 in figure 5, a rectangular mark is placed in each of these three corner regions, so that contour features can be extracted from the reference image, contour features can be extracted by adopting a Canny edge detection algorithm, the contour features correspond to rectangular marks arranged in the corner regions of the two-dimensional code, if the number of contour features is greater than or equal to three, indicating the presence of three rectangular marks in the reference image, this indicates that a complete two-dimensional code pattern exists in the reference image, and thus it can be confirmed that the two-dimensional code pattern exists in the reference pattern.
If the number of the outline features is less than three, the fact that a complete two-dimensional code pattern does not exist in the reference image is shown.
In some embodiments, the preprocessing the target image in step 210 includes: and carrying out graying processing on the target image.
It can be understood that the target image may be a color image, the color of each pixel in the color image is determined by R, G, B three components, and one component can be determined from 255 values, so that about 1600 ten thousand color change ranges exist for one pixel point, and here, the color image can be changed into a gray image by performing graying processing on the target image, so that the subsequent calculation amount can be reduced, the gray image and the color image both reflect the distribution and the features of the overall and local chromaticity and luminance levels of the whole image, and in the application scene of two-dimensional code identification, the gray image does not affect the accuracy of feature extraction, and the detection speed can be improved.
As shown in fig. 3 and 6, in some embodiments, the acquiring the two-dimensional code pattern in step 200 includes: step 201-step 202 as follows.
Step 201, drawing a plurality of rectangular frames based on the outline characteristics, and generating a minimum rectangular frame surrounding the plurality of rectangular frames.
It can be understood that after the outline features are identified from the reference pattern, a plurality of rectangular frames can be drawn according to the outline features, and a corresponding rectangular frame can be drawn for each outline feature, so that a plurality of rectangular frames can be drawn in the reference pattern, and then a minimum rectangular frame surrounding all the rectangular frames can be drawn according to the plurality of rectangular frames, and all the rectangular frames drawn in the reference pattern are surrounded in the minimum rectangular frame.
Step 202, determining a two-dimensional code pattern based on the minimum rectangular frame.
It can be understood that the minimum rectangular frame is the boundary of the two-dimensional code pattern, the area outside the minimum rectangular frame is the background pattern, and the area inside the minimum rectangular frame is the two-dimensional code pattern, so that the reference pattern can be captured by taking the minimum rectangular frame as the boundary, thereby removing the background pattern, and the area reserved in the reference pattern is the two-dimensional code pattern.
As shown in fig. 4 and fig. 6, in some embodiments, the step 201 of drawing a plurality of rectangular frames based on the outline features and generating a minimum rectangular frame surrounding the plurality of rectangular frames includes: step 2011-step 2013 as follows.
Step 2011 generates a first rectangular frame, a second rectangular frame, and a third rectangular frame based on the contour features.
It can be understood that, since a complete two-dimensional code pattern may have three rectangular frames, a first rectangular frame, a second rectangular frame, and a third rectangular frame may be drawn according to the outline features identified from the reference pattern, and the first rectangular frame, the second rectangular frame, and the third rectangular frame are respectively located at three corners of the two-dimensional code pattern.
Step 2012, a fourth rectangular box is generated based on the first rectangular box, the second rectangular box and the third rectangular box.
It is understood that the fourth corner of the two-dimensional code pattern, at which the fourth rectangular frame is generated, can be located according to the first rectangular frame, the second rectangular frame, and the third rectangular frame.
Step 2013, based on the vertexes of the first rectangular frame, the second rectangular frame, the third rectangular frame and the fourth rectangular frame, generating a minimum rectangular frame surrounding the plurality of rectangular frames.
It is understood that the boundary area of the two-dimensional code pattern may be determined according to the vertices of the first rectangular frame, the second rectangular frame, the third rectangular frame, and the fourth rectangular frame, and the minimum rectangular frame surrounding the first rectangular frame, the second rectangular frame, the third rectangular frame, and the fourth rectangular frame may be generated by connecting some of the vertices of the first rectangular frame, the second rectangular frame, the third rectangular frame, and the fourth rectangular frame, so as to obtain the two-dimensional code pattern according to the minimum rectangular frame.
In some embodiments, the robot walking control method further comprises: and confirming that the two-dimension code pattern does not exist in the target image, and controlling the robot to continue moving the target time threshold value along the original walking direction until the two-dimension code pattern exists in the target image.
It can be understood that if the two-dimensional code pattern is not detected in the target image, the robot can be controlled to continue to keep the original walking direction to continue to move forward until the two-dimensional code pattern is detected in the target image shot by the camera in real time.
In some embodiments, the robot walking control method further comprises: and if the robot still cannot detect the two-dimensional code pattern after moving the target time threshold along the original walking direction, controlling the robot to return along the original path until the two-dimensional code pattern is confirmed to exist in the target image.
It can be understood that the robot can be controlled to move for a certain time along the original walking direction, a target time threshold can be set, for example, the target time threshold can be 10 seconds, if the robot moves for 10 seconds along the original walking direction, no two-dimensional code pattern still exists in a target image shot by the camera in real time, the robot is controlled to return along the original path until the two-dimensional code pattern exists in the target image shot by the camera in real time, the two-dimensional code pattern is shown to return to the position of the two-dimensional code shot recently, and the robot can be positioned again according to the two-dimensional code.
In other words, as long as the two-dimensional code pattern exists in the target image that can be shot by the robot in the walking process, the position information and the direction information can be updated according to the two-dimensional code pattern, so that the robot can be accurately positioned.
In some embodiments, the two-dimensional code pattern is obtained by shooting a two-dimensional code made of visible light stealth paint by the camera.
It can be understood that the two-dimensional code set in the field can be made of visible light stealth paint, the visible light stealth paint cannot reflect visible light, so that a product made of the visible light stealth paint cannot be seen by naked eyes of people, the visible light stealth paint can reflect invisible light of a specific waveband, such as ultraviolet light or infrared light, and the corresponding camera can be set to be sensitive to the invisible light of the specific waveband, that is, the invisible light of the specific waveband can be shot.
After the two-dimensional code is manufactured by the visible light stealth coating, the two-dimensional code can only be shot by a camera and cannot be seen by naked eyes, namely, the two-dimensional code can only be recognized by a robot and cannot be observed by the naked eyes, so that the decoration style of a scene can be prevented from being influenced by the two-dimensional code, and the impression of the scene where the robot is located by a person is prevented from being influenced.
The following describes the robot walking control device provided in the present application, and the robot walking control device described below and the robot walking control method described above may be referred to in correspondence with each other.
As shown in fig. 7, an embodiment of the present application further provides a robot walking control device, including: an acquisition module 710, a confirmation module 720, a first determination module 730, and a second determination module 740.
The acquisition module 710 is used for acquiring a target image through a camera;
a confirming module 720, configured to confirm that a two-dimensional code pattern exists in the target image, and obtain the two-dimensional code pattern;
the first determining module 730 is configured to determine position information and direction information of the robot based on the two-dimensional code pattern.
And a second determining module 740, configured to determine a walking direction of the robot based on the preset trajectory of the robot and the position information and the direction information.
As shown in fig. 8, in some embodiments, the validation module 720 includes: a first validation submodule 810 and a second validation submodule 820.
The first confirming sub-module 810 is configured to pre-process the target image to obtain a reference image.
And the second confirming submodule 820 is configured to extract the contour features from the reference image, and confirm that the two-dimensional code pattern exists in the reference image if the number of the contour features is greater than or equal to three.
In some embodiments, the first validation submodule 810 is further configured to perform a graying process on the target image.
As shown in fig. 9, in some embodiments, the validation module 720 further comprises: a first fetch submodule 910 and a second fetch submodule 920.
The first obtaining sub-module 910 is configured to draw a plurality of rectangular frames based on the contour features, and generate a minimum rectangular frame surrounding the plurality of rectangular frames.
And a second obtaining submodule 920, configured to determine the two-dimensional code pattern based on the minimum rectangular frame.
As shown in fig. 10, in some embodiments, the first obtaining sub-module 910 includes: a first generation subunit 101, a second generation subunit 102 and a third generation subunit 103.
A first generating subunit 101, configured to generate a first rectangular frame, a second rectangular frame, and a third rectangular frame based on the contour feature.
A second generating subunit 102, configured to generate a fourth rectangular frame based on the first rectangular frame, the second rectangular frame, and the third rectangular frame.
A third generating subunit 103, configured to generate a minimum rectangular frame surrounding the first rectangular frame, the second rectangular frame, the third rectangular frame, and the fourth rectangular frame based on vertices of the first rectangular frame, the second rectangular frame, the third rectangular frame, and the fourth rectangular frame.
In some embodiments, the confirming module 720 is further configured to confirm that the two-dimensional code pattern does not exist in the target image, and control the robot to continue moving along the original walking direction until confirming that the two-dimensional code pattern exists in the target image.
In some embodiments, the confirming module 720 is further configured to, if the two-dimensional code pattern cannot be detected after the robot moves the target time threshold along the original walking direction, control the robot to return along the original path until the two-dimensional code pattern is confirmed to exist in the target image.
In some embodiments, the two-dimensional code pattern is obtained by shooting a two-dimensional code made of visible light stealth paint by a camera.
The robot walking control device provided by the embodiment of the application is used for executing the robot walking control method, the implementation mode of the robot walking control device is consistent with that of the robot walking control method provided by the application, the same beneficial effects can be achieved, and the detailed description is omitted here.
Fig. 11 illustrates a physical structure diagram of an electronic device, and as shown in fig. 11, the electronic device may include: a processor (processor)111, a communication interface (communication interface)112, a memory (memory)113 and a communication bus 114, wherein the processor 111, the communication interface 112 and the memory 113 complete communication with each other through the communication bus 114. The processor 111 may call logic instructions in the memory 113 to perform a robot walking control method, the method comprising: collecting a target image through a camera; confirming that a two-dimensional code pattern exists in the target image, and acquiring the two-dimensional code pattern; determining position information and direction information of the robot based on the two-dimensional code pattern; and determining the walking direction of the robot based on the preset track of the robot, the position information and the direction information.
In addition, the logic instructions in the memory 113 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The processor 111 in the electronic device provided in the embodiment of the present application may call the logic instruction in the memory 113 to implement the robot walking control method, and an implementation manner of the robot walking control method is consistent with that of the robot walking control method provided in the present application, and the same beneficial effects may be achieved, and details are not described here.
On the other hand, the present application further provides a computer program product, which is described below, and the computer program product described below and the robot walking control method described above may be referred to in correspondence with each other.
The computer program product comprises a computer program stored on a non-transitory computer-readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform a robot walking control method provided by the above methods, the method comprising: collecting a target image through a camera; confirming that a two-dimensional code pattern exists in the target image, and acquiring the two-dimensional code pattern; determining position information and direction information of the robot based on the two-dimensional code pattern; and determining the walking direction of the robot based on the preset track of the robot, the position information and the direction information.
When the computer program product provided by the embodiment of the present application is executed, the robot walking control method is implemented, and an implementation manner of the method is consistent with that of the robot walking control method provided by the present application, and the same beneficial effects can be achieved, and details are not repeated here.
In yet another aspect, the present application further provides a non-transitory computer-readable storage medium, which is described below, and the non-transitory computer-readable storage medium described below and the robot walking control method described above may be referred to in correspondence with each other.
The present application also provides a non-transitory computer-readable storage medium having stored thereon a computer program, which when executed by a processor, is implemented to perform the robot walking control method provided above, the method including: collecting a target image through a camera; confirming that a two-dimensional code pattern exists in the target image, and acquiring the two-dimensional code pattern; determining position information and direction information of the robot based on the two-dimensional code pattern; and determining the walking direction of the robot based on the preset track of the robot, the position information and the direction information.
When the computer program stored on the non-transitory computer readable storage medium provided in the embodiment of the present application is executed, the method for controlling robot walking is implemented, and an implementation manner of the method is consistent with that of the method for controlling robot walking provided in the present application, and the same beneficial effects can be achieved, and details are not repeated herein.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (13)

1. A robot walking control method is characterized by comprising the following steps:
collecting a target image through a camera;
confirming that a two-dimensional code pattern exists in the target image, and acquiring the two-dimensional code pattern;
determining position information and direction information of the robot based on the two-dimensional code pattern;
and determining the walking direction of the robot based on the preset track of the robot, the position information and the direction information.
2. The robot walking control method according to claim 1, wherein the confirming that the two-dimensional code pattern exists in the target image comprises:
preprocessing the target image to obtain a reference image;
and extracting contour features from the reference image, and if the number of the contour features is more than or equal to three, determining that the two-dimensional code pattern exists in the reference image.
3. The robot walking control method according to claim 2, wherein the preprocessing the target image includes:
and carrying out graying processing on the target image.
4. The robot walking control method according to claim 2, wherein the acquiring the two-dimensional code pattern includes:
drawing a plurality of rectangular frames based on the outline characteristics and generating a minimum rectangular frame surrounding the rectangular frames;
and determining the two-dimensional code pattern based on the minimum rectangular frame.
5. The robot walking control method according to claim 4, wherein said drawing a plurality of rectangular frames based on the outline feature and generating a minimum rectangular frame surrounding the plurality of rectangular frames comprises:
generating a first rectangular frame, a second rectangular frame and a third rectangular frame based on the outline features;
generating a fourth rectangular frame based on the first rectangular frame, the second rectangular frame, and the third rectangular frame;
generating a minimum rectangular frame surrounding the first, second, third, and fourth rectangular frames based on vertices of the first, second, third, and fourth rectangular frames.
6. The robot walking control method according to any one of claims 1 to 5, further comprising:
and confirming that the two-dimension code pattern does not exist in the target image, and controlling the robot to continue moving along the original walking direction until the two-dimension code pattern is confirmed to exist in the target image.
7. The robot walking control method according to claim 6, further comprising:
and if the robot cannot detect the two-dimensional code pattern after moving the target time threshold along the original walking direction, controlling the robot to return along the original path until the two-dimensional code pattern in the target image is confirmed.
8. The robot walking control method according to any one of claims 1 to 5, wherein the two-dimensional code pattern is obtained by the camera shooting a two-dimensional code made of a invisible paint.
9. A robot walking control device, comprising:
the acquisition module is used for acquiring a target image through the camera;
the confirming module is used for confirming that a two-dimensional code pattern exists in the target image and acquiring the two-dimensional code pattern;
the first determining module is used for determining the position information and the direction information of the robot based on the two-dimensional code pattern;
and the second determining module is used for determining the walking direction of the robot based on the preset track of the robot, the position information and the direction information.
10. The robotic walking control device of claim 9, wherein the confirmation module comprises:
the first confirming submodule is used for preprocessing the target image to obtain a reference image;
and the second confirming submodule is used for extracting the outline features from the reference image, and if the number of the outline features is more than or equal to three, confirming that the two-dimensional code pattern exists in the reference image.
11. The robotic walking control device of claim 10, wherein the confirmation module further comprises:
the first obtaining submodule is used for drawing a plurality of rectangular frames based on the outline characteristics and generating a minimum rectangular frame surrounding the rectangular frames;
and the second acquisition submodule is used for determining the two-dimensional code pattern based on the minimum rectangular frame.
12. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the robot walking control method according to any one of claims 1 to 8 are implemented when the processor executes the program.
13. A non-transitory computer-readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of the robot walking control method according to any one of claims 1 to 8.
CN202011615957.9A 2020-12-30 2020-12-30 Robot walking control method and device Active CN112847349B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011615957.9A CN112847349B (en) 2020-12-30 2020-12-30 Robot walking control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011615957.9A CN112847349B (en) 2020-12-30 2020-12-30 Robot walking control method and device

Publications (2)

Publication Number Publication Date
CN112847349A true CN112847349A (en) 2021-05-28
CN112847349B CN112847349B (en) 2022-05-06

Family

ID=75998748

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011615957.9A Active CN112847349B (en) 2020-12-30 2020-12-30 Robot walking control method and device

Country Status (1)

Country Link
CN (1) CN112847349B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113183141A (en) * 2021-06-09 2021-07-30 乐聚(深圳)机器人技术有限公司 Walking control method, device, equipment and storage medium for biped robot
CN114147769A (en) * 2021-12-21 2022-03-08 乐聚(深圳)机器人技术有限公司 Factory detection method, device, equipment and storage medium for robot

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106529637A (en) * 2016-10-28 2017-03-22 深圳大学 Anti-copy realization method and realization system of two-dimensional code
CN106969766A (en) * 2017-03-21 2017-07-21 北京品创智能科技有限公司 A kind of indoor autonomous navigation method based on monocular vision and Quick Response Code road sign
CN107490379A (en) * 2017-08-28 2017-12-19 山东非凡智能科技有限公司 Utilize the method and system of Quick Response Code terrestrial reference positioning AGV operating point locations
CN108594822A (en) * 2018-05-10 2018-09-28 哈工大机器人(昆山)有限公司 Robot localization method, robot charging method based on Quick Response Code and system
WO2018214941A1 (en) * 2017-05-25 2018-11-29 锥能机器人(上海)有限公司 Ground mark for spatial positioning
CN109460044A (en) * 2019-01-10 2019-03-12 轻客小觅智能科技(北京)有限公司 A kind of robot method for homing, device and robot based on two dimensional code
KR20190126607A (en) * 2018-05-02 2019-11-12 주식회사 마로로봇 테크 Autonomous driving logistics robot equipped with multiple cameras for QR code recognition
CN111486849A (en) * 2020-05-29 2020-08-04 北京大学 Mobile visual navigation method and system based on two-dimensional code road sign

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106529637A (en) * 2016-10-28 2017-03-22 深圳大学 Anti-copy realization method and realization system of two-dimensional code
CN106969766A (en) * 2017-03-21 2017-07-21 北京品创智能科技有限公司 A kind of indoor autonomous navigation method based on monocular vision and Quick Response Code road sign
WO2018214941A1 (en) * 2017-05-25 2018-11-29 锥能机器人(上海)有限公司 Ground mark for spatial positioning
CN107490379A (en) * 2017-08-28 2017-12-19 山东非凡智能科技有限公司 Utilize the method and system of Quick Response Code terrestrial reference positioning AGV operating point locations
KR20190126607A (en) * 2018-05-02 2019-11-12 주식회사 마로로봇 테크 Autonomous driving logistics robot equipped with multiple cameras for QR code recognition
CN108594822A (en) * 2018-05-10 2018-09-28 哈工大机器人(昆山)有限公司 Robot localization method, robot charging method based on Quick Response Code and system
CN109460044A (en) * 2019-01-10 2019-03-12 轻客小觅智能科技(北京)有限公司 A kind of robot method for homing, device and robot based on two dimensional code
CN111486849A (en) * 2020-05-29 2020-08-04 北京大学 Mobile visual navigation method and system based on two-dimensional code road sign

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113183141A (en) * 2021-06-09 2021-07-30 乐聚(深圳)机器人技术有限公司 Walking control method, device, equipment and storage medium for biped robot
CN114147769A (en) * 2021-12-21 2022-03-08 乐聚(深圳)机器人技术有限公司 Factory detection method, device, equipment and storage medium for robot
CN114147769B (en) * 2021-12-21 2024-06-11 乐聚(深圳)机器人技术有限公司 Method, device, equipment and storage medium for factory detection of robot

Also Published As

Publication number Publication date
CN112847349B (en) 2022-05-06

Similar Documents

Publication Publication Date Title
Park et al. Pix2pose: Pixel-wise coordinate regression of objects for 6d pose estimation
US9760791B2 (en) Method and system for object tracking
CN112847349B (en) Robot walking control method and device
KR101636370B1 (en) Image processing apparatus and method
EP2874097A2 (en) Automatic scene parsing
CN101770643A (en) Image processing apparatus, image processing method, and image processing program
KR101759188B1 (en) the automatic 3D modeliing method using 2D facial image
CN111950426A (en) Target detection method and device and delivery vehicle
CN109840463B (en) Lane line identification method and device
CN111964680B (en) Real-time positioning method of inspection robot
KR102223484B1 (en) System and method for 3D model generation of cut slopes without vegetation
CN112184793B (en) Depth data processing method and device and readable storage medium
CN111160280B (en) RGBD camera-based target object identification and positioning method and mobile robot
CN110096999B (en) Chessboard recognition method, chessboard recognition device, electronic equipment and storable medium
CN109801309B (en) Obstacle sensing method based on RGB-D camera
Okada et al. Huecode: A meta-marker exposing relative pose and additional information in different colored layers
CN112990101B (en) Facial organ positioning method based on machine vision and related equipment
CN114092668A (en) Virtual-real fusion method, device, equipment and storage medium
CN113240656A (en) Visual positioning method and related device and equipment
CN112396630B (en) Method and device for determining target object state, storage medium and electronic device
CN111380535A (en) Navigation method and device based on visual label, mobile machine and readable medium
US20200364521A1 (en) Trained network for fiducial detection
CN114237280B (en) Method for accurately landing aircraft nest platform of unmanned aerial vehicle
CN115272417A (en) Image data processing method, image processing apparatus, and readable storage medium
JP5051671B2 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240514

Address after: Room 6227, No. 999, Changning District, Shanghai 200050

Patentee after: Shenlan robot (Shanghai) Co.,Ltd.

Country or region after: China

Address before: 200336 unit 1001, 369 Weining Road, Changning District, Shanghai

Patentee before: DEEPBLUE TECHNOLOGY (SHANGHAI) Co.,Ltd.

Country or region before: China