CN114147725A - Zero point adjustment method, device, equipment and storage medium for robot - Google Patents

Zero point adjustment method, device, equipment and storage medium for robot Download PDF

Info

Publication number
CN114147725A
CN114147725A CN202111568369.9A CN202111568369A CN114147725A CN 114147725 A CN114147725 A CN 114147725A CN 202111568369 A CN202111568369 A CN 202111568369A CN 114147725 A CN114147725 A CN 114147725A
Authority
CN
China
Prior art keywords
target
robot
adjustment
view image
steering engine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111568369.9A
Other languages
Chinese (zh)
Other versions
CN114147725B (en
Inventor
冷晓琨
常琳
程鑫
白学林
柯真东
王松
吴雨璁
何治成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leju Shenzhen Robotics Co Ltd
Original Assignee
Leju Shenzhen Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leju Shenzhen Robotics Co Ltd filed Critical Leju Shenzhen Robotics Co Ltd
Priority to CN202111568369.9A priority Critical patent/CN114147725B/en
Publication of CN114147725A publication Critical patent/CN114147725A/en
Application granted granted Critical
Publication of CN114147725B publication Critical patent/CN114147725B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The application provides a zero point adjustment method, a zero point adjustment device, zero point adjustment equipment and a storage medium of a robot, and belongs to the technical field of robot control. The method comprises the following steps: receiving a target view image of a target robot acquired by image acquisition equipment; determining a target adjusting area in a target view image of a target robot, wherein the target adjusting area comprises at least two identification positions and zero positions of all steering engines in the target robot, and the identification positions are positions where labels configured in advance on the target robot are located; determining a target adjustment angle in the target adjustment area based on the identification position and the zero position of the steering engine to be adjusted; and sending a target adjusting instruction to the target robot so as to enable the target robot to carry out zero point adjustment based on the target adjusting instruction, wherein the target adjusting instruction comprises a target adjusting angle. The method and the device can quickly and accurately adjust the zero point of the robot.

Description

Zero point adjustment method, device, equipment and storage medium for robot
Technical Field
The present disclosure relates to the field of robot control technologies, and in particular, to a method, an apparatus, a device, and a storage medium for zero adjustment of a robot.
Background
When the robot is assembled with the steering engine, the position of the steering engine is usually deviated from a standard position due to errors of parts or errors of manual configuration, and in order to realize stable work of the robot, zero adjustment needs to be performed on each steering engine of the robot so as to eliminate the errors.
In the prior art, manual adjustment is usually performed manually, for example, for a humanoid robot, a worker adjusts a steering engine at each position of the robot respectively, and determines whether an error is eliminated based on an observation mode.
However, the efficiency of manually performing the zero point adjustment is low, and there may be an error in the observation, and the zero point adjustment cannot be performed quickly and accurately.
Disclosure of Invention
The application aims to provide a zero point adjustment method, a zero point adjustment device, zero point adjustment equipment and a storage medium of a robot, and the zero point adjustment of the robot can be realized more quickly and accurately.
The embodiment of the application is realized as follows:
in one aspect of the embodiments of the present application, a zero point adjustment method for a robot is provided, where the method is applied to a computer device, and the computer device is respectively in communication connection with an image acquisition device and a target robot, and the method includes:
receiving a target view image of a target robot acquired by image acquisition equipment;
determining a target adjusting area in a target view image of a target robot, wherein the target adjusting area comprises at least two identification positions and zero positions of all steering engines in the target robot, and the identification positions are positions where labels configured in advance on the target robot are located;
determining a target adjustment angle in the target adjustment area based on the identification position and the zero position of the steering engine to be adjusted;
and sending a target adjusting instruction to the target robot so as to enable the target robot to carry out zero point adjustment based on the target adjusting instruction, wherein the target adjusting instruction comprises a target adjusting angle.
Optionally, determining a target adjustment region in a target view image of the target robot comprises:
determining the identification positions of at least two associated labels in a target view image of a target robot;
generating a target reference line passing through the identification positions of the associated labels based on the identification positions of the at least two associated labels;
determining a steering engine to be adjusted according to the distance between the zero position of each steering engine of the target robot in the target view image and the target reference line;
and determining a target adjustment area based on the target reference line, the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted.
Optionally, determining a target adjustment region based on the target reference line, the identification position of the associated tag on the target reference line, and the zero position of the steering engine to be adjusted, includes:
determining a triangle with the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted as vertexes;
and taking the area containing the triangle and the target reference line as a target adjustment area.
Optionally, determining a target adjustment angle based on the identification position and the zero position of the steering engine to be adjusted in the target adjustment region, including:
generating a correlation line by taking the zero position of the steering engine to be adjusted and the identification position of a target correlation label on a target reference line as vertexes, wherein the target correlation label is the correlation label with the closest distance between the identification position of the correlation label on the target reference line and the zero position of the steering engine to be adjusted;
and taking the included angle between the target reference line and the associated line as a target adjustment angle.
Optionally, the image acquisition device comprises: a front view image acquisition device, a side view image acquisition device, or a top view image acquisition device;
the target view image includes: a front view image, a side view image, or a top view image.
Optionally, before receiving the target view image of the target robot acquired by the image acquisition device, the method further comprises:
and sending a state command to the target robot so as to enable each steering engine of the target robot to move to a preset position.
Optionally, after sending the target adjustment instruction to the target robot, the method further comprises:
receiving a view image acquired by image acquisition equipment after the target robot is adjusted;
judging whether the zero position of the steering engine after adjustment in the adjusted view image is on the same straight line with at least two identification positions;
and if so, determining that the zero point adjustment of the robot is finished.
In another aspect of the embodiments of the present application, a zero point adjusting apparatus for a robot is provided, where the apparatus is applied to a computer device, the computer device is respectively in communication connection with an image capturing device and a target robot, and the apparatus includes: the device comprises a receiving module, an area determining module, an angle determining module and a sending module;
the receiving module is used for receiving a target view image of the target robot acquired by the image acquisition equipment;
the area determining module is used for determining a target adjusting area in a target view image of the target robot, the target adjusting area comprises at least two identification positions and zero positions of all steering engines in the target robot, and the identification positions are positions where labels configured in advance on the target robot are located;
the angle determining module is used for determining a target adjusting angle in the target adjusting area based on the identification position and the zero position of the steering engine to be adjusted;
and the sending module is used for sending a target adjusting instruction to the target robot so as to enable the target robot to carry out zero point adjustment based on the target adjusting instruction, wherein the target adjusting instruction comprises a target adjusting angle.
Optionally, the area determining module is specifically configured to determine the identification positions of at least two associated labels in the target view image of the target robot; generating a target reference line passing through the identification positions of the associated labels based on the identification positions of the at least two associated labels; determining a steering engine to be adjusted according to the distance between the zero position of each steering engine of the target robot in the target view image and the target reference line; and determining a target adjustment area based on the target reference line, the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted.
Optionally, the area determining module is specifically configured to determine a triangle with the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted as a vertex; and taking the area containing the triangle and the target reference line as a target adjustment area.
Optionally, the angle determining module is specifically configured to generate a correlation line by using a zero position of the steering engine to be adjusted and an identification position of a target correlation label on a target reference line as a vertex, where the target correlation label is a correlation label with a closest distance between the identification position of the correlation label on the target reference line and the zero position of the steering engine to be adjusted; and taking the included angle between the target reference line and the associated line as a target adjustment angle.
Optionally, the sending module is further configured to send a state instruction to the target robot, so that each steering engine of the target robot moves to a preset position.
Optionally, the receiving module is further configured to receive the view image acquired by the image acquisition device after the target robot is adjusted; judging whether the zero position of the steering engine after adjustment in the adjusted view image is on the same straight line with at least two identification positions; and if so, determining that the zero point adjustment of the robot is finished.
In another aspect of the embodiments of the present application, there is provided a computer device, including: the zero point adjusting method of the robot comprises the following steps of a memory and a processor, wherein a computer program capable of running on the processor is stored in the memory, and when the processor executes the computer program, the zero point adjusting method of the robot is realized.
In another aspect of the embodiments of the present application, a computer-readable storage medium is provided, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the computer program implements the steps of the zero point adjustment method for a robot.
The beneficial effects of the embodiment of the application include:
in the zero point adjustment method, device, equipment and storage medium for the robot provided by the embodiment of the application, a target view image of a target robot acquired by image acquisition equipment can be received; determining a target adjusting area in a target view image of a target robot, wherein the target adjusting area comprises at least two identification positions and zero positions of all steering engines in the target robot, and the identification positions are positions where labels configured in advance on the target robot are located; determining a target adjustment angle in the target adjustment area based on the identification position and the zero position of the steering engine to be adjusted; and sending a target adjusting instruction to the target robot so that the target robot performs zero point adjustment based on the target adjusting instruction, wherein the target adjusting instruction comprises a target adjusting angle. After the target adjustment angle is determined in the target adjustment area, the zero position of the steering engine of the robot can be adjusted according to the target adjustment angle, so that the zero position of the robot can be adjusted more quickly and accurately.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic view of an application scenario of a zero point adjustment method for a robot according to an embodiment of the present application;
fig. 2 is a first flowchart illustrating a zero point adjustment method for a robot according to an embodiment of the present disclosure;
fig. 3 is a second flowchart illustrating a zero point adjustment method for a robot according to an embodiment of the present disclosure;
fig. 4 is a third schematic flowchart of a zero point adjustment method for a robot according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a label position of a target robot provided in an embodiment of the present application;
fig. 6 is a schematic diagram of determining a target adjustment angle according to an embodiment of the present application;
fig. 7 is a fourth schematic flowchart of a zero point adjustment method of a robot according to an embodiment of the present disclosure;
fig. 8 is a fifth flowchart illustrating a zero point adjustment method of a robot according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a zero point adjustment device of a robot according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present application, it is noted that the terms "first", "second", "third", and the like are used merely for distinguishing between descriptions and are not intended to indicate or imply relative importance.
The following specifically explains a practical application scenario of the zero point adjustment method for a robot provided in the embodiment of the present application.
Fig. 1 is a schematic view of an application scenario of a zero point adjustment method for a robot according to an embodiment of the present application, please refer to fig. 1, where the scenario may include: a computer device 110, an image capturing device 120, and a target robot 130, wherein the computer device 110 is communicatively connected to the image capturing device 120 and the target robot 130, respectively.
Alternatively, the computer device 110 may be an electronic device such as a computer, a mobile phone, a tablet computer, and a dedicated device, and is not limited in particular.
The image capturing devices 120 may be multiple, may be disposed in corresponding scenes, and are configured to obtain views of the target robot 130 at various angles, and specifically may be cameras; alternatively, the image capturing device 120 may be integrated with the computer device 110 or a separate device, which is not limited herein.
The target robot 130 may specifically be a humanoid biped robot, which can perform a series of actions, such as: transport, dance, etc., without limitation.
The target robot 130 can include a plurality of steering engines, and a joint of the target robot can be controlled to move by one or a plurality of steering engines to realize the above-mentioned action, the initial position of every steering engine can be at its zero point position, because the influence of various internal and external factors, there is the deviation in the position that probably leads to zero point, consequently needs to adjust the zero point position of every steering engine of target robot.
The following explains a specific implementation procedure of the zero point adjustment method for a robot provided in the embodiment of the present application based on the above application scenario.
Fig. 2 is a first flowchart illustrating a zero point adjustment method for a robot according to an embodiment of the present application, please refer to fig. 2, where the method includes:
s210: and receiving a target view image of the target robot acquired by the image acquisition equipment.
Alternatively, the execution subject of the method may be the computer device, the image capturing device may capture a target view image of the target robot and send the target view image to the computer device, and the computer device may receive the target view image, where the target view image may be a view image captured by any image capturing device and containing a posture of seeing the robot from a certain angle.
Optionally, if there are a plurality of image capturing devices, the received target view image may also be a plurality.
S220: a target adjustment region in a target view image of the target robot is determined.
The target adjusting area comprises at least two identification positions and zero positions of all steering engines in the target robot, and the identification positions are positions where labels are arranged on the target robot in advance.
Alternatively, the target adjustment region may be a region divided in the target view image. After determining the target view image of the target robot, a target adjustment area may be determined based on the at least two identified positions and the zero position of each steering engine in the target robot.
It should be noted that, a label may be set on the target robot, the position of the label may be preset, and the position of the label acquired in the target view image is the above-mentioned identification position; the initial position of each steering engine of the robot is the zero position.
Optionally, a plurality of target adjustment regions may be included in each target view image.
S230: and determining a target adjustment angle in the target adjustment area based on the identification position and the zero position of the steering engine to be adjusted.
Optionally, the steering engine to be adjusted may be a steering engine of which a zero position needs to be adjusted among a plurality of steering engines, the steering engine to be adjusted may be specifically determined according to the zero position of each steering engine, and after the steering engine to be adjusted is determined, a target adjustment angle may be determined in the target adjustment region based on the identification position and the zero position of the steering engine to be adjusted.
The target adjustment angle can be an angle to be adjusted of the steering engine to be adjusted.
Alternatively, different target adjustment angles may be determined based on different target adjustment regions.
S240: and sending a target adjusting instruction to the target robot so that the target robot carries out zero point adjustment based on the target adjusting instruction.
The target adjusting instruction comprises a target adjusting angle.
Optionally, after the target adjustment angle is determined, a corresponding target adjustment instruction may be generated based on the target adjustment angle, so that the target adjustment instruction is sent to the target robot, and after the target robot receives the target adjustment instruction, the corresponding steering engine to be adjusted may be adjusted according to the target adjustment angle based on the target adjustment instruction.
In the zero point adjustment method for the robot provided by the embodiment of the application, a target view image of a target robot acquired by image acquisition equipment can be received; determining a target adjusting area in a target view image of a target robot, wherein the target adjusting area comprises at least two identification positions and zero positions of all steering engines in the target robot, and the identification positions are positions where labels configured in advance on the target robot are located; determining a target adjustment angle in the target adjustment area based on the identification position and the zero position of the steering engine to be adjusted; and sending a target adjusting instruction to the target robot so that the target robot performs zero point adjustment based on the target adjusting instruction, wherein the target adjusting instruction comprises a target adjusting angle. After the target adjustment angle is determined in the target adjustment area, the zero position of the steering engine of the robot can be adjusted according to the target adjustment angle, so that the zero position of the robot can be adjusted more quickly and accurately.
Optionally, the image acquisition device comprises: a front view image acquisition device, a side view image acquisition device, or a top view image acquisition device; the target view image includes: a front view image, a side view image, or a top view image.
Alternatively, three image capturing devices, that is, the above-described front-view image capturing device, side-view image capturing device, and top-view image capturing device, which may be the same device, may be provided at different positions, wherein the front-view image capturing device may be used to capture front-view images, the side-view image capturing device may be used to capture side-view images, and the top-view image capturing device may be used to capture top-view images, and if an image of the front side of the target robot is captured as the front-view image, the side-view image is an image of the side face of the target robot, and the top-view image is an image of the top face of the target robot.
The three image acquisition devices can be respectively arranged in a scene of actual test, target view images under three visual angles of the target robot are acquired, then corresponding target adjustment areas can be respectively determined according to each target view image, and a target adjustment angle can be determined from the corresponding target adjustment areas, it needs to be explained that because the target view images are two-dimensional views, and the zero point position of the steering engine of the target robot is a position in a three-dimensional space, the target adjustment angle is only determined to be an angle to be adjusted of the target robot under the visual angle, for example: for a front view image, the target adjustment angle is an angular adjustment in the vertical and horizontal directions; for the test view image, the target adjustment angle is an angular adjustment in the vertical and thickness directions; for the overlooking view images, the target adjustment angle is the angle adjustment in the horizontal and thickness directions, and the zero point of the steering engine can be adjusted respectively based on each view image, so that the control adjustment of the zero point position of the steering engine is realized.
Another specific implementation procedure of the zero point adjustment method for a robot provided in the embodiment of the present application is specifically explained below.
Fig. 3 is a second flowchart of a zero point adjustment method for a robot according to an embodiment of the present application, please refer to fig. 3, which determines a target adjustment area in a target view image of a target robot, and includes:
s310: the identification positions of at least two associated labels in the target view image of the target robot are determined.
Alternatively, the associated tags may be two or more tags for identifying the same line, such as: a label is respectively arranged at the corresponding positions of the left shoulder and the right shoulder of the target robot, the two labels are related labels, and the label can be a mark with a special color or a special shape, for example: and the label position of each label can be determined through the target view image.
Specifically, the identification positions of at least two associated tags can be determined from the target view image based on an image recognition mode.
S320: and generating a target reference line passing through the identification positions of the associated labels based on the identification positions of the at least two associated labels.
Optionally, a straight line passing through the two positions of the at least two associated tags may be determined according to the identification positions of the at least two associated tags, which is the target reference line.
S330: and determining the steering engine to be adjusted according to the distance between the zero position of each steering engine of the target robot in the target view image and the target reference line.
The positions of all the steering engines can be obtained based on an image recognition mode, and after the target reference line is determined, the steering engines to be adjusted can be determined according to the distances between the zero positions of all the steering engines of the target robot in the target view image and the target reference line. For example: and if the distance between the zero position of the steering engine and the target reference line is smaller than a preset threshold value, determining the steering engine as the steering engine to be adjusted.
S340: and determining a target adjustment area based on the target reference line, the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted.
Optionally, after determining the target reference line, the identification position of the associated tag on the target reference line, and the zero position of the steering engine to be adjusted, a target adjustment region may be determined based on the corresponding positional relationship, and the target adjustment region may include the identification position of the associated tag on the target reference line, and the zero position of the steering engine to be adjusted.
The following specifically explains a specific implementation procedure for determining a target adjustment area in the zero point adjustment method for a robot provided in the embodiment of the present application.
Fig. 4 is a third schematic flow chart of a zero point adjustment method for a robot according to an embodiment of the present application, please refer to fig. 4, where determining a target adjustment area based on a target reference line, an identification position of an associated tag on the target reference line, and a zero point position of a steering engine to be adjusted includes:
s410: and determining a triangle with the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted as vertexes.
Optionally, after the target reference line is determined, the identification positions of at least two associated tags and the zero position of the steering engine to be adjusted may be connected to obtain a triangle, where a vertex of the triangle is the identification positions of the two associated tags and the zero position of the steering engine to be adjusted (if there are multiple associated tags, the vertex may be the position of the associated tag at the extreme edge of the target reference line and the zero position of the steering engine to be adjusted because the positions of the redundant associated tags are on the target reference line).
S420: and taking the area containing the triangle and the target reference line as a target adjustment area.
Optionally, after determining the triangle, the area where the triangle and the target reference line are located may be used as the target adjustment area, and it should be noted that each target adjustment area may be used to adjust a part of steering engines of the target robot, for example: the target robot may be divided into different target adjustment regions by providing corresponding associated tags on the head, neck, shoulder, arm, waist, knee, leg, and foot of the target robot, and may be determined based on the determination method of the target adjustment region, respectively, and then may sequentially perform zero point adjustment within the target adjustment region.
Alternatively, in the actual determination process, the target adjustment region in each target view image may be determined in turn, for example: target adjustment areas corresponding to the arms, waist and feet of the target robot in the front view image; target adjustment regions corresponding to the neck, knee and leg of the target robot can be determined in the side view image; a target adjustment area corresponding to the head and shoulders of the target robot can be determined in the top view image. The above determination process is only an example, and the determination of the target adjustment area may be performed according to the actual form of the robot and the actual requirement of the zero point adjustment, which is not limited herein.
A schematic diagram (taking a front view image as an example) of the positions of the respective labels of the target robot in the embodiment of the present application is specifically explained below.
Fig. 5 is a schematic diagram of label positions of a target robot according to an embodiment of the present application, and please refer to fig. 5, which takes a front view image as an example, specifically, takes a waist of the target robot as an example, two labels 510 are respectively disposed at positions where the waist connects two leg joints, the two labels 510 are associated labels, and a connection line of the two labels may be a line parallel to a horizontal line, and the line is a target reference line.
In fig. 5, only the waist is taken as an example, each view image in the actual view image may include a plurality of associated labels, and each set of associated labels may also include at least two, which is not limited herein.
Next, the specific determination process of the target adjustment angle in the target adjustment area formed by the associated tag is explained by taking the associated tag in fig. 5 as an example.
Fig. 6 is a schematic diagram of determining a target adjustment angle according to an embodiment of the present application, please refer to fig. 6, where three vertexes of a triangle are respectively located at a tag position of the first tag 610, a tag position of the second tag 620, and a zero point position 630 of the steering engine to be adjusted, where the first tag 610 and the second tag 620 are associated tags.
In fig. 6, two associated tags are taken as an example, if there are a plurality of associated tags, the tags are located between the first tag 610 and the second tag 620, that is, one side of the triangle, a line where the first tag 610 and the second tag 620 are located is a target reference line, and the target adjustment angle is θ in fig. 6.
The following explains a specific process of determining a target adjustment angle in the zero point adjustment method for a robot provided in the embodiment of the present application.
Fig. 7 is a fourth flowchart illustrating a zero point adjustment method for a robot according to an embodiment of the present application, please refer to fig. 7, where determining a target adjustment angle based on an identification position and a zero point position of a steering engine to be adjusted in a target adjustment region includes:
s710: and generating a correlation line by taking the zero position of the steering engine to be adjusted and the identification position of the target correlation label on the target reference line as a vertex.
And the target associated label is the associated label with the identification position closest to the zero position of the steering engine to be adjusted in the associated label on the target reference line.
Optionally, after the target adjustment area is determined, the association line may be determined first, where the association line may be one side of the triangle, and specifically may be a connection line between an association tag with a mark position closest to a zero position of the steering engine to be adjusted in the association tag on the target reference line and the zero position of the steering engine to be adjusted. Reference may be made in particular to the aforementioned fig. 6.
S720: and taking the included angle between the target reference line and the associated line as a target adjustment angle.
Alternatively, after the associated line is determined, the angle between the target reference line and the associated line may be used as the target adjustment angle, which may be θ in fig. 6.
Optionally, before receiving the target view image of the target robot acquired by the image acquisition device, the method further comprises: and sending a state command to the target robot so as to enable each steering engine of the target robot to move to a preset position.
Before the acquisition action is executed, the attitude of the robot can be controlled to be changed into a standard attitude in order to ensure the accuracy of the acquisition of the related data, namely, each steering engine of the target robot is controlled to move to a preset position, and the position is the standard attitude of the robot.
Next, a further specific implementation process of the zero point adjustment method for a robot provided in the embodiment of the present application is specifically explained.
Fig. 8 is a fifth flowchart illustrating a zero point adjustment method for a robot according to an embodiment of the present application, please refer to fig. 8, where after a target adjustment instruction is sent to a target robot, the method further includes:
s810: and receiving the view image acquired by the image acquisition equipment after the target robot is adjusted.
Alternatively, after the above adjustment of the target robot is performed, the adjusted view image of the target robot acquired by the image acquisition apparatus may be received again.
The method and specific contents for acquiring the adjusted view image are similar to those of the target view image, and are not repeated herein.
S820: and judging whether the zero position of the steering engine after adjustment in the adjusted view image is on the same straight line with at least two identification positions.
Optionally, the identification position and the zero position are determined in the same identification manner as described above, and it is determined whether the zero position of the steering engine after adjustment is on the same straight line with at least two identification positions. Wherein, at least two identification positions are the identification positions of the associated tags.
Specifically, whether the positions are on the same straight line or not can be judged based on an image recognition mode, an error threshold value can be set in actual calculation, when the errors of the judged positions are within the error threshold value, the positions can be judged to be on the same straight line, and the error threshold value can be set according to actual requirements.
S830: and if so, determining that the zero point adjustment of the robot is finished.
Optionally, confirm on a straight line, then can confirm that the adjustment of zero point of target robot is accomplished, can adopt above-mentioned mode to confirm the zero point position of every steering wheel of target robot in proper order and adjust, all behind same straight line, then can confirm that the adjustment of zero point of target robot finishes, also can finish the adjustment promptly.
The following describes apparatuses, devices, and storage media for executing the zero point adjustment method for a robot provided in the present application, and specific implementation processes and technical effects thereof are referred to above, and will not be described again below.
Fig. 9 is a schematic structural diagram of a zero point adjustment apparatus of a robot according to an embodiment of the present application, please refer to fig. 9, where the zero point adjustment apparatus of the robot is applied to a computer device, and the computer device is respectively connected to an image capturing device and a target robot in a communication manner, and the apparatus includes: a receiving module 910, an area determining module 920, an angle determining module 930, and a transmitting module 940;
a receiving module 910, configured to receive a target view image of a target robot acquired by an image acquisition device;
the area determination module 920 is configured to determine a target adjustment area in a target view image of the target robot, where the target adjustment area includes at least two identification positions and zero positions of steering engines in the target robot, and the identification positions are positions where labels pre-configured on the target robot are located;
an angle determining module 930, configured to determine a target adjustment angle in the target adjustment region based on the identification position and a zero position of the steering engine to be adjusted;
and a sending module 940, configured to send a target adjustment instruction to the target robot, so that the target robot performs zero point adjustment based on the target adjustment instruction, where the target adjustment instruction includes a target adjustment angle.
Optionally, the area determining module 920 is specifically configured to determine the identification positions of at least two associated labels in the target view image of the target robot; generating a target reference line passing through the identification positions of the associated labels based on the identification positions of the at least two associated labels; determining a steering engine to be adjusted according to the distance between the zero position of each steering engine of the target robot in the target view image and the target reference line; and determining a target adjustment area based on the target reference line, the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted.
Optionally, the area determining module 920 is specifically configured to determine a triangle with the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted as a vertex; and taking the area containing the triangle and the target reference line as a target adjustment area.
Optionally, the angle determining module 930 is specifically configured to generate a correlation line by using a zero position of the steering engine to be adjusted and an identification position of a target correlation tag on the target reference line as a vertex, where the target correlation tag is a correlation tag whose identification position in the correlation tag on the target reference line is closest to the zero position of the steering engine to be adjusted; and taking the included angle between the target reference line and the associated line as a target adjustment angle.
Optionally, the sending module 940 is further configured to send a status instruction to the target robot so that each steering engine of the target robot moves to a preset position.
Optionally, the receiving module 910 is further configured to receive the adjusted view image of the target robot, which is acquired by the image acquisition device; judging whether the zero position of the steering engine after adjustment in the adjusted view image is on the same straight line with at least two identification positions; and if so, determining that the zero point adjustment of the robot is finished.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors, or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 10 is a schematic structural diagram of a computer device according to an embodiment of the present application, and referring to fig. 10, the computer device includes: the memory 950 and the processor 960, wherein the memory 950 stores a computer program that can be run on the processor 960, and the processor 960 implements the steps of the zero point adjustment method of the robot when executing the computer program.
In another aspect of the embodiments of the present application, there is further provided a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the zero point adjustment method for a robot.
Optionally, the computer device is a computer device in communication connection with the target robot and the image capturing device in the scene.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A zero point adjustment method of a robot is applied to a computer device which is respectively connected with an image acquisition device and a target robot in a communication way, and the method comprises the following steps:
receiving a target view image of a target robot acquired by image acquisition equipment;
determining a target adjustment area in a target view image of the target robot, wherein the target adjustment area comprises at least two identification positions and zero positions of all steering engines in the target robot, and the identification positions are positions where labels configured in advance on the target robot are located;
determining a target adjustment angle in the target adjustment area based on the identification position and the zero position of the steering engine to be adjusted;
and sending a target adjusting instruction to the target robot so as to enable the target robot to perform zero point adjustment based on the target adjusting instruction, wherein the target adjusting instruction comprises the target adjusting angle.
2. The method of claim 1, wherein said determining a target adjustment area in a target view image of the target robot comprises:
determining identification positions of at least two associated labels in a target view image of the target robot;
generating a target reference line passing through the identification positions of the at least two associated labels based on the identification positions of the at least two associated labels;
determining a steering engine to be adjusted according to the distance between the zero position of each steering engine of the target robot in the target view image and the target reference line;
and determining the target adjustment area based on the target reference line, the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted.
3. The method of claim 2, wherein the determining the target adjustment region based on the target reference line, the identified position of the associated tag on the target reference line, and the zero position of the steering engine to be adjusted comprises:
determining a triangle with the identification position of the associated tag on the target reference line and the zero position of the steering engine to be adjusted as vertexes;
and taking the area containing the triangle and the target reference line as the target adjustment area.
4. The method of claim 2, wherein determining a target adjustment angle in the target adjustment region based on the identified position and a zero position of the steering engine to be adjusted comprises:
generating an associated line by taking the zero position of the steering engine to be adjusted and the identification position of the target associated tag on the target reference line as vertexes, wherein the target associated tag is the associated tag with the identification position of the associated tag on the target reference line closest to the zero position of the steering engine to be adjusted;
and taking the included angle between the target reference line and the associated line as the target adjustment angle.
5. The method of any of claims 1-4, wherein the image acquisition device comprises: a front view image acquisition device, a side view image acquisition device, or a top view image acquisition device;
the target view image includes: a front view image, a side view image, or a top view image.
6. The method of claim 1, wherein prior to receiving the target view image of the target robot acquired by the image acquisition device, the method further comprises:
and sending a state instruction to the target robot so as to enable each steering engine of the target robot to move to a preset position.
7. The method of claim 1, wherein after sending the target adjustment instruction to the target robot, the method further comprises:
receiving a view image acquired by image acquisition equipment after the target robot is adjusted;
judging whether the zero position of the steering engine after adjustment in the view image after adjustment is on the same straight line with the at least two identification positions;
and if so, determining that the zero point adjustment of the robot is finished.
8. A zero point adjusting device of a robot is applied to a computer device which is respectively in communication connection with an image acquisition device and a target robot, and the device comprises: the device comprises a receiving module, an area determining module, an angle determining module and a sending module;
the receiving module is used for receiving a target view image of the target robot acquired by the image acquisition equipment;
the area determining module is used for determining a target adjusting area in a target view image of the target robot, wherein the target adjusting area comprises at least two identification positions and zero positions of all steering engines in the target robot, and the identification positions are positions where labels configured in advance on the target robot are located;
the angle determining module is used for determining a target adjusting angle in the target adjusting area based on the identification position and the zero position of the steering engine to be adjusted;
the sending module is configured to send a target adjustment instruction to the target robot so that the target robot performs zero point adjustment based on the target adjustment instruction, where the target adjustment instruction includes the target adjustment angle.
9. A computer device, comprising: memory in which a computer program is stored which is executable on the processor, and a processor which, when executing the computer program, carries out the steps of the method according to any one of the preceding claims 1 to 7.
10. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202111568369.9A 2021-12-21 2021-12-21 Zero point adjustment method, device and equipment for robot and storage medium Active CN114147725B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111568369.9A CN114147725B (en) 2021-12-21 2021-12-21 Zero point adjustment method, device and equipment for robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111568369.9A CN114147725B (en) 2021-12-21 2021-12-21 Zero point adjustment method, device and equipment for robot and storage medium

Publications (2)

Publication Number Publication Date
CN114147725A true CN114147725A (en) 2022-03-08
CN114147725B CN114147725B (en) 2024-04-02

Family

ID=80451713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111568369.9A Active CN114147725B (en) 2021-12-21 2021-12-21 Zero point adjustment method, device and equipment for robot and storage medium

Country Status (1)

Country Link
CN (1) CN114147725B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115890690A (en) * 2023-03-09 2023-04-04 广东隆崎机器人有限公司 Robot zero point adjustment method, device, equipment and readable storage medium

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110295421A1 (en) * 2010-06-01 2011-12-01 Fanuc Corporation Device and a method for restoring positional information of robot
JP2012223871A (en) * 2011-04-21 2012-11-15 Kawasaki Heavy Ind Ltd Origin correction method and system of robot joint
CN103092218A (en) * 2013-01-06 2013-05-08 中国航天科技集团公司烽火机械厂 Method and system for steering engine zero adjustment
DE102012110190A1 (en) * 2012-10-25 2014-04-30 Mis-Robotics Gmbh Manually operated robot control and method for controlling a robot system
CN109079778A (en) * 2018-08-07 2018-12-25 珠海格力电器股份有限公司 Robot zero-setting system and method
CN110802593A (en) * 2019-11-07 2020-02-18 北京理工大学 Lower limb joint zero calibration method of humanoid robot
CN111113427A (en) * 2019-12-31 2020-05-08 深圳市优必选科技股份有限公司 Steering engine state control method and device for robot, robot and storage medium
CN111300484A (en) * 2020-03-13 2020-06-19 达闼科技成都有限公司 Method for determining joint positioning error of robot, robot and storage medium
CN112706164A (en) * 2020-12-18 2021-04-27 深圳市大富智慧健康科技有限公司 Automatic correction method, device and equipment for initial pose of mechanical arm and storage medium
CN112720474A (en) * 2020-12-21 2021-04-30 深圳市越疆科技有限公司 Pose correction method and device for robot, terminal device and storage medium
CN112720469A (en) * 2020-12-18 2021-04-30 北京工业大学 Zero point calibration method for three-axis translational motion system by microscopic stereo vision
CN113031635A (en) * 2021-02-26 2021-06-25 上海高仙自动化科技发展有限公司 Attitude adjusting method and device, cleaning robot and storage medium
CN113676387A (en) * 2021-08-11 2021-11-19 追觅创新科技(苏州)有限公司 Zero calibration method and device for multi-legged robot, storage medium and electronic device
CN114516048A (en) * 2022-02-21 2022-05-20 乐聚(深圳)机器人技术有限公司 Zero point debugging method and device for robot, controller and storage medium
WO2022188333A1 (en) * 2021-03-09 2022-09-15 美智纵横科技有限责任公司 Walking method and apparatus, and computer storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110295421A1 (en) * 2010-06-01 2011-12-01 Fanuc Corporation Device and a method for restoring positional information of robot
CN102294694A (en) * 2010-06-01 2011-12-28 发那科株式会社 Device and a method for restoring positional information of robot
JP2012223871A (en) * 2011-04-21 2012-11-15 Kawasaki Heavy Ind Ltd Origin correction method and system of robot joint
DE102012110190A1 (en) * 2012-10-25 2014-04-30 Mis-Robotics Gmbh Manually operated robot control and method for controlling a robot system
CN103092218A (en) * 2013-01-06 2013-05-08 中国航天科技集团公司烽火机械厂 Method and system for steering engine zero adjustment
CN109079778A (en) * 2018-08-07 2018-12-25 珠海格力电器股份有限公司 Robot zero-setting system and method
CN110802593A (en) * 2019-11-07 2020-02-18 北京理工大学 Lower limb joint zero calibration method of humanoid robot
CN111113427A (en) * 2019-12-31 2020-05-08 深圳市优必选科技股份有限公司 Steering engine state control method and device for robot, robot and storage medium
CN111300484A (en) * 2020-03-13 2020-06-19 达闼科技成都有限公司 Method for determining joint positioning error of robot, robot and storage medium
CN112706164A (en) * 2020-12-18 2021-04-27 深圳市大富智慧健康科技有限公司 Automatic correction method, device and equipment for initial pose of mechanical arm and storage medium
CN112720469A (en) * 2020-12-18 2021-04-30 北京工业大学 Zero point calibration method for three-axis translational motion system by microscopic stereo vision
CN112720474A (en) * 2020-12-21 2021-04-30 深圳市越疆科技有限公司 Pose correction method and device for robot, terminal device and storage medium
CN113031635A (en) * 2021-02-26 2021-06-25 上海高仙自动化科技发展有限公司 Attitude adjusting method and device, cleaning robot and storage medium
WO2022188333A1 (en) * 2021-03-09 2022-09-15 美智纵横科技有限责任公司 Walking method and apparatus, and computer storage medium
CN113676387A (en) * 2021-08-11 2021-11-19 追觅创新科技(苏州)有限公司 Zero calibration method and device for multi-legged robot, storage medium and electronic device
CN114516048A (en) * 2022-02-21 2022-05-20 乐聚(深圳)机器人技术有限公司 Zero point debugging method and device for robot, controller and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115890690A (en) * 2023-03-09 2023-04-04 广东隆崎机器人有限公司 Robot zero point adjustment method, device, equipment and readable storage medium

Also Published As

Publication number Publication date
CN114147725B (en) 2024-04-02

Similar Documents

Publication Publication Date Title
US9928656B2 (en) Markerless multi-user, multi-object augmented reality on mobile devices
JP6507730B2 (en) Coordinate transformation parameter determination device, coordinate transformation parameter determination method, and computer program for coordinate transformation parameter determination
CN107328420B (en) Positioning method and device
CN108510545B (en) Space positioning method, space positioning apparatus, space positioning system, and computer-readable storage medium
US10802606B2 (en) Method and device for aligning coordinate of controller or headset with coordinate of binocular system
US20140092132A1 (en) Systems and methods for 3d pose estimation
US10438412B2 (en) Techniques to facilitate accurate real and virtual object positioning in displayed scenes
JP2017005532A (en) Camera posture estimation device, camera posture estimation method and camera posture estimation program
CN102122392A (en) Information processing apparatus, information processing system, and information processing method
CN113256718B (en) Positioning method and device, equipment and storage medium
JP6515039B2 (en) Program, apparatus and method for calculating a normal vector of a planar object to be reflected in a continuous captured image
CN112819892B (en) Image processing method and device
US20120162387A1 (en) Imaging parameter acquisition apparatus, imaging parameter acquisition method and storage medium
CN109902675B (en) Object pose acquisition method and scene reconstruction method and device
US11461921B2 (en) Program, system, electronic device, and method for recognizing three-dimensional object
US20230005216A1 (en) Three-dimensional model generation method and three-dimensional model generation device
CN108430032B (en) Method and equipment for realizing position sharing of VR/AR equipment
CN115191113A (en) Wide-view angle stereo camera device and depth image processing method using same
CN114147725A (en) Zero point adjustment method, device, equipment and storage medium for robot
JP2008309595A (en) Object recognizing device and program used for it
US20150178927A1 (en) Method and system for determining a transformation associated with a capturing device
CN114092668A (en) Virtual-real fusion method, device, equipment and storage medium
CN113051953A (en) AR virtual commodity display method based on two-dimensional code positioning
CN111897432A (en) Pose determining method and device and electronic equipment
JP2017162449A (en) Information processing device, and method and program for controlling information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant