CN114800499B - Pose adjustment method and device, computer readable storage medium and electronic equipment - Google Patents

Pose adjustment method and device, computer readable storage medium and electronic equipment Download PDF

Info

Publication number
CN114800499B
CN114800499B CN202210421079.XA CN202210421079A CN114800499B CN 114800499 B CN114800499 B CN 114800499B CN 202210421079 A CN202210421079 A CN 202210421079A CN 114800499 B CN114800499 B CN 114800499B
Authority
CN
China
Prior art keywords
pose
robot
representation vector
target position
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210421079.XA
Other languages
Chinese (zh)
Other versions
CN114800499A (en
Inventor
黄晓康
程波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202210421079.XA priority Critical patent/CN114800499B/en
Publication of CN114800499A publication Critical patent/CN114800499A/en
Application granted granted Critical
Publication of CN114800499B publication Critical patent/CN114800499B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The present disclosure relates to a pose adjustment method, a pose adjustment device, a computer readable storage medium and an electronic device, the method comprising: obtaining a first representation vector of a first pose of the robot entering a specified docking area, optimizing the first representation vector according to image feature data of the first pose to obtain a second representation vector, calculating pose deviation of the first pose and a second pose of a preset robot target position under the condition that the first pose is out of a preset range of the preset robot target position according to the second representation vector, and adjusting the pose of the robot to be within the preset range of the target position according to the pose deviation. The high-precision meal taking and docking actions of the distribution robot are realized on the basis of the existing hardware, the precision requirement of the distribution robot is met on the basis of not increasing the hardware cost, the docking success rate of the distribution robot and a distribution station is improved, and the market competitiveness of the distribution robot is improved.

Description

Pose adjustment method and device, computer readable storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of robots, and in particular, to a pose adjustment method, a pose adjustment device, a computer readable storage medium, and an electronic apparatus.
Background
In some situations where a conventional take-out table is already placed, such as in office buildings, hospitals, schools, etc., after a rider places the take-out table in the take-out table, the user may not be able to take out the meal in time due to some temporary conditions, in which case the user has a need to use a delivery robot to deliver the take-out table to a designated location. At present, the existing takeaway dinner cabinet on the market cannot realize high-precision dinner taking and butting actions on the basis of existing hardware, and if higher-precision radars are used for realizing the high-precision dinner taking and butting actions, the hardware cost of the distribution robot is greatly increased, and the market competitiveness of the distribution robot is reduced.
Disclosure of Invention
The purpose of the present disclosure is to provide a pose adjustment method, a device, a computer readable storage medium and an electronic apparatus, wherein the pose adjustment method meets the precision requirement of a distribution robot on the basis of not increasing the hardware cost.
To achieve the above object, in a first aspect, the present disclosure provides a pose adjustment method, including: acquiring a first representation vector of a first pose of the robot entering a specified docking area; optimizing the first representation vector according to the image characteristic data of the first pose to obtain a second representation vector; according to the second representation vector, calculating the pose deviation of the first pose and the second pose of the preset robot target position under the condition that the first pose is out of the preset range of the preset robot target position; adjusting the pose of the robot to be within a preset range of the target position according to the pose deviation
In a second aspect, the present disclosure provides a pose adjustment device, comprising: the acquisition module is used for acquiring a first representation vector of a first pose of the robot entering the appointed docking area; the optimization module is used for optimizing the first representation vector according to the image characteristic data of the first pose to obtain a second representation vector; the processing module is used for calculating the pose deviation of the first pose and the second pose of the preset robot target position according to the second representation vector under the condition that the first pose is out of the preset range of the preset robot target position; and the adjusting module is used for adjusting the pose of the robot to be within a preset range of the target position according to the pose deviation.
In a third aspect, the present disclosure provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processing device, implements the steps of the aforementioned pose adjustment method.
In a fourth aspect, the present disclosure provides an electronic device comprising: a storage device having a computer program stored thereon; and the processing device is used for executing the computer program in the storage device so as to realize the steps of the pose adjustment method.
According to the technical scheme, the first representation vector of the first pose of the robot entering the appointed docking area is obtained, the first representation vector is optimized according to the image characteristic data of the first pose, the second representation vector is obtained, and according to the second representation vector, the pose deviation of the first pose and the second pose of the preset robot target position is calculated under the condition that the first pose is out of the preset range of the preset robot target position, and the pose of the robot is adjusted to be within the preset range of the target position according to the pose deviation; after the first pose of the robot entering the appointed docking area is obtained, the first pose is optimized according to the image characteristic data of the first pose, finally the pose of the robot is adjusted to be within a preset range of a target position, the high-precision meal taking docking action of the dispensing robot is realized on the basis of existing hardware, the precision requirement of the dispensing robot is met on the basis of not increasing the hardware cost, the docking success rate of the dispensing robot and a dispensing station is improved, and the market competitiveness of the dispensing robot is improved.
Additional features and advantages of the present disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification, illustrate the disclosure and together with the description serve to explain, but do not limit the disclosure. In the drawings:
fig. 1 is an error schematic diagram of a lateral position of a dispensing robot at a meal taking port according to an exemplary embodiment of the present disclosure.
FIG. 2 is a schematic view of a food serving bin provided in an exemplary embodiment of the present disclosure.
Fig. 3 is a flow chart of feature points near a meal retrieval port provided in an exemplary embodiment of the present disclosure.
Fig. 4 is a flowchart of a pose adjustment method provided by an exemplary embodiment of the present disclosure.
Fig. 5 is a flowchart illustrating sub-steps of step S102 according to an exemplary embodiment of the present disclosure.
Fig. 6 is a block diagram of a pose adjustment device provided by an exemplary embodiment of the present disclosure.
Fig. 7 is a block diagram of an electronic device provided by an exemplary embodiment of the present disclosure.
Fig. 8 is a block diagram of an electronic device provided by an exemplary embodiment of the present disclosure.
Description of the reference numerals
100-taking a dinner cabinet; 120-meal taking port; 121-feature points; 140-article placement grid; 20-a pose adjusting device; 201-an acquisition module; 203-an optimization module; 205-a processing module; 207-an adjustment module; 700-an electronic device; 701-a processor; 702-a memory; 703-a multimedia component; 704-I/O interface; 705-a communication component; 1900-electronic device; 1922-a processor; 1932-memory; 1926-power supply assembly; 1950-communication component; 1958-I/O interface.
Detailed Description
Specific embodiments of the present disclosure are described in detail below with reference to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating and illustrating the disclosure, are not intended to limit the disclosure.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The existing butt joint method of the take-out dinner cabinet in the market at present mainly comprises the following three steps:
firstly, a laser radar and a positioning method of a strong reflection sign are adopted, a strong reflection marker is arranged at a fixed position of a target position, a light intensity diagram is formed by using the laser radar, and the target position is determined by the light intensity diagram, so that the target position of the robot to be butted is calculated. According to the scheme, the laser radar or the camera is adopted as the sensor, the purpose of identifying the position is achieved by arranging the reflective tag and the light-emitting element at the specific position in a technically analyzed manner, the precision of the laser radar for the service robot is generally 2-3 cm at present, the requirement of the butt joint precision in the current application is difficult to be met, and if the laser radar with higher precision is used, the hardware cost is greatly increased, and the market competitiveness of the distribution robot is reduced.
Secondly, a method of docking two robots is adopted, a robot A is provided with a monocular camera, a robot B is provided with an LED lamp, and the robot A calculates the position and the state of the robot B by searching the position of the LED lamp and the combined state of the LED lamps, so that the docking target position of the robot B is calculated. By adopting the scheme of the monocular camera, a specific characteristic point is provided, and the pose of the butt joint robot is calculated through the specific characteristic point, however, the precision is still difficult to meet the actual application requirement.
Thirdly, a photoelectric sensor is adopted to scan the tag code to finish the positioning, the tag code is attached to the surrounding environment of the target position, the tag code carries position information, and the position information is obtained through the scanning of the tag code by the photoelectric sensor, so that the position information of the robot in the space is determined. In this positioning mode, the robot determines the global position information by identifying the position information carried by the tag code, but the logic of the robot cannot be multiplexed in the docking positioning system.
In addition, due to the fact that the existing dinner taking cabinets lack of automatic transmission and connection hardware devices and are complex in form, the distribution robot cannot automatically complete dinner taking actions in the dinner taking cabinets, under the background, a transmission and connection mechanism is required to be added to a traditional dinner taking cabinet, so that the distribution robot can automatically complete a task of taking a dinner from the dinner taking cabinet, and therefore hardware cost is increased.
In the butt joint process of the delivery robot for taking out the dinner cabinet, the delivery robot needs to return to the robot for taking out the dinner cabinet to take out the dinner port, so that the task of dinner connection is completed. In the existing robot navigation system, a laser radar is adopted for map building and positioning. The dispensing robot is able to move from a known position to the meal retrieval port, initiating a meal docking designation. However, in the actual testing process, the pose of the delivery robot reaching the meal taking port is difficult to meet the requirements of the connection task due to the limitations of the precision of the laser radar and the mechanical precision of the motion chassis. For example, the completion of the docking task requires that the dispensing robot reach the meal retrieval port with a lateral position error of less than 1cm, as shown in fig. 1.
Based on the above, the scheme of the disclosure provides a pose adjustment method for optimally adjusting the pose of the dispensing robot in a designated area near a target position, so that high-precision meal taking butt joint action is realized on the basis of not increasing hardware, and the dispensing robot can accurately complete the meal taking action in a meal taking cabinet.
Referring to fig. 2, fig. 2 illustrates a schematic view of a food serving bin 100 provided in one exemplary embodiment of the present disclosure, the food serving bin 100 including a food serving opening 120 and a plurality of item placement grid openings 140. When the dispensing robot moves to a designated docking area near the meal taking port 120, the pose of the dispensing robot is adjusted to be within a predetermined range of the target position, so that the meal can be taken from the meal taking cabinet 100; the item placement grid 140 is used to place meals.
A plurality of preset feature points 121 are arranged around the meal taking port 120, as shown in fig. 3, so that the laser radar or the camera device of the dispensing robot can identify the feature points 121, and then the position of the dispensing robot is determined according to the position information of the feature points 121.
Referring to fig. 4, fig. 4 is a flowchart of a pose adjustment method according to an exemplary embodiment of the present disclosure. The method is performed by a dispensing robot, for example, by a pose optimization module built into the dispensing robot. The pose adjustment method shown in fig. 4 includes the steps of:
in step S101, a first representative vector of a first pose of the robot entering the specified docking area is acquired.
Illustratively, a robot is used as an example of the dispensing robot.
The position of the dispensing robot can be obtained through an odometer, an inertial measurement unit (Inertial Measurement Unit, an IMU) and a motion model which are arranged in the dispensing robot, when the dispensing robot is detected to move to a specified butt joint area near a meal taking port, a laser radar and a camera of the dispensing robot are called to obtain a characteristic image containing characteristic points, the position of the characteristic points in the characteristic image is identified, the first pose of the dispensing robot is determined according to the position information of the characteristic points, the first pose is a spatial position coordinate corresponding to the dispensing robot and is marked as xi, xi= (x, y, theta), x is a coordinate of the dispensing robot in the x-axis direction, y is a coordinate of the dispensing robot in the y-axis direction, and theta is a direction angle of the dispensing robot in the first pose.
In one embodiment, the first representative vector of the first pose may be represented by a representative vector of a feature point in a feature image acquired by the dispensing robot in the first pose, for example, the feature point in the feature image acquired by the dispensing robot in the first pose is recorded as a first feature point set, and the first representative vector is obtained according to position information of the first feature point set. The first expression vector is calculated by the following steps:
wherein, the liquid crystal display device comprises a liquid crystal display device,a representing vector corresponding to a feature point i in the feature image; k is an internal reference matrix of the camera; t is a transformation matrix of the camera corresponding to the first pose, t=exp (ζ), ζ= (x, y, θ) for the first pose, x is a coordinate of the robot in an x-axis direction, y is a coordinate of the robot in a y-axis direction, and θ is a direction angle of the robot in the first pose; p is p (i) And the spatial position coordinates corresponding to the characteristic points i in the characteristic image acquired by the robot in the first pose are obtained.
In step S102, the first representative vector is optimized according to the image feature data of the first pose, so as to obtain a second representative vector.
It should be noted that, the step S102 includes a sub-step S1021, a sub-step S1022, a sub-step S1023, and a sub-step S1024, and a specific manner of obtaining the second representative vector will be described in detail in the sub-step of the step S102. Referring to fig. 5, fig. 5 is a flowchart illustrating sub-steps of step S102 according to an exemplary embodiment of the present disclosure.
And step S1021, acquiring image characteristic data of the first pose.
The image characteristic data includes: spatial position coordinates of feature points in the feature image, and an internal reference matrix of the camera. The characteristic image is obtained by a camera and comprises characteristic points at a first pose of the robot. As described above, a feature image including feature points can be acquired by a laser radar and a camera, and the spatial position coordinates of the feature points can be extracted from the feature image.
Sub-step S1022 constructs an error function from the image feature data.
From the image characteristic data obtained aboveConstructing an error function, for example, by first constructing a feature point expression from the image feature data, the feature point expression being an expression obtained from the first feature point set, the feature point expression may be as described aboveWhere i is the number of feature points, takes a value of 0-n, and then constructs a least squares problem function from the feature point expressions, which may be, for exampleWherein->For the first expression vector, namely, the expression vector corresponding to the characteristic point i in the characteristic image acquired by the robot at the first pose, K.T k ·p (i) A second preset representation vector; finally, constructing an error function according to the least square problem function, wherein the calculation mode of the error function can comprise the following steps:
wherein e (ζ) k ) Is an error function;is a first representation vector, is a known quantity; />The second expression vector, which is preset, is an unknown quantity.
Substep S1023, determining the pose transformation amount according to the error function.
Illustratively, according to the error function, the disturbance quantity Δζ is obtained by using a left-hand disturbance model, and a derivative of the transformation of e on the disturbance quantity is considered, so as to obtain the following jacobian matrix:
then iteratively solving a jacobian matrix J (ζ) and an error function e (ζ) k ) After iteration is completed, the pose transformation quantity delta xi is solved by utilizing an equation set, so that the pose transformation quantity delta xi is obtained
Δξ=-(J(ξ) T ·J(ξ)) -1 ·J(ξ) T ·e(ξ k )
Wherein, ζ is the first pose, and Δζ is the pose transformation amount; j (xi) T A matrix obtained by deriving the xi for the error function;for the left multiplication operation on the lie algebra, p (i) And the spatial position coordinates corresponding to the characteristic points i in the characteristic image acquired by the robot in the first pose are obtained.
And step S1024, optimizing the first representation vector according to the pose transformation amount to obtain a second representation vector.
Updating the transformation matrix of the camera according to the calculated pose transformation amount, obtaining an updated transformation matrix, and the method comprises the steps of, by way of example,wherein T is k For the updated transformation matrix of the camera, is +.>The transformation matrix of the camera before updating. And optimizing the first pose according to the updated transformation matrix and the logarithmic mapping to obtain a second representation vector.
In step S103, a pose deviation of the first pose from a second pose of the preset robot target position is calculated based on the second representation vector, in case the first pose is outside a predetermined range of the preset robot target position.
And (3) obtaining a second representation vector after optimizing the first pose, further judging whether the position of the robot corresponding to the second representation vector is within a preset range of a preset robot target position on the basis of the second representation vector, if so, pushing out to execute the step, and if not, calculating pose deviation between the optimized first pose and the second pose of the preset robot target position, wherein the calculation mode can refer to the content of the step 102 and is not repeated.
In step S104, the pose of the robot is adjusted to be within a predetermined range of the target position according to the pose deviation.
And adjusting the pose of the delivery robot according to the calculated pose deviation until the robot is in a preset range of the target position, so that the position error between the second pose of the delivery robot and the target position is reduced, and the docking precision of the delivery robot in meal taking is improved.
In one embodiment, the pose adjustment method provided by the present disclosure is performed by a pose optimization module built in the dispensing robot, for example, when the dispensing robot reaches a specified docking area and receives a pose adjustment instruction, the pose adjustment method as described in steps S101-S104 is performed. The pose adjustment command may be sent by other electronic devices other than the dispensing robot, or may be automatically triggered when the robot reaches a specified docking area.
In summary, the pose adjustment method provided by the present disclosure includes obtaining a first representation vector of a first pose of a robot entering a specified docking area, optimizing the first representation vector according to image feature data of the first pose to obtain a second representation vector, and calculating pose deviation of the first pose and a second pose of a preset robot target position according to the second representation vector when the first pose is out of a preset range of the preset robot target position, and adjusting the pose of the robot to be within the preset range of the target position according to the pose deviation; after the first pose of the robot entering the appointed docking area is obtained, the first pose is optimized according to the image characteristic data of the first pose, finally the pose of the robot is adjusted to be within a preset range of a target position, the high-precision meal taking docking action of the dispensing robot is realized on the basis of existing hardware, the precision requirement of the dispensing robot is met on the basis of not increasing the hardware cost, the docking success rate of the dispensing robot and a dispensing station is improved, and the market competitiveness of the dispensing robot is improved.
Fig. 6 is a block diagram of a pose adjustment device according to an exemplary embodiment of the present disclosure. Referring to fig. 6, the apparatus 20 includes an acquisition module 201, an optimization module 203, a processing module 205, and an adjustment module 207.
The acquiring module 201 is configured to acquire a first representation vector of a first pose of the robot entering the specified docking area;
the optimizing module 203 is configured to optimize the first representation vector according to the image feature data of the first pose, so as to obtain a second representation vector;
the processing module 205 is configured to calculate, according to the second representation vector, a pose deviation of the first pose from a second pose of the preset robot target position if the first pose is outside a predetermined range of the preset robot target position;
the adjusting module 207 is configured to adjust the pose of the robot to be within a predetermined range of the target position according to the pose deviation.
Optionally, the processing module 205 is further configured to measure, by using the motion model and the lidar device, a first representation vector of the first pose of the robot entering the specified docking area.
Optionally, the acquiring module 201 is further configured to acquire image feature data of the first pose;
the processing module 205 is further configured to construct an error function according to the image feature data;
the processing module 205 is further configured to determine a pose transformation amount according to the error function;
the processing module 205 is further configured to optimize the first representation vector according to the pose transformation amount to obtain a second representation vector.
Optionally, the image feature data includes:
the space position coordinates of the feature points in the feature images are the feature images, which are obtained by the camera, of the robot at the first pose and contain the feature points;
and the internal reference matrix of the camera.
Optionally, the processing module 205 is further configured to construct a feature point expression according to the image feature data;
constructing a least square problem function according to the characteristic point expression;
and constructing an error function according to the least square problem function.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 7 is a block diagram of an electronic device 700, according to an example embodiment. The electronic device 700 may be a pose adjustment module built into the dispensing robot. As shown in fig. 7, the electronic device 700 may include: a processor 701, a memory 702. The electronic device 700 may also include one or more of a multimedia component 703, an input/output (I/O) interface 704, and a communication component 705.
The processor 701 is configured to control the overall operation of the electronic device 700 to perform all or part of the steps in the above-mentioned pose adjustment method. The memory 702 is used to store various types of data to support operation on the electronic device 700, which may include, for example, instructions for any application or method operating on the electronic device 700. The Memory 702 may be implemented by any type or combination of volatile or non-volatile Memory devices, such as static random access Memory (Static Random Access Memory, SRAM for short), electrically erasable programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM for short), erasable programmable Read-Only Memory (Erasable Programmable Read-Only Memory, EPROM for short), programmable Read-Only Memory (Programmable Read-Only Memory, PROM for short), read-Only Memory (ROM for short), magnetic Memory, flash Memory, magnetic disk, or optical disk. The multimedia component 703 can include a screen and an audio component. Wherein the screen may be, for example, a touch screen, the audio component being for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signals may be further stored in the memory 702 or transmitted through the communication component 705. The audio assembly further comprises at least one speaker for outputting audio signals. The I/O interface 704 provides an interface between the processor 701 and other interface modules, which may be a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 705 is for wired or wireless communication between the electronic device 700 and other devices. Wireless communication, such as Wi-Fi, bluetooth, near field communication (Near Field Communication, NFC for short), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or one or a combination of more of them, is not limited herein. The corresponding communication component 705 may thus comprise: wi-Fi module, bluetooth module, NFC module, etc.
In an exemplary embodiment, the electronic device 700 may be implemented by one or more application specific integrated circuits (Application Specific Integrated Circuit, abbreviated as ASIC), digital signal processors (Digital Signal Processor, abbreviated as DSP), digital signal processing devices (Digital Signal Processing Device, abbreviated as DSPD), programmable logic devices (Programmable Logic Device, abbreviated as PLD), field programmable gate arrays (Field Programmable Gate Array, abbreviated as FPGA), controllers, microcontrollers, microprocessors, or other electronic components for performing the above-described pose adjustment methods.
In another exemplary embodiment, a computer readable storage medium is also provided, comprising program instructions which, when executed by a processor, implement the steps of the above-described pose adjustment method. For example, the computer readable storage medium may be the memory 702 including program instructions described above, which are executable by the processor 701 of the electronic device 700 to perform the pose adjustment method described above.
Fig. 8 is a block diagram illustrating an electronic device 1900 according to an example embodiment. For example, electronic device 1900 may be provided as a server. Referring to fig. 8, an electronic device 1900 includes a processor 1922, which may be one or more in number, and a memory 1932 for storing computer programs executable by the processor 1922. The computer program stored in memory 1932 may include one or more modules each corresponding to a set of instructions. Further, the processor 1922 may be configured to execute the computer program to perform the pose adjustment method described above.
In addition, the electronic device 1900 may further include a power component 1926 and a communication component 1950, the power component 1926 may be configured to perform power management of the electronic device 1900, and the communication component 1950 may be configured to enable communication of the electronic device 1900, e.g., wired or wireless communication. In addition, the electronic device 1900 may also include an input/output (I/O) interface 1958. The electronic device 1900 may operate an operating system based on a memory 1932, such as Windows Server TM ,Mac OS X TM ,Unix TM ,Linux TM Etc.
In another exemplary embodiment, a computer readable storage medium is also provided, comprising program instructions which, when executed by a processor, implement the steps of the above-described pose adjustment method. For example, the computer readable storage medium may be the memory 1932 described above including program instructions that are executable by the processor 1922 of the electronic device 1900 to perform the pose adjustment method described above.
In another exemplary embodiment, a computer program product is also provided, the computer program product comprising a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-described pose adjustment method when executed by the programmable apparatus.
The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solutions of the present disclosure within the scope of the technical concept of the present disclosure, and all the simple modifications belong to the protection scope of the present disclosure.
In addition, the specific features described in the foregoing embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, the present disclosure does not further describe various possible combinations.
Moreover, any combination between the various embodiments of the present disclosure is possible as long as it does not depart from the spirit of the present disclosure, which should also be construed as the disclosure of the present disclosure.

Claims (8)

1. The pose adjusting method is characterized by comprising the following steps of:
initiating a meal connection instruction in response to the robot taking out the dinner cabinet moving to a meal taking port of the robot taking out the dinner cabinet;
responding to the meal docking instruction, and acquiring a first representation vector of a first pose of the robot entering a designated docking area;
acquiring image characteristic data of the first pose;
optimizing the first representation vector according to the pose transformation amount determined by the image characteristic data to obtain a second representation vector;
according to the second representation vector, calculating the pose deviation of the first pose and the second pose of the preset robot target position under the condition that the first pose is out of the preset range of the preset robot target position;
and adjusting the pose of the robot to be within a preset range of the target position according to the pose deviation.
2. The method of claim 1, wherein the step of obtaining a first representative vector of a first pose of the robot into the designated docking area comprises:
a first representative vector of the first pose of the robot into the designated docking area is measured by the motion model and the lidar device.
3. The method of claim 1, wherein the image feature data comprises:
the space position coordinates of the feature points in the feature images are the feature images, which are obtained by the camera, of the robot at the first pose and contain the feature points;
and the internal reference matrix of the camera.
4. The method of claim 1, wherein the step of constructing an error function from the image characteristic data comprises:
constructing a feature point expression according to the image feature data;
constructing a least square problem function according to the characteristic point expression;
and constructing an error function according to the least square problem function.
5. The utility model provides a position appearance adjusting device which characterized in that includes:
the acquisition module is used for initiating a meal connection instruction in response to the fact that the robot taking out of the dinner cabinet moves to a meal taking port of the robot taking out of the dinner cabinet;
responding to the meal docking instruction, and acquiring a first representation vector of a first pose of the robot entering a designated docking area;
the optimization module is used for acquiring the image characteristic data of the first pose;
optimizing the first representation vector according to the pose transformation amount determined by the image characteristic data to obtain a second representation vector;
the processing module is used for calculating pose deviation of the first pose and a second pose of the preset robot target position according to the second representation vector under the condition that the first pose is out of a preset range of the preset robot target position;
and the adjusting module is used for adjusting the pose of the robot to be within a preset range of the target position according to the pose deviation.
6. The apparatus of claim 5, wherein the device comprises a plurality of sensors,
the processing module is further used for measuring a first representation vector of the first pose of the robot entering the appointed docking area through the motion model and the laser radar equipment.
7. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any of claims 1-4.
8. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of any one of claims 1-4.
CN202210421079.XA 2022-04-20 2022-04-20 Pose adjustment method and device, computer readable storage medium and electronic equipment Active CN114800499B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210421079.XA CN114800499B (en) 2022-04-20 2022-04-20 Pose adjustment method and device, computer readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210421079.XA CN114800499B (en) 2022-04-20 2022-04-20 Pose adjustment method and device, computer readable storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN114800499A CN114800499A (en) 2022-07-29
CN114800499B true CN114800499B (en) 2023-08-25

Family

ID=82504834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210421079.XA Active CN114800499B (en) 2022-04-20 2022-04-20 Pose adjustment method and device, computer readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114800499B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105856230A (en) * 2016-05-06 2016-08-17 简燕梅 ORB key frame closed-loop detection SLAM method capable of improving consistency of position and pose of robot
CN109465829A (en) * 2018-12-12 2019-03-15 南京工程学院 A kind of industrial robot geometric parameter discrimination method based on transition matrix error model
CN110047108A (en) * 2019-03-07 2019-07-23 中国科学院深圳先进技术研究院 UAV position and orientation determines method, apparatus, computer equipment and storage medium
CN112097768A (en) * 2020-11-17 2020-12-18 深圳市优必选科技股份有限公司 Robot posture determining method and device, robot and storage medium
CN112444242A (en) * 2019-08-31 2021-03-05 北京地平线机器人技术研发有限公司 Pose optimization method and device
WO2021128787A1 (en) * 2019-12-23 2021-07-01 中国银联股份有限公司 Positioning method and apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105856230A (en) * 2016-05-06 2016-08-17 简燕梅 ORB key frame closed-loop detection SLAM method capable of improving consistency of position and pose of robot
CN109465829A (en) * 2018-12-12 2019-03-15 南京工程学院 A kind of industrial robot geometric parameter discrimination method based on transition matrix error model
CN110047108A (en) * 2019-03-07 2019-07-23 中国科学院深圳先进技术研究院 UAV position and orientation determines method, apparatus, computer equipment and storage medium
CN112444242A (en) * 2019-08-31 2021-03-05 北京地平线机器人技术研发有限公司 Pose optimization method and device
WO2021128787A1 (en) * 2019-12-23 2021-07-01 中国银联股份有限公司 Positioning method and apparatus
CN112097768A (en) * 2020-11-17 2020-12-18 深圳市优必选科技股份有限公司 Robot posture determining method and device, robot and storage medium

Also Published As

Publication number Publication date
CN114800499A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
CN109521403B (en) Parameter calibration method, device and equipment of multi-line laser radar and readable medium
US20160300389A1 (en) Correlated immersive virtual simulation for indoor navigation
EP2526729B1 (en) Methods and apparatuses for determining if access to a region is feasible or infeasible for a user of a mobile device
CN112258567B (en) Visual positioning method and device for object grabbing point, storage medium and electronic equipment
CN112654886A (en) External parameter calibration method, device, equipment and storage medium
EP3757861A1 (en) Conversion of point cloud data points into computer-aided design (cad) objects
US8754866B2 (en) Multi-touch detection
US10265850B2 (en) Robotic sensing apparatus and methods of sensor planning
CN111145294A (en) Two-dimensional house type graph construction method and device and storage medium
US20240001558A1 (en) Robot calibration method, robot and computer-readable storage medium
CN114800499B (en) Pose adjustment method and device, computer readable storage medium and electronic equipment
CN113034603B (en) Method and device for determining calibration parameters
JP2023503750A (en) ROBOT POSITIONING METHOD AND DEVICE, DEVICE, STORAGE MEDIUM
US20200357134A1 (en) Target positioning with bundle adjustment
US10564733B2 (en) Operating method of tracking system, controller, tracking system, and non-transitory computer readable storage medium
US20230011818A1 (en) Detection of computer-aided design (cad) objects in point clouds
EP3333593B1 (en) Intra-sensor relative positioning
WO2022250605A1 (en) Navigation guidance methods and navigation guidance devices
US20210025705A1 (en) Target positioning with electronic distance measuring and bundle adjustment
WO2020154936A1 (en) Method and apparatus for calibrating external parameters of radar, and storage medium
KR102592085B1 (en) Ab driving guidance line shift system for precision autonomous driving of agricultural vehicles and operation method thereof
US11911915B2 (en) Determining robotic calibration processes
WO2023272657A1 (en) Pose deviation acquisition method and apparatus, storage medium, and electronic device
KR102512278B1 (en) Method and apparatus for guiding user to point corresponding to surveying location on construction map by providing guiding user interface such that user enables to perform precise survey
US20230243926A1 (en) Ranging error calculation device, ranging error calculation method and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant