CN112284395A - Robot positioning method and device and robot - Google Patents

Robot positioning method and device and robot Download PDF

Info

Publication number
CN112284395A
CN112284395A CN202011150002.0A CN202011150002A CN112284395A CN 112284395 A CN112284395 A CN 112284395A CN 202011150002 A CN202011150002 A CN 202011150002A CN 112284395 A CN112284395 A CN 112284395A
Authority
CN
China
Prior art keywords
robot
positioning
sub
area
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011150002.0A
Other languages
Chinese (zh)
Inventor
于宗靖
张磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN202011150002.0A priority Critical patent/CN112284395A/en
Publication of CN112284395A publication Critical patent/CN112284395A/en
Priority to PCT/CN2021/121608 priority patent/WO2022083435A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The disclosure relates to a robot positioning method and device and a robot, and relates to the technical field of navigation. The method comprises the following steps: receiving broadcast information of a current area where the robot is located, wherein the moving range of the robot is divided into a plurality of sub-areas; determining the current area as a sub-area bound with the broadcast information according to the received broadcast information; and loading map information of the bound subareas, and positioning the robot.

Description

Robot positioning method and device and robot
Technical Field
The present disclosure relates to the field of navigation technologies, and in particular, to a positioning method for a robot, a positioning apparatus for a robot, and a non-volatile computer-readable storage medium.
Background
In recent years, with the development of logistics automation and industrial automation, mobile robots are more and more widely applied in the transfer transportation link. The mobile robot has high operation precision and is unattended, and labor force is greatly liberated in the links of material conveying and carrying.
In the related art, the robot is positioned by calculating and comparing map information in the range of motion.
Disclosure of Invention
The inventors of the present disclosure found that the following problems exist in the above-described related art: when a region similar to the map information exists in the range of motion, the calculation load of the map information is large, the positioning accuracy is low, and the positioning effect of the robot is poor.
In view of this, the present disclosure provides a positioning technical solution for a robot, which can improve the positioning effect of the robot.
According to some embodiments of the present disclosure, there is provided a positioning method of a robot, including: receiving broadcast information of a current area where the robot is located, wherein the moving range of the robot is divided into a plurality of sub-areas; determining the current area as a sub-area bound with the broadcast information according to the received broadcast information; and loading map information of the bound subareas, and positioning the robot.
In some embodiments, the broadcast information is identification information of the bound sub-regions, or the broadcast information of different sub-region bindings is different audio information.
In some embodiments, the different audio information is different frequencies of audio, or the different audio information is different tracks of audio in the same audio.
In some embodiments, the method further comprises: judging whether the robot enters another sub-area through the current area or not according to the planned path of the robot; loading map information of the next sub-area under the condition that the robot is judged to enter the next sub-area; and responding to the robot entering the next subarea, and positioning the robot according to the map information of the next subarea.
In some embodiments, positioning the robot comprises: and acquiring contour information and visual characteristic information of the bound subareas for positioning according to the map information of the bound subareas.
According to other embodiments of the present disclosure, there is provided a positioning apparatus of a robot, including: the robot comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving the broadcast information of the current area where the robot is located, and the moving range of the robot is divided into a plurality of sub-areas; the determining unit is used for determining the current area as a sub-area bound with the broadcast information according to the received broadcast information; and the positioning unit is used for loading the map information of the bound subareas and positioning the robot.
In some embodiments, the broadcast information is identification information of the bound sub-regions, or the broadcast information of different sub-region bindings is different audio information.
In some embodiments, the different audio information is different frequencies of audio, or the different audio information is different tracks of audio in the same audio.
In some embodiments, the apparatus further comprises: and the judging unit is used for judging whether the robot enters another sub-area through the current area or not according to the planned path of the robot. And the positioning unit loads the map information of the next sub-area under the condition that the robot is judged to enter the next sub-area, responds to the fact that the robot enters the next sub-area, and positions the robot according to the map information of the next sub-area.
In some embodiments, the positioning unit acquires the contour information and the visual feature information of the bound sub-areas for positioning according to the map information of the bound sub-areas.
According to still further embodiments of the present disclosure, there is provided a positioning device of a robot, including: a memory; and a processor coupled to the memory, the processor configured to perform the method of positioning of the robot in any of the above embodiments based on instructions stored in the memory device.
According to still further embodiments of the present disclosure, there is provided a robot including: and the positioning device is used for executing the positioning method of the robot in any one of the embodiments.
According to still further embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a positioning method of a robot in any of the above embodiments.
In the above embodiment, the active range is divided into a plurality of sub-regions with smaller areas, and positioning is performed in the sub-region range on the basis of determining the sub-regions by using the broadcast information. Thus, the calculation load of the map information is reduced, the positioning accuracy is improved, and the positioning effect is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
The present disclosure can be more clearly understood from the following detailed description with reference to the accompanying drawings, in which:
fig. 1 illustrates a flow diagram of some embodiments of a positioning method of a robot of the present disclosure;
fig. 2 shows a schematic view of some embodiments of a positioning method of a robot of the present disclosure;
FIG. 3 shows a flow chart of further embodiments of a positioning method of a robot of the present disclosure;
fig. 4 shows a block diagram of some embodiments of a positioning device of a robot of the present disclosure;
FIG. 5 shows a block diagram of further embodiments of a positioning device of a robot of the present disclosure;
FIG. 6 illustrates a block diagram of still further embodiments of a positioning device of a robot of the present disclosure;
fig. 7 illustrates a block diagram of some embodiments of a robot of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Aiming at the technical problem, the mobile robot running in real time is firstly positioned in a relatively small sub-area by combining an area positioning technology and a map information positioning technology; meanwhile, the specific position information of the robot is calculated and judged according to the map information of the sub-area.
Therefore, the technical problems of high map information similarity, large calculation load, low positioning precision and the like caused by a large moving range can be solved. For example, the technical solution of the present disclosure can be realized by the following embodiments.
Fig. 1 illustrates a flow diagram of some embodiments of a positioning method of a robot of the present disclosure.
As shown in fig. 1, the method includes: step 110, receiving broadcast information; step 120, determining the sub-area where the sensor is located; and step 130, positioning the robot.
In step 110, broadcast information of a current area where the robot is located is received. The range of motion of the robot is divided into a plurality of sub-regions.
In some embodiments, the broadcast information is identification information of the bound sub-regions, or the broadcast information of different sub-region bindings is different audio information. For example, the different audio information may be different frequencies of audio, or the different audio information may be different tracks of audio in the same audio.
In step 120, a current region is determined as a sub-region bound to the broadcast information according to the received broadcast information.
In step 130, map information of the bound sub-area is loaded to locate the robot.
In some embodiments, according to the map information of the bound sub-areas, the contour information and the visual feature information of the bound sub-areas are acquired for positioning.
In some embodiments, the positioning may be performed by the embodiment in fig. 2.
Fig. 2 shows a schematic view of some embodiments of a positioning method of a robot of the present disclosure.
As shown in fig. 2, according to the work scene of the robot 21, the moving range of the robot 21 is determined, and the moving range is divided into a plurality of sub-areas: region # 01-06. For example, the sub-regions may be partitioned according to at least one of contour information or visual feature information.
Different broadcast information is covered in different sub-areas. For example, the broadcast information may be specific content (e.g., audio of different frequencies, audio of different tracks, etc.), encoded information of a sub-region.
In some embodiments, the robot 21 may have a broadcast information receiving device mounted on the body thereof, and mark broadcast information of each sub-area. For example, each sub-area may be bound with corresponding broadcast information and the binding information may be stored.
When the robot 21 enters the area #05, the current sub-area can be determined according to the received broadcast information of the sub-area and the binding information of each pre-calibrated sub-area and the broadcast information.
After the judgment of the current sub-area, the robot 21 loads the map information of the current sub-area. And scanning contour information and visual characteristic information according to the loaded map information, thereby realizing robot positioning. For example, the positioning may be performed by a SLAM (Simultaneous Localization And Mapping) navigation technique.
Therefore, construction operations of installing magnetic stripes, two-dimensional codes or reflectors and the like in the deployment link of the positioning system can be reduced or eliminated, so that the positioning flexibility is improved, and the cost is reduced.
In some embodiments, robotic positioning is achieved by the embodiment of fig. 3.
Fig. 3 shows a flow chart of further embodiments of the positioning method of the robot of the present disclosure.
As shown in fig. 3, the method may further include: step 310, judging whether to enter another sub-area; step 320, loading map information of the next sub-area; and step 330, positioning the robot.
In step 310, it is determined whether the robot will pass through the current area and enter another sub-area according to the planned path of the robot.
In step 320, if it is determined that the robot enters the next sub-area, the map information of the next sub-area is loaded.
In step 330, in response to the robot entering the next sub-area, the robot is located according to the map information of the next sub-area.
In some embodiments, it is determined whether the robot 21 has a job request to cross the region #05 to enter other sub-regions according to the planned path of the robot 21 in fig. 2. If there is a job demand across a small area, the robot 21 may determine a sub-area to be entered from the planned path and preload map information of the sub-area.
Thus, the positioning efficiency of the robot can be improved by loading the map information in advance.
In the above-described embodiment, a large job scene is divided into smaller sub-regions, and specific broadcast information is broadcast. And determining the current sub-area according to the broadcast information received by the robot, and loading the map information of the current area.
The positioning method has low investment cost and high area positioning speed, and can reduce the similarity of the characteristic information of the map. Since the amount of map information in the sub-area is small, the computational pressure for map information retrieval and determination can be reduced.
Fig. 4 illustrates a block diagram of some embodiments of a positioning device of a robot of the present disclosure.
As shown in fig. 4, the positioning apparatus 4 includes a receiving unit 41, a determining unit 42, and a positioning unit 43.
The receiving unit 41 receives broadcast information of a current area where the robot is located. The range of motion of the robot is divided into a plurality of sub-regions.
In some embodiments, the broadcast information is identification information of the bound sub-regions, or the broadcast information of different sub-region bindings is different audio information. For example, the different audio information may be different frequencies of audio, or the different audio information may be different tracks of audio in the same audio.
The determination unit 42 determines the current area as a sub-area bound with the broadcast information according to the received broadcast information.
The positioning unit 43 loads map information of the bound sub-areas to position the robot.
In some embodiments, the positioning unit 43 obtains the contour information and the visual feature information of the bound sub-areas for positioning according to the map information of the bound sub-areas.
In some embodiments, the positioning device 4 further comprises: and the judging unit 44 is configured to judge whether the robot will enter another sub-area through the current area according to the planned path of the robot. The positioning unit 43 loads map information of the next sub-area in a case where it is determined that the robot enters the next sub-area. The positioning unit 43 positions the robot according to the map information of the next sub-area in response to the robot entering the next sub-area.
Fig. 5 shows a block diagram of further embodiments of a positioning device of a robot of the present disclosure.
As shown in fig. 5, the positioning device 5 of the robot of this embodiment includes: a memory 51 and a processor 52 coupled to the memory 51, the processor 52 being configured to execute a positioning method of a robot in any one of the embodiments of the present disclosure based on instructions stored in the memory 51.
The memory 51 may include, for example, a system memory, a fixed nonvolatile storage medium, and the like. The system memory stores, for example, an operating system, an application program, a Boot Loader, a database, and other programs.
Fig. 6 shows a block diagram of further embodiments of a positioning device of a robot of the present disclosure.
As shown in fig. 6, the positioning device 6 of the robot of this embodiment includes: a memory 610 and a processor 620 coupled to the memory 610, the processor 620 being configured to perform the positioning method of the robot in any of the above embodiments based on instructions stored in the memory 610.
The memory 610 may include, for example, system memory, fixed non-volatile storage media, and the like. The system memory stores, for example, an operating system, an application program, a Boot Loader, and other programs.
The positioning device 6 of the robot may further include an input-output interface 630, a network interface 640, a storage interface 650, and the like. These interfaces 630, 640, 650 and the connections between the memory 610 and the processor 620 may be through a bus 660, for example. The input/output interface 630 provides a connection interface for input/output devices such as a display, a mouse, a keyboard, a touch screen, a microphone, and a sound box. The network interface 640 provides a connection interface for various networking devices. The storage interface 650 provides a connection interface for external storage devices such as an SD card and a usb disk.
Fig. 7 illustrates a block diagram of some embodiments of a robot of the present disclosure.
As shown in fig. 7, the robot 7 includes: a positioning device 71 for performing the positioning method of the robot in any of the above embodiments.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable non-transitory storage media having computer-usable program code embodied therein, including but not limited to disk storage, CD-ROM, optical storage, and the like.
So far, the positioning method of the robot, the positioning apparatus of the robot, and the non-volatile computer-readable storage medium according to the present disclosure have been described in detail. Some details that are well known in the art have not been described in order to avoid obscuring the concepts of the present disclosure. It will be fully apparent to those skilled in the art from the foregoing description how to practice the presently disclosed embodiments.
The method and system of the present disclosure may be implemented in a number of ways. For example, the methods and systems of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, and firmware. The above-described order for the steps of the method is for illustration only, and the steps of the method of the present disclosure are not limited to the order specifically described above unless specifically stated otherwise. Further, in some embodiments, the present disclosure may also be embodied as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
Although some specific embodiments of the present disclosure have been described in detail by way of example, it should be understood by those skilled in the art that the foregoing examples are for purposes of illustration only and are not intended to limit the scope of the present disclosure. It will be appreciated by those skilled in the art that modifications may be made to the above embodiments without departing from the scope and spirit of the present disclosure. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A method of positioning a robot, comprising:
receiving broadcast information of a current area where a robot is located, wherein the moving range of the robot is divided into a plurality of sub-areas;
determining the current area as a sub-area bound with the broadcast information according to the received broadcast information;
and loading the map information of the bound subarea, and positioning the robot.
2. The positioning method according to claim 1,
the broadcast information is the identification information of the bound sub-areas, or the broadcast information bound by different sub-areas is different audio information.
3. The positioning method according to claim 2,
the different audio information is audio of different frequencies, or the different audio information is different audio tracks in the same audio.
4. The positioning method according to claim 1, further comprising:
judging whether the robot can enter another sub-area through the current area or not according to the planned path of the robot;
loading map information of a next sub-area under the condition that the robot is judged to enter the next sub-area;
and responding to the robot entering the next subarea, and positioning the robot according to the map information of the next subarea.
5. The positioning method according to any one of claims 1-4, wherein said positioning the robot comprises:
and acquiring the contour information and the visual characteristic information of the bound subareas for positioning according to the map information of the bound subareas.
6. A positioning device of a robot, comprising:
the robot comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving the broadcast information of the current area where the robot is located, and the moving range of the robot is divided into a plurality of sub-areas;
a determining unit, configured to determine, according to received broadcast information, the current region as a sub-region bound to the broadcast information;
and the positioning unit is used for loading the map information of the bound subareas and positioning the robot.
7. The positioning device of claim 6, further comprising:
the judging unit is used for judging whether the robot can enter another sub-area through the current area or not according to the planned path of the robot;
wherein the content of the first and second substances,
and the positioning unit loads the map information of the next sub-area under the condition that the robot is judged to enter the next sub-area, and positions the robot according to the map information of the next sub-area in response to the robot entering the next sub-area.
8. A positioning device of a robot, comprising:
a memory; and
a processor coupled to the memory, the processor configured to perform the method of positioning of a robot of any of claims 1-5 based on instructions stored in the memory.
9. A robot, comprising:
positioning device for performing the positioning method of a robot according to any of claims 1-5.
10. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the positioning method of the robot of any one of claims 1-5.
CN202011150002.0A 2020-10-23 2020-10-23 Robot positioning method and device and robot Pending CN112284395A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011150002.0A CN112284395A (en) 2020-10-23 2020-10-23 Robot positioning method and device and robot
PCT/CN2021/121608 WO2022083435A1 (en) 2020-10-23 2021-09-29 Robot localization method and apparatus, and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011150002.0A CN112284395A (en) 2020-10-23 2020-10-23 Robot positioning method and device and robot

Publications (1)

Publication Number Publication Date
CN112284395A true CN112284395A (en) 2021-01-29

Family

ID=74424833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011150002.0A Pending CN112284395A (en) 2020-10-23 2020-10-23 Robot positioning method and device and robot

Country Status (2)

Country Link
CN (1) CN112284395A (en)
WO (1) WO2022083435A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022083435A1 (en) * 2020-10-23 2022-04-28 北京京东乾石科技有限公司 Robot localization method and apparatus, and robot

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080284649A1 (en) * 2003-12-22 2008-11-20 Abb Research Ltd. Method for Positioning and a Positioning System
CN104729502A (en) * 2015-03-30 2015-06-24 北京云迹科技有限公司 Robot mapping and positioning method and system based on Bluetooth base station and laser sensor
CN104848848A (en) * 2015-03-30 2015-08-19 北京云迹科技有限公司 Robot map drafting and positioning method based on wireless base station and laser sensor as well as system thereof
CN108981701A (en) * 2018-06-14 2018-12-11 广东易凌科技股份有限公司 A kind of indoor positioning and air navigation aid based on laser SLAM
US20190178654A1 (en) * 2016-08-04 2019-06-13 Reification Inc. Methods for simultaneous localization and mapping (slam) and related apparatus and systems
CN110319832A (en) * 2019-07-05 2019-10-11 北京海益同展信息科技有限公司 Robot localization method, apparatus, electronic equipment and medium
CN110836668A (en) * 2018-08-16 2020-02-25 科沃斯商用机器人有限公司 Positioning navigation method, device, robot and storage medium
CN111426325A (en) * 2020-06-12 2020-07-17 北京云迹科技有限公司 Positioning method and device, robot, storage medium and positioning system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112284395A (en) * 2020-10-23 2021-01-29 北京京东乾石科技有限公司 Robot positioning method and device and robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080284649A1 (en) * 2003-12-22 2008-11-20 Abb Research Ltd. Method for Positioning and a Positioning System
CN104729502A (en) * 2015-03-30 2015-06-24 北京云迹科技有限公司 Robot mapping and positioning method and system based on Bluetooth base station and laser sensor
CN104848848A (en) * 2015-03-30 2015-08-19 北京云迹科技有限公司 Robot map drafting and positioning method based on wireless base station and laser sensor as well as system thereof
US20190178654A1 (en) * 2016-08-04 2019-06-13 Reification Inc. Methods for simultaneous localization and mapping (slam) and related apparatus and systems
CN108981701A (en) * 2018-06-14 2018-12-11 广东易凌科技股份有限公司 A kind of indoor positioning and air navigation aid based on laser SLAM
CN110836668A (en) * 2018-08-16 2020-02-25 科沃斯商用机器人有限公司 Positioning navigation method, device, robot and storage medium
CN110319832A (en) * 2019-07-05 2019-10-11 北京海益同展信息科技有限公司 Robot localization method, apparatus, electronic equipment and medium
CN111426325A (en) * 2020-06-12 2020-07-17 北京云迹科技有限公司 Positioning method and device, robot, storage medium and positioning system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022083435A1 (en) * 2020-10-23 2022-04-28 北京京东乾石科技有限公司 Robot localization method and apparatus, and robot

Also Published As

Publication number Publication date
WO2022083435A1 (en) 2022-04-28

Similar Documents

Publication Publication Date Title
WO2019120327A3 (en) Processing blockchain data based on smart contract operations executed in a trusted execution environment
KR102041664B1 (en) Method and apparatus for estimating localization of robot in wide range of indoor space using qr marker and laser scanner
CN108876857B (en) Method, system, device and storage medium for positioning unmanned vehicle
US11727578B2 (en) Crowd type classification system, crowd type classification method and storage medium for storing crowd type classification program
CN113029167B (en) Map data processing method, map data processing device and robot
CN103559123A (en) Function call stack analyzing method and device based on VxWorks operation system
CN107589742A (en) Automatical pilot transportation vehicle control method, device and computer-readable recording medium
CN113447936B (en) AGV intelligent forklift and detection method and device for platform state of ground pile storage area
CN112284395A (en) Robot positioning method and device and robot
KR102202244B1 (en) Logistics transport system using the autonomous driving unit and logistics transport method using the same
JP2008165275A (en) Mobile body with self-position identification device
KR20210046217A (en) Method and apparatus for detecting an object using detection of a plurality of regions
CN112256044A (en) Method, device and storage medium for reducing waiting time of human-computer interaction
CN112711051A (en) Flight control system positioning method, device, equipment and storage medium
JP5001852B2 (en) Position detection system, position detection program, position detection apparatus, and position detection method
CN111414107A (en) Scalable vector graphics processing method and system
CN112306049A (en) Autonomous robot, obstacle avoidance method and device thereof, and storage medium
JP2637764B2 (en) Fingerprint center detection method
US20170193779A1 (en) Apparatus, system, and method for facilitating mobile tag reader positional confidence
CN114675292A (en) Library position state detection method, device, equipment, storage medium and program product
CN105204866A (en) Equipment command calling method and system
US20220292713A1 (en) Information processing apparatus, information processing method, and storage medium
CN115507836B (en) Method for determining the position of a robot and robot
CN108088435A (en) A kind of air navigation aid and device for identifying touch-control
CN112536817B (en) Steering engine fault processing method and device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210129