CN108549088A - Localization method, equipment, system based on robot and storage medium - Google Patents
Localization method, equipment, system based on robot and storage medium Download PDFInfo
- Publication number
- CN108549088A CN108549088A CN201810433354.3A CN201810433354A CN108549088A CN 108549088 A CN108549088 A CN 108549088A CN 201810433354 A CN201810433354 A CN 201810433354A CN 108549088 A CN108549088 A CN 108549088A
- Authority
- CN
- China
- Prior art keywords
- robot
- target object
- laser radar
- depth image
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 72
- 238000003860 storage Methods 0.000 title claims abstract description 35
- 230000004807 localization Effects 0.000 title claims abstract description 26
- 238000004891 communication Methods 0.000 claims abstract description 149
- 230000015654 memory Effects 0.000 claims description 43
- 238000005516 engineering process Methods 0.000 claims description 21
- 230000009471 action Effects 0.000 claims description 13
- 230000003287 optical effect Effects 0.000 claims description 13
- 238000009434 installation Methods 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 230000008901 benefit Effects 0.000 abstract description 27
- 230000004927 fusion Effects 0.000 abstract description 6
- 230000008569 process Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 14
- 238000004590 computer program Methods 0.000 description 7
- 241000406668 Loxodonta cyclotis Species 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 239000003795 chemical substances by application Substances 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 239000000523 sample Substances 0.000 description 5
- 230000002159 abnormal effect Effects 0.000 description 4
- 230000007812 deficiency Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000008054 signal transmission Effects 0.000 description 4
- 230000006641 stabilisation Effects 0.000 description 4
- 238000011105 stabilization Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000005611 electricity Effects 0.000 description 3
- 230000000644 propagated effect Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000004888 barrier function Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 230000035772 mutation Effects 0.000 description 1
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 229920000728 polyester Polymers 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Manipulator (AREA)
- Traffic Control Systems (AREA)
Abstract
The embodiment of the present application provides a kind of localization method, equipment, system and storage medium based on robot.In the embodiment of the present application, a kind of robot including proximity communication module is provided, the robot further includes depth camera and/or laser radar, in robot localization target object, the proximity communication module of robot can be combined with the depth camera of robot and/or laser radar, proximity communication module and the advantage of depth camera and/or laser radar respectively, fusion much information is made full use of to position target object, be conducive to reduce position error, improve positioning accuracy.
Description
Technical field
This application involves field of artificial intelligence more particularly to a kind of localization method based on robot, equipment, systems
And storage medium.
Background technology
With the development of artificial intelligence technology, the research and development of mobile robot is very fast, some service type machines
People has also progressed into daily life.Technology is followed based on robot, service humanoid robot, which can perform some and follow, appoints
Business, such as accompany and stroll, welcome etc..Robot follow technology refer to robot maintain a certain distance with speed follower target object,
And auxiliary mark object completes relevant social production activity.
In robot follows technology, the relative positioning between robot and target object is primarily to solve the problems, such as.
Existing robot follows in technology, general to be determined target object using the triangle polyester fibre algorithm based on WiFi, Bluetooth signal
Position.But the position error of this algorithm is larger, positioning accuracy is relatively low.
Invention content
The many aspects of the application provide a kind of localization method, equipment, system and storage medium based on robot, to
Improve positioning accuracy of the robot to target object.
The embodiment of the present application provides a kind of localization method based on robot, including:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to described
The first position information of robot;
At least one object around the robot is acquired using the depth camera and laser radar of the robot
Depth image and laser radar data;
According to the range information in the first position information, from the depth image and laser thunder of at least one object
Up in data, the depth image and laser radar data of the target object are determined;
The target object is positioned according to the depth image of the target object and laser radar data.
The embodiment of the present application also provides a kind of positioning device based on robot, including:
Acquisition module, for being communicated with target object using the proximity communication module of robot, to measure the target
First position information of the object relative to the robot;
Acquisition module, depth camera and laser radar for utilizing the robot acquire around the robot extremely
The depth image and laser radar data of a few object;
Determining module is used for according to the range information in the first position information, from the depth of at least one object
It spends in image and laser radar data, determines the depth image and laser radar data of the target object;
Locating module, for positioning the target pair according to the depth image and laser radar data of the target object
As.
The embodiment of the present application also provides a kind of robot, including:Basic machine;The basic machine is equipped with closely logical
Believe the memory of module, one or more processors, and one or more storage computer instructions;On the basic machine also
Equipped with depth camera and laser radar;
One or more of processors, for executing the computer instruction, for:
Using the proximity communication module between the robot and target object transmitting/receiving wireless signal, to measure
State first position information of the target object relative to the robot;
The depth image of at least one object around the robot is acquired using the depth camera and laser radar
And laser radar data;
According to the range information in the first position information, from the depth image and laser thunder of at least one object
Up in data, the depth image and laser radar data of the target object are determined;
The target object is positioned according to the depth image of the target object and laser radar data.
The embodiment of the present application also provides a kind of computer readable storage medium of storage computer instruction, when the computer
When instruction is executed by one or more processors, it includes action below to cause one or more of processor execution:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to described
The first position information of robot;
At least one object around the robot is acquired using the depth camera and laser radar of the robot
Depth image and laser radar data;
According to the range information in the first position information, from the depth image and laser thunder of at least one object
Up in data, the depth image and laser radar data of the target object are determined;
The target object is positioned according to the depth image of the target object and laser radar data.
The embodiment of the present application also provides a kind of positioning system based on robot, including:Robot and it is placed in target pair
As upper wireless transmitting-receiving equipments;The robot includes:The proximity communication module being adapted to the wireless transmitting-receiving equipments;It is described
Robot further includes:Depth camera and laser radar;
The robot is used for:Using the proximity communication module and the wireless transmitting-receiving equipments the robot with
Transmitting/receiving wireless signal between the target object is believed with measuring the target object relative to the first position of the robot
Breath;The depth image and laser of at least one object around the robot are acquired using the depth camera and laser radar
Radar data;According to the range information in the first position information, from the depth image and laser of at least one object
In radar data, the depth image and laser radar data of the target object are determined;According to the depth map of the target object
Picture and laser radar data position the target object.
The embodiment of the present application also provides a kind of localization method based on robot, including:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to described
The first position information of robot;
The depth image number of at least one object around the robot is acquired using the depth camera of the robot
According to;
According to the range information in the first position information, from the depth image data of at least one object,
Determine the depth image data of the target object;
The target object is positioned according to the depth image data of the target object.
The embodiment of the present application also provides a kind of robot, including:Basic machine;The basic machine is equipped with closely logical
Believe the memory of module, one or more processors, and one or more storage computer instructions;On the basic machine also
Equipped with depth camera;
One or more of processors, for executing the computer instruction, for:
It is communicated with target object using the proximity communication module, to measure the target object relative to the machine
The first position information of people;
The depth image data of at least one object around the robot is acquired using the depth camera;
According to the range information in the first position information, from the depth image data of at least one object,
Determine the depth image data of the target object;
The target object is positioned according to the depth image data of the target object.
The embodiment of the present application also provides a kind of computer readable storage medium of storage computer instruction, when the computer
When instruction is executed by one or more processors, it includes action below to cause one or more of processor execution:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to described
The first position information of robot;
The depth image number of at least one object around the robot is acquired using the depth camera of the robot
According to;
According to the range information in the first position information, from the depth image data of at least one object,
Determine the depth image data of the target object;
The target object is positioned according to the depth image data of the target object.
The embodiment of the present application also provides a kind of localization method based on robot, including:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to described
The first position information of robot;
The laser radar data of at least one object around the robot is acquired using the laser radar of the robot;
According to the range information in the first position information, from the laser radar data of at least one object,
Determine the laser radar data of the target object;
The target object is positioned according to the laser radar data of the target object.
The embodiment of the present application also provides a kind of robot, including:Basic machine;The basic machine is equipped with closely logical
Believe the memory of module, one or more processors, and one or more storage computer instructions;On the basic machine also
Equipped with laser radar;
One or more of processors, for executing the computer instruction, for:
It is communicated with target object using the proximity communication module, to measure the target object relative to the machine
The first position information of people;
The laser radar data of at least one object around the robot is acquired using the laser radar;
According to the range information in the first position information, from the laser radar data of at least one object,
Determine the laser radar data of the target object;
The target object is positioned according to the laser radar data of the target object.
The embodiment of the present application also provides a kind of computer readable storage medium of storage computer instruction, when the computer
When instruction is executed by one or more processors, it includes action below to cause one or more of processor execution:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to described
The first position information of robot;
The laser radar data of at least one object around the robot is acquired using the laser radar of the robot;
According to the range information in the first position information, from the laser radar data of at least one object,
Determine the laser radar data of the target object;
The target object is positioned according to the laser radar data of the target object.
In the embodiment of the present application, a kind of robot including proximity communication module is provided, which further includes depth
It spends camera and/or laser radar can be by the proximity communication module and machine of robot in robot localization target object
The depth camera and/or laser radar of people is combined, and makes full use of proximity communication module and depth camera and/or laser
The advantage of radar respectively, fusion much information position target object, are conducive to reduce position error, improve positioning accurate
Degree.
Description of the drawings
Attached drawing described herein is used for providing further understanding of the present application, constitutes part of this application, this Shen
Illustrative embodiments and their description please do not constitute the improper restriction to the application for explaining the application.In the accompanying drawings:
Fig. 1 is a kind of structural schematic diagram for positioning system based on robot that one exemplary embodiment of the application provides;
Fig. 2 a are a kind of hardware block diagram for robot that the application another exemplary embodiment provides;
Fig. 2 b are a kind of overall structure diagram for anthropomorphic robot that the application another exemplary embodiment provides;
Fig. 2 c are a kind of overall structure diagram for non-anthropomorphic robot that the application another exemplary embodiment provides;
Fig. 3 a are a kind of flow signal for localization method based on robot that the application another exemplary embodiment provides
Figure;
Fig. 3 b are the flow for another localization method based on robot that the application another exemplary embodiment provides
Schematic diagram;
Fig. 3 c are the flow for another localization method based on robot that the application another exemplary embodiment provides
Schematic diagram;
Fig. 3 d are the flow for another localization method based on robot that the application another exemplary embodiment provides
Schematic diagram;
Fig. 4 is a kind of structural representation for positioning device based on robot that the application another exemplary embodiment provides
Figure.
Specific implementation mode
To keep the purpose, technical scheme and advantage of the application clearer, below in conjunction with the application specific embodiment and
Technical scheme is clearly and completely described in corresponding attached drawing.Obviously, described embodiment is only the application one
Section Example, instead of all the embodiments.Based on the embodiment in the application, those of ordinary skill in the art are not doing
Go out the every other embodiment obtained under the premise of creative work, shall fall in the protection scope of this application.
Existing position error is larger when for robot localization target object in the prior art, and positioning accuracy is more low to ask
Topic provides a kind of robot including proximity communication module, which also wraps in the application some exemplary embodiments
Include depth camera and/or laser radar, in robot localization target object, can by the proximity communication module of robot with
The depth camera and/or laser radar of robot are combined, make full use of proximity communication module and depth camera and/or
The advantage of laser radar respectively, fusion much information position target object, are conducive to reduce position error, improve positioning
Precision.
Below in conjunction with attached drawing, the technical solution that each embodiment of the application provides is described in detail.
Fig. 1 is a kind of structural schematic diagram for positioning system based on robot that one exemplary embodiment of the application provides.
As shown in Figure 1, the positioning system 10 includes:Robot 100 and the wireless transmitting-receiving equipments 201 being placed on target object 200.
In positioning system 10, target object 200 refers to the people for needing to be positioned by robot 100 or object.Optionally, target
Object 200 can be being capable of free-moving people or object.Here " object " can be animal, vehicle, ship, toy for children etc.
Deng.In addition, realizing the difference of form according to target object 200, the realization form of wireless transmitting-receiving equipments 201 and it is placed in target pair
As the mode on 200 would also vary from.For example, if target object 200 is human body, which can be real
It is now the form of wearable device, and is worn in the wrist, loins or neck of human body, human body clothing can also be placed on
In pocket;Alternatively, the wireless transmitting-receiving equipments 201 can also be embodied as a kind of handheld device, and it is hand-held, etc. by human body.Example again
Such as, if target object 200 is vehicle, which can be implemented as the form of mobile unit, and be installed in
Some position of vehicle;Alternatively, the wireless transmitting-receiving equipments 201 can also be embodied as a kind of pendant, and it is placed in vehicle cab
It is interior, etc..
In positioning system 10, robot 100 includes proximity communication module.The proximity communication module and wireless receiving and dispatching
Equipment 201 is adapted to, and can be communicated with wireless transmitting-receiving equipments 201.
In positioning system 10, in addition to proximity communication module, robot 100 further includes depth camera and/or swashs
Optical radar.Depth camera can shoot the coloured image and/or depth map of scene around robot within the scope of certain angle
Picture, in order to find existing various objects around robot.Laser radar can be within the scope of 360 ° around sniffing robot
Obstacle information present in scene.
In positioning system 10, robot 100 can be operated in various application scenarios, be appointed for completing corresponding operation
Business.Certainly, according to the difference of application scenarios, robot 100 needs the job task completed would also vary from.In some applications
In scene, robot 100 needs to position special object and with moving with it.For example, in scene is done shopping in supermarket, market etc., purchase
Wu Che robots need to position customer and customer are followed to move, to accommodate the commodity that customer chooses.In another example in some companies
In storage sorting scene, sorting machine people needs locating, sorting personnel and sorting personnel is followed to be moved to shelf picking area, then
Start order sorting cargo.In the present embodiment, the special object positioned is needed to be denoted as target object 200 in robot, such as
Customer, sorting personnel in above-mentioned example are that robot 100 needs the target object 200 positioned.
It, can be by the proximity communication module of itself when robot 100 positions target object 200 in positioning system 10
Be combined with the depth camera and/or laser radar of itself, make full use of proximity communication module and depth camera and/or
The advantage of laser radar respectively, fusion much information position target object 200.Wherein, robot 100 positions target pair
As 200 process is as follows:
On the one hand it (is primarily referred to as being placed in target object with target object 200 using the proximity communication module of robot 100
Wireless transmitting-receiving equipments 201 on 200) it communicates, it, should to measure first position information of the target object 200 relative to robot 100
Operation can be referred to as the positioning operation based on proximity communication module;On the other hand the depth camera of robot 100 is utilized
And/or laser radar acquires the depth image and/or laser radar data of at least one object around robot 100;Then, root
According to the range information in the information of first position mesh is determined from the depth image and/or laser radar data of at least one object
Mark the depth image and/or laser radar data of object 200;According to the depth image of target object 200 and/or laser radar number
According to positioning target object 200.Wherein, other operations in addition to the positioning operation based on proximity communication module, may be simply referred to as
Positioning operation based on depth camera and/or laser radar.
Wherein, robot 100 by its proximity communication module to wireless transmitting-receiving equipments 201 send wireless signal, and/
Or, receiving the wireless signal that wireless transmitting-receiving equipments 201 are sent, and then target can be calculated according to the information of transmitting/receiving wireless signal
Location information of the object 200 relative to robot 100.For ease of description, the target that will be oriented based on proximity communication module
Object 200 is known as first position information relative to the location information of robot 100.First position information includes mainly target object
200 range information relative to robot 100.Optionally, first position information can also include target object 200 relative to machine
The angle information of device people 100.
In the present embodiment, do not limit positioning operation based on proximity communication module with based on depth camera and/
Or sequence is executed between the positioning operation of laser radar, depending on execution logic adaptability in detail.It in this regard, will be
It is illustrated in subsequent example embodiment.
In the present embodiment, by the depth camera and/or laser thunder of the proximity communication module of robot and robot
Up to being combined, proximity communication module has a higher advantage of precision in terms of ranging, and depth camera and/or laser radar
Measurement data have relatively comprehensively and the higher advantage of precision, therefore, the target pair measured using proximity communication module
As the range information relative to robot, from the depth of the collected at least one object of depth camera and/or laser radar
The depth image and/or laser radar data that target object is determined in image and/or laser radar data are conducive to accurately really
Set the goal the related data of object, and then using depth camera and/or the higher advantage of measure data precision of laser radar,
Depth image and/or laser radar data based on target object position target object, can reduce positioning in this way and miss
Difference improves positioning accuracy and stability.
In addition to the above-mentioned positioning system 10 based on robot, some exemplary embodiments of the application also provide a kind of machine
People 100.As shown in Figure 2 a, which includes:Basic machine 101;Basic machine 101 is equipped with one or more handle
The memory 103 and proximity communication module 104 of device 102, one or more storage computer instructions.In addition to this, mechanical
Depth camera 105 and/or laser radar 106 are additionally provided on ontology 101.For example, for some robots 100, machinery is originally
Depth camera 105 can be equipped on body 101;For other robots 100, laser radar can be equipped on basic machine 101
106;For other robot 100, depth camera 105 and laser radar 106 can be equipped on basic machine 101 simultaneously.
It is worth noting that one or more processors 102, one or more memories 103, proximity communication module
104, depth camera 105 and/or laser radar 106 may be disposed inside basic machine 101, can also be set to basic machine
101 surface.
Basic machine 101 is that robot 100 relies the executing agency of the task of fulfiling assignment, can be held in determining environment
The specified operation of row processor 102.Wherein, basic machine 101 embodies the mode of appearance of robot 100 to a certain extent.
In the present embodiment, the mode of appearance of robot 100 is not limited.For example, robot 100 can be humanoid machine shown in Fig. 2 b
People, then basic machine 101 can include but is not limited to:Head, hand, wrist, arm, waist and the pedestal of robot etc. are mechanical
Structure.In addition, robot 100 can also be the more relatively easy non-anthropomorphic robot of form shown in Fig. 2 c, then basic machine
101 are primarily referred to as the fuselage of robot 100.
One or more memories 103 are mainly used for storing computer instruction, which can be one or more
Processor 102 executes, and the basic machine 101 that one or more processors 102 can control robot 100 is caused to complete accordingly to make
Industry task.Other than storing computer instruction, one or more memories 103 are also configured to store various other data
To support the operation in robot.The example of these data includes for any application program operated in robot or side
The instruction of method, the map datum of 100 place environment of robot/scene are convenient for picture, video, the voice data etc. of human-computer interaction.
One or more memories 103, can by any kind of volatibility or non-volatile memory device or they
Combination realize, such as static RAM (SRAM), electrically erasable programmable read-only memory (EEPROM) is erasable
Programmable read only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, quick flashing
Memory, disk or CD.
One or more processors 102 can be regarded as the control system of robot 100, can be used for executing one or more
The computer instruction stored in a memory 103 completes corresponding job task to control robot 100.It is worth noting that
When robot 100 works in different application scene, the job task completed needed for robot 100 will be different, then, one
Or the computer instruction stored in multiple memories 103 would also vary from, and one or more processors 102 execute different meters
The instruction of calculation machine can control robot 100 to complete different job tasks.
In application scenes, robot 100 needs to position special object and with moving with it.For example, in supermarket, quotient
In the shopping such as field scene, shopping cart robot needs to position customer and customer is followed to move, to accommodate the commodity that customer chooses.Again
For example, in the storage sorting scene of some companies, sorting machine people needs locating, sorting personnel and sorting personnel is followed to move
To shelf picking area, then start order sorting cargo.In the present embodiment, the special object positioned is needed to be known as in robot
Customer, sorting personnel in target object, such as above-mentioned example are that robot 100 needs the target object positioned.It is positioning
It follows in scene, the energy when computer instruction stored in one or more memories 103 is executed by one or more processors 102
Enough cause processor 102 position target object and control robot 100 basic machine 101 completion follow action.
Wherein, one or more processors 102 execute the computer instruction positioning stored in one or more memories 103
When target object, on the one hand the proximity communication module of robot 100 can be utilized to be communicated with target object, to measure target pair
As the first position information relative to robot 100;On the other hand the depth camera of robot 100 can be utilized and/or swashed
Optical radar acquires the depth image and/or laser radar data of at least one object around robot 100;Then, according to first
Range information in location information determines target object from the depth image and/or laser radar data of at least one object
Depth image and/or laser radar data;Target pair is positioned according to the depth image of target object and/or laser radar data
As.
For example, one or more processors 102 can be set by the proximity communication module of robot 100 to wireless receiving and dispatching
Standby 201 send wireless signal, and receive the nothing that wireless transmitting-receiving equipments 201 return by the proximity communication module of robot 100
Line signal, and then target object can be calculated relative to robot according to the information of proximity communication module transmitting/receiving wireless signal
100 location information.For ease of description, location information here is known as first position information.First position information is mainly wrapped
Include range information of the target object relative to robot 100.Optionally, first position information can also include that target object is opposite
In the angle information of robot 100.
In the present embodiment, the operation and utilization that first position information is obtained using proximity communication module are not limited equally
Depth camera and/or laser radar acquire between the operation of depth image and/or laser radar data of at least one object
Execute sequence, in combination in detail implement logic adaptability depending on.In this regard, it will be lifted in subsequent example embodiment
Example explanation.
It is worth noting that robot 100 can also include but not limited to:Show screen, power supply module, audio component
Deng.Show that screen may include liquid crystal display (LCD) or touch panel (TP).If showing that screen includes touch panel, show
Display screen curtain may be implemented as touch screen, to receive input signal from the user.Power supply module is mainly each of robot 100
Kind component provides electric power.Audio component can be configured as output and/or input audio signal, may include microphone (MIC) and
Loud speaker.
In the present embodiment, by the depth camera and/or laser thunder of the proximity communication module of robot and robot
Up to being combined, proximity communication module has a higher advantage of precision in terms of ranging, and depth camera and/or laser radar
There is the relatively comprehensive and higher advantage of precision, therefore, the mesh measured using proximity communication module in terms of measurement data
Range information of the object relative to robot is marked, from the collected at least one object of depth camera and/or laser radar
Depth image and/or laser radar data determine the depth image and/or laser radar data of target object, are conducive to accurately
It determines target object, and then using depth camera and/or the higher advantage of measure data precision of laser radar, is based on target
The depth image and/or laser radar data of object position target object, can reduce position error in this way, and it is fixed to improve
Position precision.
In the above embodiments, including the proximity communication module using robot 100 is communicated with target object, to survey
Measure operation of the target object relative to the first position information of robot 100.In each embodiment of the application, low coverage is not limited
From the close range communication techniques that communication module uses, such as radio frequency identification (RFID), WiFi, bluetooth, infrared ray, ultra-wide can be based on
The technologies such as band (UWB) are realized.
Wherein, proximity communication module uses different close range communication techniques, and proximity communication module, wireless receiving and dispatching are set
Standby realization form and the positioning operation based on proximity communication module may be different.The following embodiments of the application are given
Go out the example of several proximity communication modules:
Example A:Proximity communication module is using UWB technology realization, referred to as UWB modules;Correspondingly, on target object
Wireless transmitting-receiving equipments can be realized as UWB labels.For example, a UWB module is arranged in robot 100.Based on this, a kind of utilization
The process that UWB modules position target object includes:
UWB modules send UWB signal to UWB labels, and the time that record sends UWB signal is T1;UWB labels receive UWB
The UWB signal that module is sent, the time that record receives UWB signal are T2;UWB labels return to UWB signal, note to UWB modules
The time that playback returns UWB signal is T3;UWB modules receive UWB labels return UWB signal, record receive UWB signal when
Between be T4.Based on these temporal informations, the one-way time T_tof=((T4-T1)-that UWB signal is propagated in the air can be obtained
(T3-T2))/2;In turn, in conjunction with the skyborne spread speed of UWB signal, the i.e. light velocity, UWB modules and UWB labels can be calculated
The distance between, that is, distance of the target object relative to robot 100.Optionally, can also according to target object relative to
Installation site of the distance and UWB modules of robot 100 in robot 100, calculates target object relative to robot
100 angle.
It is worth noting that multiple UWB modules can also be arranged in other embodiments, in robot 100, then combine
The geometry site of the distance between multiple UWB modules and UWB labels and multiple UWB modules, can more accurately count
Calculate distance and angle of the target object relative to robot 100.
In example A, target object is positioned using UWB modules, UWB signal has penetration power strong, low in energy consumption, anti-
The advantages that path effects are good, safe, therefore be conducive to improve positioning accuracy.
Example B:Proximity communication module is using Bluetooth technology realization, referred to as bluetooth module;Correspondingly, on target object
Wireless transmitting-receiving equipments also be bluetooth module.It is assumed that two bluetooth modules are arranged in robot 100, it is denoted as the first bluetooth module
With the second bluetooth module, and it is placed on the bluetooth module on target object and is denoted as third bluetooth module.It is a kind of using blue based on this
The process that tooth module positions target object includes:
First, the first bluetooth module and third bluetooth module are established and connects and carries out Bluetooth signal transmission;According to first
Signal strength between bluetooth module and third bluetooth module determines distance of the third bluetooth module with respect to the first bluetooth module;
Then, the second bluetooth module and third bluetooth module are established and connects and carries out Bluetooth signal transmission;It can be according to the second bluetooth mould
Signal strength between block and third bluetooth module determines distance of the third bluetooth module with respect to the second bluetooth module;In turn, may be used
In conjunction with the installation site relationship and third bluetooth module difference of the first bluetooth module and the second bluetooth module in robot 100
With the distance between the first bluetooth module and the second bluetooth module, the coordinate position of third bluetooth module is solved, it in turn, can be with
Third bluetooth module is calculated relative to location informations such as the distance of robot 100 and angles.
It is worth noting that according to the requirement of positioning accuracy, a bluetooth module can also be arranged in robot 100, also may be used
More bluetooth modules are arranged.
In example B, target object is positioned using bluetooth module, Bluetooth technology has higher generality and letter
Clean property, the locating scheme based on bluetooth module are relatively easy to implement, and the power consumption of Bluetooth signal is relatively low with cost, advantageously reduces
Position cost.
Example C:Proximity communication module is realized using WiFi technology, may be simply referred to as WiFi module;Correspondingly, target object
On wireless transmitting-receiving equipments be also WiFi module.Wherein, position fixing process is similar to the above, and one can be arranged in robot 100
Or multiple WiFi modules, carry out signal transmission with the WiFi module that is configured on target object, so can be based on signal strength or
Signal transmission time calculates target object relative to location informations such as the distance of robot 100 and angles.
It in some exemplary embodiments, can be first with the short-range communication of robot when positioning target object
Module is communicated with target object, to measure first position information of the target object relative to robot, that is, utilizes short-range communication
Module carries out initial alignment to target object.Here, first position information includes distance of the target object relative to robot 100
Information and angle information.
Wherein, proximity communication module generally has certain angle confidence interval, in simple terms, be exactly with it is certain with
Track range.If the angle that proximity communication module measures is in the angle confidence interval of proximity communication module, it is meant that
The angle confidence level measured is higher, can be used for positioning target object;If the angle that proximity communication module measures is not
In the angle confidence interval of proximity communication module, it is meant that the angle confidence level measured is relatively low, if for positioning mesh
Mark object can reduce the positioning accuracy of target object.
Based on above-mentioned, after obtaining first position information, it can be determined that whether the angle information in the information of first position
It is credible.If the angle information in the information of first position is credible, can according in the information of first position range information and angle
Information locating target object is spent, is not necessarily to subsequent operation at this time, computation complexity can be reduced, improves location efficiency.Wherein, it positions
Target object is primarily referred to as calculating the position coordinates of target object.
Wherein, installation of the angle confidence interval of proximity communication module generally with proximity communication module in robot
Position is related.For example, by taking UWB modules as an example, it is assumed that UWB modules are installed on robot front dead center position, then the angle of UWB modules
Degree confidence interval is ± 60 ° of ranges immediately ahead of robot.
If the angle information in the information of first position is insincere, in order to ensure positioning accuracy, robot can be utilized
Depth camera and/or laser radar secondary positioning is carried out to target object.Secondary position fixing process includes:Utilize depth camera
The operation of the depth image and/or laser radar data of at least one object around head and/or laser radar acquisition robot;It connects
Get off, using the higher advantage of range information precision in the information of first position, according to the range information in the information of first position,
From the depth image and/or laser radar data of at least one object, the depth image and/or laser thunder of target object are determined
Up to data;Target object is positioned according to the depth image of target object and/or laser radar data.In this way, based on closely logical
Range information of the target object that letter module is oriented relative to robot, can accurately determine out the depth map of target object
Picture and/or laser radar data, and then the measure data precision of depth camera and/or laser radar can be utilized higher excellent
Point, depth image and/or laser radar data based on target object position target object, can reduce positioning and miss
Difference improves positioning accuracy.
In above process, need to judge whether the angle information in the information of first position is credible.Wherein, according to applied field
Scape and difference to positioning accuracy request, decision procedure whether being adapted to property set angle is credible.It is given below the application
Go out several determination examples:
A kind of example:If the angle information in the information of first position is not in proximity communication module corresponding angle confidence area
In, determine that the angle information in the information of first position is insincere;Conversely, if the angle information in the information of first position is in low coverage
In angle confidence interval corresponding from communication module, determine that the angle information in the information of first position is credible.
Another example:If the difference of the angle information and the credible angle information in last positioning in the information of first position
Value is more than set angle threshold value, determines that the angle information in the information of first position is insincere;If conversely, in the information of first position
The difference of angle information and the credible angle information in last positioning is less than or equal to set angle threshold value, determines first position
Angle information in information is credible.
Another example:If target object fluctuates exception relative to the angle information in multiple location informations of robot,
Determine that the angle information in the information of first position is insincere;If conversely, multiple location informations of the target object relative to robot
In angle information fluctuation it is normal, determine that the angle information in the information of first position is credible.Wherein, multiple location informations here
It is to be obtained using proximity communication module between this positioning starts and upper primary positioning terminates, and includes that first position is believed
Breath.
It normally and fluctuation is abnormal about fluctuation, can adaptive settings according to demand.If for example, in multiple location informations
The whole fluctuating range of angle information is less than the threshold value of setting fluctuating range, it is believed that fluctuation is normal, on the contrary, it is believed that fluctuation is abnormal.
In another example if the angle information difference occurred in multiple location informations in the information of adjacent position is more than setting difference threshold,
Think that fluctuation is abnormal, if conversely, there is no the angle information differences in the information of adjacent position to be more than setting in multiple location informations
The case where difference threshold, then it is assumed that fluctuation is normal.Wherein, multiple location informations refer to utilizing short-range communication whithin a period of time
Module is communicated with target object and the multiple location informations of the target object measured relative to robot.
In above-mentioned each embodiment, proximity communication module and depth camera and/or laser radar can mutually be tied
It closes.The following embodiments of the application will close mode to different combinations and illustrate:
In optional embodiment 1, proximity communication module and laser radar can be combined, are provided in robot
Proximity communication module and laser radar, alternatively it is also possible to be provided with depth camera.Then, short-range communication mould is being utilized
Block communicated with target object and after measuring first position information, around the laser radar acquisition robot of robot
The laser radar data of at least one object;It in turn, can be according to the range information in the information of first position, from least one right
In the laser radar data of elephant, the laser radar data of target object is determined;It is positioned according to the laser radar data of target object
Target object.It is alternatively possible to when the angle information in the information of first position is insincere, by proximity communication module and laser
Radar is combined.Wherein, " angle information in the information of first position is insincere " is by proximity communication module and laser radar
It is combined a kind of optional condition, rather than the necessary condition that proximity communication module and laser radar are combined.
In optional embodiment 1, proximity communication module and laser radar are combined, it is detectable using laser radar
The advantage of 360 ° of ranges makes up the smaller deficiency of proximity communication module angle confidence interval, can stablize to a certain extent
Angle information solves the problems, such as to position failure caused by angle information is insincere, and combines lidar measurement data precision
Higher advantage is conducive to improve positioning accuracy, continual and steady signal can be obtained especially in complex environment.
In optional embodiment 2, proximity communication module and depth camera can be combined, are arranged in robot
There are proximity communication module and depth camera, alternatively it is also possible to be provided with laser radar.Based on this, using closely
After communication module communicates with target object and measures first position information, using the depth camera harvester of robot
The depth image data of at least one object around device people;In turn, can according to the range information in the information of first position, to
In the depth image data of a few object, the depth image data of target object is determined;According to the depth image of target object
Data position target object.It is alternatively possible to when the angle information in the information of first position is insincere, by short-range communication mould
Block and depth camera are combined.Wherein, " angle information in the information of first position is insincere " is by proximity communication module
It is combined a kind of optional condition with depth camera, rather than necessity that proximity communication module and depth camera are combined
Condition.
In optional embodiment 2, proximity communication module and depth camera are combined, it is incredible in angle information
In the case of, secondary positioning can be carried out to target object using depth camera, and combine depth camera measure data precision
Higher advantage is conducive to improve the probability successfully positioned, improves positioning accuracy.
In optional embodiment 3, proximity communication module, depth camera and laser radar can be combined, machine
Proximity communication module, depth camera and laser radar are provided on device people.Based on this, using proximity communication module with
Target object communicates after measuring first position information, and it is at least one right around laser radar acquisition robot to utilize
The laser radar data of elephant, and according to the range information in the information of first position, from the laser radar data of at least one object
In, determine the laser radar data of target object;When can not determine target from the laser radar data of at least one object
When the laser radar data of object, the depth image data of at least one object around depth camera acquisition robot is utilized;
In turn, target object is determined from the depth image data of at least one object according to the range information in the information of first position
Depth image data;Target object is positioned according to the depth image data of target object.
Optionally, when the laser radar data for determining target object in the laser radar data from least one object
When, can target object directly be positioned according to the laser radar data of target object, without further utilizing depth camera
Sampling depth image data.
It is alternatively possible to when the angle information in the information of first position is insincere, by proximity communication module and laser
Radar and depth camera are combined.Wherein, " angle information in the information of first position is insincere " is by short-range communication
Module and laser radar and depth camera are combined a kind of optional condition, rather than by proximity communication module and laser thunder
It reaches and necessary condition that depth camera is combined.
It, can be with when proximity communication module leads to because angle information is insincere positioning failure in optional embodiment 3
Using the advantage of the detectable 360 ° of ranges of laser radar, proximity communication module angle confidence interval can be made up to a certain extent
Smaller deficiency can stablize angle information, solve the problems, such as to position failure caused by angle information is insincere;Further, when
Laser radar by barrier block can not successful probe to target object when, can utilize depth camera, utilize image recognition
Technology ensures the stabilization and reliability of data, solves the problems, such as that laser radar can not detect target object, can not only improve
The probability that success positions, and be conducive to improve positioning accuracy.
In optional embodiment 4, proximity communication module, depth camera and laser radar can be combined, machine
Proximity communication module, depth camera and laser radar are provided on device people.Based on this, using proximity communication module with
Target object communicates after measuring first position information, can utilize at least one around depth camera acquisition robot
The depth image data of object;And according to the range information in the information of first position, from the depth image number of at least one object
In, the depth image data of target object is determined;When can not determine mesh from the depth image data of at least one object
When marking the depth image data of object, the laser radar data of at least one object around robot is acquired using laser radar,
In turn, target object is determined from the laser radar data of at least one object according to the range information in the information of first position
Laser radar data;Target object is positioned according to the laser radar data of target object.
Optionally, when the depth image data for determining target object in the depth image data from least one object
When, can target object directly be positioned according to the depth image data of target object, without further being adopted using laser radar
Collect laser radar data.
It is alternatively possible to when the angle information in the information of first position is insincere, by proximity communication module and laser
Radar and depth camera are combined.Wherein, " angle information in the information of first position is insincere " is by short-range communication
Module and laser radar and depth camera are combined a kind of optional condition, rather than by proximity communication module and laser thunder
It reaches and necessary condition that depth camera is combined.
It, can be with when proximity communication module leads to because angle information is insincere positioning failure in optional embodiment 4
The stabilization and reliability of data are ensured using image recognition technology, improve the probability of successfully positioning using depth camera;Into
One step can be in conjunction with laser radar in depth camera because when angle problem positioning fails, and utilizes laser radar detectable
The advantage of 360 ° of ranges stablizes angle information, solves the problems, such as to position failure caused by angle is insincere;Further, in conjunction with depth
Camera and the higher advantage of lidar measurement data precision are spent, is conducive to improve positioning accuracy.
In above-mentioned section Example, including using at least one right around the depth camera acquisition robot of robot
The operation of the depth image data of elephant.A kind of optional embodiment of the operation includes:
First, using scene around depth camera shooting robot, to obtain coloured image and depth image;Cromogram
As being mainly used for object detection and tracking, each pixel in depth image has the range information of tri- axial directions of X, Y, Z, i.e.,
Pixel coordinate.Wherein, X-axis, Y-axis, Z axis are the reference axis in three-dimensional system of coordinate where robot, and X-axis, Y-axis direction are machines
Device people both direction in the horizontal plane, Z axis direction is the longitudinal direction along robot.Wherein, the model of surrounding scene
Enclosing can be determined by the visual range of depth camera.
Secondly, it is based on image recognition technology, identifies at least one object for including in coloured image, and mark cromogram
The image-region at least one object for including as in.Later, it is closed according to the coordinate transform between coloured image and depth image
System, image-region of at least one object in coloured image can be mapped in depth image, be assured that in this way to
A few object corresponding image-region in depth image.It in turn, can be right in depth image according at least one object
Pixel coordinate in the image-region answered calculates separately out the coordinate shift of at least one object opposed robots.In the implementation
In mode, the coordinate shift of each object opposed robots is the depth image data of each object.
In above-mentioned section Example, after the depth image data of each object of determination, it can be believed according to first position
Range information in breath determines the depth image data of target object from the depth image data of at least one object.Every
In the case that the depth image data of a object includes the coordinate shift of each object opposed robots, one kind of the operation is optional
Embodiment includes:
To at least one object, target object can be determined whether one by one according to the coordinate shift of its opposed robots.
Wherein, for each object, according to the coordinate shift of its opposed robots determine whether the process of target object it is identical or
It is similar.For ease of description and understand, below to be retouched for any object at least one object, such as the first object
It states.
For example, to the first object, it is opposite that the first object can be calculated according to the coordinate shift of the first object opposed robots
The range information of robot;Range information in the range information of first object opposed robots and first position information is carried out
Compare;If the range information of the first object opposed robots and the difference of the range information in the information of first position are less than range difference
Threshold value determines that the coordinate shift of the first object opposed robots is the coordinate shift of target object opposed robots, correspondingly, the
An object is target object;Conversely, if the range information of the first object opposed robots is at a distance from the information of first position
The difference of information is greater than or equal to range difference threshold value, it is determined that the first object is not target object.
It, can be with target object after determining the depth image data of target object in above-mentioned section Example
Depth image positions target object.Include the coordinate shift of target object opposed robots in the depth image data of target object
In the case of, a kind of optional embodiment of the operation includes:
According to formula X_d=X0+X1+offset_x, Y_d=Y0+Y1+offset_y, the position for calculating target object is sat
Mark.Wherein, X_d, Y_d are the X axis coordinate and Y axis coordinate of target object, and X0, Y0 are the X axis coordinate and Y axis coordinate of robot,
X1, Y2 are installation deviation of the depth camera in X-axis and Y-axis, and offset_x, offset_y are target object relative to machine
Coordinate shift of the people in X-axis and the coordinate shift in Y-axis.
It still optionally further, can be by the X of target object after calculating the X axis coordinate and Y axis coordinate of target object
The credible X axis coordinate and Y axis coordinate for the target object that axial coordinate and Y axis coordinate obtain in being positioned with the last time are compared;If
The two error determines that this positioning result is accurate, can subsequently use the positioning result in setting error range;If conversely,
The two error in setting error range, does not illustrate that this positioning result is not accurate enough, can reposition or abandon this
Positioning result.
In above-mentioned section Example, including utilizing at least one object around the laser radar acquisition robot of robot
Laser radar data operation.A kind of optional embodiment of the operation includes:
Emit laser signal within the scope of 360 ° of robot using laser radar, and records the time of transmitting laser signal
t1;Laser signal is propagated in the air, can be reflected after encountering first object;Laser radar can receive reflected sharp
Optical signal, and record the time t2 for receiving reflected laser signal;According to time t2 and time t1, can calculate sharp
The unidirectional flight time t_tof=(t2-t1)/2 that optical signal is propagated in the air;It can in conjunction with the light velocity and unidirectional flight time t_tof
To calculate laser radar apart from the distance between the object, i.e. the distance of object opposed robots, and by laser signal institute
In angle of the angle as object opposed robots.In the optional embodiment, the laser radar data packet of each object
Range information and angle information containing each object opposed robots.
It, can be according to first position after the laser radar data of each object of determination in above-mentioned section Example
Range information in information determines the laser radar data of target object from the laser radar data of at least one object.
In the case that the laser radar data of each object includes the range information and angle information of each object opposed robots, the behaviour
Make a kind of optional embodiment include:
From the laser radar data of at least one object, obtains range information and angle information and believe respectively with first position
The difference of range information in breath and the credible angle information in last positioning is less than the laser radar data of respective threshold, makees
For the laser radar data of target object.
It is alternatively possible to which the range information of at least one object opposed robots is believed at a distance from the information of first position
Breath is compared, and therefrom obtains pair that range information is less than range difference threshold value with the difference of the range information in the information of first position
As;In turn, the angle information of acquired object opposed robots is compared with the credible angle in last position, from
The middle difference for obtaining the credible angle information during angle information is positioned with the last time is less than the object of differential seat angle threshold value, will finally obtain
Laser radar data of the laser radar data of the object and object got respectively as target object and target object.Or
It is alternatively possible to by the angle information of at least one object opposed robots and the credible angle in last positioning
It is compared, the difference for therefrom obtaining the credible angle information during angle information is positioned with the last time is less than pair of differential seat angle threshold value
As;In turn, the range information of acquired object opposed robots is compared with the range information in the information of first position,
The object that range information is less than range difference threshold value with the difference of the range information in the information of first position is therefrom obtained, will finally be obtained
Laser radar data of the laser radar data of the object and object got respectively as target object and target object.
It is worth noting that in some special circumstances, for example, target object is blocked, laser radar possibly can not be adopted
The laser radar data of the target object collected then possibly can not determine mesh from the laser radar data of at least one object
Mark the laser radar data of object.
It, can be according to target object after determining the laser radar data of target object in above-mentioned section Example
Laser radar data position target object.The laser radar data of each object include each object opposed robots away from
In the case of information and angle information, a kind of optional embodiment of the operation includes:
According to formula X_d=X0+d*cos (θ0+ θ), Y_d=Y0+d*sin (θ0+ θ), the position for calculating target object is sat
Mark.Wherein, X_d, Y_d are the X axis coordinate and Y axis coordinate of target object, and X0, Y0 are the X axis coordinate and Y axis coordinate of robot,
θ0For the direction of robot, d is the range information in the laser radar data of target object, and θ is the laser radar of target object
Angle information in data.
The positioning system or robot that the above embodiments of the present application provide, can be applied to various scenes.Below to some
Application scenarios are illustrated:
Application scenarios 1:
Mitigate burden when user's shopping in order to improve convenience when user's shopping in places such as supermarket, markets, surpass
Follower type shopping cart robot can be configured in city, market.These shopping cart robots can be followed when user does shopping in user
Behind, user carries Shopping Basket without hand-push shopping cart or hand, convenient for the more convenient picking commodities of user.
Wherein, user is preferably followed for the ease of these shopping cart robots, can is that shopping cart robot configures people
Body follower.Optionally, human body follower can be taken care of by the administrative staff in supermarket, market, can also directly configure and do shopping
In vehicle robot.
It, can be from the administrator in supermarket, market if selection shopping cart robot when user to supermarket, market shopping
The corresponding human body follower of shopping cart robot is obtained at member, alternatively, human body follower is removed from shopping cart robot, by people
Body follower is dressed or is worn on some position, or is positioned in pocket.It is then possible to pass through voice, touch-control or physical button
Etc. modes shopping cart robot is waken up from suspend mode or standby mode.Wherein, shopping cart robot is in when being not used by state,
Suspend mode or standby mode can be entered, this is conducive to the electricity for saving shopping cart robot.
At this point, user only need to be according to the shopping need of oneself, in supermarket, the market free choice of goods.Shopping cart robot
Wireless signal can be sent out to the human body follower being adapted to it by its proximity communication module, such as UWB modules, with to
Family carries out Primary Location.
In general, UWB modules have certain angle confidence interval.And when user's free choice of goods, user moving range has
It may mutate or change greatly, when except user moving abruptly to the angle confidence interval of UWB modules, UWB modules have
It possibly can not be accurately positioned.To solve the problems, such as this, laser radar is provided in shopping cart robot, laser radar can detect
Nearest object within the scope of 360 ° solves the problems, such as that angle saltus step etc. is insincere.
If the angle for the user opposed robots that UWB modules are oriented is located at angle confidence interval, shopping cart robot
Can be according to UWB modules to the positioning result of the user, and preset supermarket, market map are combined, it plans from shopping cart machine
People current location achievees the purpose that follow user to the path of user.
If the angle of the user opposed robots oriented based on UWB modules is insincere, shopping cart machine can be started
The laser radar of people carries out secondary positioning to user.If laser radar, which navigates to shopping cart robot, needs the user followed,
Can be according to laser radar to the positioning result of the user, and preset supermarket, market map are combined, it plans from shopping cart machine
People current location achievees the purpose that follow user to the path of user.
Further, optionally, in the environment such as supermarket, market, personnel, article, shelf are all than comparatively dense, shopping cart machine
People needs the user followed to be possible to be blocked by other users or shelf etc..In this case, laser radar is possibly can not
The user for needing to follow is navigated to, laser radar positioning can fail.At this point it is possible to start the depth camera of shopping cart robot
User is positioned three times.If taking shopping cart robot in depth camera needs the user followed, can basis
Depth camera combines preset supermarket, market map to the positioning result of the user, and planning is current from shopping cart robot
Position achievees the purpose that follow user to the path of user.
Application scenarios 2
In sorting system of storing in a warehouse, sorting machine people can be configured;When needing sorting operation, staff can guide
Sorting machine people reaches shelf picking area, replaces sorting personnel to carry out goods sorting by sorting machine people, can improve sorting effect
Rate saves personnel cost.
Wherein, it preferably follows staff to reach shelf picking area for the ease of these sorting machines people, can be point
Pick robot configuration human body follower.Optionally, human body follower can be taken care of by staff, can also directly configure and divide
It picks in robot.
When staff need guide sorting machine people reach shelf picking area when, can from keeping from or sorting machine people
On remove human body follower, some position is dressed or be worn on to human body follower, or be positioned in pocket.It is then possible to logical
The modes such as voice, touch-control or physical button are crossed to wake up sorting machine people from suspend mode or standby mode.Wherein, at sorting machine people
When being not used by state, suspend mode or standby mode can be entered, this is conducive to the electricity for saving sorting machine people.
At this point, staff only need to voluntarily move towards to reach shelf picking area.Sorting machine people can pass through its short-range communication
Module, such as UWB modules send out wireless signal to the human body follower being adapted to it, to carry out Primary Location to user.
In general, UWB modules have certain angle confidence interval.And the path for reaching shelf picking area may be relatively more multiple
Miscellaneous, the moving range of staff is likely to occur mutation or changes greatly, when staff moves abruptly to the angle of UWB modules
When spending except confidence interval, UWB modules are possible to not to be accurately positioned.It is provided with and swashs to solve the problems, such as this, on sorting machine people
Optical radar, laser radar can detect object nearest within the scope of 360 °, solve the problems, such as that angle saltus step etc. is insincere.
If the angle for the staff opposed robots that UWB modules are oriented is located at angle confidence interval, sorting machine
People can be according to UWB modules to the positioning result of the staff, and combine preset warehouse map, plan from sorting machine people
Current location achievees the purpose that follow staff to the path of staff.
If the angle of the staff opposed robots oriented based on UWB modules is insincere, sorter can be started
The laser radar of device people carries out secondary positioning to staff.It, can be according to laser if laser radar navigates to staff
Radar combines preset warehouse map to the positioning result of the staff, plans from sorting machine people current location to work
The path for making personnel achievees the purpose that follow staff.
Be worth explanation, according to application demand and cost, sorting machine people can also configurable deep camera, in order to
When laser radar positions failure, auxiliary positioning is further carried out.
Application scenarios 3
In home scenarios, family can be configured and accompanied and attended to robot.Robot is accompanied and attended to by family can be old instead of adult's nurse
People or child can therefrom free adult.Family accompany and attend to robot can accompany old people people or child game, read, chat,
Old man is reminded to take medicine.In order to preferably provide the service of accompanying and attending to, robot of accompanying and attending to of family can also accompany old people people according to user demand
Or child is outgoing.In outgoing scene of accompanying and attending to, robot of accompanying and attending to of family needs successfully to position and follow old man and child.
Wherein, old man or child are preferably followed for the ease of robot of accompanying and attending to of these families, can is that family accompany and attend to machine
Device people configures human body follower.When need family accompany and attend to robot follow old man or child outgoing when, old man or child can wear
Wear human body follower.Then, by old man or child, or by child parent, pass through the side such as voice, touch-control or physical button
Formula wakes up robot of accompanying and attending to of family from suspend mode or standby mode.Wherein, robot of accompanying and attending to of family is in when being not used by state,
Suspend mode or standby mode can be entered, this is conducive to save family and accompanies and attends to the electricity of robot.
At this point, old man and child only need to arbitrarily stroll or walk according to the outgoing demand of oneself.Robot is accompanied and attended to by family can
To send out wireless signal to the human body follower being adapted to it by its proximity communication module, such as UWB modules, with to old man
Or child carries out Primary Location.
When the old man or child that UWB modules are oriented with respect to family accompany and attend to robot angle changing rate it is credible when, family accompanies
Protecting robot can be according to UWB module to the positioning result of old man or child, and combines the real-time scene taken, plans from family
Robot current location is accompanied and attended to the path of old man or child in front yard, achievees the purpose that follow user.
When the old man or child that UWB modules are oriented with respect to family accompany and attend to robot angle it is insincere when, can start
The accompany and attend to laser radar of robot of family carries out secondary positioning to old man or child.If laser radar navigates to family and accompanies and attends to machine
People needs the old man or child that follow, then can be according to laser radar to the positioning result of old man or child, and combines and take
Real-time scene, planning from family accompanies and attends to robot current location to the path of old man or child, achievees the purpose that follow user.
Further, if family accompanies and attends to, the function of robot is relatively powerful, and family's demand of accompanying and attending to is also more complicated, then family
Robot is accompanied and attended in front yard can be with configurable deep camera.In following scene, depth camera pair can be started according to application demand
Old man or child are into line trace, successfully to follow, reach accompanying and attending to truly.For example, if old man or child go to people
Member than comparatively dense place, at this point, laser radar positioning may timeliness, depth camera can be started at this time, can be utilized
Image recognition technology successfully positions the old man or child for needing to follow.
In addition to the above-mentioned positioning system based on robot and robot, the embodiment of the present application also provides some and is based on machine
The localization method of people, these methods mainly describe how robot positions target object from the angle of robot.
In these methods, the proximity communication module of robot is combined with the depth camera of robot and/or laser radar,
Proximity communication module and the advantage of depth camera and/or laser radar respectively are made full use of, fusion much information is to target
Object is positioned.
The main positioning thinking of these methods is:On the one hand logical with target object using the proximity communication module of robot
Letter, to measure first position information of the target object relative to robot;On the other hand using robot depth camera and/
Or laser radar acquires the depth image and/or laser radar data of at least one object around robot;Then, according to first
Range information in location information determines target object from the depth image and/or laser radar data of at least one object
Depth image and/or laser radar data;Target pair is positioned according to the depth image of target object and/or laser radar data
As.
It is worth noting that above-mentioned positioning thinking can there are many different embodiments, for example, in some optional implementations
In mode, proximity communication module can be directly combined with depth camera probe and/or laser radar, without any
Part;In other alternative embodiments, proximity communication module is combined needs completely with depth camera probe and/or laser radar
Sufficient certain condition, such as can pop one's head in conjunction with depth camera when target object can not be accurately positioned in proximity communication module
And/or laser radar positions target object.
In the application following embodiment, when target object can not be accurately positioned with proximity communication module, in conjunction with depth
The process that camera probe and/or laser radar position target object is described.
Fig. 3 a are a kind of flow signal for localization method based on robot that the application another exemplary embodiment provides
Figure.As shown in Figure 3a, this method includes:
S301, it is communicated with target object using the proximity communication module of robot, to measure target object relative to machine
The first position information of device people, the first position information include that target object is believed relative to the range information and angle of robot
Breath.
S302, judge whether the angle information in the information of first position is credible;If judging result is no, then follow the steps
S303;If the determination result is YES, S306 is thened follow the steps.
S303, the laser radar data that at least one object around robot is acquired using the laser radar of robot, after
It is continuous to execute step S304.
S304, according to the range information in the information of first position, from the laser radar data of at least one object, determine
The laser radar data of target object continues to execute step S305.
S305, target object is positioned according to the laser radar data of target object, terminates this positioning operation.
S306, according in the information of first position range information and angle information position target object, terminate this time position
Operation.
In the present embodiment, proximity communication module and laser radar are combined, laser radar can be utilized detectable
The advantage of 360 ° of ranges makes up the smaller deficiency of proximity communication module angle confidence interval, can stablize to a certain extent
Angle information solves the problems, such as to position failure caused by angle information is insincere, and combines lidar measurement data precision
Higher advantage is conducive to improve positioning accuracy, continual and steady signal can be obtained especially in complex environment.
Fig. 3 b are that the flow for another localization method based on robot that the application another exemplary embodiment provides is shown
It is intended to.As shown in Figure 3b, this method includes:
S311, it is communicated with target object using the proximity communication module of robot, to measure target object relative to machine
The first position information of device people, the first position information include that target object is believed relative to the range information and angle of robot
Breath.
S312, judge whether the angle information in the information of first position is credible;If judging result is no, then follow the steps
S313;If the determination result is YES, S316 is thened follow the steps.
S313, the depth image data that at least one object around robot is acquired using the depth camera of robot,
Continue to execute step S314.
S314, according to the range information in the information of first position, from the depth image data of at least one object, determine
The depth image data of target object continues to execute step S315.
S315, target object is positioned according to the depth image data of target object, terminates this positioning operation.
S316, according in the information of first position range information and angle information position target object, terminate this time position
Operation.
In the present embodiment, proximity communication module and depth camera are combined, in the incredible situation of angle information
Under, secondary positioning can be carried out to target object using depth camera, and combine depth camera measure data precision higher
Advantage, be conducive to improve the probability successfully positioned, improve positioning accuracy.
Fig. 3 c are that the flow for another localization method based on robot that the application another exemplary embodiment provides is shown
It is intended to.As shown in Figure 3c, this method includes:
S3201, it is communicated with target object using the proximity communication module of robot, to measure target object relative to machine
The first position information of device people, the first position information include that target object is believed relative to the range information and angle of robot
Breath.
S3202, judge whether the angle information in the information of first position is credible;If judging result is no, then follow the steps
S3203;If the determination result is YES, S3212 is thened follow the steps.
S3203, the laser radar data that at least one object around robot is acquired using the laser radar of robot, after
It is continuous to execute step S3204.
S3204, according to the range information in the information of first position, from the laser radar data of at least one object, really
Set the goal the laser radar data of object, continues to execute step S3205.
S3205, judge whether to determine the laser radar of target object from the laser radar data of at least one object
Data;If the determination result is YES, step S3206 is executed;If judging result is no, step S3207 is executed.
S3206, target object is positioned according to the laser radar data of target object, terminates this positioning operation.
S3207, the depth image data that at least one object around robot is acquired using the depth camera of robot,
Continue to execute step S3208.
S3208, according to the range information in the information of first position, from the depth image data of at least one object, really
Set the goal the depth image data of object, continues to execute step S3209.
S3209, judge whether to determine the depth image of target object from the depth image data of at least one object
Data;If the determination result is YES, step S3210 is executed;If judging result is no, step S3211 is executed.
S3210, target object is positioned according to the depth image data of target object, terminates this positioning operation.
S3211, restoring to normal position failure result terminate positioning operation.
S3212, according in the information of first position range information and angle information position target object.
In the present embodiment, it when leading to positioning failure because angle information is insincere when proximity communication module, can utilize
The advantage of the detectable 360 ° of ranges of laser radar, it is smaller can to make up proximity communication module angle confidence interval to a certain extent
Deficiency, angle information can be stablized, solve the problems, such as angle information it is insincere caused by positioning failure;Further, work as laser
Radar by barrier block can not successful probe to target object when, depth camera can be utilized, using image recognition technology,
The stabilization and reliability for ensureing data, solve the problems, such as that laser radar can not detect target object, can not only improve success
The probability of positioning, and be conducive to improve positioning accuracy.
Fig. 3 d are that the flow for another localization method based on robot that the application another exemplary embodiment provides is shown
It is intended to.As shown in Figure 3d, this method includes:
S3301, it is communicated with target object using the proximity communication module of robot, to measure target object relative to machine
The first position information of device people, the first position information include that target object is believed relative to the range information and angle of robot
Breath.
S3302, judge whether the angle information in the information of first position is credible;If judging result is no, then follow the steps
S3303;If the determination result is YES, S3212 is thened follow the steps.
S3303, the depth image data that at least one object around robot is acquired using the depth camera of robot,
Continue to execute step S3304.
S3304, according to the range information in the information of first position, from the depth image data of at least one object, really
Set the goal the depth image data of object, continues to execute step S3305.
S3305, judge whether to determine the depth image of target object from the depth image data of at least one object
Data;If the determination result is YES, step S3306 is executed;If judging result is no, step S3307 is executed.
S3306, target object is positioned according to the depth image data of target object, terminates this positioning operation.
S3307, the laser radar data that at least one object around robot is acquired using the laser radar of robot, after
It is continuous to execute step S3308.
S3308, according to the range information in the information of first position, from the laser radar data of at least one object, really
Set the goal the laser radar data of object, continues to execute step S3309.
S3309, judge whether to determine the laser radar of target object from the laser radar data of at least one object
Data;If the determination result is YES, step S3210 is executed;If judging result is no, step S3211 is executed.
S3210, target object is positioned according to the laser radar data of target object, terminates this positioning operation.
S3211, restoring to normal position failure result terminate positioning operation.
S3212, directly according in the information of first position range information and angle information position target object.
In the present embodiment, it when leading to positioning failure because angle information is insincere when proximity communication module, can utilize
Depth camera ensures the stabilization and reliability of data using depth camera using image recognition technology, and it is successfully fixed to improve
The probability of position;Further, laser radar can be combined, and utilize and swash because when angle problem positioning fails in depth camera
The advantage of the detectable 360 ° of ranges of optical radar stablizes angle information, solves the problems, such as to position failure caused by angle is insincere;Into
One step is conducive to improve positioning accuracy in conjunction with depth camera and the higher advantage of lidar measurement data precision.
In the above-described embodiments, need to judge whether the angle information in the information of first position is credible, and when insincere,
In conjunction with depth camera and/or laser radar, but it is not limited to this.In the application other embodiments, can be indifferent to
Whether the angle information in one location information is credible, i.e., no matter whether angle information in the information of first position is credible, all nearly
Field communication module is combined with depth camera and/or laser radar.For example, in Fig. 3 a illustrated embodiments, can not hold
Row step S302, but after step S301, directly execute step S303 and follow-up correlation step.In another example in Fig. 3 b
In illustrated embodiment, step S312 can not be executed, but after step S311, directly execute step S313 and follow-up phase
Close step.In another example in Fig. 3 c illustrated embodiments, step S3202 can not be executed, but after step S3201, directly
Execute step S3203 and follow-up correlation step.In another example in Fig. 3 d illustrated embodiments, step S3302 can not be executed,
But after step S3301, step S3303 and follow-up correlation step are directly executed.About Fig. 3 a- Fig. 3 d provided herein
The variant embodiment of illustrated embodiment, associated description can be found in Fig. 3 a- Fig. 3 d illustrated embodiments, be not detailed description herein.
In above-mentioned various embodiments of the method, including judge the whether believable step of angle information in the information of first position
Suddenly.Optionally, which may be used but be not limited to following manner realization:
If the angle information in the information of first position not in the corresponding angle confidence interval of proximity communication module, determines
Angle information in the information of first position is insincere;And/or
If the difference of the angle information and the credible angle information in last positioning in the information of first position is more than setting
Angle threshold determines that the angle information in the information of first position is insincere;And/or
If target object fluctuates exception relative to the angle information in multiple location informations of robot, first position is determined
Angle information in information is insincere, and multiple location informations are between this positioning starts and upper primary positioning terminates using close
What field communication module obtained, and include first position information.
In above-mentioned certain methods embodiment, including the use of around the depth camera acquisition robot of robot at least one
The step of depth image data of a object.The step may be used but be not limited to following manner realization:
Using scene around depth camera shooting robot, to obtain coloured image and depth image;Known based on image
Other technology marks the image-region at least one object for including in coloured image;Based between coloured image and depth image
Coordinate conversion relation, image-region of at least one object in coloured image is mapped in depth image;According at least
Pixel coordinate in image-region of one object in depth image, calculates separately the seat of at least one object opposed robots
Mark offset, the depth image data as at least one object.
Include according to the range information in the information of first position, from least one right in above-mentioned certain methods embodiment
In the depth image data of elephant, the step of determining the depth image data of target object.The step may be used but be not limited to
Under type is realized:
To at least one object, target object can be determined whether one by one according to the coordinate shift of its opposed robots.
Wherein, for each object, according to the coordinate shift of its opposed robots determine whether the process of target object it is identical or
It is similar.For ease of description and understand, below to be retouched for any object at least one object, such as the first object
It states.
For example, to the first object, it is opposite the first object can be calculated according to the coordinate shift of the first object opposed robots
The range information of robot;Range information in the range information of first object opposed robots and first position information is carried out
Compare;If the range information of the first object opposed robots and the difference of the range information in the information of first position are less than range difference
Threshold value determines that the coordinate shift of the first object opposed robots is the coordinate shift of target object opposed robots, correspondingly, the
An object is target object;Conversely, if the range information of the first object opposed robots is at a distance from the information of first position
The difference of information is greater than or equal to range difference threshold value, it is determined that the first object is not target object.
Include the steps that target object is positioned according to the depth image of target object in above-mentioned certain methods embodiment.
The step may be used but be not limited to following manner realization:
According to formula X_d=X0+X1+offset_x, Y_d=Y0+Y1+offset_y, the position for calculating target object is sat
Mark.Wherein, X_d, Y_d are the X axis coordinate and Y axis coordinate of target object, and X0, Y0 are the X axis coordinate and Y axis coordinate of robot,
X1, Y2 are installation deviation of the depth camera in X-axis and Y-axis, and offset_x, offset_y are target object relative to machine
Coordinate shift of the people in X-axis and the coordinate shift in Y-axis.
In above-mentioned certain methods embodiment, the laser radar number of at least one object is acquired including the use of laser radar
According to.The laser radar data of each object includes range information and angle information of each object relative to robot.It is then above-mentioned
In embodiment of the method, according to the range information in the information of first position, from the laser radar data of at least one object, determine
The laser radar data of target object may be used but be not limited to following manner realization:
From the laser radar data of at least one object, obtains range information and angle information and believe respectively with first position
The difference of range information in breath and the credible angle information in last positioning is less than the laser radar data of respective threshold,
Laser radar data as target object.
It is alternatively possible to which the range information of at least one object opposed robots is believed at a distance from the information of first position
Breath is compared, and therefrom obtains pair that range information is less than range difference threshold value with the difference of the range information in the information of first position
As;In turn, the angle information of acquired object opposed robots is compared with the credible angle in last position, from
The middle difference for obtaining the credible angle information during angle information is positioned with the last time is less than the object of differential seat angle threshold value, will finally obtain
Laser radar data of the laser radar data of the object and object got respectively as target object and target object.Or
It is alternatively possible to by the angle information of at least one object opposed robots and the credible angle in last positioning
It is compared, the difference for therefrom obtaining the credible angle information during angle information is positioned with the last time is less than pair of differential seat angle threshold value
As;In turn, the range information of acquired object opposed robots is compared with the range information in the information of first position,
The object that range information is less than range difference threshold value with the difference of the range information in the information of first position is therefrom obtained, will finally be obtained
Laser radar data of the laser radar data of the object and object got respectively as target object and target object.
It is worth noting that in some special circumstances, for example, target object is blocked, laser radar possibly can not be adopted
The laser radar data of the target object collected then possibly can not determine mesh from the laser radar data of at least one object
Mark the laser radar data of object.
Include the step that target object is positioned according to the laser radar data of target object in above-mentioned certain methods embodiment
Suddenly.The step may be used but be not limited to following manner realization:
According to formula X_d=X0+d*cos (θ0+ θ), Y_d=Y0+d*sin (θ0+ θ), the position for calculating target object is sat
Mark.Wherein, X_d, Y_d are the X axis coordinate and Y axis coordinate of target object, and X0, Y0 are the X axis coordinate and Y axis coordinate of robot,
θ0For the direction of robot, d is the range information in the laser radar data of target object, and θ is the laser radar of target object
Angle information in data.
In some need robot to follow the application scenarios of target object, after robot localization target object, may be used also
With according to the positioning result to target object, control robot follows target object.For example, can be according to target pair
The positioning result of elephant plans the path from robot to target object, and then controls the Robot path and moved to target object
It is dynamic, to achieve the purpose that follow target object.
It, can also be by the positioning to target object after robot localization target object in other application scenarios
As a result, exporting to the corresponding control platform of robot.Wherein, control platform can execute it according to the positioning result of target object
He operates, such as recommends businessman to target object, is either target object dispatching express delivery or provides navigation clothes for target object
Business etc..
It should be noted that the executive agent of each step of above-described embodiment institute providing method may each be same equipment,
Alternatively, this method is also by distinct device as executive agent.For example, the executive agent of step S301 to step S306 can be to set
Standby A;For another example, the executive agent of step S301 and S304 can be device A, and the executive agent of step S305-S306 can be
Equipment B;Etc..
In addition, in some flows of description in above-described embodiment and attached drawing, contains and occur according to particular order
Multiple operations, but it should be clearly understood that these operations can not execute or parallel according to its sequence what appears in this article
It executes, serial number such as S301, S304 of operation etc., is only used for distinguishing each different operation, serial number itself, which does not represent, appoints
What executes sequence.In addition, these flows may include more or fewer operations, and these operations can execute in order
Or parallel execution.It should be noted that the descriptions such as " first " herein, " second ", are for distinguishing different message, setting
Standby, module etc. does not represent sequencing, does not also limit " first " and " second " and is different type.
Fig. 4 is a kind of structural representation for positioning device based on robot that the application another exemplary embodiment provides
Figure.The device can be realized as the internal module of robot;Alternatively, can also machine-independent people, with robot communication connect
It connects, and robot can be controlled.As shown in figure 4, the device includes:
Acquisition module 41, for being communicated with target object using the proximity communication module of robot, to measure target pair
As the first position information relative to robot.
Acquisition module 42, depth camera and/or laser radar for utilizing robot acquire around robot at least
The depth image and/or laser radar data of one object.
Determining module 43 is used for according to the range information in the information of first position, from the depth image of at least one object
And/or in laser radar data, the depth image and/or laser radar data of target object are determined.
Locating module 44, for positioning target object according to the depth image and/or laser radar data of target object.
In an optional embodiment, acquisition module 42 is specifically used for:Angle information in the information of first position can not
When letter, the depth image of at least one object around depth camera and/or laser radar the acquisition robot of robot is utilized
And/or laser radar data.
In an optional embodiment, acquisition module 42 is additionally operable to:
When angle information in the information of first position is not in the corresponding angle confidence interval of proximity communication module, really
The angle information determined in the information of first position is insincere;And/or
It is more than in the difference of the angle information in the information of first position and the credible angle information in upper primary positioning and sets
When determining angle threshold, determine that the angle information in the information of first position is insincere;And/or
When target object fluctuates abnormal relative to the angle information in multiple location informations of robot, first is determined
Angle information in confidence breath is insincere, and multiple location informations are utilized between this positioning starts and upper primary positioning terminates
What proximity communication module obtained, and include first position information.
In an optional embodiment, acquisition module 42 is specifically used for:It is acquired around robot at least using laser radar
The laser radar data of one object, and can not determine target object from the laser radar data of at least one object
When laser radar data, the depth image data of at least one object around depth camera acquisition robot is utilized;Alternatively, sharp
With the depth image data of at least one object around depth camera acquisition robot, and can not be from least one object
When determining the depth image data of target object in depth image data, around laser radar acquisition robot at least one is utilized
The laser radar data of a object.
In an optional embodiment, acquisition module 42 is acquired around robot extremely using the depth camera of robot
When the depth image data of a few object, it is specifically used for:Using scene around depth camera shooting robot, to obtain coloured silk
Color image and depth image;Based on image recognition technology, the image-region at least one object for including in coloured image is marked;
Based on the coordinate conversion relation between coloured image and depth image, by image-region of at least one object in coloured image
It is mapped in depth image;According to the pixel coordinate in image-region of at least one object in depth image, calculate separately
The coordinate shift of at least one object opposed robots, the depth image data as at least one object.
In an optional embodiment, determining module 43 is specifically used in the depth image data for determining target object:
To the first object at least one object, the first object is calculated with respect to machine according to the coordinate shift of the first object opposed robots
The range information of device people;If the difference of the range information of the first object opposed robots and the range information in the information of first position
Less than the poor threshold value of setpoint distance, determine that the coordinate shift of the first object opposed robots is the coordinate of target object opposed robots
Offset.Wherein, the first object is any object at least one object.
In an optional embodiment, locating module 44 according to the depth image of target object position target object when,
It is specifically used for:According to formula X_d=X0+X1+offset_x, Y_d=Y0+Y1+offset_y, the position for calculating target object is sat
Mark;
Wherein, X_d, Y_d are the X axis coordinate and Y axis coordinate of target object, and X0, Y0 are the X axis coordinate and Y-axis of robot
Coordinate, X1, Y2 are installation deviation of the depth camera in X-axis and Y-axis, and offset_x, offset_y are that target object is opposite
In coordinate shift of the robot in X-axis and the coordinate shift in Y-axis.
In an optional embodiment, the laser radar data of each object include each object relative to robot away from
From information and angle information.Based on this, determining module 43 is specifically used in the laser radar data for determining target object:From
In the laser radar data of at least one object, range information and angle information are obtained respectively at a distance from the information of first position
Information and the difference of the credible angle information in last positioning are less than the laser radar data of respective threshold, as target pair
The laser radar data of elephant.
In an optional embodiment, locating module 44 is positioning target object according to the laser radar data of target object
When, it is specifically used for:According to formula X_d=X0+d*cos (θ0+ θ), Y_d=Y0+d*sin (θ0+ θ), calculate the position of target object
Coordinate;
Wherein, X_d, Y_d are the X axis coordinate and Y axis coordinate of target object, and X0, Y0 are the X axis coordinate and Y-axis of robot
Coordinate, θ0For the direction of robot, d is the range information in the laser radar data of target object, and θ is the laser of target object
Angle information in radar data.
In an optional embodiment, which further includes:Model- following control module and/or output module.
Wherein, model- following control module, for according to the positioning result of target object, control robot to target object into
Row follows.
Output module, for will export the positioning result of target object to the corresponding control platform of robot.
Positioning device provided in this embodiment based on robot, can be by the proximity communication module and robot of robot
Depth camera and/or laser radar be combined, make full use of proximity communication module and depth camera and/or laser thunder
Up to respective advantage, fusion much information positions target object, is conducive to reduce position error, improves positioning accuracy.
Correspondingly, the embodiment of the present application also provides a kind of computer readable storage medium being stored with computer instruction, meter
When the instruction of calculation machine is executed by one or more processors, it includes action below to cause one or more processors execution:
It is communicated with target object using the proximity communication module of robot, to measure target object relative to robot
First position information;
Utilize the depth image of at least one object around depth camera and laser radar the acquisition robot of robot
And laser radar data;
According to the range information in the information of first position, from the depth image and laser radar data of at least one object
In, determine the depth image and laser radar data of target object;
Target object is positioned according to the depth image of target object and laser radar data.
About the detailed implementation process of above-mentioned action, reference can be made to the description in previous embodiment, details are not described herein.
Correspondingly, the embodiment of the present application also provides a kind of computer readable storage medium being stored with computer instruction, meter
When the instruction of calculation machine is executed by one or more processors, it includes action below to cause one or more processors execution:
It is communicated with target object using the proximity communication module of robot, to measure target object relative to robot
First position information;
Utilize the depth image data of at least one object around the depth camera acquisition robot of robot;
According to the range information in the information of first position target is determined from the depth image data of at least one object
The depth image data of object;
Target object is positioned according to the depth image data of target object.
About the detailed implementation process of above-mentioned action, reference can be made to the description in previous embodiment, details are not described herein.
Correspondingly, the embodiment of the present application also provides a kind of computer readable storage medium being stored with computer instruction, meter
When the instruction of calculation machine is executed by one or more processors, it includes action below to cause one or more processors execution:
It is communicated with target object using the proximity communication module of robot, to measure target object relative to robot
First position information;
Utilize the laser radar data of at least one object around the laser radar acquisition robot of robot;
According to the range information in the information of first position target is determined from the laser radar data of at least one object
The laser radar data of object;
Target object is positioned according to the laser radar data of target object.
About the detailed implementation process of above-mentioned action, reference can be made to the description in previous embodiment, details are not described herein.
Above computer readable storage medium storing program for executing can by any kind of volatibility or non-volatile memory device or it
Combination realize, such as static RAM (SRAM), electrically erasable programmable read-only memory (EEPROM) is erasable
Except programmable read only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, soon
Flash memory, disk or CD.
It should be understood by those skilled in the art that, the embodiment of the present invention can be provided as method, system or computer program
Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the present invention
Apply the form of example.Moreover, the present invention can be used in one or more wherein include computer usable program code computer
The computer program production implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.)
The form of product.
The present invention be with reference to according to the method for the embodiment of the present invention, the flow of equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that can be realized by computer program instructions every first-class in flowchart and/or the block diagram
The combination of flow and/or box in journey and/or box and flowchart and/or the block diagram.These computer programs can be provided
Instruct the processor of all-purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce
A raw machine so that the instruction executed by computer or the processor of other programmable data processing devices is generated for real
The device for the function of being specified in present one flow of flow chart or one box of multiple flows and/or block diagram or multiple boxes.
These computer program instructions, which may also be stored in, can guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works so that instruction generation stored in the computer readable memory includes referring to
Enable the manufacture of device, the command device realize in one flow of flow chart or multiple flows and/or one box of block diagram or
The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device so that count
Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, in computer or
The instruction executed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one
The step of function of being specified in a box or multiple boxes.
In a typical configuration, computing device includes one or more processors (CPU), input/output interface, net
Network interface and memory.
Memory may include computer-readable medium in volatile memory, random access memory (RAM) and/or
The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium
Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method
Or technology realizes information storage.Information can be computer-readable instruction, data structure, the module of program or other data.
The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), moves
State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable
Programmable read only memory (EEPROM), fast flash memory bank or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM),
Digital versatile disc (DVD) or other optical storages, magnetic tape cassette, tape magnetic disk storage or other magnetic storage apparatus
Or any other non-transmission medium, it can be used for storage and can be accessed by a computing device information.As defined in this article, it calculates
Machine readable medium does not include temporary computer readable media (transitory media), such as data-signal and carrier wave of modulation.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability
Including so that process, method, commodity or equipment including a series of elements include not only those elements, but also wrap
Include other elements that are not explicitly listed, or further include for this process, method, commodity or equipment intrinsic want
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that wanted including described
There is also other identical elements in the process of element, method, commodity or equipment.
Above is only an example of the present application, it is not intended to limit this application.For those skilled in the art
For, the application can have various modifications and variations.It is all within spirit herein and principle made by any modification, equivalent
Replace, improve etc., it should be included within the scope of claims hereof.
Claims (22)
1. a kind of localization method based on robot, which is characterized in that including:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to the machine
The first position information of people;
The depth of at least one object around the robot is acquired using the depth camera and laser radar of the robot
Image and laser radar data;
According to the range information in the first position information, from the depth image and laser radar number of at least one object
In, the depth image and laser radar data of the target object are determined;
The target object is positioned according to the depth image of the target object and laser radar data.
2. according to the method described in claim 1, it is characterized in that, the depth camera and laser using the robot
Radar acquires the depth image and laser radar data of at least one object around the robot, including:
When the angle information in the first position information is insincere, the depth camera and laser thunder of the robot are utilized
Up to the depth image and laser radar data for acquiring at least one object around the robot.
3. according to the method described in claim 2, it is characterized in that, further including:
If the angle information in the first position information not in the corresponding angle confidence interval of the proximity communication module,
Determine that the angle information in the first position information is insincere;And/or
If the difference of the angle information and the credible angle information in last positioning in the first position information is more than setting
Angle threshold determines that the angle information in the first position information is insincere;And/or
If the target object fluctuates extremely relative to the angle information in multiple location informations of the robot, described in determination
Angle information in the information of first position is insincere, and the multiple location information is to start to tie with upper primary positioning in this positioning
It is obtained using the proximity communication module between beam, and includes the first position information.
4. according to the method described in claim 1, it is characterized in that, the depth camera and laser using the robot
Radar acquires the depth image and laser radar data of at least one object around the robot, including:
Acquire the laser radar data of at least one object around the robot using the laser radar, and can not be from institute
When stating the laser radar data for determining the target object in the laser radar data of at least one object, the depth is utilized
Camera acquires the depth image data of at least one object around the robot;Or
Acquire the depth image data of at least one object around the robot using the depth camera, and can not be from
When determining the depth image data of the target object in the depth image data of at least one object, swashed using described
Optical radar acquires the laser radar data of at least one object around the robot.
5. according to claim 1-4 any one of them methods, which is characterized in that the depth camera using the robot
Head acquires the depth image data of at least one object around the robot, including:
Scene around the robot is shot using the depth camera, to obtain coloured image and depth image;
Based on image recognition technology, the image-region at least one object for including is marked in the coloured image;
Based on the coordinate conversion relation between the coloured image and the depth image, by least one object described
Image-region in coloured image is mapped in the depth image;
According to the pixel coordinate in image-region of at least one object in the depth image, calculate separately it is described extremely
The coordinate shift of few relatively described robot of an object, the depth image data as at least one object.
6. according to the method described in claim 5, it is characterized in that, the distance according in the first position information is believed
Breath, from the depth image data of at least one object, determines the depth image data of the target object, including:
To the first object at least one object, according to the coordinate shift meter of the relatively described robot of first object
Calculate the range information of the relatively described robot of first object;
If the difference of the range information of the relatively described robot of the first object and the range information in the first position information
Value is less than setpoint distance difference threshold value, determines that the coordinate shift of the relatively described robot of first object is the target object phase
To the coordinate shift of the robot;First object is any object at least one object.
7. according to the method described in claim 6, it is characterized in that, described position institute according to the depth image of the target object
Target object is stated, including:
According to formula X_d=X0+X1+offset_x, Y_d=Y0+Y1+offset_y, the position for calculating the target object is sat
Mark;
Wherein, X_d, Y_d be the target object X axis coordinate and Y axis coordinate, X0, Y0 be the robot X axis coordinate and
Y axis coordinate, X1, Y2 are installation deviation of the depth camera in X-axis and Y-axis, and offset_x, offset_y are the mesh
Object is marked relative to coordinate shift of the robot in X-axis and the coordinate shift in Y-axis.
8. according to the method described in claim 1, it is characterized in that, the laser radar data of each object includes each object phase
For the range information and angle information of the robot;
The range information according in the first position information, from the laser radar data of at least one object,
Determine the laser radar data of the target object, including:
From the laser radar data of at least one object, obtain range information and angle information respectively with described first
The difference of range information in confidence breath and the credible angle information in last positioning is less than the laser radar number of respective threshold
According to the laser radar data as the target object.
9. according to the method described in claim 8, it is characterized in that, the laser radar data according to the target object is fixed
The position target object, including:
According to formula X_d=X0+d*cos (θ0+ θ), Y_d=Y0+d*sin (θ0+ θ), the position for calculating the target object is sat
Mark;
Wherein, X_d, Y_d be the target object X axis coordinate and Y axis coordinate, X0, Y0 be the robot X axis coordinate and
Y axis coordinate, θ0For the direction of the robot, d is the range information in the laser radar data of the target object, and θ is institute
State the angle information in the laser radar data of target object.
10. according to claim 1-4 any one of them methods, which is characterized in that in the depth map according to the target object
After picture and laser radar data position the target object, the method further includes:
According to the positioning result to the target object, controls the robot and the target object is followed;And/or
It will be to the positioning result of the target object, output to the corresponding control platform of the robot.
11. a kind of positioning device based on robot, which is characterized in that including:
Acquisition module, for being communicated with target object using the proximity communication module of robot, to measure the target object
First position information relative to the robot;
Acquisition module, depth camera and laser radar for utilizing the robot acquire at least one around the robot
The depth image and laser radar data of a object;
Determining module is used for according to the range information in the first position information, from the depth map of at least one object
In picture and laser radar data, the depth image and laser radar data of the target object are determined;
Locating module, for positioning the target object according to the depth image and laser radar data of the target object.
12. a kind of robot, which is characterized in that including:Basic machine;The basic machine is equipped with proximity communication module,
One or more processors, and one or more memories for storing computer instruction;It is additionally provided with depth on the basic machine
Spend camera and laser radar;
One or more of processors, for executing the computer instruction, for:
It is communicated with target object using the proximity communication module, to measure the target object relative to the robot
First position information;
The depth image of at least one object around the robot is acquired using the depth camera and laser radar and is swashed
Optical radar data;
According to the range information in the first position information, from the depth image and laser radar number of at least one object
In, the depth image and laser radar data of the target object are determined;
The target object is positioned according to the depth image of the target object and laser radar data.
13. robot according to claim 12, which is characterized in that the processor is specifically used for:
When angle information in the first position information is insincere, the depth camera and laser thunder of the robot are utilized
Up to the depth image and laser radar data for acquiring at least one object around the robot.
14. robot according to claim 13, which is characterized in that the processor is specifically used for:
Acquire the laser radar data of at least one object around the robot using the laser radar, and can not be from institute
When stating the laser radar data for determining the target object in the laser radar data of at least one object, the depth is utilized
Camera acquires the depth image data of at least one object around the robot;Or
Acquire the depth image data of at least one object around the robot using the depth camera, and can not be from
When determining the depth image data of the target object in the depth image data of at least one object, swashed using described
Optical radar acquires the laser radar data of at least one object around the robot.
15. a kind of computer readable storage medium of storage computer instruction, which is characterized in that when the computer instruction is by one
When a or multiple processors execute, it includes action below to cause one or more of processor execution:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to the machine
The first position information of people;
The depth of at least one object around the robot is acquired using the depth camera and laser radar of the robot
Image and laser radar data;
According to the range information in the first position information, from the depth image and laser radar number of at least one object
In, the depth image and laser radar data of the target object are determined;
The target object is positioned according to the depth image of the target object and laser radar data.
16. a kind of positioning system based on robot, which is characterized in that including:Robot and the nothing being placed on target object
Line transceiver;The robot includes:The proximity communication module being adapted to the wireless transmitting-receiving equipments;The robot is also
Including:Depth camera and laser radar;
The robot is used for:
It is received between the robot and the target object using the proximity communication module and the wireless transmitting-receiving equipments
Wireless signal is sent out, to measure first position information of the target object relative to the robot;
The depth image of at least one object around the robot is acquired using the depth camera and laser radar and is swashed
Optical radar data;
According to the range information in the first position information, from the depth image and laser radar number of at least one object
In, the depth image and laser radar data of the target object are determined;
The target object is positioned according to the depth image of the target object and laser radar data.
17. a kind of localization method based on robot, which is characterized in that including:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to the machine
The first position information of people;
The depth image data of at least one object around the robot is acquired using the depth camera of the robot;
According to the range information in the first position information, from the depth image data of at least one object, determine
The depth image data of the target object;
The target object is positioned according to the depth image data of the target object.
18. a kind of robot, which is characterized in that including:Basic machine;The basic machine is equipped with proximity communication module,
One or more processors, and one or more memories for storing computer instruction;It is additionally provided with depth on the basic machine
Spend camera;
One or more of processors, for executing the computer instruction, for:
It is communicated with target object using the proximity communication module, to measure the target object relative to the robot
First position information;
The depth image data of at least one object around the robot is acquired using the depth camera;
According to the range information in the first position information, from the depth image data of at least one object, determine
The depth image data of the target object;
The target object is positioned according to the depth image data of the target object.
19. a kind of computer readable storage medium of storage computer instruction, which is characterized in that when the computer instruction is by one
When a or multiple processors execute, it includes action below to cause one or more of processor execution:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to the machine
The first position information of people;
The depth image data of at least one object around the robot is acquired using the depth camera of the robot;
According to the range information in the first position information, from the depth image data of at least one object, determine
The depth image data of the target object;
The target object is positioned according to the depth image data of the target object.
20. a kind of localization method based on robot, which is characterized in that including:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to the machine
The first position information of people;
The laser radar data of at least one object around the robot is acquired using the laser radar of the robot;
According to the range information in the first position information, from the laser radar data of at least one object, determine
The laser radar data of the target object;
The target object is positioned according to the laser radar data of the target object.
21. a kind of robot, which is characterized in that including:Basic machine;The basic machine is equipped with proximity communication module,
One or more processors, and one or more memories for storing computer instruction;It is additionally provided on the basic machine sharp
Optical radar;
One or more of processors, for executing the computer instruction, for:
It is communicated with target object using the proximity communication module, to measure the target object relative to the robot
First position information;
The laser radar data of at least one object around the robot is acquired using the laser radar;
According to the range information in the first position information, from the laser radar data of at least one object, determine
The laser radar data of the target object;
The target object is positioned according to the laser radar data of the target object.
22. a kind of computer readable storage medium of storage computer instruction, which is characterized in that when the computer instruction is by one
When a or multiple processors execute, it includes action below to cause one or more of processor execution:
It is communicated with target object using the proximity communication module of robot, to measure the target object relative to the machine
The first position information of people;
The laser radar data of at least one object around the robot is acquired using the laser radar of the robot;
According to the range information in the first position information, from the laser radar data of at least one object, determine
The laser radar data of the target object;
The target object is positioned according to the laser radar data of the target object.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810395460 | 2018-04-27 | ||
CN2018103954607 | 2018-04-27 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108549088A true CN108549088A (en) | 2018-09-18 |
CN108549088B CN108549088B (en) | 2020-10-02 |
Family
ID=63513661
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810433354.3A Active CN108549088B (en) | 2018-04-27 | 2018-05-08 | Positioning method, device and system based on robot and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108549088B (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109360124A (en) * | 2018-10-22 | 2019-02-19 | 深圳和呈睿国际技术有限公司 | Accompany robot control method, system and mobile terminal |
CN109740443A (en) * | 2018-12-12 | 2019-05-10 | 歌尔股份有限公司 | Detect the method, apparatus and sports equipment of barrier |
CN110163448A (en) * | 2018-12-28 | 2019-08-23 | 山东浪潮商用系统有限公司 | A kind of tax intelligent robot based on indoor positioning leads tax method |
CN110320523A (en) * | 2019-07-05 | 2019-10-11 | 齐鲁工业大学 | Follow the target locating set and method of robot |
CN110717003A (en) * | 2019-09-27 | 2020-01-21 | 四川长虹电器股份有限公司 | Intelligent shopping cart autonomous navigation and automatic following method based on path planning |
CN111148022A (en) * | 2019-12-31 | 2020-05-12 | 深圳市优必选科技股份有限公司 | Mobile equipment and positioning method and device thereof |
CN111141289A (en) * | 2019-12-17 | 2020-05-12 | 佛山科学技术学院 | Early warning navigation method and system for bath chair |
CN111220148A (en) * | 2020-01-21 | 2020-06-02 | 珊口(深圳)智能科技有限公司 | Mobile robot positioning method, system and device and mobile robot |
CN111252082A (en) * | 2020-01-20 | 2020-06-09 | 浙江吉利汽车研究院有限公司 | Driving early warning method and device and storage medium |
CN111381587A (en) * | 2018-12-11 | 2020-07-07 | 北京京东尚科信息技术有限公司 | Following method and device for following robot |
CN111552315A (en) * | 2020-05-11 | 2020-08-18 | 中国商用飞机有限责任公司北京民用飞机技术研究中心 | Flight driving method, device, equipment and storage medium |
CN111679663A (en) * | 2019-02-25 | 2020-09-18 | 北京奇虎科技有限公司 | Three-dimensional map construction method, sweeping robot and electronic equipment |
CN112312113A (en) * | 2020-10-29 | 2021-02-02 | 贝壳技术有限公司 | Method, device and system for generating three-dimensional model |
CN113030857A (en) * | 2021-05-27 | 2021-06-25 | 北京国电通网络技术有限公司 | Open storage yard material positioning method and equipment |
CN114136306A (en) * | 2021-12-01 | 2022-03-04 | 浙江大学湖州研究院 | Expandable UWB and camera-based relative positioning device and method |
CN114434453A (en) * | 2021-12-31 | 2022-05-06 | 上海擎朗智能科技有限公司 | Ladder taking method and system for robot, robot and storage medium |
CN115428470A (en) * | 2020-12-23 | 2022-12-02 | 松下知识产权经营株式会社 | Robot control method, robot, program, and recording medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105807775A (en) * | 2016-05-17 | 2016-07-27 | 上海酷哇机器人有限公司 | Movable robot with autonomous following and obstacle-avoidance function |
CN105910599A (en) * | 2016-04-15 | 2016-08-31 | 深圳乐行天下科技有限公司 | Robot device and method for locating target |
CN105979478A (en) * | 2016-07-26 | 2016-09-28 | 上海仙知机器人科技有限公司 | Positioning method and device |
CN106643801A (en) * | 2016-12-27 | 2017-05-10 | 纳恩博(北京)科技有限公司 | Detection method of poisoning accuracy and electronic equipment |
CN106931963A (en) * | 2017-04-13 | 2017-07-07 | 高域(北京)智能科技研究院有限公司 | Environmental data shared platform, unmanned vehicle, localization method and alignment system |
US20180052466A1 (en) * | 2016-08-22 | 2018-02-22 | Kinpo Electronics, Inc. | Real-time navigating method for mobile robot |
-
2018
- 2018-05-08 CN CN201810433354.3A patent/CN108549088B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105910599A (en) * | 2016-04-15 | 2016-08-31 | 深圳乐行天下科技有限公司 | Robot device and method for locating target |
CN105807775A (en) * | 2016-05-17 | 2016-07-27 | 上海酷哇机器人有限公司 | Movable robot with autonomous following and obstacle-avoidance function |
CN105979478A (en) * | 2016-07-26 | 2016-09-28 | 上海仙知机器人科技有限公司 | Positioning method and device |
US20180052466A1 (en) * | 2016-08-22 | 2018-02-22 | Kinpo Electronics, Inc. | Real-time navigating method for mobile robot |
CN106643801A (en) * | 2016-12-27 | 2017-05-10 | 纳恩博(北京)科技有限公司 | Detection method of poisoning accuracy and electronic equipment |
CN106931963A (en) * | 2017-04-13 | 2017-07-07 | 高域(北京)智能科技研究院有限公司 | Environmental data shared platform, unmanned vehicle, localization method and alignment system |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109360124A (en) * | 2018-10-22 | 2019-02-19 | 深圳和呈睿国际技术有限公司 | Accompany robot control method, system and mobile terminal |
CN111381587A (en) * | 2018-12-11 | 2020-07-07 | 北京京东尚科信息技术有限公司 | Following method and device for following robot |
CN111381587B (en) * | 2018-12-11 | 2023-11-03 | 北京京东乾石科技有限公司 | Following method and device for following robot |
CN109740443A (en) * | 2018-12-12 | 2019-05-10 | 歌尔股份有限公司 | Detect the method, apparatus and sports equipment of barrier |
CN110163448A (en) * | 2018-12-28 | 2019-08-23 | 山东浪潮商用系统有限公司 | A kind of tax intelligent robot based on indoor positioning leads tax method |
CN111679663A (en) * | 2019-02-25 | 2020-09-18 | 北京奇虎科技有限公司 | Three-dimensional map construction method, sweeping robot and electronic equipment |
CN110320523A (en) * | 2019-07-05 | 2019-10-11 | 齐鲁工业大学 | Follow the target locating set and method of robot |
CN110320523B (en) * | 2019-07-05 | 2020-12-11 | 齐鲁工业大学 | Target positioning device and method for following robot |
CN110717003A (en) * | 2019-09-27 | 2020-01-21 | 四川长虹电器股份有限公司 | Intelligent shopping cart autonomous navigation and automatic following method based on path planning |
CN111141289A (en) * | 2019-12-17 | 2020-05-12 | 佛山科学技术学院 | Early warning navigation method and system for bath chair |
CN111148022A (en) * | 2019-12-31 | 2020-05-12 | 深圳市优必选科技股份有限公司 | Mobile equipment and positioning method and device thereof |
CN111252082A (en) * | 2020-01-20 | 2020-06-09 | 浙江吉利汽车研究院有限公司 | Driving early warning method and device and storage medium |
CN111220148A (en) * | 2020-01-21 | 2020-06-02 | 珊口(深圳)智能科技有限公司 | Mobile robot positioning method, system and device and mobile robot |
CN111552315B (en) * | 2020-05-11 | 2023-07-18 | 中国商用飞机有限责任公司北京民用飞机技术研究中心 | Flight driving method, device, equipment and storage medium |
CN111552315A (en) * | 2020-05-11 | 2020-08-18 | 中国商用飞机有限责任公司北京民用飞机技术研究中心 | Flight driving method, device, equipment and storage medium |
CN112312113A (en) * | 2020-10-29 | 2021-02-02 | 贝壳技术有限公司 | Method, device and system for generating three-dimensional model |
CN115428470A (en) * | 2020-12-23 | 2022-12-02 | 松下知识产权经营株式会社 | Robot control method, robot, program, and recording medium |
CN115428470B (en) * | 2020-12-23 | 2023-12-26 | 松下知识产权经营株式会社 | Robot control method, robot, program, and recording medium |
CN113030857B (en) * | 2021-05-27 | 2021-08-31 | 北京国电通网络技术有限公司 | Open storage yard material positioning method and equipment |
CN113030857A (en) * | 2021-05-27 | 2021-06-25 | 北京国电通网络技术有限公司 | Open storage yard material positioning method and equipment |
CN114136306A (en) * | 2021-12-01 | 2022-03-04 | 浙江大学湖州研究院 | Expandable UWB and camera-based relative positioning device and method |
CN114136306B (en) * | 2021-12-01 | 2024-05-07 | 浙江大学湖州研究院 | Expandable device and method based on relative positioning of UWB and camera |
CN114434453A (en) * | 2021-12-31 | 2022-05-06 | 上海擎朗智能科技有限公司 | Ladder taking method and system for robot, robot and storage medium |
CN114434453B (en) * | 2021-12-31 | 2024-06-07 | 上海擎朗智能科技有限公司 | Robot ladder taking method, system, robot and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN108549088B (en) | 2020-10-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108549088A (en) | Localization method, equipment, system based on robot and storage medium | |
US10726387B2 (en) | AGV traffic management system | |
US11797910B2 (en) | Hands-free augmented reality system for picking and/or sorting assets | |
EP3771228B1 (en) | Intelligent pet monitoring method of robot | |
CN103493106B (en) | Come hand is optionally covered to the method and apparatus on the virtual projection on physical surface using bone tracking | |
EP2068275B1 (en) | Communication robot | |
CN106228302A (en) | A kind of method and apparatus for carrying out task scheduling in target area | |
US20200110934A1 (en) | Augmented reality system for asset tracking and visualization using indoor positioning system | |
US20180314346A1 (en) | Tracking of position and orientation of objects in virtual reality systems | |
CN108369419A (en) | Generating a spatiotemporal object manifest using object observations of a mobile robot and using the manifest to determine monitoring parameters for the mobile robot | |
CN113116224B (en) | Robot and control method thereof | |
US20180202819A1 (en) | Automatic routing to event endpoints | |
JP2023541619A (en) | Warehouse storage robot positioning and map creation method, robot and storage medium | |
Klinker et al. | Distributed user tracking concepts for augmented reality applications | |
CN110039535A (en) | Robot interactive method and robot | |
CN109445466A (en) | Robot follow-up control method, system, equipment and computer readable storage medium | |
CN107984474A (en) | A kind of humanoid intelligent robot of half body and its control system | |
WO2019212875A1 (en) | Representation of user position, movement, and gaze in mixed reality space | |
JP2018128640A (en) | Information processing apparatus, information processing system, and program | |
US11769110B1 (en) | Systems and methods for operator motion management | |
Galetto et al. | A wireless sensor network-based approach to large-scale dimensional metrology | |
CN108008401A (en) | Portable laser rangefinder | |
Balaji et al. | RetroSphere: Self-Contained Passive 3D Controller Tracking for Augmented Reality | |
CN208133009U (en) | robot | |
Roy et al. | Route planning for automatic indoor driving of smart cars |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |