CN107608345A - A kind of robot and its follower method and system - Google Patents
A kind of robot and its follower method and system Download PDFInfo
- Publication number
- CN107608345A CN107608345A CN201710745643.2A CN201710745643A CN107608345A CN 107608345 A CN107608345 A CN 107608345A CN 201710745643 A CN201710745643 A CN 201710745643A CN 107608345 A CN107608345 A CN 107608345A
- Authority
- CN
- China
- Prior art keywords
- target
- robot
- orientation
- bluetooth
- need
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000013135 deep learning Methods 0.000 claims abstract description 35
- 238000003384 imaging method Methods 0.000 claims abstract description 30
- 230000004807 localization Effects 0.000 claims abstract description 11
- 238000001514 detection method Methods 0.000 claims abstract description 10
- 230000004438 eyesight Effects 0.000 claims description 59
- 238000000605 extraction Methods 0.000 claims description 18
- 238000012549 training Methods 0.000 claims description 3
- 230000005055 memory storage Effects 0.000 claims description 2
- 230000002045 lasting effect Effects 0.000 abstract description 2
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 241001062009 Indigofera Species 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/656—Interaction with payloads or external entities
- G05D1/686—Maintaining a relative position with respect to moving targets, e.g. following animals or humans
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/243—Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/247—Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2101/00—Details of software or hardware architectures used for the control of position
- G05D2101/20—Details of software or hardware architectures used for the control of position using external object recognition
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/10—Land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/10—Optical signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/30—Radio signals
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a kind of robot and its follower method and system, wherein the follower method includes:Imaging sensor gathers the image in front of robot in real time;Detection is identified to described image using deep learning frame model, detects the target that need to be followed, and be tracked;After target loss, detect whether UWB signal be present;If in the presence of carrying out UWB positioning, control machine people turns to the orientation after determining the orientation where target;If being not present, bluetooth positioning is carried out, control machine people turns to the orientation after determining the orientation where target.The present invention uses a variety of track and localization means, it is ensured that the lasting tracking to that need to follow target, target need to be followed not easy to be lost.
Description
Technical field
The present invention relates to robotic artificial intelligence field, more particularly to a kind of robot and its follower method and system.
Background technology
Either system for tracking use more is based solely on vision system or is based solely on for the robot that follows on the market at present
UWB sensor locating and trackings, or human body tracking is carried out based on skeleton.
The skeleton of human body can be identified by Kinect/ or other binocular sensors based on skeleton.It is this with
It is relatively low with mode recognition rate, generally require target and slowly walk and could track, and when there are more personal appearance target area
When interference occurs.Cause to track mistake.
Sensor locating and tracking mode based on UWB (ultra wide band, impulse radio) is by UWB technology, because UWB is
Radio magnetic wave, block bigger on its influence.When between UWB base stations and tested module exist block in the case of, to survey
It is bigger to measure result influence.Locating and tracking effect is unstable.Target tripping problems be present.
The track and localization mode of view-based access control model is that the tracking object set is identified by vision sensor, is existed working as
That tracking is lost in the case of blocking can be present.Lookup is carried out again to be difficult in tracking.Some products are to need to be traced
Target wears corresponding vision sensor, by transmission of wireless signals to robot, is carried out by the image information gathered with itself
Compare, carry out position navigator fix.This method brings many inconvenience, it is necessary to which user wears vision sensor to user.Carry out
Vision slam exists computationally intensive, the problem of cumulative errors be present.
The content of the invention
For above-mentioned technical problem, the embodiments of the invention provide a kind of robot and its follower method and system, with solution
The problem of need to certainly following target can not continue tracking after disappearing.
The first aspect of the embodiment of the present invention provides a kind of robot follower method, comprises the following steps:
Image acquisition step, imaging sensor gather the image in front of robot in real time;
Detection is identified in vision tracking step, the image obtained using deep learning frame model to imaging sensor,
The target that need to be followed is detected, and is tracked;
Positioning step, after target loss, detect whether UWB signal be present;If in the presence of, carry out UWB positioning, determine mesh
Control machine people turns to the orientation behind orientation where marking, and returns to vision tracking step;If being not present, carry out bluetooth and determine
Position, control machine people turns to the orientation after determining the orientation where target, and returns to vision tracking step.
Described robot follower method, wherein, the vision tracking step specifically includes:
Advancing with need to follow the data set of target to carry out deep learning training, the deep learning framework mould trained
Type;
The image that the deep learning frame model gathers to imaging sensor carries out target identification, the target that will identify that
Image sends Vision Tracking module to;
Vision Tracking module carries out feature extraction to the target image, the feature that the first frame target image is extracted
As the reference picture subsequently matched;Feature extraction and matching are carried out to the second frame target image, mesh need to be followed by matching
Target position, and control machine people is tracked to target.
Described robot follower method, wherein, the UWB positioning includes:
Default at least three UWB base stations, the first positioning label that target need to be followed to carry are surveyed with each UWB base stations
Away from obtaining following the position of target;The second positioning label that robot carries carries out ranging with each UWB base stations, obtains machine
The position of device people;
According to the position of target and the position of robot need to be followed obtain that orientation of the target relative to robot need to be followed.
Described robot follower method, wherein, the bluetooth positioning includes:
Four bluetooth modules, which receive, need to follow the Bluetooth signal that target is sent;Four bluetooth modules are separately positioned on machine
Four different azimuths of device people;
Compare the power for the Bluetooth signal that four bluetooth modules receive, receive the indigo plant of the maximum Bluetooth signal of signal intensity
Orientation corresponding to tooth module is that need to follow the orientation of target.
The second aspect of the embodiment of the present invention provides a kind of robot system for tracking, including:
Imaging sensor, for gathering the image in front of robot in real time;
Inspection is identified in vision tracking module, the image for being obtained using deep learning frame model to imaging sensor
Survey, detect the target that need to be followed, and be tracked;
UWB locating modules, for after target is lost, detecting whether UWB signal be present;If in the presence of progress UWB determines
Position, control machine people turns to the orientation after determining the orientation where target;If being not present, start bluetooth locating module;
Bluetooth locating module, for carrying out bluetooth positioning, control machine people turns to the party after determining the orientation where target
Position.
Described robot system for tracking, wherein, the vision tracking module includes:
By the data set progress deep learning of target need to be followed to train obtained deep learning frame model;The depth
Practise frame model to be used to carry out target identification to the image of imaging sensor collection, the target image that will identify that sends vision to
Track algorithm module;
The Vision Tracking module, for carrying out feature extraction to the target image, by the first frame target image
The feature of extraction is as the reference picture subsequently matched;Feature extraction and matching, matching are carried out to the second frame target image
Go out to follow the position of target, and control machine people is tracked to target.
Described robot system for tracking, wherein, the UWB locating modules include:
Second positioning label, for receiving and launching UWB signal;
First localization process unit, for carrying out ranging with default at least three UWB base stations by the second positioning label,
The position of robot is obtained according to the result of ranging;By the second positioning label obtain the UWB base stations of each UWB base station feedbacks with
The ranging information of first positioning label, the position of the first positioning label is obtained according to the result of ranging;The first positioning label
By that target need to be followed to carry;Obtain following target relative to robot according to the position of target and the position of robot need to be followed
Orientation, and control machine people turns to the orientation.
Described robot system for tracking, wherein, the bluetooth locating module includes:
Four bluetooth modules, for receiving the Bluetooth signal that target need to be followed to send;Four bluetooth modules are set respectively
Put four different azimuths in robot;
Second localization process unit, compare the power for the Bluetooth signal that four bluetooth modules receive, receive signal intensity
Orientation corresponding to the bluetooth module of maximum Bluetooth signal is that need to follow the orientation of target, and control machine people turns to the party
Position.
The third aspect of the embodiment of the present invention provides a kind of robot, including:
Memory, for storage program;
Processor, for realizing method as described above by performing the program of the memory storage.
The fourth aspect of the embodiment of the present invention provides a kind of robot, including:
Imaging sensor, for gathering the image in front of robot in real time;
Inspection is identified in vision tracking module, the image for being obtained using deep learning frame model to imaging sensor
Survey, detect the target that need to be followed, and be tracked;
UWB locating modules, for after target is lost, detecting whether UWB signal be present;If in the presence of progress UWB determines
Position, control machine people turns to the orientation after determining the orientation where target;If being not present, start bluetooth locating module;
Bluetooth locating module, for carrying out bluetooth positioning, control machine people turns to the party after determining the orientation where target
Position.
In technical scheme provided in an embodiment of the present invention, robot follower method includes:The real-time harvester of imaging sensor
Image in front of device people;Detection is identified to described image using deep learning frame model, detects the target that need to be followed,
And it is tracked;After target loss, detect whether UWB signal be present;If in the presence of UWB positioning being carried out, where determining target
Orientation after control machine people turn to the orientation;If being not present, bluetooth positioning is carried out, is controlled after determining the orientation where target
Robot turns to the orientation.The present invention uses a variety of track and localization means, it is ensured that the lasting tracking to target need to be followed, need to be with
It is not easy to be lost with target.
Brief description of the drawings
Fig. 1 is the structured flowchart of robot system for tracking in the embodiment of the present invention;
Fig. 2 is the structured flowchart of vision tracking module in robot system for tracking of the present invention;
Fig. 3 is the schematic diagram of UWB positioning in robot system for tracking of the present invention;
Fig. 4 is the structured flowchart of UWB locating modules in robot system for tracking of the present invention;
Fig. 5 is the structured flowchart of bluetooth locating module in robot system for tracking of the present invention;
Fig. 6 is the schematic diagram of bluetooth positioning in robot system for tracking of the present invention;
Fig. 7 is the flow chart of robot follower method in the embodiment of the present invention;
Fig. 8 is the particular flow sheet of robot follower method in the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Site preparation describes, it is clear that described embodiment is only part of the embodiment of the present invention, rather than whole embodiments.It is based on
Embodiment in the present invention, the every other implementation that those skilled in the art are obtained under the premise of creative work is not made
Example, belongs to the scope of protection of the invention.
Referring to Fig. 1, the present invention provides a kind of robot system for tracking, realize that it includes machine based on Multi-sensor Fusion
Device people 10 and target 20 need to be followed.Robot 10 is used for target 20 need to be followed to be tracked.
The robot 10 includes camera, vision tracking module 120, UWB locating modules 130 and bluetooth locating module 140.
The camera is arranged on the front of robot 10, and its imaging sensor 110 is used to gather the front of robot 10 in real time
Image.The visual angle of the camera is 180 °.Described image sensor 110 can use ccd sensor or cmos sensor.
The vision tracking module 120, for the image obtained using deep learning frame model to imaging sensor 110
Detection is identified, detects the target that need to be followed, and is tracked, improves the Stability and veracity of tracking.
Specifically, referring to Fig. 2, the vision tracking module 120 includes:Deep learning frame model 121 and vision with
Track algoritic module 122.
The deep learning frame model 121 trains to obtain by the data set of target need to be followed to carry out deep learning in advance.
The deep learning frame model 121 is used to carry out target identification, the mesh that will identify that to the image that imaging sensor 110 gathers
Logo image sends Vision Tracking module 122 to.In the present embodiment, target need to be followed to behave, the deep learning framework mould
Type uses yolo models.
The Vision Tracking module 122, for carrying out feature extraction to the target image, by the first frame target figure
As the feature of extraction is as the reference picture subsequently matched;Feature extraction and matching are carried out to the second frame target image,
The position of target need to be followed by allotting, and control machine people is tracked to target.In the present embodiment, Vision Tracking KCF
Vision Tracking can accomplish real-time tracking to image.User annotation goes out the target 20 that need to be followed, KCF Vision Trackings according to
The target that yolo models sub-elect carries out feature extraction.(the present embodiment uses HOG to the feature that the target image of present frame is extracted
Feature) reference picture that is matched as next frame target image.The target figure of the present frame come out to yolo model discriminations
As carrying out feature extraction, and matched by the use of the target image of previous frame as reference picture, match the position of target 20,
And carry out target following.
The UWB locating modules 130, for after target is lost, detecting whether UWB (ultra wide band, pulse wireless be present
Electricity) signal;If in the presence of carrying out UWB positioning, control machine people 10 turns to the orientation after determining the orientation where target;If no
In the presence of then starting bluetooth locating module 140.UWBUWB base stations are generally arranged at interior, therefore the present invention is determined using UWB indoors
Position, then positioned when outdoor is without UWB signal using bluetooth.
The system for tracking also includes at least three UWB base stations 30, and the present embodiment uses four UWB base stations 30, such as Fig. 3 institutes
Show, four UWB base stations 30 pre-set four corners in the region (such as indoor) for needing to position, and ensure in visual range extremely
Rare 3 UWB base stations 30.
It is described that target 20 need to be followed to carry the first positioning label (tag) 210 and bluetooth module 220.First positioning
Label 210 be used for receive and launch UWB signal, so as to the ranging of UWB base stations 30.The bluetooth module that target 20 need to be followed is used
In sending Bluetooth signal, in order to which bluetooth positions.
Referring to Fig. 4, the UWB locating modules 130 include the first localization process unit 131 and the second positioning label 132.
The second positioning label (tag) 132, for receiving and launching UWB signal, so as to the ranging of UWB base stations 30.
That is, the second positioning label 132 is sent for the UWB signal with the ranging of UWB base stations 30 to UWB base stations 30, receives 30 turns of UWB base stations
Send out the UWB signal of the first positioning label 210 of (feedback).
The first localization process unit 131, for carrying out ranging, root with UWB base stations 30 by the second positioning label 132
The position of robot 10 is obtained using triangulation location according to the result of ranging;Each UWB base stations are obtained by the second positioning label
The UWB base stations 30 and first of 30 feedbacks position the ranging information of label 210, are obtained according to the result of ranging using triangulation location
The position (position that need to follow target 20) of first positioning label 210;According to the position and robot 10 that need to follow target 20
Position obtain following orientation of the target 20 relative to robot 10, and control machine people 10 turns to the orientation;So as to find
Need to loss follow target 20, and then recover the vision tracking based on deep learning.
The bluetooth locating module 140, for carrying out bluetooth positioning, control machine people turns after determining the orientation where target
To the orientation.Referring to Fig. 5, the bluetooth locating module 140 includes four localization process units of bluetooth module 141 and second
142。
Four bluetooth modules 141, which are used to receive, need to follow the Bluetooth signal that the bluetooth module 220 of target is sent;Described four
Bluetooth module 141 is separately positioned on four different azimuths of robot 10.In the present embodiment, as shown in fig. 6, four bluetooth modules
141 are arranged on four angles of robot 10, the corresponding quadrant of each bluetooth module (plane is divided into four quadrants).It need to follow
Target 20 can emission source of the carrying mobile phone as Bluetooth signal.
The second localization process unit 142, the power of the Bluetooth signal received for comparing four bluetooth modules 141,
According under equal transmission signal intensity, bluetooth module 220 the weaker principle of signal intensity, will receive signal apart from more remote
Orientation (quadrant) corresponding to the bluetooth module of the Bluetooth signal of maximum intensity is arranged to follow the orientation of target 20, and controls
Robot 10 turns to the orientation.
The tracking mode that the present invention is combined using bluetooth/UWB/ visions/deep learning, efficiently solve tracking target and disappear
It can not be tracked after mistake, can not have the problems such as cumulative errors in outdoor carry out target following.System tracking effect is more steady
It is fixed.This system for tracking uses the tracking mode of UWB+ visions/deep learning indoors, mainly uses deep learning frame model pair
Detection is identified in the information of imaging sensor, detects the target type to be tracked, then carries out the tracking of target, improve with
The Stability and veracity of track.When target disappears at turning or other positions, can be positioned by UWB, it is determined that tracking target
Position, allow robot to find target, further to track.When robot does not receive UWB signal, it is judged as in room
In the case of outer, due to not knowing the moving region of target, UWBUWB base station modules can not be preset.So just in the case of outdoor
Using bluetooth+vision tracking scheme.Bluetooth locating module cost is extremely low, only need corresponding to user installation APP cans realize with
Track coarse positioning.Vision tracking is carried out using vision+deep learning framework, is positioned in the case of there is target loss using bluetooth
Module carries out coarse positioning to target, allows robot to carry out target lookup.Pure visual target tracking target is efficiently solved with losing
Situation.
Further, judge to follow whether target 20 loses, the need that comparable vision tracking module 120 detects follow
Whether the orientation of target 20 follows the orientation of target 20 consistent with the need that UWB locating modules 130 detect, unanimously then illustrates vision
The target 20 that tracking module 120 tracks is effective;It is inconsistent, illustrate that the target 20 that vision tracking module 120 tracks has been lost.With/
Or, compare the orientation for the target 20 that vision tracking module 120 detects and the side of target 20 that bluetooth locating module 140 detects
Whether position is consistent, unanimously then illustrates that the target 20 that vision tracking module 120 tracks is effective;It is inconsistent, illustrate vision tracking module
The target 20 of 120 tracking has been lost.Judgement need to follow whether target 20 loses, can be by vision tracking module 120, UWB positioning moulds
One or more of block 130 and bluetooth locating module 140 module is completed.
The robot system for tracking provided based on above-described embodiment, the present invention also provide a kind of robot follower method, please
Refering to Fig. 7, methods described comprises the following steps:
S10, image acquisition step, imaging sensor 110 gather the image in front of robot in real time.
Inspection is identified in S20, vision tracking step, the image obtained using deep learning frame model to imaging sensor
Survey, detect the target that need to be followed, and be tracked.
Specifically, the vision tracking step comprises the following steps:
Advancing with need to follow the data set of target to carry out deep learning training, the deep learning framework mould trained
Type.
The image that the deep learning frame model 121 gathers to imaging sensor carries out target identification, will identify that
Target image sends Vision Tracking module to;In the present embodiment, target need to be followed to behave, the deep learning frame model
Using yolo models.
Vision Tracking module carries out feature extraction to the target image, the feature that the first frame target image is extracted
As the reference picture subsequently matched;Feature extraction and matching are carried out to the second frame target image, mesh need to be followed by matching
Target position, and control machine people is tracked to target.In the present embodiment, Vision Tracking is KCF Vision Trackings
Real-time tracking can be accomplished to image.User annotation goes out the target 20 that need to be followed, and KCF Vision Trackings sort according to yolo models
The target gone out carries out feature extraction.The feature (the present embodiment uses HOG features) that the target image of present frame is extracted is as under
The reference picture that one frame target image is matched.The target image of the present frame come out to yolo model discriminations carries out feature and carried
Take, and matched by the use of the target image of previous frame as reference picture, match the position of target 20, and carry out target with
Track.
S30, positioning step, after target loss, detect whether UWB signal be present;If in the presence of, progress UWB positioning, really
Control machine people turns to the orientation behind orientation where setting the goal, and returns to vision tracking step;If being not present, bluetooth is carried out
Positioning, control machine people turns to the orientation after determining the orientation where target, and returns to vision tracking step.
Further, the UWB positioning includes:
Default at least three UWB base stations, the first positioning label that target need to be followed to carry are surveyed with each UWB base stations
Away from obtaining following the position of target;The second positioning label that robot carries carries out ranging with each UWB base stations, obtains machine
The position of device people;
According to the position of target and the position of robot need to be followed obtain that orientation of the target relative to robot need to be followed.
The bluetooth positioning includes:
Four bluetooth modules, which receive, need to follow the Bluetooth signal that target is sent;Four bluetooth modules are separately positioned on machine
Four different azimuths of device people;
Compare the power for the Bluetooth signal that four bluetooth modules receive, receive the indigo plant of the maximum Bluetooth signal of signal intensity
Orientation corresponding to tooth module is that need to follow the orientation of target.
This specific embodiment is as shown in figure 8, specifically include:
The image that S201, imaging sensor 110 gather is input in yolo models.
The image that S202, yolo model gather to imaging sensor carries out target identification.
S203, KCF Vision Tracking carry out the tracking of target on the basis of identification.
S204, robot 10 are tracked to target 20.
S205, the orientation for comparing the target 20 that vision tracking module 120 detects detect with UWB locating modules 130
Whether the orientation of target 20 is consistent;And/or compare orientation and the bluetooth positioning of the target 20 that vision tracking module 120 detects
Whether the orientation for the target 20 that module 140 detects is consistent, unanimously then illustrates that the target 20 that vision tracking module 120 tracks has
Effect;It is inconsistent, illustrate that the target 20 that vision tracking module 120 tracks has been lost.
If S206, consistent, more new target location, i.e. using the target image of present frame as reference picture, return to step
S202, target identification is carried out to the image of next frame imaging sensor collection.
If S207, inconsistent, current target image is contrasted with initial target image, judge both whether one
Cause, to carry out goal verification.Confirm whether the target of tracking malfunctions.Initial target image can be obtained from imaging sensor
Image in, by user confirm target image, or tracking start after the first frame target image.
If S301, current target image and initial target image are inconsistent, detect whether UWB signal be present;If deposit
UWB positioning is then being carried out, control machine people turns to the orientation after determining the orientation where target.If being not present, bluetooth is carried out
Positioning, control machine people turns to the orientation after determining the orientation where target.After robot turns to, you can target lookup is carried out,
I.e. imaging sensor carries out recognition detection after reacquiring image, and now maximum probability can detect the target that need to be followed.
In embodiment provided by the present invention, it should be understood that disclosed system and method, others can be passed through
Mode is realized.For example, system embodiment described above is only schematical, for example, the division of the module, is only
A kind of division of logic function, can there is an other dividing mode when actually realizing, for example, multiple units or component can combine or
Person is desirably integrated into another system, or some features can be ignored, or does not perform.Another, shown or discussed is mutual
Between coupling or direct-coupling or communication connection can be INDIRECT COUPLING or communication link by some interfaces, device or unit
Connect, can be electrical, mechanical or other forms.
The unit illustrated as separating component can be or may not be physically separate, show as unit
The part shown can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple
On NE.Some or all of unit therein can be selected to realize the mesh of this embodiment scheme according to the actual needs
's.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, can also
That unit is individually physically present, can also two or more units it is integrated in a unit.Above-mentioned integrated list
Member can both be realized in the form of hardware, can also be realized in the form of hardware adds SFU software functional unit.
The above-mentioned integrated unit realized in the form of SFU software functional unit, can be stored in one and computer-readable deposit
In storage media.Above-mentioned SFU software functional unit is stored in a storage medium, including some instructions are causing a computer
It is each that equipment (can be personal computer, server, or network equipment etc.) or processor (processor) perform the present invention
The part steps of embodiment methods described.And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (Read-
Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disc or CD etc. it is various
Can be with the medium of store program codes.
The above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although with reference to the foregoing embodiments
The present invention is described in detail, it will be understood by those within the art that:It still can be to foregoing each implementation
Technical scheme described in example is modified, or carries out equivalent substitution to which part technical characteristic;And these modification or
Replace, the essence of appropriate technical solution is departed from the spirit and scope of various embodiments of the present invention technical scheme.
Claims (10)
1. a kind of robot follower method, it is characterised in that comprise the following steps:
Image acquisition step, imaging sensor gather the image in front of robot in real time;
Detection is identified in vision tracking step, the image obtained using deep learning frame model to imaging sensor, detection
Go out the target that need to be followed, and be tracked;
Positioning step, after target loss, detect whether UWB signal be present;If in the presence of, carry out UWB positioning, determine target institute
Orientation after control machine people turn to the orientation, and return to vision tracking step;If being not present, bluetooth positioning is carried out, really
Control machine people turns to the orientation behind orientation where setting the goal, and returns to vision tracking step.
2. robot follower method according to claim 1, it is characterised in that the vision tracking step specifically includes:
Advancing with need to follow the data set of target to carry out deep learning training, the deep learning frame model trained;
The image that the deep learning frame model gathers to imaging sensor carries out target identification, the target image that will identify that
Send Vision Tracking module to;
Vision Tracking module to the target image carry out feature extraction, using the first frame target image extraction feature as
The follow-up reference picture matched;Feature extraction and matching are carried out to the second frame target image, target need to be followed by matching
Position, and control machine people is tracked to target.
3. robot follower method according to claim 1, it is characterised in that the UWB positioning includes:
Default at least three UWB base stations, the first positioning label that target need to be followed to carry carry out ranging with each UWB base stations, obtained
To the position that need to follow target;The second positioning label that robot carries carries out ranging with each UWB base stations, obtains robot
Position;
According to the position of target and the position of robot need to be followed obtain that orientation of the target relative to robot need to be followed.
4. robot follower method according to claim 1, it is characterised in that the bluetooth positioning includes:
Four bluetooth modules, which receive, need to follow the Bluetooth signal that target is sent;Four bluetooth modules are separately positioned on robot
Four different azimuths;
Compare the power for the Bluetooth signal that four bluetooth modules receive, receive the bluetooth mould of the maximum Bluetooth signal of signal intensity
Orientation corresponding to block is that need to follow the orientation of target.
A kind of 5. robot system for tracking, it is characterised in that including:
Imaging sensor, for gathering the image in front of robot in real time;
Detection is identified in vision tracking module, the image for being obtained using deep learning frame model to imaging sensor,
The target that need to be followed is detected, and is tracked;
UWB locating modules, for after target is lost, detecting whether UWB signal be present;If in the presence of, progress UWB positioning, really
Control machine people turns to the orientation behind orientation where setting the goal;If being not present, start bluetooth locating module;
Bluetooth locating module, for carrying out bluetooth positioning, control machine people turns to the orientation after determining the orientation where target.
6. robot system for tracking according to claim 5, it is characterised in that the vision tracking module includes:
By the data set progress deep learning of target need to be followed to train obtained deep learning frame model;The deep learning frame
Frame model is used to carry out the image of imaging sensor collection target identification, and the target image that will identify that sends vision tracking to
Algoritic module;
The Vision Tracking module, for carrying out feature extraction to the target image, the first frame target image is extracted
Feature as the reference picture subsequently matched;Feature extraction and matching are carried out to the second frame target image, matching needs
The position of target is followed, and control machine people is tracked to target.
7. robot system for tracking according to claim 5, it is characterised in that the UWB locating modules include:
Second positioning label, for receiving and launching UWB signal;
First localization process unit, for carrying out ranging with default at least three UWB base stations by the second positioning label, according to
The result of ranging obtains the position of robot;The UWB base stations and first of each UWB base station feedbacks are obtained by the second positioning label
The ranging information of label is positioned, the position of the first positioning label is obtained according to the result of ranging;The first positioning label is by needing
Target is followed to carry;According to the position of target and the position of robot need to be followed obtain that side of the target relative to robot need to be followed
Position, and control machine people turns to the orientation.
8. robot system for tracking according to claim 5, it is characterised in that the bluetooth locating module includes:
Four bluetooth modules, for receiving the Bluetooth signal that target need to be followed to send;Four bluetooth modules are separately positioned on
Four different azimuths of robot;
Second localization process unit, compare the power for the Bluetooth signal that four bluetooth modules receive, receive signal intensity maximum
Bluetooth signal bluetooth module corresponding to orientation be that need to follow the orientation of target, and control machine people turns to the orientation.
A kind of 9. robot, it is characterised in that including:
Memory, for storage program;
Processor, for being realized by performing the program of the memory storage as any one of claim 1-4
Method.
A kind of 10. robot, it is characterised in that including:
Imaging sensor, for gathering the image in front of robot in real time;
Detection is identified in vision tracking module, the image for being obtained using deep learning frame model to imaging sensor,
The target that need to be followed is detected, and is tracked;
UWB locating modules, for after target is lost, detecting whether UWB signal be present;If in the presence of, progress UWB positioning, really
Control machine people turns to the orientation behind orientation where setting the goal;If being not present, start bluetooth locating module;
Bluetooth locating module, for carrying out bluetooth positioning, control machine people turns to the orientation after determining the orientation where target.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710745643.2A CN107608345A (en) | 2017-08-26 | 2017-08-26 | A kind of robot and its follower method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710745643.2A CN107608345A (en) | 2017-08-26 | 2017-08-26 | A kind of robot and its follower method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107608345A true CN107608345A (en) | 2018-01-19 |
Family
ID=61055837
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710745643.2A Pending CN107608345A (en) | 2017-08-26 | 2017-08-26 | A kind of robot and its follower method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107608345A (en) |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108436922A (en) * | 2018-06-15 | 2018-08-24 | 成都精位科技有限公司 | Associated movement robot and its control method, device, system |
CN108572649A (en) * | 2018-04-28 | 2018-09-25 | 杭州电子科技大学 | Shared intelligence based on UWB labels follows and handling system and method |
CN108646736A (en) * | 2018-05-02 | 2018-10-12 | 北京京东尚科信息技术有限公司 | Method for tracking target and device for tracking robot |
CN108734082A (en) * | 2018-03-21 | 2018-11-02 | 北京猎户星空科技有限公司 | Method for building up, device, equipment and the storage medium of correspondence |
CN108810616A (en) * | 2018-05-31 | 2018-11-13 | 广州虎牙信息科技有限公司 | Object localization method, image display method, device, equipment and storage medium |
CN108955687A (en) * | 2018-05-31 | 2018-12-07 | 湖南万为智能机器人技术有限公司 | The synthesized positioning method of mobile robot |
CN109062215A (en) * | 2018-08-24 | 2018-12-21 | 北京京东尚科信息技术有限公司 | Robot and barrier-avoiding method, system, equipment and medium are followed based on its target |
CN109725642A (en) * | 2019-01-25 | 2019-05-07 | 深圳普思英察科技有限公司 | Self-service machine system and its air navigation aid and device |
CN109816688A (en) * | 2018-12-03 | 2019-05-28 | 安徽酷哇机器人有限公司 | Article follower method and luggage case |
CN110060295A (en) * | 2019-04-24 | 2019-07-26 | 达闼科技(北京)有限公司 | Object localization method and device, control device follow equipment and storage medium |
CN110166571A (en) * | 2018-06-08 | 2019-08-23 | 深圳勇艺达机器人有限公司 | A kind of automatic follower method and device based on mobile robot |
CN110362093A (en) * | 2019-08-06 | 2019-10-22 | 苏州红树林智能科技有限公司 | The follower method of the intelligent wheel chair and its control system of view-based access control model and three-point fix |
CN110377020A (en) * | 2018-09-19 | 2019-10-25 | 北京京东尚科信息技术有限公司 | The running method of unmanned equipment, device and system |
CN110488874A (en) * | 2019-08-29 | 2019-11-22 | 五邑大学 | A kind of education auxiliary robot and its control method |
CN110750098A (en) * | 2019-11-27 | 2020-02-04 | 广东博智林机器人有限公司 | Robot navigation system |
CN110979499A (en) * | 2019-11-19 | 2020-04-10 | 贵州电网有限责任公司 | Automatic following system and following method for spherical robot group |
CN111367320A (en) * | 2018-12-26 | 2020-07-03 | 沈阳新松机器人自动化股份有限公司 | Management method and management system for indoor mobile robot |
CN111381587A (en) * | 2018-12-11 | 2020-07-07 | 北京京东尚科信息技术有限公司 | Following method and device for following robot |
CN111462226A (en) * | 2020-01-19 | 2020-07-28 | 杭州海康威视系统技术有限公司 | Positioning method, system, device, electronic equipment and storage medium |
CN112654471A (en) * | 2018-09-06 | 2021-04-13 | Lg电子株式会社 | Multiple autonomous mobile robots and control method thereof |
CN112742038A (en) * | 2019-10-29 | 2021-05-04 | 珠海市一微半导体有限公司 | Toy robot and moving method and chip thereof |
TWI732211B (en) * | 2018-05-04 | 2021-07-01 | 南韓商Lg電子股份有限公司 | A plurality of autonomous mobile robots and controlling method for the same |
TWI742644B (en) * | 2020-05-06 | 2021-10-11 | 東元電機股份有限公司 | Following mobile platform and method thereof |
US11169539B2 (en) | 2018-05-04 | 2021-11-09 | Lg Electronics Inc. | Plurality of autonomous mobile robots and controlling method for the same |
CN113741550A (en) * | 2020-05-15 | 2021-12-03 | 北京机械设备研究所 | Mobile robot following method and system |
CN114326732A (en) * | 2021-12-28 | 2022-04-12 | 无锡笠泽智能科技有限公司 | Robot autonomous following system and autonomous following control method |
CN115177931A (en) * | 2022-07-11 | 2022-10-14 | 深圳市格睿尔科技有限公司 | Visual identification following system and method, walking device and method and golf bag |
WO2024066308A1 (en) * | 2022-09-28 | 2024-04-04 | 厦门兴联智控科技有限公司 | Golf cart following method and apparatus, and storage medium |
US12001223B2 (en) | 2018-05-04 | 2024-06-04 | Lg Electronics Inc. | Plurality of autonomous mobile robots and controlling method for the same |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102411371A (en) * | 2011-11-18 | 2012-04-11 | 浙江大学 | Multi-sensor service-based robot following system and method |
WO2013059418A1 (en) * | 2011-10-21 | 2013-04-25 | Ftr Systems, Inc. | Tracking foldable cart |
CN103608741A (en) * | 2011-06-13 | 2014-02-26 | 微软公司 | Tracking and following of moving objects by a mobile robot |
CN204181134U (en) * | 2014-07-31 | 2015-03-04 | 清华大学 | Intelligent luggage carrier |
CN105761245A (en) * | 2016-01-29 | 2016-07-13 | 速感科技(北京)有限公司 | Automatic tracking method and device based on visual feature points |
CN105807775A (en) * | 2016-05-17 | 2016-07-27 | 上海酷哇机器人有限公司 | Movable robot with autonomous following and obstacle-avoidance function |
CN205844897U (en) * | 2016-07-27 | 2016-12-28 | 深圳市大疆创新科技有限公司 | Intelligent shopping trolley |
CN106683123A (en) * | 2016-10-31 | 2017-05-17 | 纳恩博(北京)科技有限公司 | Method and device for tracking targets |
CN106940562A (en) * | 2017-03-09 | 2017-07-11 | 华南理工大学 | A kind of mobile robot wireless clustered system and neutral net vision navigation method |
-
2017
- 2017-08-26 CN CN201710745643.2A patent/CN107608345A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103608741A (en) * | 2011-06-13 | 2014-02-26 | 微软公司 | Tracking and following of moving objects by a mobile robot |
WO2013059418A1 (en) * | 2011-10-21 | 2013-04-25 | Ftr Systems, Inc. | Tracking foldable cart |
CN102411371A (en) * | 2011-11-18 | 2012-04-11 | 浙江大学 | Multi-sensor service-based robot following system and method |
CN204181134U (en) * | 2014-07-31 | 2015-03-04 | 清华大学 | Intelligent luggage carrier |
CN105761245A (en) * | 2016-01-29 | 2016-07-13 | 速感科技(北京)有限公司 | Automatic tracking method and device based on visual feature points |
CN105807775A (en) * | 2016-05-17 | 2016-07-27 | 上海酷哇机器人有限公司 | Movable robot with autonomous following and obstacle-avoidance function |
CN205844897U (en) * | 2016-07-27 | 2016-12-28 | 深圳市大疆创新科技有限公司 | Intelligent shopping trolley |
CN106683123A (en) * | 2016-10-31 | 2017-05-17 | 纳恩博(北京)科技有限公司 | Method and device for tracking targets |
CN106940562A (en) * | 2017-03-09 | 2017-07-11 | 华南理工大学 | A kind of mobile robot wireless clustered system and neutral net vision navigation method |
Non-Patent Citations (1)
Title |
---|
刘国成: "《一种基于多传感器数据融合的目标跟踪算法》", 《系统仿真学报》 * |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108734082A (en) * | 2018-03-21 | 2018-11-02 | 北京猎户星空科技有限公司 | Method for building up, device, equipment and the storage medium of correspondence |
CN108572649A (en) * | 2018-04-28 | 2018-09-25 | 杭州电子科技大学 | Shared intelligence based on UWB labels follows and handling system and method |
CN108646736A (en) * | 2018-05-02 | 2018-10-12 | 北京京东尚科信息技术有限公司 | Method for tracking target and device for tracking robot |
US12001223B2 (en) | 2018-05-04 | 2024-06-04 | Lg Electronics Inc. | Plurality of autonomous mobile robots and controlling method for the same |
TWI732211B (en) * | 2018-05-04 | 2021-07-01 | 南韓商Lg電子股份有限公司 | A plurality of autonomous mobile robots and controlling method for the same |
US11169539B2 (en) | 2018-05-04 | 2021-11-09 | Lg Electronics Inc. | Plurality of autonomous mobile robots and controlling method for the same |
CN108810616A (en) * | 2018-05-31 | 2018-11-13 | 广州虎牙信息科技有限公司 | Object localization method, image display method, device, equipment and storage medium |
CN108955687A (en) * | 2018-05-31 | 2018-12-07 | 湖南万为智能机器人技术有限公司 | The synthesized positioning method of mobile robot |
US11284128B2 (en) | 2018-05-31 | 2022-03-22 | Guangzhou Huya Information Technology Co., Ltd. | Object positioning method, video display method, apparatus, device, and storage medium |
CN110166571A (en) * | 2018-06-08 | 2019-08-23 | 深圳勇艺达机器人有限公司 | A kind of automatic follower method and device based on mobile robot |
CN108436922A (en) * | 2018-06-15 | 2018-08-24 | 成都精位科技有限公司 | Associated movement robot and its control method, device, system |
CN109062215A (en) * | 2018-08-24 | 2018-12-21 | 北京京东尚科信息技术有限公司 | Robot and barrier-avoiding method, system, equipment and medium are followed based on its target |
US11906979B2 (en) | 2018-09-06 | 2024-02-20 | Lg Electronics Inc. | Plurality of autonomous mobile robots and controlling method for the same |
CN112654471B (en) * | 2018-09-06 | 2023-11-28 | Lg电子株式会社 | Multiple autonomous mobile robots and control method thereof |
CN112654471A (en) * | 2018-09-06 | 2021-04-13 | Lg电子株式会社 | Multiple autonomous mobile robots and control method thereof |
CN110377020A (en) * | 2018-09-19 | 2019-10-25 | 北京京东尚科信息技术有限公司 | The running method of unmanned equipment, device and system |
CN110377020B (en) * | 2018-09-19 | 2023-05-30 | 北京京东乾石科技有限公司 | Driving method, device and system of unmanned equipment |
CN109816688A (en) * | 2018-12-03 | 2019-05-28 | 安徽酷哇机器人有限公司 | Article follower method and luggage case |
CN111381587B (en) * | 2018-12-11 | 2023-11-03 | 北京京东乾石科技有限公司 | Following method and device for following robot |
CN111381587A (en) * | 2018-12-11 | 2020-07-07 | 北京京东尚科信息技术有限公司 | Following method and device for following robot |
CN111367320A (en) * | 2018-12-26 | 2020-07-03 | 沈阳新松机器人自动化股份有限公司 | Management method and management system for indoor mobile robot |
CN109725642A (en) * | 2019-01-25 | 2019-05-07 | 深圳普思英察科技有限公司 | Self-service machine system and its air navigation aid and device |
CN110060295B (en) * | 2019-04-24 | 2022-05-31 | 达闼科技(北京)有限公司 | Target positioning method and device, control device, following equipment and storage medium |
CN110060295A (en) * | 2019-04-24 | 2019-07-26 | 达闼科技(北京)有限公司 | Object localization method and device, control device follow equipment and storage medium |
CN110362093B (en) * | 2019-08-06 | 2024-05-07 | 苏州红树林智能科技有限公司 | Intelligent wheelchair based on vision and three-point positioning and following method of control system of intelligent wheelchair |
CN110362093A (en) * | 2019-08-06 | 2019-10-22 | 苏州红树林智能科技有限公司 | The follower method of the intelligent wheel chair and its control system of view-based access control model and three-point fix |
CN110488874A (en) * | 2019-08-29 | 2019-11-22 | 五邑大学 | A kind of education auxiliary robot and its control method |
CN112742038A (en) * | 2019-10-29 | 2021-05-04 | 珠海市一微半导体有限公司 | Toy robot and moving method and chip thereof |
CN110979499B (en) * | 2019-11-19 | 2024-04-09 | 贵州电网有限责任公司 | Automatic following system and following method for spherical robot group |
CN110979499A (en) * | 2019-11-19 | 2020-04-10 | 贵州电网有限责任公司 | Automatic following system and following method for spherical robot group |
CN110750098A (en) * | 2019-11-27 | 2020-02-04 | 广东博智林机器人有限公司 | Robot navigation system |
CN111462226A (en) * | 2020-01-19 | 2020-07-28 | 杭州海康威视系统技术有限公司 | Positioning method, system, device, electronic equipment and storage medium |
TWI742644B (en) * | 2020-05-06 | 2021-10-11 | 東元電機股份有限公司 | Following mobile platform and method thereof |
CN113741550B (en) * | 2020-05-15 | 2024-02-02 | 北京机械设备研究所 | Mobile robot following method and system |
CN113741550A (en) * | 2020-05-15 | 2021-12-03 | 北京机械设备研究所 | Mobile robot following method and system |
CN114326732A (en) * | 2021-12-28 | 2022-04-12 | 无锡笠泽智能科技有限公司 | Robot autonomous following system and autonomous following control method |
CN115177931A (en) * | 2022-07-11 | 2022-10-14 | 深圳市格睿尔科技有限公司 | Visual identification following system and method, walking device and method and golf bag |
CN115177931B (en) * | 2022-07-11 | 2024-04-19 | 深圳市格睿尔科技有限公司 | Visual recognition following system and method, walking device and method and golf bag |
WO2024066308A1 (en) * | 2022-09-28 | 2024-04-04 | 厦门兴联智控科技有限公司 | Golf cart following method and apparatus, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107608345A (en) | A kind of robot and its follower method and system | |
CN109890573B (en) | Control method and device for mobile robot, mobile robot and storage medium | |
CN101295016B (en) | Sound source independent searching and locating method | |
CN105956586B (en) | A kind of intelligent tracking system based on TOF 3D video camera | |
WO2018214909A1 (en) | Target tracking method, target tracking device, and computer storage medium | |
US20170045337A1 (en) | Smart wearable mine detector | |
EP3724747B1 (en) | Detecting the pose of an out-of-range controller | |
US20180048482A1 (en) | Control system and control processing method and apparatus | |
US20200327681A1 (en) | Target tracking method and apparatus | |
CN110134117B (en) | Mobile robot repositioning method, mobile robot and electronic equipment | |
KR20150082379A (en) | Fast initialization for monocular visual slam | |
CN106291523A (en) | Hand-held device, object localization method and computer-readable record medium | |
CN108062098A (en) | Map construction method and system for intelligent robot | |
CN103679742B (en) | Method for tracing object and device | |
WO2019019819A1 (en) | Mobile electronic device and method for processing tasks in task region | |
CN110188749A (en) | Designated vehicle Vehicle License Plate Recognition System and method under a kind of more vehicles | |
CN107610157A (en) | A kind of unmanned plane target method for tracing and system | |
CN105364915A (en) | Intelligent home service robot based on three-dimensional machine vision | |
CN106933227A (en) | The method and electronic equipment of a kind of guiding intelligent robot | |
CN105373130A (en) | Special device accident on-site information detection system based on stereo modeling | |
CN105100952B (en) | Screen-picture method of adjustment, device and equipment | |
CN107647828A (en) | The sweeping robot of fish-eye camera is installed | |
CN102968615A (en) | Three-dimensional somatic data identification method with anti-interference function in intensive people flow | |
CN206724970U (en) | A kind of Indoor Robot hybrid location system | |
CN107305385A (en) | The docking calculation and automatic running device of automatic running device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180119 |
|
RJ01 | Rejection of invention patent application after publication |