CN106094875B - A kind of target follow-up control method of mobile robot - Google Patents
A kind of target follow-up control method of mobile robot Download PDFInfo
- Publication number
- CN106094875B CN106094875B CN201610482865.5A CN201610482865A CN106094875B CN 106094875 B CN106094875 B CN 106094875B CN 201610482865 A CN201610482865 A CN 201610482865A CN 106094875 B CN106094875 B CN 106094875B
- Authority
- CN
- China
- Prior art keywords
- target
- mobile robot
- follow
- follows
- followed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 238000001514 detection method Methods 0.000 claims abstract description 20
- 238000011897 real-time detection Methods 0.000 claims abstract description 5
- 230000033001 locomotion Effects 0.000 claims description 10
- 230000000007 visual effect Effects 0.000 claims description 7
- 230000003044 adaptive effect Effects 0.000 claims description 5
- 239000000284 extract Substances 0.000 claims description 4
- 230000004888 barrier function Effects 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 235000012054 meals Nutrition 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/06—Systems determining the position data of a target
- G01S15/08—Systems for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
The invention discloses a kind of target follow-up control methods of mobile robot, comprising steps of triangle is arranged in mobile robot images unit, and distribute corresponding ID number and angular field of view;Acquisition obtains and follows target identities feature, and is uploaded and stored;Detect the identity characteristic of target to be followed and be uploaded to Cloud Server, by Cloud Server characteristic matching, determining when successful match and locking should target be followed be to follow target;People follows target following to locking, and obtains the relative direction for following target and mobile robot;Target real-time detection is followed to locking, calculates and obtains relative distance;It follows the relative direction of target and mobile robot, relative distance to determine moving line according to obtaining, and mobile robot is controlled to following target to move according to moving line.The present invention has the automatic control function of higher feature detection function and higher degree, can be preferably applied to robot and follow process.
Description
Technical field
The present invention relates to a kind of target follow-up control methods of mobile robot, belong to the field of robot application technology.
Background technique
With the development of artificial intelligence technology, robot is applied to more and more in various scenes.For example, at present
There is the robot as amusement, automatic driving car, robot dog and the high anthropomorphic robot etc. of the degree of balance.But it is more
, present robot all initially enters in life usually, and provides service for the mankind, such as the meal delivery robot in dining room
Deng.
When robot provides service in many cases, usually require that robot itself can position the position where oneself
And the position of service object.For example, in museum, it is possible that a kind of can provide the robot of explanation service, this
When just need a kind of method that robot can be allowed to follow client in real time.And current tracking technique it may first have to it positions, it is common
Base station location, the technologies such as inertial positioning, but these technologies are located to require to rely on a variety of peripheral hardwares, cost of implementation is high, Er Qieyao
It asks and network layout is carried out to place, wire structures are complicated, and signal transmission is unstable.
Therefore, existing mobile robot is during following, and can not utilize camera function, to following target to carry out feature
Extract and detection so that mobile robot needs to rely on complicated routing network structure, be unfavorable for quickly and accurately realizing with
With following for target.
Summary of the invention
Technical problem to be solved by the present invention lies in the mesh for overcoming the deficiencies of the prior art and provide a kind of mobile robot
Follow-up control method is marked, existing mobile robot is solved and relies on complicated routing network structure, camera function pair can not be utilized
It follows target to carry out feature extraction and detection, is unfavorable for realization and follows the problem of following of target.
The present invention specifically uses following technical scheme to solve above-mentioned technical problem:
A kind of target follow-up control method of mobile robot, this method are based on mobile robot and Cloud Server, including
Step:
Triangle is set in mobile robot and images unit, and corresponds to ID number and angular field of view for each camera allocation;
Acquisition obtains and follows target identities feature, and is uploaded and stored to Cloud Server;
The mobile robot detects the identity characteristic of target to be followed using triangle camera shooting unit and is uploaded to cloud service
Device according to the identity characteristic of upload and is stored by Cloud Server and follows target identities characteristic matching, by determining when successful match
Being somebody's turn to do target to be followed with locking is to follow target;
The mobile robot follows object real-time tracking to locking, extract the ID number of video camera where following target and
Angular field of view;Subregion setting is carried out to extracted angular field of view, determines district location where following target;By the video camera
ID number and district location where target is followed to obtain the relative direction for following target and mobile robot;
The mobile robot follows target real-time detection to locking, and calculates to obtain and follows target and mobile robot
Relative distance;
The relative direction of target and mobile robot, relative distance is followed to determine moving line according to obtaining, and according to
Moving line controls mobile robot to following target to move, to realize to following target to follow.
Further, as a preferred technical solution of the present invention: the method is using window adaptive
CamShift Density Estimator algorithm, which is treated, follows the face of target to be tracked.
Further, as a preferred technical solution of the present invention: the method is used based on Harr feature
Adaboost Face datection algorithm, which is treated, follows the face of target to be detected.
Further, as a preferred technical solution of the present invention: the method also includes identity characteristic matchings
When failure, prompt warning is issued by mobile robot.
Further, as a preferred technical solution of the present invention: the method also includes judging extracted visual angle model
It follows whether the position of target deviates in enclosing, and controls mobile robot to following target to be deviateed when deviateing
Direction rotation.
Further, as a preferred technical solution of the present invention: the method also includes judging when deviateing
Follow whether target appears in the angular field of view of other video cameras, and when in the angular field of view for coming across other video cameras into
Row follows task to join.
Further, as a preferred technical solution of the present invention: in the method mobile robot using radio frequency away from
From detection method to lock follow target detection obtain relative distance.
Further, as a preferred technical solution of the present invention: the method also includes threshold value, the movement is arranged
Robot controls moveable robot movement according to the size of threshold value and relative distance.
Further, as a preferred technical solution of the present invention: the method is using pid control algorithm control movement
Robot is according to moving line to following target to move.
Further, as a preferred technical solution of the present invention: the method also includes being examined using ultrasonic distance
Survey method judges that whether there are obstacles in the moving line of mobile robot, and determining fortune is recalculated when there are barrier
Dynamic route.
The present invention by adopting the above technical scheme, can have the following technical effects:
The target follow-up control method of mobile robot provided by the present invention, by being taken the photograph to set by mobile robot
Camera group is improved, and in the way of the setting simultaneously of triangle camera shooting unit, expands angular field of view, and follow target and shifting convenient for calculating
The relative direction of mobile robot, while relative distance is obtained based on detection, it thereby determines that the moving line of mobile robot, controls
Mobile robot is moved according to moving line, enables robot to follow required target in real time, and provide service, is increased
The target of strong mobile robot follows function, has the automatic control function of higher feature detection function and higher degree,
This method is widely applied in each field.Existing mobile robot can effectively be solved and rely on complicated wiring
Network structure, can not using camera function to following target to carry out feature extraction and detection, be unfavorable for realizing follow target with
With the problem of.
Detailed description of the invention
Fig. 1 is the flow diagram of the target follow-up control method of mobile robot of the invention.
Fig. 2 is the structural schematic diagram that triangle set by mobile robot images unit in the present invention.
Fig. 3 is the schematic diagram of the angular field of view of single camera in the present invention.
Fig. 4 is the schematic diagram for the angular field of view that intermediate cam of the present invention images unit.
Specific embodiment
Embodiments of the present invention are described with reference to the accompanying drawings of the specification.
As shown in Figure 1, the present invention devises a kind of target follow-up control method of mobile robot, this method is based on movement
Robot and Cloud Server, mobile robot and Cloud Server establish communication by Radio Transmission Technology, specific for this method
Include the following steps:
Triangle camera shooting unit is arranged in mobile robot, and corresponds to ID number and visual angle for each camera allocation for step 1
Range;Wherein, as shown in Fig. 2, circle indicates that mobile robot ontology, the rectangle of three different directions indicate three camera shootings in figure
Mechanism images unit at triangle, wherein three video cameras are arranged along the different directions of mobile robot;And single camera
Angular field of view is divided into several regions as shown in figure 3, the angular field of view of single camera can carry out subregion setting.For
The angular field of view of three video cameras, as shown in figure 4, region A, B, C are the angular field of view of three video cameras, their visual angle respectively
It can be overlapped or not be overlapped between range, each lap can be respectively seen as the region CA and AB.And three video cameras can be used
Wide angle cameras, combine the angular field of view that can be monitored more than 120 degree, but the present invention is not limited within the scope of this kind, other
Angular field of view is equally applicable in the present invention.
Step 2, acquisition, which obtain, follows target identities feature, and is uploaded and stored to Cloud Server.It specifically, can benefit
The triangle set by mobile robot images any one video camera in unit, i.e. one in selection triangle camera shooting unit, benefit
Include the acquisition of the information such as face or clothing, gender to following target to carry out identity characteristic with the video camera, and uploads to cloud clothes
Business device models the identity characteristic for following target;Meanwhile its target that follows as mobile robot is set, establishment follows
With the relationship being followed, it is not limited to that other video cameras can also be utilized using the video camera in mobile robot in the present invention
Acquisition obtains and follows target identities feature, and the present invention is not limited thereof.
Step 3, the mobile robot detect the identity characteristic of target to be followed using triangle camera shooting unit and are uploaded to
Cloud Server according to the identity characteristic of upload and is stored by Cloud Server and follows target identities characteristic matching, by successful match
When determining and locking should target be followed be to follow target.
In the identity characteristic extraction process for carrying out target to be followed, adaptive present invention preferably employs window
CamShift Density Estimator algorithm, which is treated, follows the face of target to be tracked.The detection algorithm detects the people of target to be followed
Face characteristic procedure is as follows:
Step 21: converting the RGB image that the camera shooting unit of mobile robot collects target identities feature to be followed to
HSV image.
Step 22: the color histogram in human face target region is calculated according to HSV image.
Step 23: according to color histogram calculated, calculating the probability distribution for determining color in HSV image, obtain face
Color probability histogram.
Step 24: the size and initial position of search window are chosen by color histogram, adaptive using window
CamShift Density Estimator algorithm tracks face.It is basic for the adaptive CamShift Density Estimator algorithm of window
Principle is that all picture frames of video sequence are all made with MeanShift operation, and will be in result, that is, search window of previous frame
Heart position and window size, the initial value of the search window as next frame MeanShift algorithm, so iteration continues, so that it may
To calculate the center of target window and size in each frame, the face tracking for following target is treated to realize.
On this basis, it obtains the identity characteristic of target to be followed and is uploaded to Cloud Server;Cloud Server is according to upload
Identity characteristic and store and follow target identities characteristic matching, by determine and lock when successful match the target to be followed for
With target, prompt warning is issued by the mobile robot if it fails to match, prompts that goal directed robot is followed to lock again
Surely target is followed.
And follow target if can't detect in the process, that is, it follows target that can not determine or lose, starts at this time
3 are re-execute the steps, i.e., target following to be followed in the angular field of view in video camera and feature is detected using above-mentioned algorithm, is swept
All moving targets in the ken are retouched, follow target to relock.
Step 4, the mobile robot follow object real-time tracking to locking, and extract video camera where following target
ID number and angular field of view;Subregion setting is carried out to extracted angular field of view, determines district location where following target;It is taken the photograph by described
The ID number of camera and district location where target is followed, obtains the relative direction for following target and mobile robot.
Specifically, it as shown in figure 3, the angular field of view of single camera of the invention is partitioned into six regions, that is, images
One frame image size of machine is divided into six regions, and each Regional Representative follows target in the opposite of current camera angular field of view
Position.Then in conjunction with the visual angle district location in the ID number and video camera of video camera where currently following target, note target is opposite
The position coordinates of robot are (id, n), and wherein id represents the number of video camera, and n represents the current camera angle range head at place
District location, wherein 1 < n < 6.After the subregion setting in angular field of view, each angular field of view is divided into 6 regions in Fig. 4,
It can successively be expressed as n1, n2, n3, n4, n5 and n6 from left to right, and follow target institute using represented by circles in Fig. 4
The district location in angular field of view, the circle target region that can obtain wherein rightmost is n4, then n4=4, thus its
The district location at place can be expressed as (B, 4).
Preferably, during to target lock-on and tracking is followed, the method also includes judging extracted angular field of view
In follow whether the position of target deviates, and control mobile robot to the side for following target to be deviateed when deviateing
To rotation.When target will leave extracted angular field of view, mobile robot rotates centainly along the direction for following target to leave
Angle, until make target enter three camera shooting units center camera visual angle center, it is ensured that target is always all in visual angle model
Within enclosing.
Further, the method can also be included in when deviateing, and judgement follows target whether to appear in other and takes the photograph
In the angular field of view of camera, and in the angular field of view for coming across other video cameras when, carries out that task is followed to join.Three camera shootings
The angular field of view of machine has fraction overlapping respectively, facilitates handover in this way and follows target.The angular field of view of i.e. three video cameras point
There is not fraction overlapping, facilitates handover in this way and follow target, join as shown in figure 4, region A, B, C are three video cameras respectively
Angular field of view, each lap be respectively in the region CA and AB and figure circle representative follow target position.Assuming that with
With target O currently just in center camera angular field of view a-quadrant, when following target to move right always, and enter the region AB
When target encounters the right side bearing of AB, target handoff at this time, that is, target is followed to be determined as regarding in right camera at this time
In angular region B area, other situations are similar.This have the advantage that robot when not used when keep following during target is in
The center of the image of the acquisition of video camera is entreated, reduces due to mobile robot caused by target is when walking up and down and circles round
Turn.
Also, during following target, mobile robot after locking follows target, upload at regular intervals with
With the image of target, follow target whether correct to be determined by Cloud Server.
Step 5, the mobile robot follow target real-time detection to locking, and calculate to obtain and follow target and moving machine
The relative distance of device people.
In the acquisition process of relative distance, present invention preferably employs radio range detection methods to follow target to locking
Real-time detection obtains relative distance.Zigbee module is chosen in the present embodiment carries out radio frequency transmission and apart from detection, specifically, mistake
Journey is as follows:
A Zigbee module is equipped in mobile robot, and is set as coordinator mode, can receive Zigbee terminal
Information.It follows target to carry one Zigbee terminal device of band, is needing to obtain mobile robot and following between target
When relative distance, signal is actively sent to Zigbee terminal device from the Zigbee module of mobile robot, Zigbee terminal is set
Feedback signal after standby reception;Acquire measurement parameter RSSI based on the feedback signal by mobile robot, final obtain follows mesh
The relative distance d of mark and mobile robot, following formula obtain:
Wherein, each meaning of parameters: d: measurement distance, unit m;A be receiving end from one meter of transmitting terminal when transmitting terminal hair
Penetrate power, unit dbm;RSSI: received signal intensity is provided by Zigbee module.It is measured through realizing, the optimum range of A value
Be 45-49, c value optimum range be 3.25-4.5.
Step 6 follows the relative direction of target and mobile robot, relative distance to determine moving line according to obtaining,
And mobile robot is controlled according to moving line to following target to move, follow target to follow to described to realize.Specifically,
Relative direction (id, n) obtained and relative distance d are calculated according to above-mentioned steps, determines that moving line is current mobile machine
Towards distance corresponding to relative direction (id, n) movement relative distance d under the angle of people, this preferably uses PID control in the process
Algorithm controls mobile robot according to moving line to following target to move.
And the method can also include setting threshold value, the mobile robot is big according to threshold value and relative distance
Small control moveable robot movement.
After following target to lock and determining its relative direction, start to detect the phase that mobile robot also follows target
It adjusts the distance, if relative distance is higher than threshold value, starts robot moving algorithm, if pid control algorithm is to mobile robot
Speed is controlled;Further, the method also includes the fortune of mobile robot is judged using ultrasonic distance detection method
Whether there are obstacles in dynamic route, and determining moving line is recalculated when there are barrier.
I.e. during the motion, mobile robot can also start ultrasonic wave anticollision subsystem, and the present invention uses anti-
Hitting subsystem is to install a ultrasonic wave module respectively on eight directions of mobile robot, and each ultrasonic wave module passes through
Time-multiplexed mode works, and provides the obstacle information of surrounding in real time for mobile robot, but super the present invention is not limited to this kind
The detection system of acoustic wave sensing system, other structures is equally applicable in the present invention.Wherein, eight ultrasonic wave modules one take turns information
The acquisition time used about controls in 500ms, and the ranging range of all ultrasonic waves is determined as within 2m, therefore, in order to mention
The working efficiency of high eight ultrasonic waves, the present invention are acquired using four tunnels, and every road acquisition channel is carried out using time-multiplexed mode.
After starting ultrasonic wave anticollision subsystem, to ensure that robot energy avoiding obstacles redefine advance side in real time
To.Specific calculation process is as follows: according to the relative direction (id, n) and relative distance d calculated above, in conjunction with opposite when going to
Obstacle situation when direction (id, n) is mobile is made to be moved in next step;If accessible to the direction (id, n), then directly to (id,
N) direction is mobile, while real-time update (id, n);If current (id, n) has obstacle, then mobile robot rotation is controlled until working as
Preceding positive direction does not have obstacle, then determines to act in next step further according to current (id, n).The above process carries out always, until
It is moved in the threshold range for following target.If mobile robot is lower than threshold value, machine with the relative distance of target is followed
People is failure to actuate.
The target follow-up control method of mobile robot of the invention as a result, by being taken the photograph to set by mobile robot
Camera group is improved, and in the way of the setting simultaneously of triangle camera shooting unit, expands angular field of view, and follow target and shifting convenient for calculating
The relative direction of mobile robot, while relative distance is obtained based on detection, it thereby determines that the moving line of mobile robot, controls
Mobile robot is moved according to moving line, enables robot to follow required target in real time, and in motion process
It is middle to carry out detection of obstacles using ultrasonic distance detection technique, further increase the identifiability and mobile control of mobile robot
Function processed.
To sum up, the target that the present invention can enhance mobile robot follows function, has higher feature detection function and more
The automatic control function of high level applies to this method widely in each field.
Embodiments of the present invention are explained in detail above in conjunction with attached drawing, but the present invention is not limited to above-mentioned implementations
Mode within the knowledge of a person skilled in the art can also be without departing from the purpose of the present invention
It makes a variety of changes.
Claims (8)
1. a kind of target follow-up control method of mobile robot, which is characterized in that comprising steps of
Triangle is set in mobile robot and images unit, and corresponds to ID number and angular field of view for each camera allocation;
Acquisition obtains and follows target identities feature, and is uploaded and stored to Cloud Server;
The mobile robot detects the identity characteristic of target to be followed using triangle camera shooting unit and is uploaded to Cloud Server, by
Cloud Server is according to the identity characteristic of upload and store and follows target identities characteristic matching, determination and locks in successful match
Being somebody's turn to do target to be followed is to follow target;
The mobile robot follows object real-time tracking to locking, and extracts the ID number of video camera and visual angle where following target
Range;Subregion setting is carried out to extracted angular field of view, determines district location where following target;By the ID number of the video camera
With the relative direction for following the district location acquisition of target place to follow target and mobile robot;
The mobile robot follows target real-time detection to locking, and calculates to obtain and follows the opposite of target and mobile robot
Distance;
The relative direction of target and mobile robot, relative distance is followed to determine moving line according to obtaining, and according to movement
Route controls mobile robot to following target to move, to realize to following target to follow;
And further include judging to follow whether the position of target deviates in extracted angular field of view, and when deviateing
Control mobile robot is rotated to the direction for following target to be deviateed;When deviateing, judgement follows whether target appears in
In the angular field of view of other video cameras, and in the angular field of view for coming across other video cameras when, carries out that task is followed to join.
2. the target follow-up control method of mobile robot according to claim 1, it is characterised in that: the method uses window
The adaptive CamShift Density Estimator algorithm of mouth, which is treated, follows the face of target to be tracked.
3. the target follow-up control method of mobile robot according to claim 1, it is characterised in that: the method uses base
Treating in the Adaboost Face datection algorithm of Harr feature follows the face of target to be detected.
4. the target follow-up control method of mobile robot according to claim 1, it is characterised in that: the method also includes
The identity characteristic issues prompt warning when it fails to match, by mobile robot.
5. the target follow-up control method of mobile robot according to claim 1, it is characterised in that: moved in the method
Robot using radio range detection method to lock follow target detection acquisition relative distance.
6. the target follow-up control method of mobile robot according to claim 1, it is characterised in that: the method also includes
Threshold value is set, and the mobile robot controls moveable robot movement according to the size of threshold value and relative distance.
7. the target follow-up control method of mobile robot according to claim 1, it is characterised in that: the method uses
Pid control algorithm controls mobile robot according to moving line to following target to move.
8. the target follow-up control method of mobile robot according to claim 1, it is characterised in that: the method also includes
Judge that whether there are obstacles in the moving line of mobile robot using ultrasonic distance detection method, and when there are barriers
When recalculate determining moving line.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610482865.5A CN106094875B (en) | 2016-06-27 | 2016-06-27 | A kind of target follow-up control method of mobile robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610482865.5A CN106094875B (en) | 2016-06-27 | 2016-06-27 | A kind of target follow-up control method of mobile robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106094875A CN106094875A (en) | 2016-11-09 |
CN106094875B true CN106094875B (en) | 2019-01-22 |
Family
ID=57213687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610482865.5A Active CN106094875B (en) | 2016-06-27 | 2016-06-27 | A kind of target follow-up control method of mobile robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106094875B (en) |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6931994B2 (en) * | 2016-12-22 | 2021-09-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Autonomous mobile body, mobile control method and mobile control program |
CN106737751A (en) * | 2017-01-15 | 2017-05-31 | 禾思凯尔智能科技(东莞)有限公司 | A kind of service robot grasp system and its control method based on cloud information bank |
CN106843280B (en) * | 2017-02-17 | 2021-03-16 | 深圳市卓兴半导体科技有限公司 | Intelligent robot following system |
CN107116555A (en) * | 2017-05-27 | 2017-09-01 | 芜湖星途机器人科技有限公司 | Robot guiding movement system based on wireless ZIGBEE indoor positioning |
CN107272744A (en) * | 2017-05-27 | 2017-10-20 | 芜湖星途机器人科技有限公司 | The robot active system for tracking being engaged with the number of taking machine |
CN108459594A (en) * | 2017-06-12 | 2018-08-28 | 炬大科技有限公司 | A kind of method in mobile electronic device and the mobile electronic device |
CN107273850B (en) * | 2017-06-15 | 2021-06-11 | 上海工程技术大学 | Autonomous following method based on mobile robot |
CN107390721B (en) * | 2017-07-26 | 2021-05-18 | 歌尔科技有限公司 | Robot following control method and device and robot |
CN107659918B (en) * | 2017-08-11 | 2020-08-04 | 东北电力大学 | Intelligent following method and system |
CN107682879B (en) * | 2017-08-30 | 2021-04-02 | 深圳市盛路物联通讯技术有限公司 | Frequency adjusting method based on antenna received signal strength and mobile terminal |
CN107450565A (en) * | 2017-09-18 | 2017-12-08 | 天津工业大学 | Intelligent movable tracks car |
CN107544506B (en) * | 2017-09-27 | 2021-05-18 | 上海有个机器人有限公司 | Robot following method, robot, and storage medium |
CN108737362B (en) * | 2018-03-21 | 2021-09-14 | 北京猎户星空科技有限公司 | Registration method, device, equipment and storage medium |
CN108734082A (en) * | 2018-03-21 | 2018-11-02 | 北京猎户星空科技有限公司 | Method for building up, device, equipment and the storage medium of correspondence |
CN108536145A (en) * | 2018-04-10 | 2018-09-14 | 深圳市开心橙子科技有限公司 | A kind of robot system intelligently followed using machine vision and operation method |
CN108931979B (en) * | 2018-06-22 | 2020-12-15 | 中国矿业大学 | Visual tracking mobile robot based on ultrasonic auxiliary positioning and control method |
CN109333535B (en) * | 2018-10-25 | 2022-05-20 | 同济大学 | Guiding method of autonomous mobile robot |
CN109740461B (en) * | 2018-12-21 | 2020-12-25 | 北京智行者科技有限公司 | Object and subsequent processing method |
CN109686031B (en) * | 2018-12-21 | 2020-10-27 | 北京智行者科技有限公司 | Identification following method based on security |
CN109739267A (en) * | 2018-12-21 | 2019-05-10 | 北京智行者科技有限公司 | Follow the determination method in path |
CN109633719A (en) * | 2018-12-21 | 2019-04-16 | 北京智行者科技有限公司 | The target trajectory recognition methods that vehicle follows |
CN109709953A (en) * | 2018-12-21 | 2019-05-03 | 北京智行者科技有限公司 | Vehicle follower method in road cleaning operation |
CN110162102A (en) * | 2019-05-17 | 2019-08-23 | 广东技术师范大学 | Unmanned plane automatic identification tracking and system based on cloud platform and machine vision |
CN110244772B (en) * | 2019-06-18 | 2021-12-03 | 中国科学院上海微系统与信息技术研究所 | Navigation following system and navigation following control method of mobile robot |
CN111053564B (en) * | 2019-12-26 | 2023-08-18 | 上海联影医疗科技股份有限公司 | Medical equipment movement control method and medical equipment |
CN111476195A (en) * | 2020-04-20 | 2020-07-31 | 安徽中科首脑智能医疗研究院有限公司 | Face detection method, face detection device, robot and computer-readable storage medium |
CN113311826A (en) * | 2021-05-06 | 2021-08-27 | 南通大学 | Automatic following system based on annular infrared array and working method thereof |
CN113933871B (en) * | 2021-10-15 | 2023-01-24 | 贵州师范学院 | Flood disaster detection system based on unmanned aerial vehicle and Beidou positioning |
CN113959432B (en) * | 2021-10-20 | 2024-05-17 | 上海擎朗智能科技有限公司 | Method, device and storage medium for determining following path of mobile equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102411368A (en) * | 2011-07-22 | 2012-04-11 | 北京大学 | Active vision human face tracking method and tracking system of robot |
CN105425795A (en) * | 2015-11-26 | 2016-03-23 | 纳恩博(北京)科技有限公司 | Method for planning optimal following path and apparatus |
CN105654512A (en) * | 2015-12-29 | 2016-06-08 | 深圳羚羊微服机器人科技有限公司 | Target tracking method and device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4871160B2 (en) * | 2007-02-16 | 2012-02-08 | 株式会社東芝 | Robot and control method thereof |
TWI388205B (en) * | 2008-12-19 | 2013-03-01 | Ind Tech Res Inst | Method and apparatus for tracking objects |
-
2016
- 2016-06-27 CN CN201610482865.5A patent/CN106094875B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102411368A (en) * | 2011-07-22 | 2012-04-11 | 北京大学 | Active vision human face tracking method and tracking system of robot |
CN105425795A (en) * | 2015-11-26 | 2016-03-23 | 纳恩博(北京)科技有限公司 | Method for planning optimal following path and apparatus |
CN105654512A (en) * | 2015-12-29 | 2016-06-08 | 深圳羚羊微服机器人科技有限公司 | Target tracking method and device |
Non-Patent Citations (1)
Title |
---|
基于全景视觉的目标跟踪方法研究;俞燕燕等;《合肥工业大学学报(自然科学版)》;20150131;第38卷(第1期);摘要,期刊第56-58页 |
Also Published As
Publication number | Publication date |
---|---|
CN106094875A (en) | 2016-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106094875B (en) | A kind of target follow-up control method of mobile robot | |
US8115814B2 (en) | Mobile tracking system, camera and photographing method | |
CN106303442B (en) | Tracking path topological structure establishing method, target object tracking method and target object tracking equipment | |
US11863857B2 (en) | Photographing control method, apparatus, and control device | |
US9398231B2 (en) | Surveillance camera terminal | |
CN105841687B (en) | indoor positioning method and system | |
KR101077967B1 (en) | Apparatus and method for surveillance and tracking | |
CN104820998B (en) | A kind of human testing based on unmanned motor platform and tracking and device | |
CN109901590B (en) | Recharging control method of desktop robot | |
CN108038415B (en) | Unmanned aerial vehicle automatic detection and tracking method based on machine vision | |
CN103310442B (en) | Based on intelligent positioning system and the localization method thereof of multifrequency information fusion | |
CN112050810B (en) | Indoor positioning navigation method and system based on computer vision | |
CN104811667A (en) | Unmanned aerial vehicle target tracking method and system | |
TW201911118A (en) | Method for tracing track of target in cross regions, and data processing method, apparatus and system | |
CN110514212A (en) | A kind of intelligent vehicle map terrestrial reference localization method merging monocular vision and difference GNSS | |
JP2018061114A (en) | Monitoring device and monitoring method | |
Zachariadis et al. | 2D visual tracking for sports UAV cinematography applications | |
CN106911916B (en) | Image acquisition system, apparatus and method | |
KR20170058612A (en) | Indoor positioning method based on images and system thereof | |
Sakai et al. | Large-scale 3D outdoor mapping and on-line localization using 3D-2D matching | |
Chen et al. | Robust autonomous landing of UAVs in non-cooperative environments based on comprehensive terrain understanding | |
CN111157008A (en) | Local autonomous navigation system and method based on multidimensional environment information perception | |
Bahadori et al. | Real-time people localization and tracking through fixed stereo vision | |
CN102497668A (en) | Wireless sensor network (WSN) node APIT positioning method | |
CN110515086A (en) | A kind of naval target search simulation system and method applied to unmanned boat |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20200116 Address after: Room 507, No. 6-3, Xingzhi Road, Nanjing Economic and Technological Development Zone, Nanjing, Jiangsu Province Patentee after: Nanjing Nanyou Information Industry Technology Research Institute Co. Ltd. Address before: 210000, 66 new model street, Gulou District, Jiangsu, Nanjing Patentee before: Nanjing Post & Telecommunication Univ. |