CN105182983A - Face real-time tracking method and face real-time tracking system based on mobile robot - Google Patents

Face real-time tracking method and face real-time tracking system based on mobile robot Download PDF

Info

Publication number
CN105182983A
CN105182983A CN201510690644.2A CN201510690644A CN105182983A CN 105182983 A CN105182983 A CN 105182983A CN 201510690644 A CN201510690644 A CN 201510690644A CN 105182983 A CN105182983 A CN 105182983A
Authority
CN
China
Prior art keywords
target
information
module
robot
mobile robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510690644.2A
Other languages
Chinese (zh)
Inventor
余刚
林天麟
庄礼填
杨帆
唐志海
肖杰
斯美樑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chuangxiangweilai Robot Co Ltd
Original Assignee
Shenzhen Chuangxiangweilai Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chuangxiangweilai Robot Co Ltd filed Critical Shenzhen Chuangxiangweilai Robot Co Ltd
Priority to CN201510690644.2A priority Critical patent/CN105182983A/en
Publication of CN105182983A publication Critical patent/CN105182983A/en
Pending legal-status Critical Current

Links

Abstract

The invention relates to a face real-time tracking method based on a mobile robot. The face real-time tracking method comprises the following steps: by utilizing a depth camera, photographing and collecting the face image information and the distance information of a target; collecting body temperature information by utilizing an infrared temperature sensor; determining whether the face information belongs to a live person by a face detection module, and determining the identity of the target by utilizing a characteristic comparison identifying module; and controlling a mechanical movement module to move the robot to avoid an error target by utilizing a control drive module, or tracking the target the identity of which is determined.

Description

Based on real time face tracking method and the tracker of mobile robot
Technical field
The present invention relates to mobile robot field, particularly a kind of real time face tracking method based on mobile robot and tracker.
Background technology
Compared with the robot tide of the seventies, present robot research has two features: one is had more realistic standard to the intelligent positioning of robot, namely robot does not have the high intelligence as the mankind, but only requires that robot has the ability of autonomous process problem to a certain extent; Another feature be many new technologies and control method (such as, neural network, sensor fusion, virtual reality, computer vision and multi agent etc.) be introduced in the research of robot, that robot research is moved towards is healthy and the road of steady development for the transformation of research emphasis, and constantly obtains new achievement in research.
Along with the development of the technology such as sensor, control, driving and new material, the research of mobile robot in recent years makes great progress in many-side.In addition, mobile robot has higher level of intelligence and can be used for, in service sector and other scopes, greatly strengthen range of application and the function of robot.According to scholarly forecast, the quantity of intelligent robot eventually exceeds industrial robot.At present, many countries are all being engaged in the research of intelligent robot, and the Japan producing first as robot of the world compares the civilian intelligent machine man-based development of attention always.It is mainly used in three aspects: one is housework and environmental applications, comprises service robot and relevant housework robotization; Two is that life supports application, for the elderly and those with physical disabilities offer help; Three is application of living and education aspect.From world wide, some economy and the richer country of technical strength have all formulated oneself recent and long-term robot development plan, as the U.S., France, Japan and Italian etc.
Path tracking technique is one of module important in mobile-robot system.Mobile robot has multiple path tracking technique, environmentally the factor such as integrated degree, path trace indicator signal type, path trace region of information is different, mainly comprises and following the tracks of based on map path, based on road sign path trace, view-based access control model path trace and based on perceptron path trace etc.
Machine vision replaces human eye and brain with video camera and computing machine, carries out perception, explanation and understanding to surrounding environment.Target following, as an important research direction of field of machine vision, has a wide range of applications at numerous areas, and along with the fast development of Robotics in recent years, the robot target of view-based access control model follows the tracks of the focus becoming research.
Summary of the invention
In view of this, the invention provides a kind of real time face tracking method based on mobile robot and tracker.
Based on a mobile robot's real time face tracking method, comprise the following steps:
Send instruction to robot, require that in robot and identity information storehouse, a predetermined face carries out alternately;
Find target, utilize depth camera to take pictures and gather facial image information, the range information of target; Utilize infrared temperature sensor to gather the body temperature information of target simultaneously;
Facial image information, range information and body temperature information are sent to face detection module, judge whether detected target is face and whether is living things, when detected target be non-face or the life entity of non-live time, start and drive tracking module to carry out mobile robot, to avoid measured target; When detected target is the face of living, continue to perform subsequent action;
After face detection module detects, the face information of living is sent to characteristic extracting module, characteristic information extraction;
Extracted characteristic information is sent to Characteristic Contrast identification module, information in itself and identity information storehouse is contrasted, thus identify that whether target is the predetermined face in described identity information storehouse, when really admit a fault predetermined face time, start and drive tracking module to carry out mobile robot, to avoid measured target; When being confirmed to be predetermined face, carry out interacting activity;
Target be identified is after predetermined face; the range information of target is sent to driving tracking module; utilize the movement driving tracking module adjustment robot; make the distance of depth camera and infrared temperature sensor and target remain on suitable distance range, guarantee that the reciprocal process of robot and target is normally carried out.
Preferably, the testing process of described face detection module comprises the following steps:
Percentage regulation camera photo angle, makes target face be presented at the center of the video image that depth camera obtains, and D represents the distance being in the target of picture centre that depth camera obtains;
While described depth camera is taken pictures, utilize infrared temperature sensor to obtain the temperature of target, T represents the temperature that infrared temperature sensor is measured;
T_d representative is through the face surface temperature of distance correction, and work as T_d, T and D meets expression formula T_d=f (T, D), Tmin, Tmax are respectively minimum and maximum temperature values, as Tmin<T_d<Tmax, detecting target is the face of living.
Preferably, described robot and target are carried out in interacting activity process, utilize the positional information of a position feedback module Real-time Feedback target to described control driver module, this control driver module controls described mechanical motion module mobile robot in real time, real-time follow-up is carried out to target, guarantee that target is in the center of the video image of described depth camera all the time, and be in the sensing range of described infrared temperature sensor.
In addition, the present invention also provides a kind of real time face tracking system based on mobile robot, comprise: information acquisition module, face detection module, characteristic extracting module, identity information storehouse, Characteristic Contrast identification module, control driver module and mechanical motion module, described information acquisition module comprises depth camera and infrared temperature sensor, described depth camera gathers the facial image information of target, the positional information of target, described infrared temperature sensor gathers the body temperature information of target, described information acquisition module is by described facial image information, positional information, body temperature information is sent to described face detection module, described face detection module judges whether measured target is the face of living, when the face of testing result non-live, this object information is sent to control driver module, control described mechanical motion module mobile robot and avoid current goal, when testing result is the face of living, this object information is sent to described characteristic extracting module, described characteristic extracting module is extracted clarification of objective information and described characteristic information is sent to described Characteristic Contrast identification module, characteristic information in clarification of objective information and described identity information storehouse is carried out contrast and identifies by described Characteristic Contrast identification module, when recognition result is target face predetermined in identity information storehouse, robot carries out man-machine interaction, when target face predetermined in recognition result non-part information bank, this object information is sent to control driver module, control described mechanical motion module mobile robot and avoid current goal.
Preferably, the described real time face tracking system based on mobile robot, also comprises a position feedback module, for the positional information extremely described control driver module of Real-time Feedback target.
Preferably, described based in the real time face tracking system of mobile robot, described mechanical motion module comprises chassis driving wheel and end rotation part, described chassis drive wheel robot moves on chassis, described end rotation part drive machines head part rotates, thus adjusts the coverage of described depth camera.
Preferably, the end rotation part of described mechanical motion module comprises the first end rotation part and the second end rotation part, the head of described first end rotation part drive machines people rotates in the space of vertical direction, rotates in the head of described second end rotation part drive machines people space in the horizontal direction.
Compared with prior art, the present invention has the following advantages:
First, in information acquisition module, utilize depth camera and infrared temperature sensor to combine, comprehensive utilization human face image information and body temperature information judge the information of face jointly as detection, to carry out the detection of face from two aspects.
Secondly, utilize the synergy controlling driver module, mechanical motion module, position feedback module, the movement on control chassis and the rotation of control machine head part, make in interactive process, reach the real-time follow-up to target.
Accompanying drawing explanation
Fig. 1 is the real time face tracking system schematic that the present invention is based on mobile robot.
Fig. 2 is the workflow schematic diagram of face detection module in Fig. 1.
Fig. 3 is the embodiment schematic diagram that the real time face tracking system that the present invention is based on mobile robot is applied to robot.
Fig. 4 is the process flow diagram of the real time face tracking method that the present invention is based on mobile robot.
Embodiment
For making object of the present invention, technical scheme and advantage clearly understand, below in conjunction with specific embodiments, the present invention is described in detail.Should be appreciated that specific embodiments described herein only in order to explain the present invention, be not intended to limit the present invention.
The invention provides a kind of real time face tracking method based on mobile robot and tracker.
As shown in Figure 1, based on the real time face tracking system 100 of mobile robot, comprise information acquisition module 110, face detection module 120, characteristic extracting module 130, identity information storehouse 140, Characteristic Contrast identification module 150, control driver module 160, mechanical motion module 170 and position feedback module 180.
Information acquisition module 110 comprises depth camera 111 and infrared temperature sensor 112.Object in depth camera 110 pairs of surrounding environment is taken pictures, for obtaining the range information (or positional information) of human face image information, face.Infrared temperature sensor 120 for obtaining the temperature of target body so that determine measured target be live human body.
Face detection module 120 feature of other objects in the feature of people in surrounding environment and environment will be made a distinction.In the present invention, on the one hand, the object differences such as face and other animals, furniture, household electrical appliances can be come by the detection of facial image feature; On the other hand, by the detection of body temperature information, can judge that detected target is the life entity of living.Through the comprehensive detection of these two kinds of information, can determine that measured target is living person.
The workflow of face detection module 120 as shown in Figure 2.Wherein, used Face datection algorithm can be conventional algorithm, such as, based on the cascade classifier of Harr feature, or based on the cascade classifier etc. of LBP feature.The testing process of face detection module 120 comprises the following steps: robot is mobile at random finds target face, face is detected by the depth camera 111 of information acquisition module 110, adjustment robot head angle, detected target face is made to be presented at the center of the video image that robot obtains, infrared temperature sensor 120 obtains the temperature of target, T represents the temperature that infrared temperature sensor is measured, D represents the distance of the target piece (such as face) being in picture centre that depth camera obtains, T_d is the real human face surface temperature through distance correction, T_d, the relational expression of T and D is: T_d=f (T, D), Tmin, Tmax is respectively minimum and maximum temperature values.Such as Tmin=34 DEG C, Tmax=38 DEG C.If the scope of T_d is between Tmin and Tmax, testing result confirms that target is face alive.
Such as, when measured target is household electrical appliances, because the temperature of household electrical appliances and human body temperature difference are very large, T_d between Tmin and Tmax, can judge that measured target is non-human, thus drive machines people can not carry out dodging motion after face detection module 120.When detected target is human body, proceed follow-up action.
Characteristic extracting module 130 selects suitable feature, makes the accuracy rate of human bioequivalence higher.
In the present invention, the feature of extraction comprises facial image characteristic sum human body temperature feature.Facial image feature is divided into geometric properties, algebraic characteristic etc. usually.Geometric properties is the feature based on the shape of human face and geometric relationship, comprises geometric properties curvature and facial geometric feature point.Geometric properties curvature refers to the outline line curvature of face.Facial geometric feature point comprises each organs such as eyes, nose, mouth, chin, and relative position between them and distance.These features have the unchangeability such as position, viewpoint, size.Require that choosing geometric properties has certain uniqueness, can reflect the difference between different face, there is again certain elasticity to eliminate the impact of the factor such as time span, illumination simultaneously.Algebraic characteristic is that the specific transform method of facial image is projected in reduced order subspace, forms the algebraic characteristic of face.As the wavelet character etc. obtaining singular value features through singular value conversion, obtain eigenface feature through Karhunen-Loeve transformation, wavelet transformation obtains.
The extraction of face characteristic is carried out for some feature of face, such as, realizes the extraction of face characteristic according to Face geometric eigenvector.
The characteristic information of the people of typing in advance in robot system is stored in identity information storehouse 140.Such as, when present system is used for household service robot, the characteristic information of face in advance by shooting facial image, can be recorded in identity information storehouse, road 150 by the identity information of kinsfolk.
Characteristic Contrast identification module 150, in order to the characteristic information in the target signature information that characteristic extracting module 130 extracted and identity information storehouse 140 is contrasted by algorithm, judges that whether target face is a certain people in identity information storehouse 140.Such as, the feature of a certain people stored in the feature of target face and identity information storehouse 140 contrasted by certain algorithm, the mahalanobis distance of both calculating, if mahalanobis distance is less than appointment threshold value, then thinks that target is exactly this people in identity information storehouse 140.
Such as, kinsfolk 1 sends instruction to robot, requires that robot follows kinsfolk 2 to tell a story.After robot receives instruction, start to move at home, find kinsfolk 2, when robot moves to the front of a certain people, start collection human face image information of taking pictures, range information, gather body temperature information simultaneously, after face detection module 120 is analyzed, find that measured target is a living person, the characteristic information of the tested face of following extraction, then contrast with the characteristic information of kinsfolk 2 that prestores in identity information storehouse, Characteristic Contrast identification module 150 is utilized to calculate the mahalanobis distance of the characteristic of the characteristic of target face and the kinsfolk 2 of identity information storehouse 140 storage, if mahalanobis distance is less than appointment threshold value, then think that target is exactly kinsfolk 2.Next, robot starts to carry out with target (i.e. kinsfolk 2) alternately, that is, telling a story to kinsfolk 2.
Controlling driver module 160 is carry out in reciprocal process in robot and target, controls the motor behavior of mechanical motion module 170.The first, can according to the instruction of Characteristic Contrast identification module 150, such as, when recognition result is not the target face that will follow the tracks of, instruction controls driver module 160 and controls the position that mechanical motion module 170 carries out mobile robot, searches for next target.Second, also next target can be found according to the instruction of face detection module 120, the target that such as face detection module 120 detects is not face, such as, when being household electrical appliances, instruction controls driver module 160 and controls the position that mechanical motion module 170 carries out mobile robot, avoid household electrical appliances, and search for next target.3rd, the positional information (or range information) of the face can be fed back according to position feedback module 180, instruction controls driver module 160 and controls mechanical motion module 170, such as can control chassis drive wheel robot to move integrally, also can control end rotation part drive machines head part rotate, the shooting angle of depth camera is changed, thus in real time target facial image is taken.Such as, in the process that robot tells a story to kinsfolk 2, kinsfolk 2 may not be actionless, and along with walking about of kinsfolk 2, robot will carry out real-time follow-up.
Mechanical motion module 170 comprises chassis driving wheel 171 and end rotation part 172.Chassis driving wheel 171 quantity is not limit, and be two chassis driving wheels 171 in the present embodiment, be installed on the bottom of robot, for drive machines, people moves integrally, to adjust the distance between robot and target.End rotation part 172 is arranged at head or the neck of robot, and the head of robot can be rotated freely, thus the coverage of percentage regulation camera 111.End rotation part 172 comprises the first end rotation part 172a and the second end rotation part 172b, first end rotation part 172a makes the head of robot rotate in the space of vertical direction, second end rotation part 172b makes to rotate in the head of robot space in the horizontal direction, make the head of robot according to the coverage of the position percentage regulation camera 111 of measured target face, obtain human face image information accurately.
As shown in Figure 3, the real time face tracking system based on mobile robot of the present invention, is applied to the robot architecture's schematic diagram on a household service robot.Shown robot architecture comprises head 11, fuselage 12 and chassis 13.Robot head 11 is established depth camera 111 and infrared temperature sensor 112.Face detection module 120, characteristic extracting module 130, identity information storehouse 140, Characteristic Contrast identification module 150, control driver module 160 are all arranged on fuselage 12 inside.The chassis driving wheel 171 of mechanical motion module 170 is arranged on chassis 13, and the end rotation part 172 of mechanical motion module 170 is arranged on the neck of robot.Mechanical motion module 170 is all electrically connected with control driver module 160, the control of controlled driver module 160 and carry out moving or rotating.Position feedback module 180 one end and mechanical motion module 170 are electrically connected, and one end is electrically connected with control driver module 160.
As shown in Figure 4, the present invention is based on the real time face tracking method of mobile robot, comprise the following steps:
S201: start the machine people, sends instruction to robot, requires that the intended target in robot and identity information storehouse is carried out alternately.
Robot, before carrying out man-machine interaction, sets up identity information storehouse as required.Such as the characteristic information of all for family members is entered in identity information storehouse 140, and carries out one_to_one corresponding, each kinsfolk oneself distinctive characteristic information corresponding.Input Process, that is, utilize depth camera 111 to take pictures to each kinsfolk and gather its face characteristic information, and carry out storing, record.
In the present embodiment, such as, send instruction to robot, require that robot follows kinsfolk 2 to exchange, such as, robot tells a story to kinsfolk 2.
S202: find target, utilizes depth camera 111 to take pictures and gathers facial image information, the range information of target; Utilize infrared temperature sensor 112 to gather the body temperature information of target simultaneously.
After robot receives instruction, start to move at home, find target, when robot moves to the front of a certain target, utilize the target of depth camera 111 to front to take pictures, gather facial image information, the range information of target; Utilize infrared temperature sensor 112 to gather body temperature information simultaneously.
S203: facial image information, range information and body temperature information that S202 step obtains are sent to face detection module 120, judge whether detected target is face and whether is living things, when detected target is not face, or the photo of people but according to body temperature information judge be not live people time, this information is sent to and controls driver module 160, control driver module 160 and control mechanical motion module 170 mobile robot, to avoid measured target, find another target; When detected target is the face of living, continue to perform subsequent action.
Such as, when target is pet dog, can judge that target is non-face by face detection module 120, this information is sent to and controls driver module 160, control driver module 160 and control mechanical motion module 170 mobile robot, avoid pet dog, find another target.When target is the photo of people, although face detection module 120 detects facial image, but comprehensive body temperature information, judge it is not the life entity of living, this information is sent to and controls driver module 160, control driver module 160 and control mechanical motion module 170 mobile robot, avoid the photo of people, find another target.When the temperature range that the body temperature of detected target is the life entity of living, and facial image information is face, then enter next step process.
S204: after face detection module 120 detects, confirm as face information alive and be sent to characteristic extracting module 130, characteristic information extraction.
S205: extracted characteristic information is sent to Characteristic Contrast identification module 150, information in itself and identity information storehouse 140 is contrasted, thus identify that whether target is the characteristic information of kinsfolk 2 in identity information storehouse 140, when really admit a fault kinsfolk 2 time, this information is sent to and controls driver module 160, control driver module 160 and control mechanical motion module 170 mobile robot, avoid target, find another target; When to not Shi Bie after confirm as kinsfolk 2 time, carry out interacting activity.
S206: when target is confirmed to be kinsfolk 2; the range information of target is sent to and controls driver module 160; control driver module 160 is utilized to control mechanical motion module 170 mobile robot; the distance of depth camera 111 and infrared temperature sensor 112 and target is made to remain on suitable distance range; namely; target is within the scope of the accurate shooting angle of depth camera 111 pairs of faces in real time; within the scope of the coverage of infrared temperature sensor 112 take temperature, thus guarantee that the reciprocal process of robot and target is normally carried out.
In the process that robot tells a story to kinsfolk 2, kinsfolk 2 may not be actionless, along with walking about of kinsfolk 2, the positional information (i.e. range information) of position feedback module 180 Real-time Feedback target (i.e. kinsfolk 2) is to controlling driver module 160, make to control driver module 160 real time control machine tool motion module 170 mobile robot, real-time follow-up is carried out to kinsfolk 2, guarantee that kinsfolk 2 is in the coverage of depth camera 111 all the time, and be in the sensing range of infrared temperature sensor 112, reciprocal process is made to continue to carry out.Avoid kinsfolk 2 in reciprocal process to move out outside the coverage of robot, body temperature sensing range, thus interrupt reciprocal process.
Compared with prior art, the present invention has the following advantages:
First, in information acquisition module, utilize depth camera and infrared temperature sensor to combine, comprehensive utilization human face image information and body temperature information judge the information of face jointly as detection, to carry out the detection of face from two aspects.
Secondly, utilize the synergy controlling driver module 160, mechanical motion module 170, position feedback module, the movement on control chassis and the rotation of control machine head part, make in interactive process, reach the real-time follow-up to target.
The above is only preferred embodiment of the present invention, and be not do other forms of restriction to the present invention, any those skilled in the art may utilize the technology contents of above-mentioned announcement to be changed or be modified as the Equivalent embodiments of equivalent variations.But everyly do not depart from technical solution of the present invention content, any simple modification, equivalent variations and the remodeling done above embodiment according to technical spirit of the present invention, still belong to the protection domain of technical solution of the present invention.

Claims (7)

1., based on a mobile robot's real time face tracking method, comprise the following steps:
Send instruction to robot, require that in robot and identity information storehouse, a predetermined face carries out alternately;
Find target, utilize depth camera to take pictures and gather facial image information, the range information of target; Utilize infrared temperature sensor to gather the body temperature information of target simultaneously;
Facial image information, range information and body temperature information are sent to face detection module, judge whether detected target is face and whether is living things, when detected target be non-face or the life entity of non-live time, start and drive tracking module to carry out mobile robot, to avoid measured target; When detected target is the face of living, continue to perform subsequent action;
After face detection module detects, the face information of living is sent to characteristic extracting module, characteristic information extraction;
Extracted characteristic information is sent to Characteristic Contrast identification module, information in itself and identity information storehouse is contrasted, thus identify that whether target is the predetermined face in described identity information storehouse, when really admit a fault predetermined face time, start and drive tracking module to carry out mobile robot, to avoid measured target; When being confirmed to be predetermined face, carry out interacting activity;
Target be identified is after predetermined face; the range information of target is sent to driving tracking module; utilize the movement driving tracking module adjustment robot; make the distance of depth camera and infrared temperature sensor and target remain on suitable distance range, guarantee that the reciprocal process of robot and target is normally carried out.
2. as right wants the method as described in 1, it is characterized in that, the testing process of described face detection module comprises the following steps:
Percentage regulation camera photo angle, makes target face be presented at the center of the video image that depth camera obtains, and D represents the distance being in the target of picture centre that depth camera obtains;
While described depth camera is taken pictures, utilize infrared temperature sensor to obtain the temperature of target, T represents the temperature that infrared temperature sensor is measured;
T_d representative is through the face surface temperature of distance correction, and work as T_d, T and D meets expression formula T_d=f (T, D), Tmin, Tmax are respectively minimum and maximum temperature values, as Tmin<T_d<Tmax, detecting target is the face of living.
3. as right wants the method as described in 1, it is characterized in that, described robot and target are carried out in interacting activity process, utilize the positional information of a position feedback module Real-time Feedback target to described control driver module, this control driver module controls described mechanical motion module mobile robot in real time, real-time follow-up is carried out to target, guarantees that target is in the center of the video image of described depth camera all the time, and be in the sensing range of described infrared temperature sensor.
4. the real time face tracking system based on mobile robot, comprise: information acquisition module, face detection module, characteristic extracting module, identity information storehouse, Characteristic Contrast identification module, control driver module and mechanical motion module, described information acquisition module comprises depth camera and infrared temperature sensor, described depth camera gathers the facial image information of target, the positional information of target, described infrared temperature sensor gathers the body temperature information of target, described information acquisition module is by described facial image information, positional information, body temperature information is sent to described face detection module, described face detection module judges whether measured target is the face of living, when the face of testing result non-live, this object information is sent to control driver module, control described mechanical motion module mobile robot and avoid current goal, when testing result is the face of living, this object information is sent to described characteristic extracting module, described characteristic extracting module is extracted clarification of objective information and described characteristic information is sent to described Characteristic Contrast identification module, characteristic information in clarification of objective information and described identity information storehouse is carried out contrast and identifies by described Characteristic Contrast identification module, when recognition result is target face predetermined in identity information storehouse, robot carries out man-machine interaction, when target face predetermined in recognition result non-part information bank, this object information is sent to control driver module, control described mechanical motion module mobile robot and avoid current goal.
5. as claimed in claim 4 based on the real time face tracking system of mobile robot, it is characterized in that, also comprise a position feedback module, for the positional information extremely described control driver module of Real-time Feedback target.
6. as claimed in claim 4 based on the real time face tracking system of mobile robot, it is characterized in that, described mechanical motion module comprises chassis driving wheel and end rotation part, described chassis drive wheel robot moves on chassis, described end rotation part drive machines head part rotates, thus adjusts the coverage of described depth camera.
7. as claimed in claim 6 based on the real time face tracking system of mobile robot, it is characterized in that, described end rotation part comprises the first end rotation part and the second end rotation part, the head of described first end rotation part drive machines people rotates in the space of vertical direction, rotates in the head of described second end rotation part drive machines people space in the horizontal direction.
CN201510690644.2A 2015-10-22 2015-10-22 Face real-time tracking method and face real-time tracking system based on mobile robot Pending CN105182983A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510690644.2A CN105182983A (en) 2015-10-22 2015-10-22 Face real-time tracking method and face real-time tracking system based on mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510690644.2A CN105182983A (en) 2015-10-22 2015-10-22 Face real-time tracking method and face real-time tracking system based on mobile robot

Publications (1)

Publication Number Publication Date
CN105182983A true CN105182983A (en) 2015-12-23

Family

ID=54905122

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510690644.2A Pending CN105182983A (en) 2015-10-22 2015-10-22 Face real-time tracking method and face real-time tracking system based on mobile robot

Country Status (1)

Country Link
CN (1) CN105182983A (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105759650A (en) * 2016-03-18 2016-07-13 北京光年无限科技有限公司 Method used for intelligent robot system to achieve real-time face tracking
CN105785948A (en) * 2016-04-11 2016-07-20 马泽泠 Intelligent green-ecological-landscape-based health-preserving household unit and operating method thereof
CN105841675A (en) * 2016-05-03 2016-08-10 北京光年无限科技有限公司 Range finding method and system for intelligent robot
CN105912120A (en) * 2016-04-14 2016-08-31 中南大学 Face recognition based man-machine interaction control method of mobile robot
CN106096573A (en) * 2016-06-23 2016-11-09 乐视控股(北京)有限公司 Method for tracking target, device, system and long distance control system
CN106407882A (en) * 2016-07-26 2017-02-15 河源市勇艺达科技股份有限公司 Method and apparatus for realizing head rotation of robot by face detection
CN106406323A (en) * 2016-12-14 2017-02-15 智易行科技(武汉)有限公司 Adaptive precision motion control method for mobile platform based on Beidou-GPS navigation
CN106426180A (en) * 2016-11-24 2017-02-22 深圳市旗瀚云技术有限公司 Robot capable of carrying out intelligent following based on face tracking
CN106625711A (en) * 2016-12-30 2017-05-10 华南智能机器人创新研究院 Method for positioning intelligent interaction of robot
CN106774318A (en) * 2016-12-14 2017-05-31 智易行科技(武汉)有限公司 Multiple agent interactive environment is perceived and path planning kinematic system
CN106791565A (en) * 2016-12-15 2017-05-31 北京奇虎科技有限公司 Robot video calling control method, device and terminal
CN106843280A (en) * 2017-02-17 2017-06-13 深圳市踏路科技有限公司 A kind of intelligent robot system for tracking
CN106886216A (en) * 2017-01-16 2017-06-23 深圳前海勇艺达机器人有限公司 Robot automatic tracking method and system based on RGBD Face datections
CN106919246A (en) * 2015-12-24 2017-07-04 北京奇虎科技有限公司 The display methods and device of a kind of application interface
CN107018310A (en) * 2016-10-08 2017-08-04 罗云富 Possess the self-timer method and self-timer of face function
CN107019498A (en) * 2017-03-09 2017-08-08 深圳市奥芯博电子科技有限公司 Nurse robot
CN107102540A (en) * 2016-02-23 2017-08-29 芋头科技(杭州)有限公司 A kind of method and intelligent robot for waking up intelligent robot
CN107330368A (en) * 2017-05-27 2017-11-07 芜湖星途机器人科技有限公司 The robot recognition of face device of tilting multi-cam
CN107330366A (en) * 2017-05-27 2017-11-07 芜湖星途机器人科技有限公司 Tilting can adjust the robot recognition of face device of multi-cam
CN107390721A (en) * 2017-07-26 2017-11-24 歌尔科技有限公司 Robot retinue control method, device and robot
CN107454335A (en) * 2017-08-31 2017-12-08 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and mobile terminal
WO2018001245A1 (en) * 2016-06-30 2018-01-04 Beijing Airlango Technology Co., Ltd. Robot control using gestures
WO2018058557A1 (en) * 2016-09-30 2018-04-05 Intel Corporation Human search and identification in complex scenarios
CN108171219A (en) * 2018-01-30 2018-06-15 广州市君望机器人自动化有限公司 Face method is tracked by a kind of robot
CN108304799A (en) * 2018-01-30 2018-07-20 广州市君望机器人自动化有限公司 A kind of face tracking methods
CN109154977A (en) * 2016-03-28 2019-01-04 亚马逊科技公司 Combined depth and thermal information are to be used for object detection and evacuation
CN109658570A (en) * 2018-12-19 2019-04-19 中新智擎科技有限公司 A kind of server, client, mobile robot, door access control system and method
CN109686031A (en) * 2018-12-21 2019-04-26 北京智行者科技有限公司 Identification follower method based on security protection
CN109946703A (en) * 2019-04-10 2019-06-28 北京小马智行科技有限公司 A kind of sensor attitude method of adjustment and device
CN109976335A (en) * 2019-02-27 2019-07-05 武汉大学 A kind of traceable Portable stereoscopic live streaming intelligent robot and its control method

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102135455A (en) * 2010-11-18 2011-07-27 杭州自动化技术研究院有限公司 Non-contact temperature measurement method, point temperature instrument and application thereof
CN102411368A (en) * 2011-07-22 2012-04-11 北京大学 Active vision human face tracking method and tracking system of robot
CN102566474A (en) * 2012-03-12 2012-07-11 上海大学 Interaction system and method for robot with humanoid facial expressions, and face detection and tracking method
CN103093199A (en) * 2013-01-15 2013-05-08 中国科学院自动化研究所 Certain face tracking method based on online recognition
CN103106393A (en) * 2012-12-12 2013-05-15 袁培江 Embedded type face recognition intelligent identity authentication system based on robot platform
CN103344358A (en) * 2013-05-06 2013-10-09 华中科技大学 Low-temperature fine metal wire non-contact temperature measuring method
US20130342652A1 (en) * 2012-06-22 2013-12-26 Microsoft Corporation Tracking and following people with a mobile robotic device
CN103514438A (en) * 2012-06-25 2014-01-15 盈泰安股份有限公司 System and method for identifying human face
CN103984315A (en) * 2014-05-15 2014-08-13 成都百威讯科技有限责任公司 Domestic multifunctional intelligent robot
CN103995747A (en) * 2014-05-12 2014-08-20 上海大学 Distributed pedestrian detection system and method based on mobile robot platform
CN104248422A (en) * 2014-07-29 2014-12-31 西安三威安防科技有限公司 Infrared temperature measuring instrument

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102135455A (en) * 2010-11-18 2011-07-27 杭州自动化技术研究院有限公司 Non-contact temperature measurement method, point temperature instrument and application thereof
CN102411368A (en) * 2011-07-22 2012-04-11 北京大学 Active vision human face tracking method and tracking system of robot
CN102566474A (en) * 2012-03-12 2012-07-11 上海大学 Interaction system and method for robot with humanoid facial expressions, and face detection and tracking method
US20130342652A1 (en) * 2012-06-22 2013-12-26 Microsoft Corporation Tracking and following people with a mobile robotic device
CN103514438A (en) * 2012-06-25 2014-01-15 盈泰安股份有限公司 System and method for identifying human face
CN103106393A (en) * 2012-12-12 2013-05-15 袁培江 Embedded type face recognition intelligent identity authentication system based on robot platform
CN103093199A (en) * 2013-01-15 2013-05-08 中国科学院自动化研究所 Certain face tracking method based on online recognition
CN103344358A (en) * 2013-05-06 2013-10-09 华中科技大学 Low-temperature fine metal wire non-contact temperature measuring method
CN103995747A (en) * 2014-05-12 2014-08-20 上海大学 Distributed pedestrian detection system and method based on mobile robot platform
CN103984315A (en) * 2014-05-15 2014-08-13 成都百威讯科技有限责任公司 Domestic multifunctional intelligent robot
CN104248422A (en) * 2014-07-29 2014-12-31 西安三威安防科技有限公司 Infrared temperature measuring instrument

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘平 等: "《自动识别技术概论》", 31 August 2013, 清华大学出版社 *
类红乐 等: "人脸识别机器人的设计与实现", 《电脑知识与技术》 *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106919246A (en) * 2015-12-24 2017-07-04 北京奇虎科技有限公司 The display methods and device of a kind of application interface
WO2017143948A1 (en) * 2016-02-23 2017-08-31 芋头科技(杭州)有限公司 Method for awakening intelligent robot, and intelligent robot
CN107102540A (en) * 2016-02-23 2017-08-29 芋头科技(杭州)有限公司 A kind of method and intelligent robot for waking up intelligent robot
CN105759650A (en) * 2016-03-18 2016-07-13 北京光年无限科技有限公司 Method used for intelligent robot system to achieve real-time face tracking
CN109154977A (en) * 2016-03-28 2019-01-04 亚马逊科技公司 Combined depth and thermal information are to be used for object detection and evacuation
CN105785948A (en) * 2016-04-11 2016-07-20 马泽泠 Intelligent green-ecological-landscape-based health-preserving household unit and operating method thereof
CN105912120A (en) * 2016-04-14 2016-08-31 中南大学 Face recognition based man-machine interaction control method of mobile robot
CN105912120B (en) * 2016-04-14 2018-12-21 中南大学 Mobile robot man-machine interaction control method based on recognition of face
CN105841675A (en) * 2016-05-03 2016-08-10 北京光年无限科技有限公司 Range finding method and system for intelligent robot
CN106096573A (en) * 2016-06-23 2016-11-09 乐视控股(北京)有限公司 Method for tracking target, device, system and long distance control system
WO2018001245A1 (en) * 2016-06-30 2018-01-04 Beijing Airlango Technology Co., Ltd. Robot control using gestures
US10710244B2 (en) 2016-06-30 2020-07-14 Beijing Airlango Technology Co., Ltd. Robot control using gestures
CN106407882A (en) * 2016-07-26 2017-02-15 河源市勇艺达科技股份有限公司 Method and apparatus for realizing head rotation of robot by face detection
WO2018058557A1 (en) * 2016-09-30 2018-04-05 Intel Corporation Human search and identification in complex scenarios
US10607070B2 (en) 2016-09-30 2020-03-31 Intel Corporation Human search and identification in complex scenarios
CN107018310A (en) * 2016-10-08 2017-08-04 罗云富 Possess the self-timer method and self-timer of face function
CN106426180A (en) * 2016-11-24 2017-02-22 深圳市旗瀚云技术有限公司 Robot capable of carrying out intelligent following based on face tracking
CN106406323A (en) * 2016-12-14 2017-02-15 智易行科技(武汉)有限公司 Adaptive precision motion control method for mobile platform based on Beidou-GPS navigation
CN106774318A (en) * 2016-12-14 2017-05-31 智易行科技(武汉)有限公司 Multiple agent interactive environment is perceived and path planning kinematic system
CN106791565A (en) * 2016-12-15 2017-05-31 北京奇虎科技有限公司 Robot video calling control method, device and terminal
CN106625711A (en) * 2016-12-30 2017-05-10 华南智能机器人创新研究院 Method for positioning intelligent interaction of robot
CN106886216A (en) * 2017-01-16 2017-06-23 深圳前海勇艺达机器人有限公司 Robot automatic tracking method and system based on RGBD Face datections
CN106843280A (en) * 2017-02-17 2017-06-13 深圳市踏路科技有限公司 A kind of intelligent robot system for tracking
CN107019498A (en) * 2017-03-09 2017-08-08 深圳市奥芯博电子科技有限公司 Nurse robot
CN107330368A (en) * 2017-05-27 2017-11-07 芜湖星途机器人科技有限公司 The robot recognition of face device of tilting multi-cam
CN107330366A (en) * 2017-05-27 2017-11-07 芜湖星途机器人科技有限公司 Tilting can adjust the robot recognition of face device of multi-cam
CN107390721A (en) * 2017-07-26 2017-11-24 歌尔科技有限公司 Robot retinue control method, device and robot
CN107454335A (en) * 2017-08-31 2017-12-08 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and mobile terminal
CN108304799A (en) * 2018-01-30 2018-07-20 广州市君望机器人自动化有限公司 A kind of face tracking methods
CN108171219A (en) * 2018-01-30 2018-06-15 广州市君望机器人自动化有限公司 Face method is tracked by a kind of robot
CN109658570A (en) * 2018-12-19 2019-04-19 中新智擎科技有限公司 A kind of server, client, mobile robot, door access control system and method
CN109686031A (en) * 2018-12-21 2019-04-26 北京智行者科技有限公司 Identification follower method based on security protection
CN109686031B (en) * 2018-12-21 2020-10-27 北京智行者科技有限公司 Identification following method based on security
CN109976335A (en) * 2019-02-27 2019-07-05 武汉大学 A kind of traceable Portable stereoscopic live streaming intelligent robot and its control method
CN109946703A (en) * 2019-04-10 2019-06-28 北京小马智行科技有限公司 A kind of sensor attitude method of adjustment and device

Similar Documents

Publication Publication Date Title
CN105182983A (en) Face real-time tracking method and face real-time tracking system based on mobile robot
Kristan et al. The seventh visual object tracking vot2019 challenge results
Munaro et al. Tracking people within groups with RGB-D data
CN102682302B (en) Human body posture identification method based on multi-characteristic fusion of key frame
Luber et al. People tracking in rgb-d data with on-line boosted target models
Basso et al. Fast and robust multi-people tracking from RGB-D data for a mobile robot
US10939791B2 (en) Mobile robot and mobile robot control method
Zhou et al. Self‐supervised learning to visually detect terrain surfaces for autonomous robots operating in forested terrain
Gritti et al. Kinect-based people detection and tracking from small-footprint ground robots
CN103353935A (en) 3D dynamic gesture identification method for intelligent home system
WO2019179441A1 (en) Focus tracking method and device of smart apparatus, smart apparatus, and storage medium
CN105931276B (en) A kind of long-time face tracking method based on patrol robot intelligence cloud platform
CN107398900A (en) Active system for tracking after robot identification human body
Maier et al. Vision-based humanoid navigation using self-supervised obstacle detection
Su et al. Global localization of a mobile robot using lidar and visual features
Cosgun et al. Context-aware robot navigation using interactively built semantic maps
Wang et al. Immersive human–computer interactive virtual environment using large-scale display system
Chaabane et al. Deft: Detection embeddings for tracking
Luber et al. Learning to detect and track people in rgbd data
Wang et al. Review on kernel based target tracking for autonomous driving
Perera et al. Human motion analysis from UAV video
Königs et al. Fast visual people tracking using a feature-based people detector
Taddei et al. Detecting ambiguity in localization problems using depth sensors
Chen FOLO”: A vision-based human-following robot
Liu et al. Visual attention servo control for task-specific robotic applications

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20151223

RJ01 Rejection of invention patent application after publication