CN106355242A - Interactive robot on basis of human face detection - Google Patents

Interactive robot on basis of human face detection Download PDF

Info

Publication number
CN106355242A
CN106355242A CN201610848652.XA CN201610848652A CN106355242A CN 106355242 A CN106355242 A CN 106355242A CN 201610848652 A CN201610848652 A CN 201610848652A CN 106355242 A CN106355242 A CN 106355242A
Authority
CN
China
Prior art keywords
face
module
user
head
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610848652.XA
Other languages
Chinese (zh)
Inventor
俞艳青
陈关峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Xiaolu Robot Co Ltd
Original Assignee
Suzhou Xiaolu Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Xiaolu Robot Co Ltd filed Critical Suzhou Xiaolu Robot Co Ltd
Priority to CN201610848652.XA priority Critical patent/CN106355242A/en
Publication of CN106355242A publication Critical patent/CN106355242A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Robotics (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to an interactive robot on the basis of human face detection. The interactive robot has the advantages that an angle of a head can be adjusted in real time by means of human face chasing, so that the head can directly face users, reliable data sources can be provided for follow-up further human face recognition, and the interactive robot is good in science and technology feeling and user experience; levels of the users can be judged by means of human face recognition, and accordingly excellent support can be provided for providing different types of service for the different users.

Description

A kind of interactive robot based on Face datection
Technical field
The present invention relates to a kind of intelligent robot, more particularly, to a kind of interactive robot based on Face datection.
Background technology
With computer technology continuous to intelligent direction development, the continuous extension in robot application field and in-depth, Roboticses just extend rapidly to fields such as medical services, Edutainment, home services from industry manufacture field.
The research of face identification system starts from the sixties in 20th century, with computer technology and optical imagery skill after the eighties The development of art is improved, and actually enters the application stage of primary then 90 year later stage.The development of artificial intelligence has promoted people Face identifies the development to face tracking direction, makes intelligent subscriber experience more smooth.
Recognition of face, is a kind of biological identification technology that facial feature information based on people carries out identification.With shooting Machine or the photographic head image containing face for the collection or video flowing, and automatic detect and track face in the picture, and then to detection To face carry out a series of correlation techniques of face, generally also referred to as Identification of Images, facial recognition.Face tracking is mainly base Add that related frame for movement to coordinate in face recognition technology, the intelligence carrying out face is followed the trail of.
Current recognition of face is mainly used in the intrinsic function of the photographic head such as auto-focusing, does not move with robot head Deng external equipment linkage, when robot is with user interaction, need people's ceaselessly adjustment position or state adjusting video camera It is identified, be mainly shown as and use inconvenience it is impossible to accomplish real-time tracing.
In view of above-mentioned defect, the design people, actively research and innovation in addition, to found a kind of new structure based on The interactive robot of Face datection is so as to more the value in industry.
Content of the invention
For solving above-mentioned technical problem, it is an object of the invention to provide a kind of interactive robot based on Face datection, make Face can be followed the trail of with face recognition technology, all around shift position real-time adjustment robot head is up and down according to user Rotate, make robot head just carry out question and answer interaction to user.
The interactive robot based on Face datection of the present invention, including
- photographic head, positioned at described robot head middle, for the image containing face for the collection or video flowing;
- face recognition module, is connected with photographic head, for identifying the face in image or video flowing;
- range sensor, for detecting the distance of user opposed robots;
- control main frame, connects described face recognition module and range sensor, use is detected according to described range sensor The distance of family opposed robots, judges whether user enters detection range, and controls whether described face recognition module enters pedestrian Face detects;
- face location computing module, is connected with described control main frame, according to face position in the picture, calculates user Whether photographic head is offset and deviation post;
- head control, is connected with described face location computing module, and is connected with motor rotational angle computing module, Described motor rotational angle computing module by described head control according to described face location computing module calculate inclined Move data reduction to send as an envoy to robot angle just to be rotated to user's head motor;
- head motor driver, is connected with described motor rotational angle computing module, calculates mould according to motor rotational angle Angle driven machine head part's motor that block calculates rotates corresponding angle, and the head making robot is just to user.
Further, also include cloud computing platform and the action expression interactive module being connected with described control main frame, described Cloud computing platform has face database, and described control main frame is according to the face inquiry that described face recognition module detects Face database is simultaneously compared, according to comparison result control described action expression interactive module whether make corresponding action with Expression.
Further, described control main frame is also associated with voice acquisition module and voice output module;Described cloud computing is put down Platform also has sound identification module, semantic meaning analysis module and intelligent Answering module;Described control main frame is according to described voice collecting The voice of module collection controls described voice output by described sound identification module, semantic meaning analysis module and intelligent Answering module Module makes answer.
Further, described cloud computing platform is wirelessly connected with described control main frame.
Further, described robot carries out interaction with user by the following method, and methods described includes step:
(1) judge user whether in the detection range of robot, if waking up the robot execution being in park mode Next step, if not existing, continues detection;
(2) face is identified by face recognition technology;
(3) position in whole map sheet according to face, determines whether user offsets and deviation post to photographic head;
(4) if user is not in photographic head middle, the angle up and down adjusting head makes photographic head just to user;
(5) when no longer capturing face, return positive robot head.
Further, methods described also can carry out inquiry operation during carrying out face tracking simultaneously.
Further, methods described, when inquiry operates, also can be passed through to contrast face database according to the face detecting, Discriminating user is classified, and provides differentiated service for inhomogeneity user.
By such scheme, the present invention is chased by face, energy real-time adjustment head angle, makes head just to user, can To provide more reliable data source for subsequently further recognition of face, also bring more preferable technology sense and user's body in addition Test;Simultaneously by user gradation be may determine that to the identification of face, and then provide different services for different user, there is provided good Support.
Described above is only the general introduction of technical solution of the present invention, in order to better understand the technological means of the present invention, And can be practiced according to the content of description, below with presently preferred embodiments of the present invention and coordinate accompanying drawing describe in detail as after.
Brief description
Fig. 1 is the system framework figure of the robot of the present invention;
Fig. 2 is the interactive flow chart of the robot of the present invention.
Specific embodiment
With reference to the accompanying drawings and examples, the specific embodiment of the present invention is described in further detail.Hereinafter implement Example is used for the present invention is described, but is not limited to the scope of the present invention.
Referring to Fig. 1, a kind of interactive robot based on Face datection described in a preferred embodiment of the present invention, including
- photographic head, positioned at robot head middle, for the image containing face for the collection or video flowing;
- face recognition module, is connected with photographic head, for identifying the face in image or video flowing;
- range sensor, for detecting the distance of user opposed robots;
- control main frame, connects face recognition module and range sensor, user is detected relative to machine according to range sensor The distance of device people, judges whether user enters detection range, and controls whether face recognition module carries out Face datection;
- face location computing module, is connected with control main frame, according to face position in the picture, calculates user to taking the photograph As whether head offsets and deviation post;
- head control, is connected with face location computing module, and is connected with motor rotational angle computing module, motor The offset data conversion that rotational angle computing module is calculated according to face location computing module by head control is sent as an envoy to machine Device people angle just to be rotated to user's head motor;
- head motor driver, is connected with motor rotational angle computing module, according to motor rotational angle computing module meter The angle driven machine head part's motor calculating rotates corresponding angle, and the head making robot is just to user.
The present invention can follow the trail of face by face recognition technology, according to user's shift position real-time adjustment machine all around Device head part rotate up and down, makes robot head just to user, substantially increases Consumer's Experience;Do not stop without the need for people Adjustment robot location, you can make robot automatic tracing user.
For lifting interaction effect, present invention additionally comprises the cloud computing platform being connected with control main frame and the interactive mould of action expression Block, cloud computing platform has face database, and control main frame inquires about human face data according to the face that face recognition module detects Simultaneously comparing in storehouse, whether makes corresponding action and expression according to comparison result control action expression interactive module;Control master Machine is also associated with voice acquisition module and voice output module;Cloud computing platform also has sound identification module, semantic parsing mould Block and intelligent Answering module;Control main frame passes through sound identification module, semantic parsing according to the voice that voice acquisition module gathers Answer made by module and intelligent Answering module control voice output module.
Robot is combined by the present invention with cloud computing platform, make robot can freely smooth and user session, machine Device people also can carry out communication exchange according to the different identity recognizing with it or provide specific priority service, and has rich Rich facial expressions and acts, bring more preferable technology sense and Consumer's Experience.
For enabling a cloud computing platform to serve multiple robots, in the present invention, cloud computing platform and control main frame be no Line connects.Face Database vendors can be made to multiple robots, make sound identification module, semantic meaning analysis module and intelligent Answering Module provides inquiry answer according to sequential for multiple robots.
As shown in Fig. 2 the robot of the present invention carries out interaction with user by the following method, interactive approach includes step:
(1) judge user whether in the detection range of robot, if waking up the robot execution being in park mode Next step, if not existing, continues detection;
(2) face is identified by face recognition technology;
(3) position in whole map sheet according to face, determines whether user offsets and deviation post to photographic head;
(4) if user is not in photographic head middle, the angle up and down adjusting head makes photographic head just to user;
(5) when no longer capturing face, return positive robot head.
Certainly, also can carry out inquiry operation during carrying out face tracking simultaneously, when inquiry operates, also can be according to inspection The face measuring passes through to contrast face database, and discriminating user is classified, and provides differentiated service for inhomogeneity user.
The operation principle of the present invention is as follows: user reaches in face of robot, and robot is identified by face recognition technology Face;According to position in whole map sheet for the face, determine whether user offsets and deviation post to photographic head;If user is not Photographic head middle again, then the angle up and down adjusting head makes robot head just to user;By continuous tracker Face simultaneously adjusts robot head in real time, makes robot head always just to user, with user's face-to-face exchange;Identify face Afterwards, compare from face database, thus discriminating user classification, provide differentiated service for inhomogeneity user;When no longer When capturing face, return positive robot head.
The above is only the preferred embodiment of the present invention, is not limited to the present invention it is noted that for this skill For the those of ordinary skill in art field, on the premise of without departing from the technology of the present invention principle, can also make some improve and Modification, these improve and modification also should be regarded as protection scope of the present invention.

Claims (7)

1. a kind of interactive robot based on Face datection it is characterised in that: include
- photographic head, positioned at described robot head middle, for the image containing face for the collection or video flowing;
- face recognition module, is connected with photographic head, for identifying the face in image or video flowing;
- range sensor, for detecting the distance of user opposed robots;
- control main frame, connects described face recognition module and range sensor, user's phase is detected according to described range sensor Distance to robot, judges whether user enters detection range, and controls whether described face recognition module carries out face inspection Survey;
- face location computing module, is connected with described control main frame, according to face position in the picture, calculates user to taking the photograph As whether head offsets and deviation post;
- head control, is connected with described face location computing module, and is connected with motor rotational angle computing module, described The offset numbers that motor rotational angle computing module is calculated according to described face location computing module by described head control According to the robot angle just to be rotated to user's head motor of sening as an envoy to that converts;
- head motor driver, is connected with described motor rotational angle computing module, according to motor rotational angle computing module meter The angle driven machine head part's motor calculating rotates corresponding angle, and the head making robot is just to user.
2. the interactive robot based on Face datection according to claim 1 it is characterised in that: also include and described control Cloud computing platform and action expression interactive module that main frame connects, described cloud computing platform has face database, described control Main frame is inquired about described face database and is compared according to the face that described face recognition module detects, according to comparison result Control whether described action expression interactive module makes corresponding action and expression.
3. the interactive robot based on Face datection according to claim 2 it is characterised in that: described control main frame also connects It is connected to voice acquisition module and voice output module;Described cloud computing platform also has sound identification module, semantic meaning analysis module With intelligent Answering module;The voice that described control main frame gathers according to described voice acquisition module is by described speech recognition mould Block, semantic meaning analysis module and intelligent Answering module control described voice output module to make answer.
4. the interactive robot based on Face datection according to claim 3 it is characterised in that: described cloud computing platform with Described control main frame wirelessly connects.
5. the interactive robot based on Face datection according to claim 4 it is characterised in that: described robot pass through with Lower method and user carry out interaction, and methods described includes step:
(1) judge user whether in the detection range of robot, if waking up and being in the robot of park mode and execute next Step, if not existing, continues detection;
(2) face is identified by face recognition technology;
(3) position in whole map sheet according to face, determines whether user offsets and deviation post to photographic head;
(4) if user is not in photographic head middle, the angle up and down adjusting head makes photographic head just to user;
(5) when no longer capturing face, return positive robot head.
6. the interactive robot based on Face datection according to claim 5 it is characterised in that: methods described is entering pedestrian Inquiry operation also can be carried out in face tracing process simultaneously.
7. the interactive robot based on Face datection according to claim 6 it is characterised in that: methods described inquiry grasp When making, also can pass through to contrast face database according to the face detecting, discriminating user is classified, and provides difference for inhomogeneity user Change service.
CN201610848652.XA 2016-09-26 2016-09-26 Interactive robot on basis of human face detection Pending CN106355242A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610848652.XA CN106355242A (en) 2016-09-26 2016-09-26 Interactive robot on basis of human face detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610848652.XA CN106355242A (en) 2016-09-26 2016-09-26 Interactive robot on basis of human face detection

Publications (1)

Publication Number Publication Date
CN106355242A true CN106355242A (en) 2017-01-25

Family

ID=57859403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610848652.XA Pending CN106355242A (en) 2016-09-26 2016-09-26 Interactive robot on basis of human face detection

Country Status (1)

Country Link
CN (1) CN106355242A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106826867A (en) * 2017-03-31 2017-06-13 上海思依暄机器人科技股份有限公司 A kind of method that robot and control robot head are rotated
CN107133609A (en) * 2017-06-02 2017-09-05 王永安 A kind of man-machine communication robot and its control method
CN107908008A (en) * 2017-12-28 2018-04-13 许峰 A kind of certainly mobile AR display screens
CN108733084A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Control method, device, robot and the storage medium of revolute
CN109461169A (en) * 2018-10-22 2019-03-12 同济大学 A kind of system and method positioned for face tracking and human body
CN109955257A (en) * 2017-12-22 2019-07-02 深圳市优必选科技有限公司 A kind of awakening method of robot, device, terminal device and storage medium
CN109992091A (en) * 2017-12-29 2019-07-09 深圳市优必选科技有限公司 A kind of man-machine interaction method, device, robot and storage medium
CN110326273A (en) * 2017-02-24 2019-10-11 夏普株式会社 Control device, terminal installation, bracket, notice system, control method and control program
CN110861107A (en) * 2019-12-03 2020-03-06 北京海益同展信息科技有限公司 Service robot, display control method thereof, controller, and storage medium
TWI739339B (en) * 2020-03-11 2021-09-11 國立陽明交通大學 System for indoor positioning of personnel and tracking interactions with specific personnel by mobile robot and method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105116994A (en) * 2015-07-07 2015-12-02 百度在线网络技术(北京)有限公司 Intelligent robot tracking method and tracking device based on artificial intelligence
CN105459126A (en) * 2015-12-30 2016-04-06 深圳亿家智宝电子科技有限公司 Robot communication device and achieving method thereof
CN105590084A (en) * 2014-11-03 2016-05-18 贵州亿丰升华科技机器人有限公司 Robot human face detection tracking emotion detection system
CN206331472U (en) * 2016-09-26 2017-07-14 苏州小璐机器人有限公司 A kind of interactive robot based on Face datection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105590084A (en) * 2014-11-03 2016-05-18 贵州亿丰升华科技机器人有限公司 Robot human face detection tracking emotion detection system
CN105116994A (en) * 2015-07-07 2015-12-02 百度在线网络技术(北京)有限公司 Intelligent robot tracking method and tracking device based on artificial intelligence
CN105459126A (en) * 2015-12-30 2016-04-06 深圳亿家智宝电子科技有限公司 Robot communication device and achieving method thereof
CN206331472U (en) * 2016-09-26 2017-07-14 苏州小璐机器人有限公司 A kind of interactive robot based on Face datection

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110326273A (en) * 2017-02-24 2019-10-11 夏普株式会社 Control device, terminal installation, bracket, notice system, control method and control program
CN106826867A (en) * 2017-03-31 2017-06-13 上海思依暄机器人科技股份有限公司 A kind of method that robot and control robot head are rotated
CN107133609A (en) * 2017-06-02 2017-09-05 王永安 A kind of man-machine communication robot and its control method
CN107133609B (en) * 2017-06-02 2020-01-14 王永安 Man-machine communication robot and control method thereof
CN109955257A (en) * 2017-12-22 2019-07-02 深圳市优必选科技有限公司 A kind of awakening method of robot, device, terminal device and storage medium
CN107908008A (en) * 2017-12-28 2018-04-13 许峰 A kind of certainly mobile AR display screens
CN109992091A (en) * 2017-12-29 2019-07-09 深圳市优必选科技有限公司 A kind of man-machine interaction method, device, robot and storage medium
CN108733084A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Control method, device, robot and the storage medium of revolute
CN109461169A (en) * 2018-10-22 2019-03-12 同济大学 A kind of system and method positioned for face tracking and human body
CN110861107A (en) * 2019-12-03 2020-03-06 北京海益同展信息科技有限公司 Service robot, display control method thereof, controller, and storage medium
CN110861107B (en) * 2019-12-03 2020-12-22 北京海益同展信息科技有限公司 Service robot, display control method thereof, controller, and storage medium
WO2021109806A1 (en) * 2019-12-03 2021-06-10 京东数科海益信息科技有限公司 Service robot and display control method therefor, controller, and storage medium
US20220402142A1 (en) * 2019-12-03 2022-12-22 Jingdong Technology Information Technology Co., Ltd. Service robot and display control method thereof, controller and storage medium
TWI739339B (en) * 2020-03-11 2021-09-11 國立陽明交通大學 System for indoor positioning of personnel and tracking interactions with specific personnel by mobile robot and method thereof

Similar Documents

Publication Publication Date Title
CN106355242A (en) Interactive robot on basis of human face detection
CN109034013B (en) Face image recognition method, device and storage medium
Luber et al. People tracking in rgb-d data with on-line boosted target models
CN106096662B (en) Human motion state identification based on acceleration transducer
CN206331472U (en) A kind of interactive robot based on Face datection
CN102023703B (en) Combined lip reading and voice recognition multimodal interface system
WO2016112630A1 (en) Image recognition system and method
CN204480251U (en) The self-service detection system of a kind of driver's physical qualification
CN106682603B (en) Real-time driver fatigue early warning system based on multi-source information fusion
CN109571499A (en) A kind of intelligent navigation leads robot and its implementation
WO2019148491A1 (en) Human-computer interaction method and device, robot, and computer readable storage medium
WO2020029444A1 (en) Method and system for detecting attention of driver while driving
TWI621999B (en) Method for face detection
CN105022999A (en) Man code company real-time acquisition system
CN109583505A (en) A kind of object correlating method, device, equipment and the medium of multisensor
CN111931869B (en) Method and system for detecting user attention through man-machine natural interaction
CN107483813B (en) Method and device for tracking recording and broadcasting according to gestures and storage device
CN112581015B (en) Consultant quality assessment system and assessment method based on AI (advanced technology attachment) test
CN103279188A (en) Method for operating and controlling PPT in non-contact mode based on Kinect
Ponce-López et al. Multi-modal social signal analysis for predicting agreement in conversation settings
CN114255508A (en) OpenPose-based student posture detection analysis and efficiency evaluation method
CN113936340B (en) AI model training method and device based on training data acquisition
KR101817773B1 (en) An Advertisement Providing System By Image Processing of Depth Information
CN102759953A (en) Automatic camera
Ong et al. Sensor fusion based human detection and tracking system for human-robot interaction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170125

RJ01 Rejection of invention patent application after publication