CN102955565A - Man-machine interaction system and method - Google Patents

Man-machine interaction system and method Download PDF

Info

Publication number
CN102955565A
CN102955565A CN2011102547029A CN201110254702A CN102955565A CN 102955565 A CN102955565 A CN 102955565A CN 2011102547029 A CN2011102547029 A CN 2011102547029A CN 201110254702 A CN201110254702 A CN 201110254702A CN 102955565 A CN102955565 A CN 102955565A
Authority
CN
China
Prior art keywords
facial expression
module
information
human
computer interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011102547029A
Other languages
Chinese (zh)
Inventor
吴冠廷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DEXIN INTERACTION TECHNOLOGY (BEIJING) Co Ltd
Original Assignee
DEXIN INTERACTION TECHNOLOGY (BEIJING) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DEXIN INTERACTION TECHNOLOGY (BEIJING) Co Ltd filed Critical DEXIN INTERACTION TECHNOLOGY (BEIJING) Co Ltd
Priority to CN2011102547029A priority Critical patent/CN102955565A/en
Publication of CN102955565A publication Critical patent/CN102955565A/en
Pending legal-status Critical Current

Links

Abstract

The invention relates to a man-machine interaction system and method. The system comprises a video capture device and a control device, wherein the video capture device is used for capturing and outputting images in real time; the control device comprises a receiving module, a facial expression extraction module, a storage module; a control module and an execution module; the receiving module is used for receiving the images; the facial expression extraction module is used for extracting the hot spot region information of the face of a user from the images and determining the facial expression information of the user according to the hot spot region information; the storage module is used for pre-storing the facial expression information and the correspondence information of control commands; the control module is used for searching for the determined facial expression information in the correspondence information stored in the storage module in a matching manner to determine the control command corresponding to the facial expression information determined by the facial expression extraction module; and the execution module is used for executing the control command determined by the control module. According to the technical scheme provided by the invention, man-machine interaction can be achieved based on the facial expression of the user, so that the implementation mode of man-machine interaction is diversified.

Description

Human-computer interaction system and method
Technical field
The present invention relates to a kind of human-computer interaction technology, particularly relate to a kind of human-computer interaction system and method.
Background technology
The human-computer interaction technology has been widely used in daily life and the work.For example, control of somatic sensation television game and electric equipment etc.Especially the somatic sensation television game in the human-computer interaction technology is because it has the purpose of body-building and amusement concurrently and liking of people extremely.
Present existing human-computer interaction technology normally realizes based on opertaing device, and for example, somatic sensation television game is normally realized by computing machine and body sense control device or realized by televisor, set-top box and body sense control device.Body sense control device such as game paddle etc., body sense control device can be held in hand by user's one hand or both hands usually, and carries out control operation.
The inventor finds in realizing process of the present invention: the opertaing device in the human-computer interaction is a physical entity equipment normally, and usually be comprised of elements such as a plurality of buttons, rocking bar, light source, Gravity accelerometer and the small screen.Yet the entity device that present human-computer interaction technology can be not limited to physics has been realized, and the implementation of existing human-computer interaction awaits further abundant.
Because the demand that above-mentioned existing human-computer interaction technology exists, the inventor is based on being engaged in for many years abundant practical experience and professional knowledge of this type of product design manufacturing, and the utilization of cooperation scientific principle, positive research and innovation in addition, to founding a kind of new human-computer interaction system and method, can satisfy the demand that existing human-computer interaction technology exists, make it have more practicality.Through continuous research and design, and through after repeatedly studying sample and improvement, finally create the present invention who has practical value.
Summary of the invention
The object of the invention is to, satisfy the demand that the human-computer interaction technology exists, and a kind of new human-computer interaction system and method are provided, technical matters to be solved is, makes the implementation diversification of human-computer interaction technology, is very suitable for practicality.
Purpose of the present invention and solve its technical matters and can adopt following technical scheme to realize.
According to a kind of human-computer interaction system that the present invention proposes, described human-computer interaction system comprises: capture device and control device; Capture device is used for the real time shooting image, and output; Described control device comprises: receiver module is used for receiving the image that described capture device transmission comes; The facial expression extraction module is used for extracting the hot spot region information of user's face from the image that described receiver module receives, and determines user's facial expression information according to described hot spot region information; Memory module is for the correspondence relationship information of pre-stored facial expression information and control order; Control module is used for the facial expression information that the facial expression extraction module is determined is carried out matched and searched in the correspondence relationship information that described memory module is stored, with control command corresponding to facial expression information of determining that the facial expression extraction module is determined; Execution module is used for carrying out the control command that described control module is determined.
Purpose of the present invention and solve its technical matters and can also be further achieved by the following technical measures.
Better, aforesaid human-computer interaction system, wherein said capture device and described control device are integrated to be arranged in the same electronic equipment.
Better, aforesaid human-computer interaction system, wherein said electronic equipment comprises: computing machine, game machine, mobile phone, panel computer, set-top box, TV set-top box all-in-one or televisor.
Better, aforesaid human-computer interaction system, wherein said capture device and described control device independent separate arrange, and are connected or connect by wireless mode by wired mode between described capture device and the described control device.
Better, aforesaid human-computer interaction system, wherein said control device are arranged in computing machine, game machine, mobile phone, panel computer, set-top box, TV set-top box all-in-one or the televisor.
A kind of human-computer interaction method according to the present invention proposes comprises: the real time shooting image; From the image of described picked-up, extract the hot spot region information of user's face; Determine user's facial expression information according to described hot spot region information; With described facial expression information matched and searched in the correspondence relationship information of pre-stored facial expression information and control order, to determine described user's control command corresponding to facial expression information; Carry out the described control command of determining.
By technique scheme, human-computer interaction system of the present invention and method have following advantages and beneficial effect at least: the present invention is by utilizing capture device pickup image, facial expression extraction module that the facial expression information corresponding to hot spot region information of the user's face in the image of picked-up is offered control module, make control module export control command according to the corresponding relation of storing in this facial expression information and the memory module, to realize the human-computer interaction based on facial expression, thereby make the implementation diversification of human-computer interaction, be very suitable for practicality.
In sum, the present invention has significant progress technically, has obvious good effect, really is a new and innovative, progressive, practical new design.
Above-mentioned explanation only is the general introduction of technical solution of the present invention, for can clearer understanding technological means of the present invention, and can be implemented according to the content of instructions, and for above and other purpose of the present invention, feature and advantage can be become apparent, below especially exemplified by preferred embodiment, and cooperate accompanying drawing to be described in detail as follows.
Description of drawings
Fig. 1 is human-computer interaction system schematic diagram of the present invention;
Fig. 2 is human-computer interaction method flow diagram of the present invention.
Embodiment
Reach technological means and the effect that predetermined goal of the invention is taked for further setting forth the present invention, below in conjunction with accompanying drawing and preferred embodiment, human-computer interaction system and its embodiment of method, structure, feature, flow process and effect thereof to foundation the present invention proposes are described in detail as follows.
Embodiment one, human-computer interaction system.This system as shown in Figure 1.
Human-computer interaction system shown in Fig. 1 comprises: capture device 1 and control device 2.Control device 2 wherein comprises: receiver module 21, facial expression extraction module 22, memory module 23, control module 24 and execution module 25.Receiver module 21 is connected with facial expression extraction module 22, and control module 24 all is connected with facial expression extraction module 22, memory module 23 and execution module 25.
Capture device 1 is mainly used in the real time shooting image, and the image that its picked-up is arrived is sent to control device 2.The real time shooting here such as capture device 1 are carried out image sampling according to predetermined sampling frequency.This capture device 1 can with control device 2 integrated settings, also can be separated from each other independent setting with control device 2.In the situation that capture device 1 is separated from each other independent the setting with control device 2, capture device 1 can adopt the wired connection modes be connected with control device 2 or adopt the wireless connections mode to connect, and namely the image transmitting that its picked-up arrived by wired or wireless mode of capture device is to control device 2.
Capture device 1 can adopt the picture pick-up devices such as existing camera and video camera, and the present invention does not limit the particular type of capture device 1.
Control device 2 be mainly used in according to the image of capture device 1 real time shooting, with and the correspondence relationship information of pre-stored user's facial expression information and control order determine control command corresponding to user's facial expression that picked-up is arrived, this control command that control device 2 is determined by carrying out it, thus realized human-computer interaction based on facial expression.
Because human-computer interaction system provided by the invention only is based on user's facial expression and realizes human-computer interaction, therefore, the user can be sitting in before the table fully or stand in before the table or be positioned at other in-plant place and carry out human-computer interaction such as somatic sensation television game etc., thereby can realize the close distance motion sensing game, and can make the user in game process, carry out facial movement, strengthen the user to the interest of wellness approach.
Receiver module 21 in the control device 2 is mainly used in the image sequence that 1 transmission of receiver, video trap setting comes.In the situation that capture device 1 arranges with control device 2 independent separate, receiver module 21 can receive the image that capture device 1 transmission comes by wired or wireless mode.A concrete example: receiver module 21 can be by the next image of the transmission mode receiver, video trap settings such as bluetooth, 2.4GHz, WIFI, Infrared Transmission and USB 1 transmission, and namely receiver module 21 can be bluetooth module, 2.4GHz module, WIFI module, infrared module or USB module.Be arranged on same electronic equipment in the situation that capture device 1 and control device 2 are integrated, receiver module 21 can be the buffer memory medium.The present invention does not limit the specific implementation of receiver module 21.
Facial expression extraction module 22 in the control device 2 is mainly used in extracting the hot spot region information of user's face from the image that receiver module 21 receives, and judge facial expression information corresponding to this hot spot region information, afterwards, facial expression extraction module 22 is to control module 24 its facial expression information of determining of output.The hot spot region here such as eyebrow, eyes, face and cheek etc.The facial expression information here such as the key value at the one or more positions in the positions such as eyebrow, eyes, face and cheek also can be index value of facial expression etc.The present invention does not limit the concrete manifestation form of the facial expression information of facial expression extraction module 22 outputs.In addition, the facial expression extraction module 22 among the present invention can adopt existing technology (such as smiling face's extractive technique etc.) to extract the hot spot region information of the privileged site of user's face.
Need to prove, facial expression extraction module 22 can be optimized its image that receives first and process operation after receiving the next image of capture device 1 transmission, afterwards, carries out the extraction of hot spot region information again and determines that facial expression operates.The above-mentioned optimization process operation that image is carried out can comprise: remove one or more operation in the operations such as invalid information processing, the processing of removal interfere information, the processing of removal blood-shot eye illness, correction lens distortion processing and the processing of enhancing effective information.
In addition, facial expression extraction module 22 is in the process of extracting hot spot region information, the image transitions that can first capture device 1 transmission be come is black white image, thereby facial expression extraction module 22 can according to the gray-scale value of the pixel in the black white image, extract the hot spot region information of user's face.
Memory module 23 in the control device 2 is mainly used in storing the correspondence relationship information between facial expression information and the control command, for example, stores the call number of facial expression and the correspondence relationship information of control command in the memory module 23; Again for example, store each key value of facial expression and the correspondence relationship information between the control command in the memory module 23.The bloating of the bending direction of the folding degree of above-mentioned key value such as eyes, the flexibility of eyebrow, face folding degree, face and cheek/sinking degree etc.Above-mentioned control command can be called certain the concrete control command used for equipment, and for example, televisor changes platform, browses the photo page turning, closes the webpage of browsing or the game order in the somatic sensation television game etc.
The present invention can arrange the correspondence relationship information of storage in the memory module 23 dynamically, concrete example: at first, open capture device 1, capture device 1 beginning camera operation, the picture that comprises user's face that capture device 1 is absorbed is displayed on the display screen of control device 2 place equipment, control device 2 can show facial regional location scope at display screen, the user can by adjust its sitting posture or erect-position etc. make capture device 1 picked-up to user's face be positioned at this facial zone position range, afterwards, the user clicks button (register button that shows on the enter key on this button such as the computer keyboard or the screen or the definite key on the telepilot etc.) corresponding to picked-up picture; Control device 2 is after the order that monitors the picked-up picture that is produced by this button, obtain the picture that includes user's facial zone, this picture can be black and white picture or colour picture, then, control device 2 obtains the hot spot region image information from this picture, and determine the facial expression index value by analyzing this hot spot region image information, afterwards, this facial expression index value and the current control command that arranges of needing are stored in the memory module 23 with the form responding of record in the table.
Certainly, control device 2 among the present invention also can adopt other operating process to store correspondence relationship information between facial expression information and the control command in memory module 23, and the present invention is not limited in the specific implementation of the correspondence relationship information between the memory module 23 facial expression informations of storage and the control command.
Control module 24 in the control device 2 is mainly used in based on the correspondence relationship information of storage in the memory module 23 the facial expression information that 22 transmission of facial expression extraction module come being converted to corresponding control command, and this control command offered execution module in the control device 2, to realize man-machine interaction such as somatic sensation television game etc.A concrete example: carry out matched and searched in the correspondence relationship information that control module 24 can utilize the facial expression information of facial expression extraction module 22 outputs to store in memory module 23, the control command in the record that matches is its control command corresponding to facial expression information that receives.
Control device 2 can also comprise the elements such as display screen and power module.Display screen is mainly used in display frame (image that 1 picked-up obtains such as game picture and capture device etc.).This display screen can be display screen, television display screen, tablet personal computer display screen or the game machine display screen etc. of computer screen, mobile phone.Power module is mainly used in providing electric power resource in the control device 2 each with electric device (such as each module in each module in the control device 2 or capture device 1 and the control device 2 etc.), the power module that this power module can carry for electronic equipments itself such as computing machine, mobile phone, televisor, panel computer or game machines, the power module that also can arrange for the human-computer interaction system that aims among the present invention.
The set-up mode of the human-computer interaction system of putting down in writing in above-described embodiment can be following two kinds of forms:
The first form is: whole human-computer interaction system is integrated to be arranged in the same electronic equipment, thereby whole human-computer interaction system becomes the part of an electronic equipment, and this electronic equipment can be specially computing machine, mobile phone, televisor, panel computer, set-top box, television set top box all-in-one or game machine etc.A concrete example: be arranged in the computing machine in the situation that human-computer interaction system is integrated, the capture device 1 integrated upper left corner that is arranged on computer display, and in the internal circuit of the integrated main frame that is arranged on computing machine of control device 2.
The second form is: capture device in the human-computer interaction system 1 and control device 2 independent separate settings, and can be connected by wired or wireless mode between capture device 1 and the control device 2.Concrete, capture device 1 can be the equipment independently such as camera or video camera, and 2 of control device can integratedly be arranged in the electronic equipments such as computing machine, mobile phone, televisor, panel computer (PAD), set-top box, television set top box all-in-one or game machine.
Embodiment two, human-computer interaction method.The flow process of the method as shown in Figure 2.
Human-computer interaction method shown in Fig. 2 comprises the steps:
S200, real time shooting image.Concrete, can carry out image sampling according to predetermined sampling frequency.
S210, from the image of real time shooting, extract the hot spot region information of user's face.
Concrete, above-mentioned hot spot region such as eyebrow, eyes, face and cheek etc.Can adopt existing technology (such as smiling face's extractive technique etc.) to extract the hot spot region information of the privileged site of user's face among the present invention.The present invention can be first be optimized the image of real time shooting and processes operation, carries out afterwards the extraction operation of hot spot region information again.The above-mentioned optimization process operation that image is carried out can comprise: remove one or more operation in the operations such as invalid information processing, the processing of removal interfere information, the processing of removal blood-shot eye illness, correction lens distortion processing and the processing of enhancing effective information.In addition, in the process of extracting hot spot region information, can be first with picked-up to image transitions be black white image, thereby according to the gray-scale value of the pixel in the black white image, extract the hot spot region information of user's face.
S220, determine user's facial expression information according to hot spot region information.
Concrete, above-mentioned facial expression information such as the key value at the one or more positions in the positions such as eyebrow, eyes, face and cheek also can be for the index values of facial expression etc.The present invention does not limit the concrete manifestation form of facial expression information.
S230, with the above-mentioned facial expression information of determining matched and searched in the correspondence relationship information of pre-stored facial expression information and control order, to determine user's control command corresponding to facial expression information.
Concrete, the pre-stored correspondence relationship information that has between facial expression information and the control command of the present invention has the call number of facial expression and a correspondence relationship information of control command as pre-stored; Again for example, pre-stored have each key value of facial expression and a correspondence relationship information between the control command.The bloating of the bending direction of the folding degree of above-mentioned key value such as eyes, the flexibility of eyebrow, face folding degree, face and cheek/sinking degree etc.Above-mentioned control command can be called certain the concrete control command used for equipment, and for example, televisor changes platform, the front/rear page turning of computing machine browsing pictures, closes the webpage of browsing or the game order in the somatic sensation television game etc.
The present invention can arrange pre-stored correspondence relationship information dynamically, for example: at first, begin to carry out camera operation, the picture that comprises user's face that absorbs is shown, the present invention can show facial regional location scope, the user can by adjust its sitting posture or erect-position etc. make real time shooting to user's face be positioned at this facial zone position range, afterwards, the user clicks button (register button that shows on the enter key on this button such as the computer keyboard or the screen or the definite key on the telepilot etc.) corresponding to picked-up picture, thereby obtain the picture that includes user's facial zone, this picture can be black and white picture or colour picture, then, from this picture, obtain the hot spot region image information, and determine the facial expression index value by analyzing this hot spot region image information, afterwards, this facial expression index value and the current control command that needs to arrange are stored with the form responding that records in the table.
Certainly, the present invention also can adopt the correspondence relationship information between the pre-stored facial expression information of other operating process and the control command, and the present invention does not limit the specific implementation of the correspondence relationship information between pre-stored facial expression information and the control command.
Determine a concrete example of control command: the facial expression information of utilizing abovementioned steps to obtain is carried out matched and searched in pre-stored correspondence relationship information, the control command in the record that matches is above-mentioned control command corresponding to facial expression information of determining.
S240, the above-mentioned control command of determining of execution.
The above only is preferred embodiment of the present invention, be not that the present invention is done any pro forma restriction, although the present invention discloses as above with preferred embodiment, yet be not to limit the present invention, any those skilled in the art are not within breaking away from the technical solution of the present invention scope, when the technology contents that can utilize above-mentioned announcement is made a little change or is modified to the equivalent embodiment of equivalent variations, in every case be the content that does not break away from technical solution of the present invention, any simple modification that foundation technical spirit of the present invention is done above embodiment, equivalent variations and modification all still belong in the scope of technical solution of the present invention.

Claims (6)

1. a human-computer interaction system is characterized in that, described system comprises: capture device and control device;
Described capture device is used for the real time shooting image, and output;
Described control device comprises:
Receiver module is used for receiving the image that described capture device transmission comes;
The facial expression extraction module is used for extracting the hot spot region information of user's face from the image that described receiver module receives, and determines user's facial expression information according to described hot spot region information;
Memory module is for the correspondence relationship information of pre-stored facial expression information and control order;
Control module is used for the facial expression information that the facial expression extraction module is determined is carried out matched and searched in the correspondence relationship information that described memory module is stored, with control command corresponding to facial expression information of determining that the facial expression extraction module is determined;
Execution module is used for carrying out the control command that described control module is determined.
2. human-computer interaction system as claimed in claim 1 is characterized in that, described capture device and described control device are integrated to be arranged in the same electronic equipment.
3. human-computer interaction system as claimed in claim 2 is characterized in that, described electronic equipment comprises: computing machine, game machine, mobile phone, panel computer, set-top box, TV set-top box all-in-one or televisor.
4. human-computer interaction system as claimed in claim 1, it is characterized in that, described capture device and described control device independent separate arrange, and are connected or connect by wireless mode by wired mode between described capture device and the described control device.
5. human-computer interaction system as claimed in claim 4 is characterized in that, described control device is arranged in computing machine, game machine, mobile phone, panel computer, set-top box, TV set-top box all-in-one or the televisor.
6. a human-computer interaction method is characterized in that, described method comprises:
The real time shooting image;
From the image of described picked-up, extract the hot spot region information of user's face;
Determine user's facial expression information according to described hot spot region information;
With described facial expression information matched and searched in the correspondence relationship information of pre-stored facial expression information and control order, to determine described user's control command corresponding to facial expression information;
Carry out the described control command of determining.
CN2011102547029A 2011-08-31 2011-08-31 Man-machine interaction system and method Pending CN102955565A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011102547029A CN102955565A (en) 2011-08-31 2011-08-31 Man-machine interaction system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011102547029A CN102955565A (en) 2011-08-31 2011-08-31 Man-machine interaction system and method

Publications (1)

Publication Number Publication Date
CN102955565A true CN102955565A (en) 2013-03-06

Family

ID=47764447

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011102547029A Pending CN102955565A (en) 2011-08-31 2011-08-31 Man-machine interaction system and method

Country Status (1)

Country Link
CN (1) CN102955565A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309450A (en) * 2013-06-09 2013-09-18 张家港市鸿嘉数字科技有限公司 Method for identifying facial expression of user to operate tablet personal computer
CN103399630A (en) * 2013-07-05 2013-11-20 北京百纳威尔科技有限公司 Method and device for recording facial expressions
CN105307737A (en) * 2013-06-14 2016-02-03 洲际大品牌有限责任公司 Interactive video games
CN106162052A (en) * 2015-03-30 2016-11-23 联想(北京)有限公司 Transmitting video image and device thereof, the method for display image and device thereof
CN106325501A (en) * 2016-08-10 2017-01-11 合肥泰壤信息科技有限公司 Game control method and system based on facial expression recognition technology
CN106325524A (en) * 2016-09-14 2017-01-11 珠海市魅族科技有限公司 Method and device for acquiring instruction
CN107020637A (en) * 2016-01-29 2017-08-08 深圳光启合众科技有限公司 The emotion expression method and pet robot of pet robot
CN107527033A (en) * 2017-08-25 2017-12-29 歌尔科技有限公司 Camera module and social intercourse system
CN108009411A (en) * 2017-12-21 2018-05-08 江西爱驰亿维实业有限公司 Method, apparatus and computing device based on recognition of face control automobile

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201289739Y (en) * 2008-11-18 2009-08-12 天津三星电子有限公司 Remote control video player capable of automatically recognizing expression
CN101825947A (en) * 2010-05-04 2010-09-08 中兴通讯股份有限公司 Method and device for intelligently controlling mobile terminal and mobile terminal thereof
CN102033696A (en) * 2009-09-24 2011-04-27 株式会社泛泰 Apparatus and method for controlling picture using image recognition
CN102058983A (en) * 2010-11-10 2011-05-18 无锡中星微电子有限公司 Intelligent toy based on video analysis
CN102098567A (en) * 2010-11-30 2011-06-15 深圳创维-Rgb电子有限公司 Interactive television system and control method thereof
US20110158546A1 (en) * 2009-12-25 2011-06-30 Primax Electronics Ltd. System and method for generating control instruction by using image pickup device to recognize users posture
CN102164541A (en) * 2008-12-17 2011-08-24 爱信精机株式会社 Opened/closed eye recognizing apparatus and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201289739Y (en) * 2008-11-18 2009-08-12 天津三星电子有限公司 Remote control video player capable of automatically recognizing expression
CN102164541A (en) * 2008-12-17 2011-08-24 爱信精机株式会社 Opened/closed eye recognizing apparatus and program
CN102033696A (en) * 2009-09-24 2011-04-27 株式会社泛泰 Apparatus and method for controlling picture using image recognition
US20110158546A1 (en) * 2009-12-25 2011-06-30 Primax Electronics Ltd. System and method for generating control instruction by using image pickup device to recognize users posture
CN101825947A (en) * 2010-05-04 2010-09-08 中兴通讯股份有限公司 Method and device for intelligently controlling mobile terminal and mobile terminal thereof
CN102058983A (en) * 2010-11-10 2011-05-18 无锡中星微电子有限公司 Intelligent toy based on video analysis
CN102098567A (en) * 2010-11-30 2011-06-15 深圳创维-Rgb电子有限公司 Interactive television system and control method thereof

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309450A (en) * 2013-06-09 2013-09-18 张家港市鸿嘉数字科技有限公司 Method for identifying facial expression of user to operate tablet personal computer
CN105307737A (en) * 2013-06-14 2016-02-03 洲际大品牌有限责任公司 Interactive video games
CN103399630A (en) * 2013-07-05 2013-11-20 北京百纳威尔科技有限公司 Method and device for recording facial expressions
CN106162052A (en) * 2015-03-30 2016-11-23 联想(北京)有限公司 Transmitting video image and device thereof, the method for display image and device thereof
CN107020637A (en) * 2016-01-29 2017-08-08 深圳光启合众科技有限公司 The emotion expression method and pet robot of pet robot
CN106325501A (en) * 2016-08-10 2017-01-11 合肥泰壤信息科技有限公司 Game control method and system based on facial expression recognition technology
CN106325524A (en) * 2016-09-14 2017-01-11 珠海市魅族科技有限公司 Method and device for acquiring instruction
CN107527033A (en) * 2017-08-25 2017-12-29 歌尔科技有限公司 Camera module and social intercourse system
WO2019037217A1 (en) * 2017-08-25 2019-02-28 歌尔科技有限公司 Camera assembly and social networking system
CN108009411A (en) * 2017-12-21 2018-05-08 江西爱驰亿维实业有限公司 Method, apparatus and computing device based on recognition of face control automobile

Similar Documents

Publication Publication Date Title
CN102955565A (en) Man-machine interaction system and method
US11838650B2 (en) Photographing using night shot mode processing and user interface
CN103034323A (en) Man-machine interaction system and man-machine interaction method
CN112717370B (en) Control method and electronic equipment
CN103034322A (en) Man-machine interaction system and man-machine interaction method
CN110020140B (en) Recommended content display method, device and system
CN106453962B (en) Camera shooting control method of double-screen intelligent terminal
EP2897028B1 (en) Display device and method for controlling the same
US9319632B2 (en) Display apparatus and method for video calling thereof
CN105260093B (en) A kind of method, device and mobile terminal of intelligent set screen rotation
WO2021093583A1 (en) Video stream processing method and apparatus, terminal device, and computer readable storage medium
CN105100609A (en) Mobile terminal and shooting parameter adjusting method
CN110427151A (en) A kind of method and electronic equipment controlling user interface
US20140125757A1 (en) Method of providing information-of-users' interest when video call is made, and electronic apparatus thereof
CN108777766B (en) Multi-person photographing method, terminal and storage medium
CN106534667B (en) Distributed collaborative rendering method and terminal
CN103379278A (en) Display control device and device control method
CN109788268A (en) Terminal and its white balance correction control method and computer readable storage medium
CN103347211A (en) Method for sending screenshot in mobile phone television playing process
KR20220158101A (en) Image taking methods and electronic equipment
US20230316529A1 (en) Image processing method and apparatus, device and storage medium
CN109842723A (en) Terminal and its screen brightness control method and computer readable storage medium
CN202093528U (en) Character recognition system and translation system based on gestures
CN202275357U (en) Human-computer interaction system
CN102760198A (en) Close somatosensory interaction device and method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130306