CN202275357U - Human-computer interaction system - Google Patents

Human-computer interaction system Download PDF

Info

Publication number
CN202275357U
CN202275357U CN 201120323511 CN201120323511U CN202275357U CN 202275357 U CN202275357 U CN 202275357U CN 201120323511 CN201120323511 CN 201120323511 CN 201120323511 U CN201120323511 U CN 201120323511U CN 202275357 U CN202275357 U CN 202275357U
Authority
CN
China
Prior art keywords
module
control device
human
computer interaction
facial expression
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201120323511
Other languages
Chinese (zh)
Inventor
吴冠廷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DEXIN INTERACTION TECHNOLOGY (BEIJING) Co Ltd
Original Assignee
DEXIN INTERACTION TECHNOLOGY (BEIJING) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DEXIN INTERACTION TECHNOLOGY (BEIJING) Co Ltd filed Critical DEXIN INTERACTION TECHNOLOGY (BEIJING) Co Ltd
Priority to CN 201120323511 priority Critical patent/CN202275357U/en
Application granted granted Critical
Publication of CN202275357U publication Critical patent/CN202275357U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The utility model relates to a human-computer interaction system, which comprises a video capturing device and a control device. The video capturing device transmits an image which is shot in real time to the control device. The control device comprises a receiving module, an extraction circuit module, a storage module and a control module, wherein the receiving module is connected with the video capturing device, and receives the image from the video capturing device; the extraction circuit module is connected with the receiving module, and outputs facial expression information corresponding to the hotspot region information of a user face in the image received by the receiving module; the storage module stores information about a corresponding relationship between the facial expression information and a control command; and the control module is connected with the extraction circuit module and the storage module respectively, receives the facial expression information output by the extraction circuit module, and outputs the control command corresponding to the received facial expression information in the storage module to the control device. By the technical scheme, human-computer interaction can be realized based on the facial expression of a user in diversified ways.

Description

The human-computer interaction system
Technical field
The utility model relates to a kind of human-computer interaction technology, particularly relates to a kind of human-computer interaction system.
Background technology
During the human-computer interaction technology has been widely used in daily life and has worked.For example, control of somatic sensation television game and electric equipment or the like.Especially the somatic sensation television game in the human-computer interaction technology is because it has purpose and the liking of people extremely of body-building and amusement concurrently.
Present existing human-computer interaction technology realizes based on opertaing device that normally for example, somatic sensation television game is normally realized through computing machine and body sense control device or realized through televisor, STB and body sense control device.Body sense control device such as game paddle etc., body sense control device can be held in hand by the singlehanded perhaps both hands of user usually, and carry out control operation.
The inventor finds in realizing the utility model process: opertaing device is a physical entity equipment normally, and be made up of elements such as a plurality of buttons, rocking bar, light source, acceleration of gravity sensor and the small screen usually.Yet the entity device that present human-computer interaction technology can be not limited to physics has been realized, and the implementation of existing human-computer interaction awaits further abundant.
Because the demand that above-mentioned existing human-computer interaction technology exists; The inventor is based on being engaged in this type of product design manufacturing abundant for many years practical experience and professional knowledge; And cooperate the utilization of studying the science, actively study innovation, in the hope of founding a kind of human-computer interaction system of new structure; Can satisfy the demand that existing human-computer interaction technology exists, make it have more practicality.Through continuous research and design, and, found out the utility model of true tool practical value finally through after studying sample and improvement repeatedly.
Summary of the invention
The purpose of the utility model is, satisfies the demand that the human-computer interaction technology exists, and provides a kind of human-computer interaction system of new structure, technical matters to be solved to be, makes the implementation diversification of human-computer interaction technology, is very suitable for practicality.
The purpose of the utility model and solve its technical matters and can adopt following technical scheme to realize.
A kind of human-computer interaction system according to the utility model proposes comprises: capture device and control device; Said capture device transfers to control device with the image of real time shooting; Said control device comprises: the receiver module that is connected and receives the next image of said capture device transmission with said capture device; Connect and export the extraction circuit module of the corresponding facial expression information of the facial hot spot region information of user in the image that said receiver module receives with said receiver module; Store the memory module of facial expression information and control command correspondence relationship information; Be connected respectively with said memory module with said extraction circuit module, receive the facial expression information of extracting circuit module output, and export the control module of the said facial expression information corresponding control command in memory module that receives to control device.
The purpose of the utility model and solve its technical matters and can also adopt following technical measures to come further to realize.
Preferable, aforesaid human-computer interaction system, wherein said capture device and said control device are integrated to be arranged in the same electronic equipment.
Preferable, aforesaid human-computer interaction system, wherein said electronic equipment comprises: computing machine, game machine, mobile phone, panel computer, STB, TV set-top box all-in-one or televisor.
Preferable, aforesaid human-computer interaction system, wherein said capture device and said control device independent separate are provided with, and between said capture device and the said control device through wired mode be connected, or through the wireless mode connection.
Preferable, aforesaid human-computer interaction system, wherein said control device is arranged in computing machine, game machine, mobile phone, panel computer, STB, TV set-top box all-in-one or the televisor.
By technique scheme; The human-computer interaction system of the utility model has advantage and beneficial effect at least: the utility model offers control module through utilizing capture device pickup image, the extraction circuit module hot spot region corresponding facial expression information of information that the user in the image of picked-up is facial; Make control module export control command according to stored relation in this facial expression information and the memory module; To realize human-computer interaction based on facial expression; Thereby make the implementation diversification of human-computer interaction, be very suitable for practicality.
In sum, the utility model has obvious improvement technically, and has tangible good effect, really is the new design of a novelty, progress, practicality.
Above-mentioned explanation only is the general introduction of the utility model technical scheme; In order more to know the technological means of understanding the utility model; And can implement according to the content of instructions, and for let the above-mentioned of the utility model with other purposes, feature and advantage can be more obviously understandable, below special act preferred embodiment; And conjunction with figs., specify as follows.
Description of drawings
Fig. 1 is the human-computer interaction system schematic of the utility model.
Embodiment
For further setting forth the utility model is to reach technological means and the effect that predetermined utility model purpose is taked; Below in conjunction with accompanying drawing and preferred embodiment; To its embodiment of human-computer interaction system, structure, characteristic and the effect thereof that proposes according to the utility model, specify as after.
Fig. 1 shows a kind of human-computer interaction system of the utility model specific embodiment.This system shown in Fig. 1 comprises: capture device 1 and control device 2.Control device 2 wherein comprises: receiver module 21, extraction circuit module 22, memory module 23 and control module 24.Receiver module 21 is connected with extraction circuit module 22, and control module 24 all is connected with memory module 23 with extraction circuit module 22.
Capture device 1 is mainly used in the real time shooting image, and its image that absorbs is sent to control device 2.Real time shooting here such as capture device 1 are carried out image sampling according to predetermined sampling frequency.This capture device 1 can with control device 2 integrated settings, also can be separated from each other independent setting with control device 2.Be separated from each other under the independent situation about being provided with at capture device 1 and control device 2; Capture device 1 can adopt the wired connection mode to be connected with control device 2 or adopt the wireless connections mode to connect, and promptly capture device is transferred to control device 2 through wired or wireless mode with its image that absorbs.
Capture device 1 can adopt picture pick-up devices such as prior camera and video camera, and the utility model does not limit the particular type of capture device 1.
Control device 2 be mainly used in according to the image of capture device 1 real time shooting, with and user's the facial expression information of storage in advance determine the corresponding control command of absorbing of user's facial expression with the correspondence relationship information of control command; Control device 2 is through carrying out this control command that it is determined, thereby realized the human-computer interaction based on facial expression.
Because the system that the utility model provides only is based on user's facial expression and realizes human-computer interaction; Therefore; The user can be sitting in before the table fully or stand in before the table or be positioned at other in-plant place and carry out human-computer interaction such as somatic sensation television game etc.; Thereby can realize closely somatic sensation television game, and can make the user in game process, carry out facial movement, strengthen the interest of user wellness approach.
Receiver module 21 in the control device 2 is mainly used in the image sequence that 1 transmission of receiver, video trap setting comes.Under the situation that capture device 1 and control device 2 independent separate are provided with, receiver module 21 can receive the image that capture device 1 transmission comes through wired or wireless mode.A concrete example: receiver module 21 can be through the next image of transmission mode receiver, video trap settings such as bluetooth, 2.4GHz, WIFI, infrared transmission and USB 1 transmission, and promptly receiver module 21 can be bluetooth module, 2.4GHz module, WIFI module, infrared module or USB module.Under the integrated situation that is arranged on same electronic equipment of capture device 1 and control device 2, receiver module 21 can be the buffer memory medium.The utility model does not limit the concrete implementation of receiver module 21.
Extraction circuit module 22 in the control device 2 is mainly used in and from the image that receiver module 21 receives, extracts the facial hot spot region information of user; And judge the corresponding facial expression information of this hot spot region information; Afterwards, extract circuit module 22 to control module 24 its facial expression information of determining of output.The facial expression information here such as the key value at the one or more positions in the positions such as eyebrow, eyes, face and cheek also can be index value of facial expression etc.The utility model does not limit the concrete manifestation form of the facial expression information of extracting circuit module 22 outputs.In addition, the extraction circuit module 22 in the utility model can adopt existing technology (like smiling face's extractive technique etc.) to extract the hot spot region information of user's privileged site.
Extract circuit module 22 after receiving the image that capture device 1 transmission comes, can be optimized its image that receives earlier and handle operation, afterwards, carry out the extraction of hot spot region information again and confirm the facial expression operation.The above-mentioned optimization process operation that image is carried out can comprise: remove one or more operation in the operations such as invalid information processing, the processing of removal interfere information, the processing of removal blood-shot eye illness, correction lens distortion processing and the processing of enhancing effective information.
Extract circuit module 22 in the process of extracting hot spot region information; The image transitions that can earlier capture device 1 transmission be come is a black white image; Thereby extract circuit module 22 and can extract the facial hot spot region information of user according to the gray-scale value of the pixel in the black white image.
Memory module 23 in the control device 2 is mainly used in the correspondence relationship information between facial expression information of storage and the control command, for example, stores the call number of facial expression and the correspondence relationship information of control command in the memory module 23; Again for example, store each key value of facial expression and the correspondence relationship information between the control command in the memory module 23.The bloating of the bending direction of the folding degree of above-mentioned key value such as eyes, the flexibility of eyebrow, face folding degree, face and cheek/sinking degree etc.Above-mentioned control command can be called certain the concrete control command used to equipment, for example, and before televisor changes platform, computing machine browsing pictures/back page turning, the webpage of closing browsing or the order of the recreation in the somatic sensation television game or the like.
The utility model can be provided with the correspondence relationship information in the memory module 23 dynamically; Concrete example: at first; Open capture device 1; Capture device 1 beginning camera operation; Capture device 1 is absorbed comprises on the display screen that the facial picture of user is displayed on control device 2 place equipment, and control device 2 can show facial regional location scope on display screen, and the user can be positioned at this facial zone position range through adjusting the user plane portion that its sitting posture or erect-position etc. absorb capture device 1; Afterwards, the user clicks picked-up picture corresponding key (register button that shows on enter key or the screen on this button such as the computer keyboard or the definite key on the telepilot etc.); Control device 2 is after the order that monitors the picked-up picture that is produced by this button; Obtain the picture that includes user's facial zone, this picture can be black and white picture or colour picture, then; Control device 2 obtains the hot spot region image information from this picture; And determine the facial expression index value through analyzing this hot spot region image information, afterwards, the control command that this facial expression index value and current is needed be provided with the form corresponding stored of record in the table in memory module 23.
Certainly; Control device 2 in the utility model also can adopt other operating process in memory module 23, to store the correspondence relationship information between facial expression information and the control command, and the utility model is not limited in the concrete implementation of the correspondence relationship information between memory module 23 facial expression informations of storage and the control command.
Control module 24 in the control device 2 is mainly used in, and will to extract the facial expression information translation that circuit module 22 transmission comes based on canned data in the memory module 23 be control command; And this control command offered other module in the control device 2, to realize man-machine interaction such as somatic sensation television game etc.A concrete example: control module 24 can utilize the facial expression information of extracting circuit module 22 outputs in memory module 23, to carry out matched and searched in the stored relation information, and the control command in the record that matches is the corresponding control command of its facial expression information that receives.
Control device 2 can also comprise elements such as display screen and power module.Display screen is mainly used in display frame (image that 1 picked-up obtains like game picture and capture device etc.).This display screen can be display screen, television display screen, panel computer display screen or the game machine display screen etc. of computer screen, mobile phone.Power module is mainly used in in the control device 2 each and with electric device (like each module in each module in the control device 2 or capture device 1 and the control device 2 etc.) electric power resource is provided; The power module that this power module can carry for electronic equipments itself such as computing machine, mobile phone, televisor, panel computer or game machines, the power module that also can be provided with for the human-computer interaction system that aims in the utility model.
The set-up mode of the human-computer interaction system that puts down in writing in the foregoing description can be following two kinds of forms:
First kind of form is: the whole human-computer interaction system integration is arranged in the same electronic equipment; Thereby whole human-computer interaction system becomes the part of an electronic equipment, and this electronic equipment can be specially computing machine, mobile phone, televisor, panel computer, STB, television set top box all-in-one or game machine etc.A concrete example: be arranged under the situation in the computing machine in the human-computer interaction system integration, the capture device 1 integrated upper left corner that is arranged on computer display, and in the internal circuit of the integrated main frame that is arranged on computing machine of control device 2.
Second kind of form is: capture device in the human-computer interaction system 1 and control device 2 independent separate settings, and can be connected through wired or wireless mode between capture device 1 and the control device 2.Concrete; Capture device 1 can be separate equipment such as camera or video camera, and 2 of control device can integratedly be arranged in the electronic equipments such as computing machine, mobile phone, televisor, panel computer (PAD), STB, television set top box all-in-one or game machine.
The above only is the preferred embodiment of the utility model; Be not that the utility model is done any pro forma restriction; Though the utility model discloses as above with preferred embodiment; Yet be not in order to limit the utility model; Any professional and technical personnel of being familiar with makes a little change or is modified to the equivalent embodiment of equivalent variations when the technology contents of above-mentioned announcement capable of using in not breaking away from the utility model technical scheme scope, is the content that does not break away from the utility model technical scheme in every case;, all still belong in the scope of the utility model technical scheme any simple modification, equivalent variations and modification that above embodiment did according to the technical spirit of the utility model.

Claims (5)

1. a human-computer interaction system is characterized in that, said system comprises: capture device and control device;
Said capture device transfers to control device with the image of real time shooting;
Said control device comprises:
The receiver module that is connected and receives the next image of said capture device transmission with said capture device;
Connect and export the extraction circuit module of the corresponding facial expression information of the facial hot spot region information of user in the image that said receiver module receives with said receiver module;
Store the memory module of facial expression information and control command correspondence relationship information;
Be connected respectively with said memory module with said extraction circuit module, receive the facial expression information of said extraction circuit module output, and export the control module of the said facial expression information corresponding control command in memory module that receives to control device.
2. human-computer interaction as claimed in claim 1 system is characterized in that said capture device and said control device are integrated to be arranged in the same electronic equipment.
3. human-computer interaction as claimed in claim 2 system is characterized in that said electronic equipment comprises: computing machine, game machine, mobile phone, panel computer, STB, TV set-top box all-in-one or televisor.
4. human-computer interaction as claimed in claim 1 system; It is characterized in that; Said capture device and said control device independent separate are provided with, and between said capture device and the said control device through wired mode be connected, or through the wireless mode connection.
5. human-computer interaction as claimed in claim 4 system is characterized in that said control device is arranged in computing machine, game machine, mobile phone, panel computer, STB, TV set-top box all-in-one or the televisor.
CN 201120323511 2011-08-31 2011-08-31 Human-computer interaction system Expired - Fee Related CN202275357U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201120323511 CN202275357U (en) 2011-08-31 2011-08-31 Human-computer interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201120323511 CN202275357U (en) 2011-08-31 2011-08-31 Human-computer interaction system

Publications (1)

Publication Number Publication Date
CN202275357U true CN202275357U (en) 2012-06-13

Family

ID=46195703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201120323511 Expired - Fee Related CN202275357U (en) 2011-08-31 2011-08-31 Human-computer interaction system

Country Status (1)

Country Link
CN (1) CN202275357U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104765443A (en) * 2014-01-03 2015-07-08 伊吉士摩斯科技股份有限公司 Image type virtual interaction device and implementation method thereof
CN104777907A (en) * 2015-04-17 2015-07-15 中国科学院计算技术研究所 System for group human-computer interaction
CN105664434A (en) * 2016-03-11 2016-06-15 中国人民解放军63680部队 Running machine and video playing synchronous control system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104765443A (en) * 2014-01-03 2015-07-08 伊吉士摩斯科技股份有限公司 Image type virtual interaction device and implementation method thereof
CN104765443B (en) * 2014-01-03 2017-08-11 异奇科技股份有限公司 Image type virtual interaction device and implementation method thereof
CN104777907A (en) * 2015-04-17 2015-07-15 中国科学院计算技术研究所 System for group human-computer interaction
CN105664434A (en) * 2016-03-11 2016-06-15 中国人民解放军63680部队 Running machine and video playing synchronous control system

Similar Documents

Publication Publication Date Title
CN102955565A (en) Man-machine interaction system and method
CN103034323A (en) Man-machine interaction system and man-machine interaction method
US9319632B2 (en) Display apparatus and method for video calling thereof
CN103034322A (en) Man-machine interaction system and man-machine interaction method
EP2897028A1 (en) Display device and method for controlling the same
CN100512420C (en) Method and apparatus for composing images during video communications
US20150049924A1 (en) Method, terminal device and storage medium for processing image
CN108712603B (en) Image processing method and mobile terminal
CN102737238A (en) Gesture motion-based character recognition system and character recognition method, and application thereof
CN110365974A (en) For Video coding and decoded adaptive transmission function
CN108536367B (en) Interactive page jamming processing method, terminal and readable storage medium
CN102750121A (en) Electronic display device intelligent expanding and accurate controlling system and multi-user multi-task encryption sharing method thereof
CN107656792A (en) Method for displaying user interface, device and terminal
CN202275357U (en) Human-computer interaction system
CN102760198A (en) Close somatosensory interaction device and method
CN202362731U (en) Man-machine interaction system
CN107704828A (en) Methods of exhibiting, mobile terminal and the computer-readable recording medium of reading information
CN102929547A (en) Intelligent terminal contactless interaction method
CN202093528U (en) Character recognition system and translation system based on gestures
CN109151162A (en) A kind of multi-panel screen interaction control method, equipment and computer readable storage medium
CN202270343U (en) Somatosensory interaction device
CN108401173A (en) Interactive terminal, method and the computer readable storage medium of mobile live streaming
US20220172440A1 (en) Extended field of view generation for split-rendering for virtual reality streaming
CN103150021B (en) Electronic book reading control system and electronic book reading control method
CN203537531U (en) Full-function remote controller

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120613

Termination date: 20170831