CN202433830U - Human-machine interactive system - Google Patents

Human-machine interactive system Download PDF

Info

Publication number
CN202433830U
CN202433830U CN2011204245086U CN201120424508U CN202433830U CN 202433830 U CN202433830 U CN 202433830U CN 2011204245086 U CN2011204245086 U CN 2011204245086U CN 201120424508 U CN201120424508 U CN 201120424508U CN 202433830 U CN202433830 U CN 202433830U
Authority
CN
China
Prior art keywords
human
image
computer interaction
scene depth
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2011204245086U
Other languages
Chinese (zh)
Inventor
董德福
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunhu Times Technology Co., Ltd.
Original Assignee
BEIJING DEXIN INTERACTIVE NETWORK TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING DEXIN INTERACTIVE NETWORK TECHNOLOGY CO LTD filed Critical BEIJING DEXIN INTERACTIVE NETWORK TECHNOLOGY CO LTD
Priority to CN2011204245086U priority Critical patent/CN202433830U/en
Application granted granted Critical
Publication of CN202433830U publication Critical patent/CN202433830U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The utility model discloses a human-machine interactive system, which comprises a 3-D camera device and a control device, wherein the 3-D camera device is used for transmitting real-timely captured scene depth images to the control device; the control device comprises a receiving module, a storage module and a control chip; the receiving module is used for receiving the scene depth images sent by the 3-D camera device; the storage module stores the information of the corresponding relation between gesture action identification information and control commands; and the control chip, which is respectively connected with the receiving module and the storage module, outputs the commands in the storage module corresponding to the gesture action identification information of the gesture action of a controller in the scene depth images received by the receiving module. The human-machine interactive system which adopts the technical scheme, realizes the human-machine interactive based on the 3-D camera device and the gestures in various ways, and is very practical.

Description

The human-computer interaction system
Technical field
The utility model relates to a kind of human-computer interaction technology, particularly relates to a kind of human-computer interaction system.
Background technology
During the human-computer interaction technology has been widely used in daily life and has worked.For example, control of somatic sensation television game and electric equipment or the like.Especially the somatic sensation television game in the human-computer interaction technology is because it has purpose and the liking of people extremely of body-building and amusement concurrently.
Present existing human-computer interaction technology realizes based on opertaing device that normally for example, somatic sensation television game is normally realized through computing machine and body sense control device or realized through televisor, STB and body sense control device.Body sense control device such as game paddle etc., body sense control device can be held in hand by the singlehanded perhaps both hands of user usually, and carry out control operation.
The inventor finds in realizing the utility model process: opertaing device is a physical entity equipment normally, and be made up of elements such as a plurality of buttons, rocking bar, light source, acceleration of gravity sensor and the small screen usually.Yet the entity device that present human-computer interaction technology can be not limited to physics has been realized, for example, can realize human-computer interaction through the finger touches screen based on the equipment of touching screen such as panel computer etc.The implementation of existing human-computer interaction awaits further abundant.
Because the demand that above-mentioned existing human-computer interaction technology exists; The inventor is based on being engaged in this type of product design manufacturing abundant for many years practical experience and professional knowledge; And cooperate the utilization of studying the science, actively study innovation, in the hope of founding a kind of human-computer interaction system of new structure; Can satisfy the demand that existing human-computer interaction technology exists, make it have more practicality.Through continuous research and design, and, found out the utility model of true tool practical value finally through after studying sample and improvement repeatedly.
Summary of the invention
The purpose of the utility model is, satisfies the demand that the human-computer interaction technology exists, and provides a kind of human-computer interaction system of new structure, technical matters to be solved to be, makes the implementation diversification of human-computer interaction technology, is very suitable for practicality.
The purpose of the utility model and solve its technical matters and can adopt following technical scheme to realize.
According to a kind of human-computer interaction system that the utility model proposes, said system comprises: 3D camera head and control device; Said 3D camera head transfers to control device with the scene depth image that real time shooting obtains; Said control device comprises: receiver module receives the scene depth image that said 3D camera head transmission comes; Store the memory module of gesture motion identification information and control command correspondence relationship information; Control chip is connected respectively with said memory module with said receiver module, the gesture motion identification information of exporting the effector's gesture motion in the scene depth image that said receiver module receives corresponding control command in memory module.
The purpose of the utility model and solve its technical matters and can also adopt following technical measures to come further to realize.
Preferable, aforesaid human-computer interaction system, wherein said 3D camera head comprises: infrared light supply or led light source or LASER Light Source; The CMOS image sensor, output infrared light coding image or LED light coding image or laser code image; The scene depth processing module is connected with said CMOS image sensor, receives infrared light coding image or the LED light coding image or the laser code image of said CMOS image sensor output, and to control device output scene depth image.
Preferable, aforesaid human-computer interaction system, wherein said human-computer interaction system comprises: hand held electronic terminals equipment.
Preferable, aforesaid human-computer interaction system, wherein said hand held electronic terminals equipment comprises: mobile phone, notebook computer, panel computer or handheld game machine.。
By technique scheme; The human-computer interaction system of the utility model has advantage and beneficial effect at least: the utility model is through utilizing the picked-up of 3D camera head and producing the scene depth image; And utilize control chip to identify effector's in this scene depth image gesture motion identification information; Make control chip convert effector's gesture motion into control command; Realizing human-computer interaction, thereby make the implementation diversification of human-computer interaction, be very suitable for practicality based on 3D camera head and gesture motion.
In sum, the utility model has obvious improvement technically, and has tangible good effect, really is the new design of a novelty, progress, practicality.
Above-mentioned explanation only is the general introduction of the utility model technical scheme; In order more to know the technological means of understanding the utility model; And can implement according to the content of instructions, and for let the above-mentioned of the utility model with other purposes, feature and advantage can be more obviously understandable, below special act preferred embodiment; And conjunction with figs., specify as follows.
Description of drawings
Fig. 1 is the human-computer interaction system schematic of the utility model.
Embodiment
For further setting forth the utility model is to reach technological means and the effect that predetermined utility model purpose is taked; Below in conjunction with accompanying drawing and preferred embodiment; To its embodiment of human-computer interaction system, structure, characteristic and the effect thereof that proposes according to the utility model, specify as after.
Fig. 1 shows a kind of human-computer interaction system of the utility model specific embodiment, and this human-computer interaction system can be handheld electronic+terminal devices such as mobile phone, notebook computer, panel computer or handheld game machine.
Above-mentioned human-computer interaction system comprises: 3D camera head 1 and control device 2.3D camera head 1 can all integratedly be built in the human-computer interaction system with control device 2; Certainly; Also can there be other set-up mode in this system; Like 3D camera head 1 and control device 2 independent separate settings, and control device 2 integrated being built in the human-computer interaction system, and 3D camera head 1 carries out information interaction with control device 2 through wired connection mode (like USB) or wireless connections mode.Control device 2 wherein can specifically comprise: receiver module 21, memory module 22 and control chip 23.
3D camera head 1 is connected with control device 2, is connected with receiver module 21 in the control device 2 like 3D camera head 1.3D camera head 1 can adopt existing 3D camera.3D camera head 1 is mainly used in real time shooting scene depth image, and will absorb the scene depth image that obtains and transfer to control device 2.The real time shooting here is as carrying out image sampling according to predetermined sampling frequency.If 3D camera head 1 all is built in the human-computer interaction system with control device 2; Then can be connected through signal wire between 3D camera head 1 and the control device 2, promptly 3D camera head 1 adopts the wired connection mode that its image information of absorbing is transferred to control device 2.The utility model do not limit 3D camera head 1 particular type and with the connected mode of control device 2.
3D camera head 1 can comprise: infrared light supply, CMOS image sensor and scene depth processing module.Above-mentioned infrared light supply also can be led light source or LASER Light Source.
Infrared light supply should meet the one-level safety requirements in the IEC 60825-1 standard.Led light source or LASER Light Source also should meet the safety requirements in the respective standard.
The CMOS image sensor is mainly used in the infrared light that receives infrared light supply and emit (perhaps the LED light that sends of led light source or LASER Light Source send laser); And based on the infrared light that receives (perhaps LED light or laser) generation infrared light coding image (perhaps LED light coding image or laser code image); Afterwards, the infrared light coding image (perhaps LED light coding image or laser code image) with its generation is transferred to the scene depth processing module.
The scene depth processing module is connected with the CMOS image sensor.The scene depth processing module can be the PS1080 chip, certainly, also can for the chip of similar other model of PS1080 chip functions effect.The scene depth processing module is mainly used in to be handled infrared image (perhaps LED light coding image or laser code image), generates the scene depth image by frame, and the scene depth image of its generation is transferred to control device 2.
Control device 2 is mainly used in the represented control command of coming out of the gesture motion that from the scene depth image that 3D camera head 1 picked-up obtains, identifies the effector (gesture motion that promptly analyzes the effector hope give expression to control command); And export this control command, thereby be implemented under the situation of opertaing device of no physics entity the Be Controlled object is controlled.The control command of control device 2 outputs can offer other module in the human-computer interaction system, also can offer the miscellaneous equipment that is connected with the human-computer interaction system.
Receiver module 21 and the 1 wired or wireless connections of 3D camera head in the control device 2.Receiver module 21 is mainly used in wired or wireless mode and receives the scene depth image sequence that 1 transmission of 3D camera head comes.Receiver module 21 can specifically comprise: USB interface, signal wire and buffer memory medium etc.
Memory module 22 in the control device 2 is connected with control chip 23.This memory module 22 can be internal memory or flash memory etc.Store the correspondence relationship information between gesture motion identification information and the control command in the memory module 22.For example, store the call number of gesture motion and the correspondence relationship information of control command in the memory module 22.Above-mentioned control command can be called to certain the concrete control command used in the human-computer interaction system; For example, before mobile phone is changed in progress video, panel computer browsing pictures/back page turning, the webpage of closing browsing or the order of the recreation in the somatic sensation television game or the like.
Control chip 23 in the control device 2 all is connected with memory module 22 with receiver module 21.Control chip 23 is mainly used in a series of scene depth images that receiver module 21 is received and compares to confirm the effector's in the image gesture motion; And the gesture motion identification information of definite this gesture motion; Afterwards; Search the control command of the gesture motion identification information coupling of determining with it in the gesture motion identification information that control chip 23 is stored in advance and the correspondence relationship information of control command in memory module 22, and export the control command of this coupling to the human-computer interaction system.
Control chip 23 can begin from the time point that receiver module 21 receives the scene depth image at first; Each the scene depth image that receives in the predetermined amount of time (like 3 seconds) is compared, to confirm the represented effector's who goes out of each image that receives in this predetermined amount of time gesture motion.After control chip 23 can relatively finish at the image of this predetermined amount of time; Delete part or all and carried out image relatively; Like the image that deletion receives at first a second, afterwards, control chip 23 continues each image that receives in the predetermined amount of time is discerned.
The hand that control chip 23 can be through judging adjacent image and the registration of forearm are confirmed the effector's in the image gesture motion; For example: the hand that control chip 23 relatively can be judged this effector through the registration of a series of adjacent image moves from left to right, thereby the action of determining the effector is for wave from left to right; Again for example: the hand that control chip 23 relatively can be judged this effector through the registration of a series of adjacent image carries out cyclic motion in vertical direction, thereby the action of determining the effector is that hand is vertically drawn circle.Above-mentioned example is that example describes with the dynamic gesture action all, need to prove, this gesture motion also can be static gesture motion, like the OK gesture etc.
Can store the correspondence relationship information of action message and gesture motion identification information in the control chip 23 in advance; Like this; Control chip 23 can be determined the gesture motion identification information of needs output according to this correspondence relationship information of its storage after the action of determining effector's execution.
The control command that the control command of control chip 23 outputs (like the control command to the output of human-computer interaction system) can be changed in progress video for turning round; Change the control command of in progress video forward; Transfer the control command of big man-machine interaction systems volume; Turn the control command of human-computer interaction system sound volume down; Change the control command of the ratio of in progress video pictures; Improve the control command of the contrast of in progress video; Perhaps reduce the control command or the like of the contrast of in progress video.Control chip 23 should be the command format that the human-computer interaction system supports to the form of the control command of human-computer interaction system output.Control chip 23 can adopt the agreement of human-computer interaction system support to produce control command.
A concrete example of the operation that control chip 23 is carried out: control chip 23 is after receiver module 21 receives the scene depth image; Confirm to sell behind the gesture motion identification information of horizontal bar circle action based on this scene depth image, according to the control command of the control command executive component transmitting and displaying operating system master menu of stored relation information in the memory module 22 in panel computer.The example that another is concrete: control chip 23 is after receiver module 21 receives the scene depth image; Confirm to sell behind the gesture motion identification information of brandishing action from left to right based on this scene depth image, the control command of in progress video is changed in the control command executive component transmission in panel computer backward according to stored relation information in the memory module 22.
Control chip in the present embodiment can be for fpga chip etc., and the concrete manifestation form, control chip 23 that present embodiment does not limit stored relation in the concrete structure of control chip 23, the concrete gesture motion that the effector carried out that control chip 23 is determined, concrete implementation that control chip 23 is confirmed the gesture motion identification informations, the control chip 23 is to concrete agreement that concrete transmission mode and control command adopted of human-computer interaction system transmissions control command or the like.
The above only is the preferred embodiment of the utility model; Be not that the utility model is done any pro forma restriction; Though the utility model discloses as above with preferred embodiment; Yet be not in order to limit the utility model; Any professional and technical personnel of being familiar with makes a little change or is modified to the equivalent embodiment of equivalent variations when the technology contents of above-mentioned announcement capable of using in not breaking away from the utility model technical scheme scope, is the content that does not break away from the utility model technical scheme in every case;, all still belong in the scope of the utility model technical scheme any simple modification, equivalent variations and modification that above embodiment did according to the technical spirit of the utility model.

Claims (4)

1. a human-computer interaction system is characterized in that, said system comprises: 3D camera head and control device;
Said 3D camera head transfers to control device with the scene depth image that real time shooting obtains;
Said control device comprises:
Receiver module receives the scene depth image that said 3D camera head transmission comes;
Store the memory module of gesture motion identification information and control command correspondence relationship information;
Control chip is connected respectively with said memory module with said receiver module, the gesture motion identification information of exporting the effector's gesture motion in the scene depth image that said receiver module receives corresponding control command in memory module.
2. human-computer interaction as claimed in claim 1 system is characterized in that said 3D camera head comprises:
Infrared light supply or led light source or LASER Light Source;
The CMOS image sensor, output infrared light coding image or LED light coding image or laser code image;
The scene depth processing module is connected with said CMOS image sensor, receives infrared light coding image or the LED light coding image or the laser code image of said CMOS image sensor output, and to control device output scene depth image.
3. according to claim 1 or claim 2 human-computer interaction system is characterized in that, said human-computer interaction system comprises: hand held electronic terminals equipment.
4. human-computer interaction as claimed in claim 3 system is characterized in that said hand held electronic terminals equipment comprises: mobile phone, notebook computer, panel computer or handheld game machine.
CN2011204245086U 2011-10-31 2011-10-31 Human-machine interactive system Expired - Fee Related CN202433830U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011204245086U CN202433830U (en) 2011-10-31 2011-10-31 Human-machine interactive system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011204245086U CN202433830U (en) 2011-10-31 2011-10-31 Human-machine interactive system

Publications (1)

Publication Number Publication Date
CN202433830U true CN202433830U (en) 2012-09-12

Family

ID=46783266

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011204245086U Expired - Fee Related CN202433830U (en) 2011-10-31 2011-10-31 Human-machine interactive system

Country Status (1)

Country Link
CN (1) CN202433830U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103777750A (en) * 2012-10-23 2014-05-07 三星电子株式会社 Mobile system including image sensor, method of operating image sensor, and method of operating mobile system
CN106056994A (en) * 2016-08-16 2016-10-26 安徽渔之蓝教育软件技术有限公司 Assisted learning system for gesture language vocational education

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103777750A (en) * 2012-10-23 2014-05-07 三星电子株式会社 Mobile system including image sensor, method of operating image sensor, and method of operating mobile system
CN106056994A (en) * 2016-08-16 2016-10-26 安徽渔之蓝教育软件技术有限公司 Assisted learning system for gesture language vocational education

Similar Documents

Publication Publication Date Title
CN202362731U (en) Man-machine interaction system
CN102184014B (en) Intelligent appliance interaction control method and device based on mobile equipment orientation
WO2017148294A1 (en) Mobile terminal-based apparatus control method, device, and mobile terminal
US20210232232A1 (en) Gesture-based manipulation method and terminal device
CN102902476B (en) A kind of method of touch terminal control electronic equipment and system thereof
CN102769802A (en) Man-machine interactive system and man-machine interactive method of smart television
WO2013167057A2 (en) Television interface focus control method, apparatus and system
CN103455270A (en) Video file transmission method and video file transmission system
CN103647865A (en) Incoming call management method and system on the basis of gesture
CN202433831U (en) Man-machine interaction system
CN102902406B (en) The system of a kind of touch terminal and control electronic equipment thereof
CN205353936U (en) Data interaction system based on VR equipment
CN104731829A (en) Network picture interaction method and network picture interaction device
CN107071551A (en) Applied to the multi-screen interactive screen response method in intelligent television system
CN102413023B (en) Interactive delight system and method
CN105069104A (en) Dynamic cartoon generation method and system
CN102955565A (en) Man-machine interaction system and method
CN102326140A (en) Method and apparatus for displaying additional information items
CN105472358A (en) Intelligent terminal about video image processing
CN103064532A (en) Air mouse remote controller
CN202433830U (en) Human-machine interactive system
CN102868925A (en) Intelligent TV (television) control method
CN202460088U (en) Holographic projection body feeling interactive system
CN201311700Y (en) Remote controller with a camera
CN103914305A (en) Method and system for freely controlling applications on mobile terminal

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20160425

Address after: 100176 Beijing street Rongchang Beijing economic and Technological Development Zone No. 5 Building No. 3, B zone 2

Patentee after: Beijing Yunhu Times Technology Co., Ltd.

Address before: 100176 Beijing city street Rongchang Daxing District Yizhuang Economic Development Zone No. 5 Longsheng building block C

Patentee before: Beijing Dexin Interactive Network Technology Co.,Ltd.

CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120912

Termination date: 20191031

CF01 Termination of patent right due to non-payment of annual fee