CN201540535U - Non-contact human-computer interaction system based on blue point identification - Google Patents

Non-contact human-computer interaction system based on blue point identification Download PDF

Info

Publication number
CN201540535U
CN201540535U CN2009202081883U CN200920208188U CN201540535U CN 201540535 U CN201540535 U CN 201540535U CN 2009202081883 U CN2009202081883 U CN 2009202081883U CN 200920208188 U CN200920208188 U CN 200920208188U CN 201540535 U CN201540535 U CN 201540535U
Authority
CN
China
Prior art keywords
display device
data
blue
image
system based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2009202081883U
Other languages
Chinese (zh)
Inventor
陈林志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHANGHAI AISHUO SOFTWARE TECHNOLOGY CO LTD
Original Assignee
SHANGHAI AISHUO SOFTWARE TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHANGHAI AISHUO SOFTWARE TECHNOLOGY CO LTD filed Critical SHANGHAI AISHUO SOFTWARE TECHNOLOGY CO LTD
Priority to CN2009202081883U priority Critical patent/CN201540535U/en
Application granted granted Critical
Publication of CN201540535U publication Critical patent/CN201540535U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A non-contact human-computer interaction system based on blue point identification comprises a motion capture, an engine data processor, gloves with blue blocks, a large screen display device and an auxiliary audio system. The motion capture mounted above the display device is used for capturing and acquiring image and motion data of an actor. The engine data processor connected with the display device receives the captured data of the motion capture, analyzes the acquired image and motion data and controls signal outputting of the display device. The system further comprises identification devices used by actors. The motion capture acquires image and motion data of the identification devices. The audio system arranged in the system realizes outputting of audio data. The system sufficiently utilizes naturalness, directness and convenience of visual channels in information acquisition and expression, thereby leading the human-computer interaction process to be simpler, prompter and operationally easier.

Description

A kind of non-contact type human-machine interaction system based on blue dot identification
Technical field
The utility model relates to a kind of contactless man-machine interactive system, belongs to field of human-computer interaction, relates in particular to a kind of system of the non-contact type human-machine interaction based on blue identification point.
Background technology
Human-computer interaction technology (Human-Computer Interaction Techniques) is meant by the computing machine Input/Output Device, realizes the technology of people and computer dialog with effective and efficient manner.It comprises that machine provides to the people by output or display device and reaches prompting in a large number for information about and ask for instructions etc., and input reaches prompting and asks for instructions etc. the people for information about to machine by input equipment, and the people answers a question etc. by input equipment to the machine input for information about.Human-computer interaction technology is one of important content in the computer user interface design.Ambits such as it and cognitive science, ergonomics, psychology have close getting in touch.
The multimedia human-computer interaction technology is the combination of multimedia technology and human-computer interaction technology.The variation of information representation and be the important content of multimedia human-computer interaction technology alternately how by multiple input-output device and computing machine.Multimedia human-computer interaction is based on new interaction techniques such as eye tracking, speech recognition, gesture input, sensation feedback.
But the bottleneck of the system of existing non-contact type human-machine interaction based on blue identification point is that accuracy of identification is not high, influences user experience.
Given this, be necessary in fact to provide a kind of system of new non-contact type human-machine interaction based on blue identification point to solve the problems of the technologies described above.
The utility model content
The purpose of this utility model provides a kind of system of the non-contact type human-machine interaction based on blue identification point, is used to realize that the man-machine interaction process is easy, rapid, easy to operate.
For solving the problems of the technologies described above, the utility model adopts following technical scheme: a kind of system of the non-contact type human-machine interaction based on blue identification point, this system comprises the display device that is provided with stereo set, be installed on the action collector of display device top, be used for seizure and collection participant's image and action data; With the data processor that the action collector is connected, be used for analyzing stored and analyze image and the action data of gathering; Described data processor links to each other with display device, is used to control the signal output of display device; This screen touch-control reaches display system alternately and further comprises recognition device, uses for the participant, and described action collector is gathered the image and the action data of this recognition device.
Preferably, described action collector is one or more video camera or camera.
Preferably, the recognition device of described participant's use is the gloves that are provided with blue identification point.
Preferably, the recognition device of described participant's use is the pen that is provided with blue identification point.
The system of a kind of non-contact type human-machine interaction based on blue identification point of the utility model makes full use of that the visual channel obtains, the naturality of expressing information, substantivity and convenience, makes the man-machine interaction process easier, rapid, easy to operate.
Description of drawings
Fig. 1 is the utility model system schematic.
Fig. 2 is the structural representation of a kind of recognition device of the utility model.
Embodiment
A kind of non-contact type human-machine interaction system based on blue dot identification, this system comprises the display device that is provided with stereo set, is installed on the action collector of display device top, is used for seizure and collection to participant's image and action data; With the data processor that the action collector is connected, be used for analyzing stored and analyze image and the action data of gathering; Described data processor links to each other with display device, is used to control the signal output of display device; This screen touch-control reaches display system alternately and further comprises recognition device, uses for the participant, and described action collector is gathered the image and the action data of this recognition device.
Described action collector is one or more video camera or camera.
The recognition device that described participant uses is the gloves 1 that are provided with blue identification point 2.
The recognition device that described participant uses is the pen that is provided with blue identification point.
Concrete, the utility model system is made up of three parts, promptly moves collector, game engine, large-size screen monitors demonstration and stereo set.Image action collector is realized seizure and the collection to participant's action data.
Game engine is the core of system, it is the data processor of real-time interactive between player and the various types of games effect, realize the participant's who gathers image and action data are analyzed, realize the game step effect, and real-time interactive special efficacy and the sound effect of control large screen display output.
The game engine system module is made up of 4 parts, video acquisition module, video are synthetic, video image analysis (blue identification), with variation and the response of above-mentioned information Control graphic element in 3d space, finish designed game process.
Video acquisition module is obtained the data of camera, and is intercepted the image block of corresponding size by video frequency collection card.Hardware device adopts a day quick sdk 3000.After pci card being installed and being driven kit, read camera and catch the picture message.
Image synthesis unit be used for when capturing visual very big, perhaps since lock-on range too short, when causing a video camera can't finish the seizure of whole interactive picture, the splicing that need obtain image.So just need carry out the image synthetic work.(native system adopts the single camera system, but also possesses multi-cam splicing function)
Such as 4 W*H image block A1, A2, A3, A4 by different combinations, can obtain image block B, and the composition of B has following several state.
Integrated mode
A1 A2 A 3 A4 A1A2 A 3A4
Image analysis module is the nucleus module of system.After obtaining image block, by the image outline analysis, boundary characteristic is significantly local in the searching image.Touch-control analyzing and testing module: draw steering order and send to the master routine system.
Traditional camera is caught the limb action technology in the air, can not accurate localization.The way that native system adopts blue dot to detect can be accurate, obtains coordinate efficiently.Blueness can be used gloves, and penholder or other stage property are as carrier.After the video image analysis, with variation and the response of above-mentioned information Control graphic element in 3d space.
During use, it in face of the game player is the virtual game scene that is projected on the metope that system architecture is presented on, and breaks away from the limitation of mouse action, in the face of large display screen, is immersed in for example sea floor world, in the scene in beautiful spring.Child stands in screen front, brandishes to have blue stage property, just can control the element of picture, realizes and the interaction of giant-screen game element, creates the sensation of magic mystery.By different game process, cultivate that child starts, the ability of cognitive, logical thinking.
Beautiful background animation scene cooperates the corresponding game theme music, and player's action in the air is commanded the sensation of building magic, allows child in recreation simultaneously, and also study is dealt with problems, the enjoyment of observing and learn from real life, thereby the raising of realization emotion ability.
Description of the present utility model and application are illustrative, are not to want with scope restriction of the present utility model in the above-described embodiments.Here the distortion of disclosed embodiment and change are possible, and the various parts of the replacement of embodiment and equivalence are known for those those of ordinary skill in the art.Those skilled in the art are noted that under the situation that does not break away from spirit of the present utility model or essential characteristic the utility model can be realized with other forms.Under the situation that does not break away from the utility model scope and spirit, can carry out other distortion and change here to disclosed embodiment.

Claims (4)

1. non-contact type human-machine interaction system based on blue dot identification, it is characterized in that: this system comprises the display device that is provided with stereo set, is installed on the action collector of display device top, is used for seizure and collection to participant's image and action data; With the data processor that the action collector is connected, be used for analyzing stored and analyze image and the action data of gathering; Described data processor links to each other with display device, is used to control the signal output of display device; This screen touch-control reaches display system alternately and further comprises recognition device, uses for the participant, and described action collector is gathered the image and the action data of this recognition device.
2. the non-contact type human-machine interaction system based on blue dot identification as claimed in claim 1, it is characterized in that: described action collector is one or more video camera or camera.
3. the non-contact type human-machine interaction system based on blue dot identification as claimed in claim 1 is characterized in that: the recognition device that described participant uses is the gloves that are provided with blue identification point.
4. the non-contact type human-machine interaction system based on blue dot identification as claimed in claim 1 is characterized in that: the recognition device that described participant uses is the pen that is provided with blue identification point.
CN2009202081883U 2009-08-20 2009-08-20 Non-contact human-computer interaction system based on blue point identification Expired - Fee Related CN201540535U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009202081883U CN201540535U (en) 2009-08-20 2009-08-20 Non-contact human-computer interaction system based on blue point identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009202081883U CN201540535U (en) 2009-08-20 2009-08-20 Non-contact human-computer interaction system based on blue point identification

Publications (1)

Publication Number Publication Date
CN201540535U true CN201540535U (en) 2010-08-04

Family

ID=42591996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009202081883U Expired - Fee Related CN201540535U (en) 2009-08-20 2009-08-20 Non-contact human-computer interaction system based on blue point identification

Country Status (1)

Country Link
CN (1) CN201540535U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102739640A (en) * 2012-04-06 2012-10-17 北京千松科技发展有限公司 Novel omnimedia popular science window
CN104731334A (en) * 2015-03-26 2015-06-24 广东工业大学 Spatial gesture interactive type maritime silk road dynamic history GIS and implementation method
CN108628767A (en) * 2013-02-26 2018-10-09 搜诺思公司 The pre-cache of audio content

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102739640A (en) * 2012-04-06 2012-10-17 北京千松科技发展有限公司 Novel omnimedia popular science window
CN108628767A (en) * 2013-02-26 2018-10-09 搜诺思公司 The pre-cache of audio content
CN104731334A (en) * 2015-03-26 2015-06-24 广东工业大学 Spatial gesture interactive type maritime silk road dynamic history GIS and implementation method

Similar Documents

Publication Publication Date Title
Schiel et al. The SmartKom Multimodal Corpus at BAS.
Wang et al. EGGNOG: A continuous, multi-modal data set of naturally occurring gestures with ground truth labels
Gunes et al. Is automatic facial expression recognition of emotions coming to a dead end? The rise of the new kids on the block
CN104777911B (en) A kind of intelligent interactive method based on holographic technique
CN103440033B (en) A kind of method and apparatus realizing man-machine interaction based on free-hand and monocular cam
CN107024989A (en) A kind of husky method for making picture based on Leap Motion gesture identifications
CN104598027B (en) A kind of motion sensing control multi-media Training System based on user behavior analysis
Frank et al. Engagement detection in meetings
CN106293099A (en) Gesture identification method and system
CN111103982A (en) Data processing method, device and system based on somatosensory interaction
CN108595012A (en) Visual interactive method and system based on visual human
CN201540535U (en) Non-contact human-computer interaction system based on blue point identification
CN109739353A (en) A kind of virtual reality interactive system identified based on gesture, voice, Eye-controlling focus
CN103170115A (en) Interactive table tennis system
CN102819751A (en) Man-machine interaction method and device based on action recognition
CN108664119A (en) A kind of configuration body-sensing acts the method and device of the mapping relations between pseudo operation
CN112149599B (en) Expression tracking method and device, storage medium and electronic equipment
Zhang Computer-assisted human-computer interaction in visual communication
CN108388399A (en) The method of state management and system of virtual idol
CN104766355A (en) Splash-color painting interactive system based on handwriting analysis and method for generating digital splash-color painting in real time through system
CN104064064A (en) Teaching assistance system based on multi-mode man-machine interaction technology
CN203085064U (en) Virtual starry sky teaching device
CN110838357A (en) Attention holographic intelligent training system based on face recognition and dynamic capture
CN205460934U (en) Augmented reality game station based on motion capture
CN206991240U (en) A kind of man-machine interactive system based on virtual reality technology

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100804

Termination date: 20110820