CN103268408A - Multi-dimensional interaction platform - Google Patents

Multi-dimensional interaction platform Download PDF

Info

Publication number
CN103268408A
CN103268408A CN2013101748034A CN201310174803A CN103268408A CN 103268408 A CN103268408 A CN 103268408A CN 2013101748034 A CN2013101748034 A CN 2013101748034A CN 201310174803 A CN201310174803 A CN 201310174803A CN 103268408 A CN103268408 A CN 103268408A
Authority
CN
China
Prior art keywords
client
platform
sensor
dimensional
interaction platform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2013101748034A
Other languages
Chinese (zh)
Inventor
杨恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan Ruipan Science & Technology Co Ltd
Original Assignee
Yunnan Ruipan Science & Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunnan Ruipan Science & Technology Co Ltd filed Critical Yunnan Ruipan Science & Technology Co Ltd
Priority to CN2013101748034A priority Critical patent/CN103268408A/en
Publication of CN103268408A publication Critical patent/CN103268408A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to an interaction platform for displaying animation, characters, pictures and three-dimensional vision, in particular to a multi-dimensional interaction platform which comprises a client, a motion sensor, a voice sensor and display units. The somatosensory sensor, the voice sensor and the display units are connected with the client. By means of the structure, the somatosensory sensor and the voice sensor are utilized to collect various control signals such as gestures, body actions and voice commands of a human body of a user, then recognition and signal conversion are conducted through a central processing unit, and control action is directly reflected on the display unit. Due to the fact that various choices can be conducted on the display units, different display units can be selected according to different demonstration conditions. When a demonstrator uses the platform, the demonstrator can controls the platform through body action and voice matched with multi-dimensional demonstration to achieve smooth complete demonstration.

Description

The multidimensional interaction platform
Technical field
The present invention relates to a kind of interaction platform of showing animation, literal, picture, three-dimensional mirage, especially a kind of multidimensional interaction platform.
Background technology
Along with market economy increasing development, competition between the manufacturer also is growing more intense, the product of each manufacturer has all passed through before formal the release and has worked with great care, but this is not enough for the market expansion, also need spend a lot of energy and infusion of financial resources to allow the client understand characteristic and the bright spot of product in depth, trigger the desire of purchase.Product is not only wanted " well-done ", also will " can show " when promoting; In enterprises and institutions such as governments the international communication corporate image is arranged equally, show the needs of achievement.
Achievements exhibition of the prior art, normally in the majority by the form demonstration of PPT, certainly also take into video, cooperate the people to explain then, such interaction platform is more uninteresting, operate pretty troublesome, especially demonstration needs various operations to half, will destroy the atmosphere of demonstration, interrupt the process of demonstration, do not reach the purpose of abundant demonstration.
Summary of the invention
The objective of the invention is to, a kind of many means are provided, control convenient, the smooth complete multidimensional interaction platform of maintenance demonstration.
The technical solution adopted for the present invention to solve the technical problems is: a kind of multidimensional interaction platform, comprise client, body propagated sensation sensor, speech transducer, display unit, and body propagated sensation sensor, speech transducer, display unit are connected with client respectively.
The further setting of the present invention is: also comprise the feature identification module, the feature identification module is connected with client.
The further setting of the present invention is: client comprises central processing unit, control module, and body propagated sensation sensor, speech transducer are connected with control module respectively.
The further setting of the present invention is: display unit, feature identification module are connected with central processing unit respectively.
The further setting of the present invention is: described display unit is a kind of, two or more the combination wherein of projector, the three-dimensional projector, display, touching-type monitor.
Said structure, gather user's various control signals of human body such as gesture, body action, voice command etc. by body propagated sensation sensor and speech transducer, identify with signal by central processing unit then and transform, directly control action is reacted on the display unit, because display unit has a variety of selections, can be according to different demonstration situations, select different display units, the demonstrator when in use so, can control by body action or language, cooperate the multidimensional demonstration, just can reach smooth complete demonstration.
Description of drawings
In order to be illustrated more clearly in the embodiment of the invention or technical scheme of the prior art, to do to introduce simply to the accompanying drawing of required use in embodiment or the description of the Prior Art below, apparently, accompanying drawing in describing below only is some embodiments of the present invention, for those of ordinary skills, under the prerequisite of not paying creative work, can also obtain other accompanying drawing according to these accompanying drawings.
Fig. 1 is theory diagram of the present invention;
Fig. 2 is the theory diagram of present embodiment;
Embodiment
With reference to figure 1 as can be known, a kind of multidimensional interaction platform of the present invention comprises client, body propagated sensation sensor, speech transducer, display unit, and body propagated sensation sensor, speech transducer, display unit are connected with client respectively, and the feature identification module is connected with client.
With reference to figure 2 as can be known, client comprises central processing unit, control module, and body propagated sensation sensor, speech transducer are connected with control module respectively, and display unit, feature identification module are connected with central processing unit respectively.
Display unit among the present invention is a kind of, two or more the combination wherein of projector, the three-dimensional projector, display, touching-type monitor.
Below be example with the three-dimensional projector, the course of work of the present invention is described:
Step a: hardware aspect, utilizing computer as client, outer junctor propagated sensation sensor and speech transducer, the three-dimensional projector on the computer.
Step b: client is by body propagated sensation sensor, gesture, body action and the facial situation of catching the user; Speech transducer catches the voice that the user sends.Client is identified the content that captures; At first by feature identification module identification user's face, confirm user's identity (identification), thereby determine whether to allow user's logging on client to operate.
Step c: after landing, the user can show corporate image or the promote products of oneself thereon, and perhaps other any content of multimedia all can be showed; Organisations and institutions' distribution situation of enterprise's situation of user, product and project situation, various company information, enterprise, the Sales Reports of enterprise etc., by three-dimensional projector three-dimensional imaging, the demonstrator can directly carry out virtual action and order to three-dimensional mirage, and various data can be stored in the database, read out corresponding content as required and show at this platform.
Steps d: the present invention is in operation, and body propagated sensation sensor captures user's gesture, body action, facial situation and voice, and after transforming by the control module signal, central processing unit can be understood people's purpose; The user can use gesture, body action, voice command, and process and the content of showing are controlled; For example, the amplification of present graphical is dwindled, is showed the page turning of content, select to show that according to menu operation such as content and three-dimensional mirage cooperate demonstration.
Be form figure big or small scalable of three-dimensional animation such as the content of showing, the rotatable and deflection of direction can be dragged and dropped into different graph position, and figure can have animation effect; In the displaying process, the different content of the different brackets by menu prompt uses gesture or the content of showing need to be selected in voice, and menu has good animation effect with prompting user operation; The different step of showing all has the corresponding animation prompting corresponding to operation, guarantees the demonstration effect.
Certainly the present invention is a kind of platform, platform and platform can be mutual by transmission, set up server, PostgreSql8.0 and IIS service need be installed and dispose to server, even demonstrate at a platform so, other platform also has identical effect demonstration, so can be at an enterprising line operate of platform, other platforms are demonstrated synchronously, and it is very convenient to operate, and have adapted to the process of present social development.
Obviously, above-described embodiment only be for explanation clearly do for example, and be not restriction to embodiment.To those of ordinary skill in the art, can also make other changes in different forms on the basis of the above description.Here need not also can't give all embodiments exhaustive.And the apparent variation of being extended out thus or change still are in protection scope of the present invention.

Claims (7)

1. multidimensional interaction platform, it is characterized in that: comprise client, body propagated sensation sensor, speech transducer, display unit, body propagated sensation sensor, speech transducer, display unit are connected with client respectively.
2. according to the described multidimensional interaction platform of claim 1, it is characterized in that: also comprise the feature identification module, the feature identification module is connected with client.
3. according to claim 1 or 2 described multidimensional interaction platforms, it is characterized in that: client comprises central processing unit, control module, and body propagated sensation sensor, speech transducer are connected with control module respectively.
4. according to the described multidimensional interaction platform of claim 3, it is characterized in that: display unit, feature identification module are connected with central processing unit respectively.
5. according to claim 1 or 2 described multidimensional interaction platforms, it is characterized in that: described display unit is a kind of, two or more the combination wherein of projector, the three-dimensional projector, display, touching-type monitor.
6. according to the described multidimensional interaction platform of claim 3, it is characterized in that: described display unit is a kind of, two or more the combination wherein of projector, the three-dimensional projector, display, touching-type monitor.
7. according to the described multidimensional interaction platform of claim 4, it is characterized in that: described display unit is a kind of, two or more the combination wherein of projector, the three-dimensional projector, display, touching-type monitor.
CN2013101748034A 2013-05-13 2013-05-13 Multi-dimensional interaction platform Pending CN103268408A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2013101748034A CN103268408A (en) 2013-05-13 2013-05-13 Multi-dimensional interaction platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2013101748034A CN103268408A (en) 2013-05-13 2013-05-13 Multi-dimensional interaction platform

Publications (1)

Publication Number Publication Date
CN103268408A true CN103268408A (en) 2013-08-28

Family

ID=49012036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2013101748034A Pending CN103268408A (en) 2013-05-13 2013-05-13 Multi-dimensional interaction platform

Country Status (1)

Country Link
CN (1) CN103268408A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103702100A (en) * 2013-12-17 2014-04-02 Tcl商用信息科技(惠州)股份有限公司 3D (three-dimensional) display method and 3D display system for scenario
CN104683720A (en) * 2013-11-28 2015-06-03 联想(北京)有限公司 Electronic equipment and control method
CN106716501A (en) * 2016-12-12 2017-05-24 深圳前海达闼云端智能科技有限公司 Visual decoration design method, apparatus therefor, and robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100271458A1 (en) * 2009-04-28 2010-10-28 Yashesh Shethia Multi-Input-Driven Entertainment and Communication Console With Minimum User Mobility
CN102306051A (en) * 2010-06-18 2012-01-04 微软公司 Compound gesture-speech commands
CN102707797A (en) * 2011-03-02 2012-10-03 微软公司 Controlling electronic devices in a multimedia system through a natural user interface
CN202749066U (en) * 2012-03-09 2013-02-20 无锡华轩信息科技有限公司 Non-contact object-showing interactive system
CN102945672A (en) * 2012-09-29 2013-02-27 深圳市国华识别科技开发有限公司 Voice control system for multimedia equipment, and voice control method
CN103049618A (en) * 2012-12-30 2013-04-17 江南大学 Intelligent home displaying method on basis of Kinect

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100271458A1 (en) * 2009-04-28 2010-10-28 Yashesh Shethia Multi-Input-Driven Entertainment and Communication Console With Minimum User Mobility
CN102306051A (en) * 2010-06-18 2012-01-04 微软公司 Compound gesture-speech commands
CN102707797A (en) * 2011-03-02 2012-10-03 微软公司 Controlling electronic devices in a multimedia system through a natural user interface
CN202749066U (en) * 2012-03-09 2013-02-20 无锡华轩信息科技有限公司 Non-contact object-showing interactive system
CN102945672A (en) * 2012-09-29 2013-02-27 深圳市国华识别科技开发有限公司 Voice control system for multimedia equipment, and voice control method
CN103049618A (en) * 2012-12-30 2013-04-17 江南大学 Intelligent home displaying method on basis of Kinect

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐超: "基于ShiVa3D的在线三维虚拟场景交互式展示研究", 《软件导刊》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104683720A (en) * 2013-11-28 2015-06-03 联想(北京)有限公司 Electronic equipment and control method
CN103702100A (en) * 2013-12-17 2014-04-02 Tcl商用信息科技(惠州)股份有限公司 3D (three-dimensional) display method and 3D display system for scenario
CN106716501A (en) * 2016-12-12 2017-05-24 深圳前海达闼云端智能科技有限公司 Visual decoration design method, apparatus therefor, and robot

Similar Documents

Publication Publication Date Title
US11080520B2 (en) Automatic machine recognition of sign language gestures
US10817760B2 (en) Associating semantic identifiers with objects
CN106325856B (en) A kind of method and system for realizing Elasticsearch Dsl rule visual edit and data exhibiting
US20140253431A1 (en) Providing a gesture-based interface
US9690784B1 (en) Culturally adaptive avatar simulator
JP2019503004A5 (en)
JP2013196157A5 (en)
CN106415446A (en) Accessibility detection of content properties through tactile interactions
JP2013080326A5 (en)
CN109782706A (en) Exhibition room control system and method, Cloud Server and terminal control equipment
CN104410911A (en) Video emotion tagging-based method for assisting identification of facial expression
Khan et al. Reframing HRI design opportunities for social robots: Lessons learnt from a service robotics case study approach using UX for HRI
US20170228034A1 (en) Method and apparatus for providing interactive content
CN103268408A (en) Multi-dimensional interaction platform
CN106796487A (en) Interacted with the user interface element for representing file
Paravati et al. Human-computer interaction in smart environments
CN202142050U (en) Interactive customer reception system
CN102004632A (en) Method and device for setting time information
CN108388399B (en) Virtual idol state management method and system
KR20210139203A (en) Commodity guiding method, apparatus, device and storage medium and computer program
KR101964192B1 (en) Smart table apparatus for simulation
Machidori et al. Implementation of multi-modal interface for VR application
JP6977408B2 (en) Information processing system, terminal device, information processing method and information processing program
US9131107B2 (en) Telepresence device communication and control system
CN107908385B (en) Holographic-based multi-mode interaction system and method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20130828