CN205007551U - Human -computer interaction system based on virtual reality technology - Google Patents

Human -computer interaction system based on virtual reality technology Download PDF

Info

Publication number
CN205007551U
CN205007551U CN201520627434.4U CN201520627434U CN205007551U CN 205007551 U CN205007551 U CN 205007551U CN 201520627434 U CN201520627434 U CN 201520627434U CN 205007551 U CN205007551 U CN 205007551U
Authority
CN
China
Prior art keywords
virtual reality
sensor
glasses
display screen
touch sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201520627434.4U
Other languages
Chinese (zh)
Inventor
胡金晖
武健
朱锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen You Shi Virtual Reality Technology Co Ltd
Original Assignee
Shenzhen You Shi Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen You Shi Virtual Reality Technology Co Ltd filed Critical Shenzhen You Shi Virtual Reality Technology Co Ltd
Priority to CN201520627434.4U priority Critical patent/CN205007551U/en
Application granted granted Critical
Publication of CN205007551U publication Critical patent/CN205007551U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The utility model relates to a human -computer interaction system based on virtual reality technology. Including the virtual reality glasses, the virtual reality glasses include glasses shell, three -dimensional sight glass, display screen socket, display screen and bandeau, display screen passes through the display screen socket and is connected with the virtual reality glasses, three -dimensional sight glass setting is between user's eye and display screen, still include glasses the control unit and touch sensor. The utility model provides an immerse the human -computer interaction system that the sense is lifelike, the interaction is good.

Description

A kind of man-machine interactive system based on virtual reality technology
Technical field
The utility model belongs to field of virtual reality, relates to a kind of man-machine interactive system, particularly relates to a kind of man-machine interactive system based on virtual reality technology.
Background technology
In recent years, due to its real-time content revealing form true to nature of virtual reality technology, the numerous and confused Devoting Major Efforts To Developing virtual reality worlds of company such as a lot of popular industry is such as played, film.Enter into the key driving force in market as virtual reality, man-machine interaction experience that is convenient, that meet human interaction feature becomes new Focus Area.
In virtual reality system, usually require that user wears virtual reality glasses, the sensor utilizing hand-held remote control unit or be positioned at each position of human body is to follow the tracks of the motion of user, finally by this synchronized movement of human body to virtual reality scenario, to reach the object promoting user feeling of immersion.But, in existing virtual reality series products, there is a problem: the sense organ of people is shape respectively, sound, look, taste, touch, human body occurring in nature obtain sensory feedback more in real time, truer, feeling of immersion in virtual world will be more true to nature, and time especially mutual between surrounding environment, haptic feedback accounts for very important proportion, therefore only from vision and the experience that user acoustically can not be allowed to experience on the spot in person, this just needs a kind of better man-machine interactive system to solve the problems referred to above.
Utility model content
In order to solve the problem of the man-machine interaction haptic feedback mentioned in background technology, the utility model proposes a kind of man-machine interactive system based on virtual reality technology, when user needs to carry out mutual with virtual world, only need to operate on a touch sensor, the status signal produced in touch sensor meeting sensing user motion process, as movement velocity, the direction of motion, acceleration of motion, motion frequency, this status signal is sent in virtual reality glasses by data transmission unit, by in virtual reality scenario on the display screen synchronous after the process of glasses control unit, finishing man-machine interaction.The utility model is intended to solve virtual reality series products and feels not obvious, that feeling of immersion is bad problem alternately.In existing technology, also not for this way to solve the problem.Can expect that the simplest mode adds tactile feedback, but vision and experience effect acoustically can be reduced again accordingly, and the utility model is from better Consumer's Experience, will be addressed this problem by Combining with technology of virtual reality.
The technical solution of the utility model is: 1. one kind based on the man-machine interactive system of virtual reality technology, comprises virtual reality glasses; Above-mentioned virtual reality glasses comprises glasses shell, stereoscope, display screen socket, display screen and headband; Above-mentioned display screen is connected with virtual reality glasses by display screen socket; Above-mentioned stereoscope is arranged between user's eye and display screen; Its special character is:
Also comprise glasses control unit and touch sensor;
Above-mentioned glasses control unit comprises inertial sensor, the first data processing unit, the first power supply unit, first data transmission unit and controls button;
Above-mentioned inertial sensor is used for change in location and the attitudes vibration of sensing user head, comprises acceleration, angular speed and magnetic flux;
Above-mentioned first data processing unit is used for positional information when the initial data that sensed by inertial sensor is converted to user's head movement and attitude information; Described inertial sensor is acceleration transducer and angular-rate sensor;
Above-mentioned angular-rate sensor obtains user's head three shaft angle increment informations in space;
Above-mentioned acceleration transducer obtains user's head three axle instantaneous acceleration information in space, and in conjunction with gravitation information, utilize trigonometric function can in the hope of the angle of each axle and weight component, the angle obtained with angular-rate sensor be carried out contrast and corrected;
Above-mentioned control button comprises the direction controlling of upper and lower, left and right and selection, switching, return command control;
Above-mentioned first power supply unit provides electric energy for providing display screen;
Control instruction for receiving the data from touch sensor, and is sent to touch sensor by above-mentioned first data transmission unit;
Above-mentioned touch sensor comprises sensor array, the second data processing unit, the second power supply unit and the second data transmission unit;
Above-mentioned sensor array is for sensing the actuating signal or human body attitude signal that move when user controls touch sensor;
Above-mentioned actuating signal and attitude signal all from user, and send information to first data processing unit respectively by inertial sensor, sensor array, the second data processing unit carries out signal transacting; Above-mentioned actuating signal and attitude signal comprise click, double-click, slip, whipping, movement, translation, circular motion;
Above-mentioned second data processing unit be used for monitoring from sensor array output signal, and when triggering or the closing information of output signal being detected triggering command;
The sense data of touch sensor for receiving the control instruction from glasses control unit, and is sent to glasses control unit by above-mentioned second data transmission unit;
Exchanges data is carried out by wireless data transmission unit between above-mentioned virtual reality glasses and touch sensor;
Above-mentioned touch sensor is involving vibrations sensor also;
Above-mentioned inertial sensor also comprises magnetometer; The flux information that described magnetometer senses carries out offset correction to angular-rate sensor;
Above-mentioned display screen is smart mobile phone or LED display or panel computer;
Above-mentioned wireless data transmission unit 104 can be selected from bluetooth, wireless WIFI, infrared ray, NFC, RFID;
Above-mentioned first power supply unit, the second power supply unit are disposable battery or rechargeable battery.
The utility model has the advantages that: from technical standpoint, the utility model is intended to solve the problem that virtual reality series products feeling of immersion is not true to nature, interactivity is bad.In existing technology, also not for this way to solve the problem.Can expect that the simplest mode adds tactile feedback, but vision and experience effect acoustically can be reduced again accordingly, and the utility model is from better Consumer's Experience, will be addressed this problem by Combining with technology of virtual reality.
Accompanying drawing explanation
Fig. 1 is the man-machine interactive system based on virtual reality technology according to preferred embodiment of the present utility model;
Fig. 2 is the touch sensor of the utility model preferred embodiment;
Fig. 3 is the product structure figure of the virtual reality glasses of a specific embodiment of the present utility model;
Fig. 4 shows according to an embodiment of the present utility model when touch sensor senses the slip of user, click or shift action and the process of triggering command;
Fig. 5 is the operating process be directly connected with virtual reality glasses when touch sensor according to an embodiment of the present utility model;
Wherein 101-virtual reality glasses, 102-glasses control unit, 103-touch sensor, 104-wireless data transmission unit, 101a-display screen socket, 102a-inertial sensor, 102b-first data processing unit, 102c-first power supply unit, 102d-first data transmission unit, 102e-controls button, 103a-sensor array, 103b-second data processing unit, 103c-second power supply unit, 103d-second data transmission unit, 103e-vibrating sensor, 301-glasses shell, 302-stereoscope, 303-display screen, 304-headband.
Detailed description of the invention
See Fig. 1-5, a kind of man-machine interactive system based on virtual reality technology, comprises virtual reality glasses 101; Virtual reality glasses 101 comprises glasses shell, stereoscope, display screen socket 101a, display screen 303 and headband 304; Display screen 303 is connected with virtual reality glasses 101 by display screen socket 101a; Stereoscope is arranged between user's eye and display screen; Also comprise glasses control unit 102 and touch sensor 103; Glasses control unit 102 comprises inertial sensor 102a, the first data processing unit 102b, the first power supply unit 102c, first data transmission unit 102d and controls button 102e; Inertial sensor 102a is used for change in location and the attitudes vibration of sensing user head, comprises acceleration, angular speed and magnetic flux; First data processing unit 102b is used for positional information when the initial data that sensed by inertial sensor is converted to user's head movement and attitude information; Inertial sensor 102a be acceleration transducer and or angular-rate sensor; Inertial sensor 102a also comprises magnetometer; When inertial sensor 102a sense angular-rate sensor obtain data time, user's head three shaft angle increment informations in space can be obtained, because the sampling time is identical, so carry out integration to measured angle increment on time dimension, namely the rotational angle of each axle is obtained, the attitude angle change of user's head when rotating of also arriving; When inertial sensor 102a senses the data of acceleration transducer, user's head three axle instantaneous acceleration information in space can be obtained, in conjunction with gravitation information, utilize trigonometric function can in the hope of the angle of each axle and weight component, the angle obtained with angular-rate sensor be carried out contrast and is corrected; When inertial sensor 102a senses the data of magnetometer, the flux information that magnetometer senses carries out offset correction to angular-rate sensor; Control button 102e and comprise the direction controlling of upper and lower, left and right and selection, switching, return command control; First power supply unit 102c provides electric energy for providing display screen;
Control instruction for receiving the data from touch sensor, and is sent to touch sensor by first data transmission unit;
Touch sensor 103 comprises sensor array 103a, the second data processing unit 103b, the second power supply unit 103c and the second data transmission unit 103d; Sensor array 103a is for sensing the actuating signal or human body attitude signal that move when user controls touch sensor;
Information all from user, and is sent to the first data processing unit 102d by inertial sensor 102a, sensor array 103a by actuating signal and attitude signal respectively, the second data processing unit 103b carries out signal transacting; Actuating signal and attitude signal comprise click, double-click, slip, whipping, movement, translation, circular motion; Second data processing unit 103b be used for monitoring from sensor array output signal, and when triggering or the closing information of output signal being detected triggering command;
The sense data of touch sensor for receiving the control instruction from glasses control unit, and is sent to glasses control unit by the second data transmission unit;
Exchanges data is carried out by wireless data transmission unit between virtual reality glasses 101 and touch sensor 103; Touch sensor 103 goes back involving vibrations sensor; Display screen 303 is smart mobile phone or LED display or panel computer; Wireless data transmission unit 104 can be selected from bluetooth, wireless WIFI, infrared ray, NFC, RFID; First power supply unit 102c, the second power supply unit 103c are disposable battery or rechargeable battery.
According to an embodiment of the present utility model, a kind of man-machine interactive system based on virtual reality technology, primarily of three part compositions: virtual reality glasses, glasses control unit, touch sensor.
According to another embodiment of the present utility model, a kind of novel touch sensor, is built-in with power supply, sensor array, data processing unit and wireless transmission unit.
Fig. 1 is the structural representation of the virtual reality human-computer interaction system 100 according to preferred embodiment of the present utility model.As shown in the figure, virtual reality human-computer interaction system 100 of the present utility model comprises virtual reality glasses 101, glasses control unit 102 and touch sensor 103.Wherein, virtual reality glasses 101 needs to be worn on user's head, provides scene Presentation Function; Glasses control unit 102 and virtual reality glasses 101 are one, for receiving the data processing signal from touch sensor 103, and are shown to by synchronizing information in the display screen of virtual reality glasses 101.Touch sensor 103 and virtual reality glasses 101 are splits, carry out exchanges data between them by wireless data transmission unit 104; Touch sensor 103 is the manipulation part of sensing user action, and the signal sensed for the action manipulation in virtual reality scenario, such as, manipulates the figure action etc. in game.
Virtual reality glasses 101 provides scene Presentation Function for user, needs to be worn on user's head for a long time, and therefore its weight and comfort level must ensure good Consumer's Experience.According to an embodiment of the present utility model, exchanges data is carried out by wireless data transmission unit 104 between virtual reality glasses 101 and touch sensor 103, this process not only can secured transmission of payload data and signal, user can also be avoided when moving simultaneously to be connected line puzzlement, thus to promote Consumer's Experience.
As shown in Figure 1, virtual reality glasses 101 comprises display screen 101a and shows for scene.Display screen is selected from least one in smart mobile phone, LED display, panel computer.The glasses control unit 102 be connected with virtual reality glasses 101 comprises inertial sensor 102a, can be selected from one or more in acceleration transducer, angular-rate sensor, magnetometer, for change in location and the attitudes vibration of sensing user head, first data processing unit 102b is used for the initial data sensed by inertial sensor 102a, such as acceleration, one or more in angular speed and magnetic flux, positional information when being converted to user's head movement and attitude information, such as can obtain user's head three shaft angle increment informations in space according to angular-rate sensor, by carrying out integration to the time, obtain the rotational angle of each axle, obtain the attitude angle change of user's head when rotating roughly, preferably, the flux information that magnetometer senses can be added offset correction is carried out to angular-rate sensor, to make the result of measurement more accurate, first power supply unit 102c is used for powering for display screen 101a, can select the mode such as disposable battery, rechargeable battery, first data transmission unit 102d is used for data between virtual reality glasses 101 and touch sensor 103 or signal communication, radio communication unit 104 can be selected from bluetooth, wireless WIFI, infrared ray, NFC, RFID one or more.
Preferably, glasses control unit 102 can also comprise at least one control button 102e.When being inconvenient to use touch sensor 103 or be used alone virtual reality glasses 101 when carrying out the demonstration of virtual reality scenario, all can manipulate by controlling button 102e.Control button 102e and can comprise direction controlling up and down, selection, switching, return command control.More preferably, touch sensor 103 can directly be connected with virtual reality glasses 101 by user, can reducing the shake that user brings virtual reality glasses 101 when operating like this, more can improve Consumer's Experience.Whole connection procedure will hereafter describe in detail.
As shown in Figure 2, touch sensor 103 comprises sensor array 103a, the second data processing unit 103b, the second power supply unit 103c, the second data transmission unit 103d.Sensor array 103a is for sensing the actuating signal or human body attitude signal that move when user controls touch sensor 103.Described action and attitude are such as clicked, double-click, slide, whipping, movement, translation, one or more in circular motion.Described signal parameter comprises movement position, movement velocity, the direction of motion, frequency, acceleration, the angular speed of object, single click, repeatedly click one or more.Such as, when sensor is pressure sensor, the slip of indicating user on touch sensor 103, whipping, translation can be exported, click and wait and the pressure signal of generation.Second data processing unit 103b can monitor the output signal from this pressure sensor, and when triggering or the closing information of output signal being detected triggering command.Such as, when user with certain speed in touch sensor 103 enterprising line slip time, being distributed in multiple sensor array 103a on touch sensor 103 can the operating state of sensing user hand and operating position, thus reaches the triggering of instruction.Preferably, described sensor array also can be capacitance sensor, light sensor, piezoelectric transducer one or more, output display touch sensor 103 due to by user's click, slip, rotation, translation etc. produce trigger-type signal message.Second power supply unit 103c is used for powering for touch sensor 103, can select the mode such as disposable battery, rechargeable battery; Second data transmission unit 103d is used for data between virtual reality glasses 101 and touch sensor 103 or signal communication, affiliated radio communication unit 104 can be selected from bluetooth, wireless WIFI, infrared ray, NFC, RFID one or more.
Preferably, vibrating sensor 103e can be added in touch sensor 103, when user and virtual world mutual time, the collision produced, contact, friction information, by the first data processing unit 102b, above-mentioned information is processed, as collision information mark position 1, touch sensor 103 is sent to again by data transmission unit 102c, utilize the second data processing unit 103c to make corresponding instruction to judge, control vibrating sensor 103e and start vibrations, when no longer terminating containing during vibrations flag bit in transmission information, make real-time, interactive sexual experience more superior.
Fig. 3 shows the product structure schematic diagram of the virtual reality glasses 101 of a specific embodiment of the present utility model.As shown in Figure 3, described virtual reality glasses 300 comprises glasses shell 301, is built in eyes enclosure and simultaneously two stereoscopes 302 corresponding with left and right eyes, is positioned at stereoscope 302 front end and the display screen 303 identical with stereoscope 302 center and be tied to eyes shell both sides and top for being fixed on the headband 304 of user's head near human eye.
Glasses shell 301 meets human face's ergonomic near the side of face, all there is the groove of laminating human cheek feature at forehead, cheek, nose place, utilize this category feature to play the effect supporting whole cheek simultaneously, glasses shell 301 and human cheek's contact surface are bonded, extraneous light must not be entered wherein, thus in user can be immersed in better scene content that display screen 303 provides.Stereoscope 302 is for embodying the content of display in display screen 303 with the form of solid, the at least one in aspherical lens, super tough eyeglass, color-changing lens, tinted lens, progressive multi-focus lens, radiation proof eyeglass, resin lens can be selected from, such as when selecting aspherical lens, two split screen displaying contents of display screen 303 are watched respectively by two aspherical lens, different according to the object angle that eyes are seen, produce the far and near degree of depth, thus produce third dimension.Display screen 303, for the display of virtual reality scenario, can be at least one in smart mobile phone, LED display, panel computer.User, when using virtual reality glasses 300, needs headband 304 to be fixed on head, and by regulating headband 304, user can liberate both hands and not by the impact of extraneous shadow change, be immersed in the picture in display screen completely.
As shown in Figure 4, in embodiments more of the present utility model, touch sensor 103 senses user action can triggering command through some sensor array, these instructions can be a series of complete actions, as slip, mobile, whipping, also can be an independent action, as clicked, double-clicking, determine, switch, return.Such as, time among the virtual reality scenario that user is in flying games, now touch sensor is as aircraft simulation control stick, when user's hand moves on a touch sensor according to Fig. 4 direction, the pressure that the pressure sensing array be positioned on touch sensor produces when sensing user's hand exercise, triggering command, and record the time of each pressure sensing array instruction triggers, the state that user's hand moves on a touch sensor can be judged by these times, as movement velocity, judged by the size in the time interval of two adjacent pressure sensing array instruction triggers, interval is shorter, movement velocity is faster, otherwise it is then slower, as the direction of motion, judged by the difference of different pressures sensor array triggering command time, similar, also movable acceleration and motion frequency also can judge according to the size in the time interval of two adjacent pressure sensing array instruction triggers, and trigger interval is shorter, acceleration is larger, and frequency is higher, otherwise trigger interval is longer, acceleration is less, and frequency is lower.This state is sent to virtual reality glasses 101 by first data transmission unit 102d with Wireless Data Transmission 104, makes the aircraft in the virtual reality flying games in display screen 101a start to fly according to same state.It will be appreciated by persons skilled in the art that and other sensor array modes also can be adopted to carry out contact signal detection, as capacitance sensor, light sensor, piezoelectric transducer.
Preferably, the material that described touch sensor should adopt coefficient of friction to be comparatively applicable to, as synthetic rubber, cloth cover, silica gel, plastics, glass, metal material, these material surfaces frictional force after texture processing is larger, be convenient to user's movement in use and location, can ensure that the touch sensor back side has certain skidproof effect simultaneously, keep firm, not easily move.
Fig. 5 shows another embodiment of the present utility model, in the use procedure of whole virtual reality human-computer interaction system 100, may occur being inconvenient to use touch sensor 103 or be used alone the demonstration that virtual reality glasses 101 carries out virtual reality scenario, touch sensor 103 can directly be connected with virtual reality glasses 101 by user, concrete connected mode is slot in the side of virtual reality glasses 101 close to right side cheek, its size, shape, the degree of depth is all identical with touch sensor 103, user is when inserting touch sensor 103, should by the sensor array 103a direction on touch sensor 103 laterally.Adopt the connected mode of Fig. 5 not luminous energy solve and be inconvenient to the problem that uses touch sensor 103 or be used alone virtual reality glasses 101, the shake that user brings virtual reality glasses 101 when operation can also be reduced, more can improve Consumer's Experience.
Inertial sensor 102a sensing user head movement, the first data processing unit 102b by the spatial movement information of head, as signal transacting is carried out in attitude angle, position etc.; Sensor array 103a sensing user hand exercise, then carry out signal transacting by the second data processing unit 103b.

Claims (6)

1., based on a man-machine interactive system for virtual reality technology, comprise virtual reality glasses; Described virtual reality glasses comprises glasses shell, stereoscope, display screen socket, display screen and headband; Described display screen is connected with virtual reality glasses by display screen socket; Described stereoscope is arranged between user's eye and display screen; It is characterized in that:
Also comprise glasses control unit and touch sensor;
Described glasses control unit comprises inertial sensor, the first data processing unit, the first power supply unit, first data transmission unit and controls button;
Described inertial sensor is used for change in location and the attitudes vibration of sensing user head, comprises acceleration, angular speed and magnetic flux;
Described first data processing unit is used for positional information when the initial data that sensed by inertial sensor is converted to user's head movement and attitude information; Described inertial sensor is acceleration transducer and angular-rate sensor;
Described angular-rate sensor obtains user's head three shaft angle increment informations in space;
Described acceleration transducer obtains user's head three axle instantaneous acceleration information in space, and in conjunction with gravitation information, utilize trigonometric function can in the hope of the angle of each axle and weight component, the angle obtained with angular-rate sensor be carried out contrast and corrected;
Described control button comprises the direction controlling of upper and lower, left and right and selection, switching, return command control;
Described first power supply unit provides electric energy for providing display screen;
Control instruction for receiving the data from touch sensor, and is sent to touch sensor by described first data transmission unit;
Described touch sensor comprises sensor array, the second data processing unit, the second power supply unit and the second data transmission unit;
Described sensor array is for sensing the actuating signal or human body attitude signal that move when user controls touch sensor;
Described actuating signal and attitude signal all from user, and send information to first data processing unit respectively by inertial sensor, sensor array, the second data processing unit carries out signal transacting; Described actuating signal and attitude signal comprise click, double-click, slip, whipping, movement, translation, circular motion;
Described second data processing unit be used for monitoring from sensor array output signal, and when triggering or the closing information of output signal being detected triggering command;
The sense data of touch sensor for receiving the control instruction from glasses control unit, and is sent to glasses control unit by described second data transmission unit;
Exchanges data is carried out by wireless data transmission unit between described virtual reality glasses and touch sensor.
2. a kind of man-machine interactive system based on virtual reality technology according to claim 1, is characterized in that: described touch sensor is involving vibrations sensor also.
3. a kind of man-machine interactive system based on virtual reality technology according to claim 1, is characterized in that: described inertial sensor also comprises magnetometer; The flux information that described magnetometer senses carries out offset correction to angular-rate sensor.
4. a kind of man-machine interactive system based on virtual reality technology according to claim 1 or 2 or 3, is characterized in that: described display screen is smart mobile phone or LED display or panel computer.
5. a kind of man-machine interactive system based on virtual reality technology according to claim 1 or 2 or 3, is characterized in that: described wireless data transmission unit 104 can be selected from bluetooth, wireless WIFI, infrared ray, NFC, RFID.
6. a kind of man-machine interactive system based on virtual reality technology according to claim 1 or 2 or 3, is characterized in that: described first power supply unit, the second power supply unit are disposable battery or rechargeable battery.
CN201520627434.4U 2015-08-19 2015-08-19 Human -computer interaction system based on virtual reality technology Expired - Fee Related CN205007551U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201520627434.4U CN205007551U (en) 2015-08-19 2015-08-19 Human -computer interaction system based on virtual reality technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201520627434.4U CN205007551U (en) 2015-08-19 2015-08-19 Human -computer interaction system based on virtual reality technology

Publications (1)

Publication Number Publication Date
CN205007551U true CN205007551U (en) 2016-02-03

Family

ID=55206181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201520627434.4U Expired - Fee Related CN205007551U (en) 2015-08-19 2015-08-19 Human -computer interaction system based on virtual reality technology

Country Status (1)

Country Link
CN (1) CN205007551U (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106741750A (en) * 2016-11-30 2017-05-31 广东法诺文化传媒有限公司 A kind of virtual reality protector under water
CN106886285A (en) * 2017-01-20 2017-06-23 西安电子科技大学 A kind of historical relic interactive system and operating method based on virtual reality
WO2017177394A1 (en) * 2016-04-13 2017-10-19 华为技术有限公司 Method and apparatus for controlling operating state of wearable electronic device
CN107272891A (en) * 2017-05-31 2017-10-20 广东南海鹰视通达科技有限公司 A kind of the intelligent display helmet and its display methods including display system
CN107807774A (en) * 2017-01-05 2018-03-16 北京行云时空科技有限公司 The control method and split type glasses of a kind of Split type intelligent glasses
CN108700939A (en) * 2016-02-05 2018-10-23 奇跃公司 System and method for augmented reality

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108700939A (en) * 2016-02-05 2018-10-23 奇跃公司 System and method for augmented reality
CN108700939B (en) * 2016-02-05 2022-07-05 奇跃公司 System and method for augmented reality
WO2017177394A1 (en) * 2016-04-13 2017-10-19 华为技术有限公司 Method and apparatus for controlling operating state of wearable electronic device
US10694018B2 (en) 2016-04-13 2020-06-23 Huawei Technologies Co., Ltd. Method and apparatus for controlling running status of wearable electronic device
CN106741750A (en) * 2016-11-30 2017-05-31 广东法诺文化传媒有限公司 A kind of virtual reality protector under water
CN107807774A (en) * 2017-01-05 2018-03-16 北京行云时空科技有限公司 The control method and split type glasses of a kind of Split type intelligent glasses
CN106886285A (en) * 2017-01-20 2017-06-23 西安电子科技大学 A kind of historical relic interactive system and operating method based on virtual reality
CN107272891A (en) * 2017-05-31 2017-10-20 广东南海鹰视通达科技有限公司 A kind of the intelligent display helmet and its display methods including display system

Similar Documents

Publication Publication Date Title
CN105031918B (en) A kind of man-machine interactive system based on virtual reality technology
CN205007551U (en) Human -computer interaction system based on virtual reality technology
CN106445176B (en) Man-machine interactive system based on virtual reality technology and exchange method
US11745097B2 (en) Spatially-correlated human-machine interface
US11173392B2 (en) Spatially-correlated human-machine interface
JP6670884B2 (en) System and method for tactile-use adaptive and multi-faceted displays
CN103877726B (en) A kind of virtual reality components system
JP6027747B2 (en) Multi-display human machine interface with spatial correlation
US20170212589A1 (en) Providing fingertip tactile feedback from virtual objects
US9740305B2 (en) Operation method, control apparatus, and program
WO2015180497A1 (en) Motion collection and feedback method and system based on stereoscopic vision
EP3364272A1 (en) Automatic localized haptics generation system
WO2017153771A1 (en) Virtual reality
CN205015835U (en) Wear -type intelligence interaction system
CN103077633A (en) Three-dimensional virtual training system and method
WO2007100204A1 (en) Stereovision-based virtual reality device
JP2012115414A (en) Game device, method of providing game, game program, and game system
JP6203346B1 (en) Method, program, and recording medium for providing virtual space
JP6330072B1 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
CN115033105B (en) Large-space movable platform supporting bare-hand multi-touch force sense natural interaction
US20240012496A1 (en) Computer Systems with Handheld Controllers
JPWO2021059358A1 (en) Animation production system

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160203

Termination date: 20200819

CF01 Termination of patent right due to non-payment of annual fee