CN103877726A - Virtual reality assembly system - Google Patents
Virtual reality assembly system Download PDFInfo
- Publication number
- CN103877726A CN103877726A CN201410143435.1A CN201410143435A CN103877726A CN 103877726 A CN103877726 A CN 103877726A CN 201410143435 A CN201410143435 A CN 201410143435A CN 103877726 A CN103877726 A CN 103877726A
- Authority
- CN
- China
- Prior art keywords
- sensor
- controller
- user
- eye
- display control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004891 communication Methods 0.000 claims abstract description 15
- 230000033001 locomotion Effects 0.000 claims description 31
- 238000006073 displacement reaction Methods 0.000 claims description 26
- 230000001133 acceleration Effects 0.000 claims description 22
- 230000006698 induction Effects 0.000 claims description 5
- 230000004913 activation Effects 0.000 claims description 4
- 230000002045 lasting effect Effects 0.000 claims description 3
- 238000012546 transfer Methods 0.000 claims description 3
- 230000001960 triggered effect Effects 0.000 claims description 2
- 238000000034 method Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 210000003128 head Anatomy 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 230000001276 controlling effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 239000011435 rock Substances 0.000 description 4
- 238000004088 simulation Methods 0.000 description 4
- 206010049244 Ankyloglossia congenital Diseases 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000036541 health Effects 0.000 description 3
- 230000008054 signal transmission Effects 0.000 description 3
- 230000000994 depressogenic effect Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 210000000697 sensory organ Anatomy 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/428—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
Abstract
The invention discloses a virtual reality system. The virtual reality system comprises a near-eye displayer, a displayer control unit and a controller, wherein the near-eye displayer and the displayer control unit are separable. The near-eye displayer comprises a first sensor which is used for sensing the three-axis posture and/or the three-axis position of the near-eye displayer. The displayer control unit comprises a first connecting part, a communication module and a power supply module, wherein the first connecting part is used for being connected with the controller, the communication module is used for receiving and sending data or control signals in a wired or wireless mode, transmitting the data or the control signals to the near-eye displayer and receiving the signals sent by the near-eye displayer, and the power supply module is used for supplying power to the displayer control unit. The controller comprises a second connecting part and a second sensor, wherein the second connecting part is used for being connected with the displayer control unit, and the second sensor is used for sensing the three-axis posture or the three-axis position of the controller.
Description
Technical field
The present invention relates to virtual reality field, particularly a kind of system that realizes virtual reality scenario.
Background technology
Virtual reality (Virtual Reality is called for short VR) is the high-tech technology occurring in recent years.Virtual reality is to utilize computer simulation to produce a three-dimensional virtual world, and user is provided the simulation about sense organs such as vision, the sense of hearing, senses of touch, allows user as being personally on the scene, can observe in time, ad lib the things in three-dimensional space.Virtual reality technology has contained and has comprised real-time three-dimensional computer graphics techniques, wide-angle (the wide visual field) stereo display technique, to the tracking sensing technology of observer's head, eye and hand, and sense of touch/power is felt the multiple technologies such as feedback, stereo, Internet Transmission, phonetic entry output.Be widely used in multiple fields such as medical science, amusement, military aerospace, training.
In the scene of virtual reality applications, user, by wearing head mounted display, utilizes controller and various sensor to follow the tracks of the various motions of head and health, thereby makes user experience impression on the spot in person conventionally.In order to alleviate the weight of head mounted display, existing product adopts display part and the separable scheme of controller part, separable under user's operation.For example, circuit board is contained in separately in control box, then by wired modes such as cables, controller is connected with head mounted display.But can bring the inconvenience in great manipulation for user's use like this.Because control box need to be laid separately, be for example placed on desk, or on the ground, therefore cannot allow easily user hand-held, make user's scope of activities and limited, hindered and affected.In addition, therefore the line between control box and head mounted display also can hinder user's motion, brings inconvenience, and reduces user and experiences.
In addition, in the scene of game of virtual reality technology application, be much the game with first person, the shooting game (FPS) of for example first person.But existing FPS, utilizing in the situation of virtual reality technology simulation first person, experiences true not.For example, in the time needing player to carry out article throwing in game, can only be by simulating by the button on movement controller, the user that can not simulate really throwing experiences.In the time that role moves, existing game control is all moving of control role all around by button or rocking bar, this and really the move mode in environment differ greatly, true not.And, in virtual reality equipment, adopting computer vision methods to follow the tracks of user, thereby realize motion tracking, this scheme system bulky complex, is not easy to civilian popularization.
Therefore, need the better virtual reality system of a kind of experience to solve the problems referred to above.
Summary of the invention
The object of the present invention is to provide a kind of virtual reality system, comprise near-to-eye, display control module, and controller, described near-to-eye and described display control module are separable, wherein, described near-to-eye comprises first sensor, for three-axis attitude and/or three shaft positions of near-to-eye described in sensing; Described display control module comprises the first connector, communication module and supply module, wherein said the first connector is used for connecting described controller, described communication module is for receiving and send data/control signals in wired or wireless mode, and transfer to described near-to-eye and receive the signal sending from described near-to-eye, described supply module is used to described display control module power supply; Described controller comprises the second connector and the second sensor, and wherein said the second connector is used for connecting described display control module, and described the second sensor is for three-axis attitude and/or three shaft positions of controller described in sensing.
Preferably, described near-to-eye is connected by line with described display control module and transmits data and/or power supply signal.
Preferably, described display control module comprises that interface module is for being connected to other display terminals or peripheral hardware.
Preferably, described first sensor and the second sensor are selected from least one or the multiple combination in acceleration transducer, angular-rate sensor and magnetic induction sensor.
Preferably, whether described acceleration transducer reaches certain threshold value for controller described in sensing in linear acceleration or the linear velocity of inertial system, is Trig control signal.
Preferably, whether described angular-rate sensor reaches certain threshold value for controller described in sensing at angular acceleration or the angular speed of inertial system, is Trig control signal.
Preferably, when the motion that described sensor senses reaches after certain threshold value, by recording parameter that the activation threshold value moment senses by sensor as initial parameter, trigger the event being formed by multiple instructions.
Preferably, when sensing user, described first sensor or the second sensor produce displacement or the inclination with respect to first direction in self reference axis, and after described displacement or inclination exceed certain threshold value, described sensor-triggered instruction with continue to send the first mobile control signal of stating in direction.
Preferably, in the time that described first sensor or the second sensor sense user and have produced the movement of the second direction different with respect to described first direction or tilted, and after the displacement in this second direction or inclination exceed certain threshold value, Trig control signal is with the lasting movement along described first direction before stopping, or triggers to the displacement control signal in this second direction.
Preferably, described display control module and controller are detachable combination set, and its combination is buckle-type or magnetic-type.
Preferably, described combination set is hand-held gun shape, and described display control module is gun barrel part, and described controller is pikestaff part.
Preferably, also comprise the office that threads off, under the pulling of first direction, make described display control module and described controller throw off for the fingers of single hand user.
Preferably, the described office that threads off is for the operation at user's described controller of triggering under the pulling of the second direction different from described first direction.
Preferably, the described office that threads off triggers the throwing action of opening sensing user that is operating as of described controller.
Preferably, the described communication module of described wireless mode is selected from least one in WiFi, bluetooth, infrared ray, ultrasonic wave, Wireless USB, RFID, NFC.
Preferably, described supply module is used to described display control module and/or described controller to power.
Preferably, the battery case that described supply module holds battery is set to the fuselage of display control module described in break-through, so that user is in the time that the fuselage one of described display control module is sidelong into new battery, old battery is ejected to fuselage simultaneously.
Preferably, described display control module also comprises at least one manipulation button.
Can allow user comfortable and naturally manipulate the display control module part of display according to virtual reality system of the present invention, avoid the restriction of extraneous cable to user's head.One-handed performance is deviate from mechanism, without user's bimanualness, has improved convenience.Virtual reality system according to the present invention adopts real whipping to move to simulate user's operation of throwing, controls the motion of virtual portrait by the true movement of sensing user health, has improved user's experience.
The description and the follow-up detailed description that should be appreciated that aforementioned cardinal principle are exemplary illustration and explanation, should the restriction to the claimed content of the present invention with do.
Accompanying drawing explanation
With reference to the accompanying drawing of enclosing, the more object of the present invention, function and advantage are illustrated the following description by embodiment of the present invention, wherein:
Fig. 1 is the Sketch schematic diagram according to virtual reality system of the present invention.
Fig. 2 is the type of attachment that display control module wraps up controller according to an embodiment of the invention.
Fig. 3 shows the product structure figure of the near-to-eye of a specific embodiment according to the present invention.
The process of triggering command when Fig. 4 shows according to one embodiment of the invention when whipping or the throwing action of the sensor sensing user in controller and reaches certain threshold value.
Fig. 5 shows the process when the sensor sensing user in near-to-eye or controller is subjected to displacement when moving and reaching certain threshold value triggering command according to one embodiment of the invention.
The specific embodiment
By reference to one exemplary embodiment, object of the present invention and function and will be illustrated for the method that realizes these objects and function.But the present invention is not limited to following disclosed one exemplary embodiment; Can be realized it by multi-form.The essence of description is only to help various equivalent modifications Integrated Understanding detail of the present invention.
Hereinafter, embodiments of the invention will be described with reference to the drawings.In the accompanying drawings, identical Reference numeral represents same or similar parts, or same or similar step.
Fig. 1 is according to the structural representation of virtual reality system 100 of the present invention.As shown in Figure 1, virtual reality system 100 of the present invention comprises near-to-eye 101, display control module 102, and controller 103, wherein, near-to-eye 101 provides Presentation Function for being worn on user's head, near-to-eye 101 is splits with display control module 102, between can connect and transmit data and/or power supply signal by line 104.Control module 102 is the hand-held manipulation part of user, manipulates, as the action of personage in manipulation training or game etc. for the experience in virtual reality.Display control module 102 receives the control signal of self-controller 103, generates the display control signal showing for near-to-eye 101.
Preferably, display control module 102 and controller 103 can be set to detachable combination set.As shown in Figure 1, according to one embodiment of present invention, described combination set for example can be set to hand-held gun shape, and display control module 102 is gun barrel part, and controller 103 is pikestaff part.Gun barrel part and pikestaff part combine along four-headed arrow direction as shown in Figure 1 by connector 102a and 103a.The combination of connector 102a and 103a can be for example buckle-type or magnetic-type.For example can adopt the form of negative and positive buckle as shown in Figure 1, also can adopt clad type buckle form, the example type of attachment of wrapping up controller 103 by display control module 102 as shown in Figure 2.Be understandable that, described combination set can also be set to other and be convenient to the hand-held mode of user, for example mode of control stick, handle, steering wheel.
Near-to-eye 101 needs long period to be in use worn on user's head, therefore needs the lightweight comfortableness to guarantee to wear.According to one embodiment of present invention, between near-to-eye 101 and display control module, be preferably connected by the line 104 of wired mode.Line 104 not only can transmit data and signal, can also will be described below by the supply module 102e(being arranged in display control module 102) transmission power supply signal, need to configure the extra weight that power supply unit was increased to exempt near-to-eye 101.
As shown in Figure 1, controller 103 comprises that the machine-operated 103b that threads off is for controlling the dropout of display control module 102 and controller 103.Be set in the embodiment of hand-held gun shape at combination set, the machine-operated 103b that threads off can be set to the shape of trigger.For can singlehandedly manipulating in use the dropout with controller 103 with control display control module 102, greatly improve the convenience using.As shown in Figure 2, thus user can pull towards direction d1 as shown in Figure 2 the machine-operated 103b that threads off by fingers of single hand throws off described display control module 102 and described controller 103.
More preferably, as one of control button of controller 103, thereby the machine-operated 103b that threads off can also be arranged on other direction motions and trigger extra control function, for example, user is along the machine-operated 103b that threads off that pulls of the direction d2 different from direction d1 as shown in Figure 2, thereby triggers other operation of described controller.These operations will below describe in detail.
As shown in Figure 1, display control module 102 comprises connector 102a, interface module 102b, CPU 102c, communication module 102d and supply module 102e.Interface module 102b comprises at least one video/audio display interface, for being connected to other display terminals.Interface module 102b for example can be selected from the conventional video/audio in this area and show output interface, is selected from HDMI interface, micro-HDMI interface, VGA interface etc.In the time being connected to other video display devices, display control module 102 itself can be by the video signal transmission that originally transfers to near-to-eye 101 to external display device, and such as TV, projection etc., for it provides video output signals.Interface module 102b can also be that USB interface is for connecting other peripheral hardwares, as charger, external sensor (as infrared or laser range sensor etc.).
Preferably, display control module 102 can also comprise at least one manipulation button 102f.When manipulation button on controller 103 is inconvenient to use or while using separately display control module 102, carry out the manipulation of virtual reality scenario.Manipulation button 102f can comprise buttons such as controlling the upper and lower, left and right button of directionkeys and selection, beginning, switching.
Preferably, controller 103 can also comprise communication module 103d and operational and controlled key button 103e.Communication module 102d is for receiving and send data/control signals with main frame or display control module in wired or wireless mode.Communication module 102d preferably carries out signal transmission with wireless mode.The communication module 102d of described wireless mode can be selected from least one in WiFi, bluetooth, infrared ray, ultrasonic wave, Wireless USB, RFID, NFC.It will be appreciated by persons skilled in the art that and also can adopt wired mode to connect and data-signal transmission.Manipulation button 102f for example can comprise buttons such as controlling the upper and lower, left and right button of directionkeys and selection, beginning, switching.More preferably, controller 103 can be equipped with independently power supply unit (not shown) to be its power supply in the time that controller 103 is independent of display control module 102 and works.
As shown in Figure 1, near-to-eye 101 comprises display screen 101a and sensor 101b.Sensor 101b for sensing when user wears near-to-eye 101 while moving motor message.Sensor 101b for example can be for the three-axis attitude of near-to-eye described in sensing 101 and three shaft positions.Sensor 101b for example can be selected from least one in acceleration transducer, gyroscope, gravity sensor, angular-rate sensor.The sense operation of sensor will below describe in detail.
Fig. 3 shows the product structure figure of the near-to-eye 300 of a specific embodiment according to the present invention.As shown in Figure 3, near-to-eye 101 can be set to goggles formula, comprises display housing 301, is embedded in close face one side and the display screen 302 corresponding with people's right and left eyes of this display housing 301 and is connected with display housing 301 and extends back to fix the ring-type headband 303 of this display housing around user's head.
Described display housing 301 outer appearnces are like goggles, it is near face one side direction sunken inside one desired depth, depressed part is furnished with display screen 302, depressed part surrounding by be supported in a top forehead support portion 304, be supported in a bottom cheek support portion 305, be supported in the nose support 306 in the middle of face and be supported in a headband for left and right sidepiece 303 and surround, formation fits tightly with face, and extraneous visible ray like this is isolated and can not comes in.By regulating headband 303, user when this goggles formula display, is not affected by extraneous visible ray can and be clear that the game picture on screen on wearing.
According to embodiments of the invention, controller 103 can comprise sensor for sensing the motor message in the time that user's hand held controller moves or human body attitude signal.The such as throwing of described motion and attitude, rock, whipping, throwing connect, move, rotation, inclination etc. one or more.For example, sensor can be acceleration transducer (gravity sensor), can export indicating controller due to by user's throwing, rock, the signal of accelerated motion that whipping etc. produces.Controller 103 can be monitored the output signal from this acceleration sensor, and in the time the threshold value of output signal being detected triggering command.For example, in the time that arriving certain acceleration with certain dynamics whipping throwing or controller 103, user can reach this threshold value.Sensor can be also angular-rate sensor (gyroscope), output indicating controller due to by user's throwing, rock, the signal of angular velocity varies that whipping etc. produces.
More preferably, sensor can also comprise magnetic induction sensor, thereby the changes of magnetic field producing for sensing user motion is offset rectification to acceleration transducer or angular-rate sensor, to make the result of sensing more accurate.
As shown in Figure 4, in certain embodiments, can triggering command when sensor sensing user action reaches certain threshold value, this instruction can be the instruction of carrying out the next step in the process that includes series of steps.For example, when in the reality environment of user in combat game, now user's handling is lived controller 400 as the antitank grenade in virtual game, user edge direction whipping or throwing controller 400 as shown in Figure 4, when sensing linear acceleration or the linear velocity of user's throwing controller 400 in inertial system, acceleration transducer 401 reaches after certain threshold value, triggering command, this instruction can be for example that the virtual antitank grenade of controlling in reality environment game starts to be separated from user's hand and moves voluntarily the target that reaches predetermined.The sensor parameter such as acceleration, thrown direction that the track of motion can be by recording the user that the activation threshold value moment senses by sensor is voluntarily as initial parameter, thereby the calculating of carrying out parabolic motion simulation obtains.
Alternatively, sensor can also be the angular-rate sensor for sense angular speed, for example gyroscope, when user edge direction whipping or throwing controller 400 as shown in Figure 4, angular-rate sensor 401 senses angular acceleration or the angular speed of user's throwing controller 400 in inertial system and reaches after certain threshold value, triggering command.
Preferably, the sensor that controller 103 according to the present invention adopts is nine shaft space free degree motion sensors of Integrated Accelerometer, gyroscope and magnetic induction sensor one.
More preferably, controller 103 can also coordinate external sensor to carry out further accurate rectification and measurement.For example connect external rangefinder (infrared, laser) by the interface of display control module 102 and measure the distance between human body and fixed signal thing, thus the motion of more accurate sensing human body or the variation of attitude.
Be understandable that, in certain embodiments, this instruction can be pre-defined by user.The threshold size of instruction also can be pre-defined voluntarily by user, the size of threshold value can be used for reacting the sensitivity of trigger sensor, and too low threshold value likely causes user's mobile controller and trigger mistakenly the operation in corresponding reality environment inadvertently.
Preferably, described controller 400 includes flexible frenulum, for making user that controller 400 is tied up to wrist by frenulum, can prevent in throwing process that controller 400 lost away the damage causing by user's mistake of slipping out of the hand.According to an embodiment, because frenulum is pulled when user's throwing, and produce the variation of a specific speed or acceleration, thus instruction or the event of Trig control signal triggering following that also can be using the variation of this speed or acceleration as sensor.
According to still another embodiment of the invention, in reality environment, the triggering of instruction can also realize triggering by the 103b of dropout office shown in Fig. 2.The triggering of machine-operated 103b of threading off is similar to physical button formula and triggers, can realize by various key modes, for example long by, short by, press etc.Press different durations by setting, can trigger different semiotic functions.For example, when in the reality environment of user in combat game, now user's handling is lived controller 400 as the antitank grenade in virtual game, and user continues the specific time, for example 3-5 second by pin the machine-operated 103b that threads off to a direction (example is direction d2 as shown in Figure 2).This duration also can be set voluntarily by user.When meeting or exceeding after predetermined duration according to the machine-operated 103b that threads off, user discharges the machine-operated 103b that threads off, and triggering command realizes the throwing of antitank grenade.
According to still another embodiment of the invention, in reality environment, the triggering of instruction can also cause display control module 102 and the dropout of controller 103 to move to trigger by the operation of the 103b of dropout office shown in Fig. 2.Display control module 102 and the coupling part of controller 103 can be arranged at least one electric contact, when user threads off when machine-operated 103b by pulling to a direction in use, can singlehandedly manipulate the dropout with controller 103 with control display control module 102, now at least one electric contact in the coupling part of display control module 102 and controller 103 disconnects, triggering command.For example, when in the reality environment of user in combat game, when now user threads off machine-operated 103b by pulling to a direction, control the dropout of display control module 102 and controller 103, cause at least one electric contact in the coupling part of display control module 102 and controller 103 to disconnect.Now produce triggering signal and open the throwing action of sensing user, for example " throwing away antitank grenade " pattern in reality-virtualizing game environment, user can be then by the above-described operation that utilizes sensor to detect movement velocity or carry out triggering following by length by the machine-operated 103b that threads off.
According to embodiments of the invention, in near-to-eye, also can comprise sensor for sensing the motor message during when near-to-eye generation translation described in user's head-mount or with directive banking motion.Described sensor can be for example acceleration (gravity) sensor, can export the signal of indicating the motion producing due to the movement on user's occurrence positions, thereby trigger further instruction.Display control module can be monitored the signal of exporting from this sensor institute's sensing of near-to-eye, and in the time the threshold value of output signal being detected triggering command.For example, be subjected to displacement or tilt when user departs from a direction in self reference axis, and after the displacement of this direction or angle of inclination exceed certain threshold value, can triggering command.Can realize like this virtual reality system of the present invention user's position is followed the tracks of, make reality and virtual synchronous.Alternatively, also can realize the sensing that above-mentioned user departs from a direction top offset in self reference axis or angle of inclination by the sensor in display control module.
Fig. 5 schematically shows the process of this position mobile tracking.As shown in Figure 5, in certain embodiments, whether there is the displacement of certain amplitude to any direction in sensor sensing user, can triggering command in the time that displacement reaches certain threshold value, and this instruction can be the instruction of carrying out the next step in the process that includes series of steps.For example, when in the process of the virtual portrait walking in the reality environment of user in game, now user wears near-to-eye 501, user edge as shown in Figure 5 either direction is moved, when sensing user, the sensor of sensed displacement in near-to-eye 501 produces the displacement with respect to a direction in self reference axis, and after the displacement of a direction exceedes certain threshold value, sensor will triggering command continue to send the mobile control signal in this direction with the personage who is controlled in virtual scene, starts the direction walking of moving to user.In the time that sensor senses user and has then produced second direction of displacement different with respect to last direction of displacement mobile, and after the displacement in this second direction exceedes certain threshold value, can trigger the second control signal, with the persistent movement along a direction before stopping, or after the displacement in this second direction exceedes another threshold value, also can trigger to the displacement control signal in this second direction.
According to another embodiment, when user is moved along either direction as shown in Figure 5, the sensor in the sensor display control module 502 in near-to-eye 501 also can produce with respect to the inclination of a direction in self reference axis and triggering command by sensing user.And for example, after the angle of inclination of a direction exceedes certain threshold value (certain angle), sensor will triggering command continue to send the mobile control signal in this direction with the personage who is controlled in virtual scene, starts the direction walking of moving to user.In the time that sensor senses user and has then produced the inclination of the second direction different with respect to last direction, and when the inclination in this second direction exceedes after certain angle reaches threshold value, can trigger the second control signal, with the lasting displacement movement along a direction before stopping.Or after the inclination in this second direction exceedes another threshold value, also can trigger to the displacement control signal in this second direction.Preferably, the angle between this last direction and second direction is greater than 90 degree.
When the user's displacement sensing when sensor changes and reaches specific threshold value, the initial signal of sequence of operations after next step operation of triggering also can be used as.For example, can trigger start to some directions mobile after, constantly to this direction always automatic moving to reach predetermined target.Track, the initial velocity etc. of automatic moving can be by recording the sensor parameters such as the user's that the activation threshold value moment senses by sensor mobile range, translational acceleration, linear velocity as initial parameter, thereby the calculating of carrying out moving movement simulation obtains.
Be understandable that, in certain embodiments, this instruction can be pre-defined by user.The threshold size of instruction also can be pre-defined voluntarily by user, the size of threshold value can be used for reacting the sensitivity of trigger sensor, and too low threshold value likely causes user's mobile near-to-eye and trigger mistakenly the operation in corresponding reality environment inadvertently.
According to still another embodiment of the invention, the above-mentioned sensor that is arranged on the sensing user displacement in near-to-eye also can be arranged in controller or display control module, the displacement movement of sensing user while connecting external display with convenient user by display control module.
Can allow user comfortable and naturally manipulate the control module part of display according to virtual reality system of the present invention, avoid the restriction of extraneous cable to user's head.One-handed performance is deviate from mechanism, without user's bimanualness, has improved convenience.Virtual reality system according to the present invention adopts real whipping to move to simulate user's operation of throwing, controls the motion of virtual portrait by the true movement of sensing user health, has improved user's experience.
In conjunction with the explanation of the present invention and the practice that disclose here, other embodiment of the present invention are easy to expect and understand for those skilled in the art.Illustrate with embodiment and be only considered to exemplary, true scope of the present invention and purport limit by claim.
Claims (10)
1. a virtual reality system, comprises near-to-eye, display control module, and controller, and described near-to-eye and described display control module are separable, wherein,
Described near-to-eye comprises first sensor, for three-axis attitude and/or three shaft positions of near-to-eye described in sensing;
Described display control module comprises the first connector, communication module and supply module, wherein said the first connector is used for connecting described controller, described communication module is for receiving and send data/control signals in wired or wireless mode, and transfer to described near-to-eye and receive the signal sending from described near-to-eye, described supply module is used to described display control module power supply;
Described controller comprises the second connector and the second sensor, and wherein said the second connector is used for connecting described display control module, and described the second sensor is for three-axis attitude and/or three shaft positions of controller described in sensing.
2. the system as claimed in claim 1, wherein said first sensor and the second sensor are selected from least one or the multiple combination in acceleration transducer, angular-rate sensor and magnetic induction sensor.
3. system as claimed in claim 2, whether wherein said acceleration transducer reaches certain threshold value for controller described in sensing in linear acceleration or the linear velocity of inertial system, is Trig control signal.
4. system as claimed in claim 2, whether wherein said angular-rate sensor reaches certain threshold value for controller described in sensing at angular acceleration or the angular speed of inertial system, is Trig control signal.
5. the system as claimed in claim 1, wherein when the motion that described sensor senses reaches after certain threshold value, by recording parameter that the activation threshold value moment senses by sensor as initial parameter, triggers the event being made up of multiple instructions.
6. the system as claimed in claim 1, wherein when sensing user, described first sensor or the second sensor produce displacement or the inclination with respect to first direction in self reference axis, and after described displacement or inclination exceed certain threshold value, described sensor-triggered instruction with continue to send the first mobile control signal of stating in direction.
7. system as claimed in claim 6, wherein in the time that described first sensor or the second sensor sense user and have produced the movement of the second direction different with respect to described first direction or tilted, and after the displacement in this second direction or inclination exceed certain threshold value, Trig control signal is with the lasting movement along described first direction before stopping, or triggers to the displacement control signal in this second direction.
8. the system as claimed in claim 1, wherein said display control module and controller are detachable combination set, its combination is buckle-type or magnetic-type.
9. system as claimed in claim 8, wherein also comprises the office that threads off, and makes described display control module and described controller throw off for the fingers of single hand user under the pulling of first direction.
10. system as claimed in claim 9, wherein said dropout is machine-operated for the operation at user's described controller of triggering under the pulling of the second direction different from described first direction.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410143435.1A CN103877726B (en) | 2014-04-10 | 2014-04-10 | A kind of virtual reality components system |
PCT/CN2015/075213 WO2015154627A1 (en) | 2014-04-10 | 2015-03-27 | Virtual reality component system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410143435.1A CN103877726B (en) | 2014-04-10 | 2014-04-10 | A kind of virtual reality components system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103877726A true CN103877726A (en) | 2014-06-25 |
CN103877726B CN103877726B (en) | 2017-09-26 |
Family
ID=50947053
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410143435.1A Active CN103877726B (en) | 2014-04-10 | 2014-04-10 | A kind of virtual reality components system |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN103877726B (en) |
WO (1) | WO2015154627A1 (en) |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015154627A1 (en) * | 2014-04-10 | 2015-10-15 | 北京蚁视科技有限公司 | Virtual reality component system |
CN105031918A (en) * | 2015-08-19 | 2015-11-11 | 深圳游视虚拟现实技术有限公司 | Man-machine interaction system based on virtual reality technology |
CN105159450A (en) * | 2015-08-25 | 2015-12-16 | 中国运载火箭技术研究院 | Portable interactive desktop-level virtual reality system |
CN105353871A (en) * | 2015-10-29 | 2016-02-24 | 上海乐相科技有限公司 | Target object control method and apparatus in virtual reality scene |
CN105487672A (en) * | 2015-12-18 | 2016-04-13 | 北京诺亦腾科技有限公司 | Prop applied to virtual reality system |
CN105487671A (en) * | 2015-12-18 | 2016-04-13 | 北京诺亦腾科技有限公司 | Prop applied to virtual reality system |
CN105487670A (en) * | 2015-12-18 | 2016-04-13 | 北京诺亦腾科技有限公司 | Method for positioning physical prop of virtual reality system and virtual reality system |
CN105511627A (en) * | 2015-12-18 | 2016-04-20 | 北京诺亦腾科技有限公司 | Prop for virtual reality system |
CN105528079A (en) * | 2015-12-18 | 2016-04-27 | 北京诺亦腾科技有限公司 | Method for locating physical prop of virtual reality system and virtual reality system |
CN105607737A (en) * | 2015-12-18 | 2016-05-25 | 北京诺亦腾科技有限公司 | Positioning method for physical prop of virtual reality system, and virtual reality system |
CN105657408A (en) * | 2015-12-31 | 2016-06-08 | 北京小鸟看看科技有限公司 | Method for implementing virtual reality scene and virtual reality apparatus |
CN105721856A (en) * | 2014-12-05 | 2016-06-29 | 北京蚁视科技有限公司 | Remote image display method for near-to-eye display |
CN105920838A (en) * | 2016-06-08 | 2016-09-07 | 北京行云时空科技有限公司 | System and method for movement collection and control |
CN106737604A (en) * | 2017-02-03 | 2017-05-31 | 重庆梦神科技有限公司 | The direction controlling arm of force and virtual reality experience equipment |
WO2017120768A1 (en) * | 2016-01-12 | 2017-07-20 | 深圳多哚新技术有限责任公司 | Heat dissipation apparatus based on hand-held terminal of vr glasses, hand-held terminal and vr glasses |
CN107153446A (en) * | 2016-03-02 | 2017-09-12 | 宏达国际电子股份有限公司 | Virtual reality system and tracker device |
CN107667328A (en) * | 2015-06-24 | 2018-02-06 | 谷歌公司 | System for tracking handheld device in enhancing and/or reality environment |
WO2018049624A1 (en) * | 2016-09-14 | 2018-03-22 | 深圳市柔宇科技有限公司 | Head-mounted display device |
CN108037827A (en) * | 2017-12-08 | 2018-05-15 | 北京凌宇智控科技有限公司 | The virtual objects throwing emulation mode and its system of Virtual actual environment |
TWI635319B (en) * | 2017-04-24 | 2018-09-11 | 英華達股份有限公司 | Virtual reality system and method |
CN109069927A (en) * | 2016-06-10 | 2018-12-21 | Colopl株式会社 | For providing the method for Virtual Space, for making computer realize the program of this method and for providing the system of Virtual Space |
CN109213323A (en) * | 2018-08-28 | 2019-01-15 | 北京航空航天大学青岛研究院 | A method of screen Attitude estimation is realized based on eye movement interaction technique |
CN109983415A (en) * | 2017-07-24 | 2019-07-05 | 深圳市大疆创新科技有限公司 | Remote controler and unmanned vehicle system |
TWI664995B (en) * | 2018-04-18 | 2019-07-11 | 鴻海精密工業股份有限公司 | Virtual reality multi-person board game interacting system, initeracting method, and server |
US11789276B1 (en) | 2020-04-06 | 2023-10-17 | Apple Inc. | Head-mounted device with pivoting connectors |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110833690B (en) * | 2019-10-30 | 2023-07-07 | 上海国威互娱文化科技有限公司 | Virtual synchronous all-in-one machine based on field interaction |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060217243A1 (en) * | 2002-12-04 | 2006-09-28 | Philip Feldman | Isometric exercise system and method of facilitating user exercise during video game play |
CN2892214Y (en) * | 2006-04-30 | 2007-04-25 | 吴铁励 | Entertainment machine by human's body gesture operation |
CN200983877Y (en) * | 2005-10-16 | 2007-12-05 | 王飞 | Virtual reality helmet |
CN101101666A (en) * | 2007-08-09 | 2008-01-09 | 中国科学院计算技术研究所 | Dummy role movement synthesis method based on movement capturing data |
EP2281228A1 (en) * | 2008-05-26 | 2011-02-09 | Microsoft International Holdings B.V. | Controlling virtual reality |
CN102023700A (en) * | 2009-09-23 | 2011-04-20 | 吴健康 | Three-dimensional man-machine interactive system |
CN102348068A (en) * | 2011-08-03 | 2012-02-08 | 东北大学 | Head gesture control-based following remote visual system |
KR20120052783A (en) * | 2010-11-16 | 2012-05-24 | 한국전자통신연구원 | Apparatus for managing a reconfigurable platform for virtual reality based training simulator |
JP2013065341A (en) * | 2008-12-05 | 2013-04-11 | Social Communications Company | Realtime kernel |
US20130093788A1 (en) * | 2011-10-14 | 2013-04-18 | James C. Liu | User controlled real object disappearance in a mixed reality display |
CN103149689A (en) * | 2011-12-06 | 2013-06-12 | 微软公司 | Augmented reality virtual monitor |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100809479B1 (en) * | 2006-07-27 | 2008-03-03 | 한국전자통신연구원 | Face mounted display apparatus and method for mixed reality environment |
CN101024125B (en) * | 2007-03-28 | 2010-04-14 | 深圳市飞达荣电子有限公司 | Multi-platform wireless image-sound reality-virtualizing game system |
US7993107B2 (en) * | 2010-10-25 | 2011-08-09 | General Electric Company | Onshore wind turbine with tower support system |
CN102553232A (en) * | 2010-12-07 | 2012-07-11 | 鼎亿数码科技(上海)有限公司 | Human posture capture apparatus and implementing method thereof |
CN202724664U (en) * | 2012-05-30 | 2013-02-13 | 深圳市宇恒互动科技开发有限公司 | Game gun capable of breaking away from game and being used independently and target system |
CN203075636U (en) * | 2012-11-09 | 2013-07-24 | 西安景行数创信息科技有限公司 | Interactive clay-pigeon shooting game system |
CN103877726B (en) * | 2014-04-10 | 2017-09-26 | 北京蚁视科技有限公司 | A kind of virtual reality components system |
-
2014
- 2014-04-10 CN CN201410143435.1A patent/CN103877726B/en active Active
-
2015
- 2015-03-27 WO PCT/CN2015/075213 patent/WO2015154627A1/en active Application Filing
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060217243A1 (en) * | 2002-12-04 | 2006-09-28 | Philip Feldman | Isometric exercise system and method of facilitating user exercise during video game play |
CN200983877Y (en) * | 2005-10-16 | 2007-12-05 | 王飞 | Virtual reality helmet |
CN2892214Y (en) * | 2006-04-30 | 2007-04-25 | 吴铁励 | Entertainment machine by human's body gesture operation |
CN101101666A (en) * | 2007-08-09 | 2008-01-09 | 中国科学院计算技术研究所 | Dummy role movement synthesis method based on movement capturing data |
EP2281228A1 (en) * | 2008-05-26 | 2011-02-09 | Microsoft International Holdings B.V. | Controlling virtual reality |
JP2013065341A (en) * | 2008-12-05 | 2013-04-11 | Social Communications Company | Realtime kernel |
CN102023700A (en) * | 2009-09-23 | 2011-04-20 | 吴健康 | Three-dimensional man-machine interactive system |
KR20120052783A (en) * | 2010-11-16 | 2012-05-24 | 한국전자통신연구원 | Apparatus for managing a reconfigurable platform for virtual reality based training simulator |
CN102348068A (en) * | 2011-08-03 | 2012-02-08 | 东北大学 | Head gesture control-based following remote visual system |
US20130093788A1 (en) * | 2011-10-14 | 2013-04-18 | James C. Liu | User controlled real object disappearance in a mixed reality display |
CN103149689A (en) * | 2011-12-06 | 2013-06-12 | 微软公司 | Augmented reality virtual monitor |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015154627A1 (en) * | 2014-04-10 | 2015-10-15 | 北京蚁视科技有限公司 | Virtual reality component system |
CN105721856A (en) * | 2014-12-05 | 2016-06-29 | 北京蚁视科技有限公司 | Remote image display method for near-to-eye display |
CN107667328A (en) * | 2015-06-24 | 2018-02-06 | 谷歌公司 | System for tracking handheld device in enhancing and/or reality environment |
CN105031918A (en) * | 2015-08-19 | 2015-11-11 | 深圳游视虚拟现实技术有限公司 | Man-machine interaction system based on virtual reality technology |
CN105031918B (en) * | 2015-08-19 | 2018-02-23 | 深圳游视虚拟现实技术有限公司 | A kind of man-machine interactive system based on virtual reality technology |
CN105159450A (en) * | 2015-08-25 | 2015-12-16 | 中国运载火箭技术研究院 | Portable interactive desktop-level virtual reality system |
CN105159450B (en) * | 2015-08-25 | 2018-01-05 | 中国运载火箭技术研究院 | One kind is portable can interactive desktop level virtual reality system |
CN105353871A (en) * | 2015-10-29 | 2016-02-24 | 上海乐相科技有限公司 | Target object control method and apparatus in virtual reality scene |
CN105353871B (en) * | 2015-10-29 | 2018-12-25 | 上海乐相科技有限公司 | The control method and device of target object in a kind of virtual reality scenario |
CN105528079A (en) * | 2015-12-18 | 2016-04-27 | 北京诺亦腾科技有限公司 | Method for locating physical prop of virtual reality system and virtual reality system |
CN105511627A (en) * | 2015-12-18 | 2016-04-20 | 北京诺亦腾科技有限公司 | Prop for virtual reality system |
CN105607737A (en) * | 2015-12-18 | 2016-05-25 | 北京诺亦腾科技有限公司 | Positioning method for physical prop of virtual reality system, and virtual reality system |
CN105487672A (en) * | 2015-12-18 | 2016-04-13 | 北京诺亦腾科技有限公司 | Prop applied to virtual reality system |
CN105487671A (en) * | 2015-12-18 | 2016-04-13 | 北京诺亦腾科技有限公司 | Prop applied to virtual reality system |
CN105487670A (en) * | 2015-12-18 | 2016-04-13 | 北京诺亦腾科技有限公司 | Method for positioning physical prop of virtual reality system and virtual reality system |
CN105657408A (en) * | 2015-12-31 | 2016-06-08 | 北京小鸟看看科技有限公司 | Method for implementing virtual reality scene and virtual reality apparatus |
CN105657408B (en) * | 2015-12-31 | 2018-11-30 | 北京小鸟看看科技有限公司 | The implementation method and virtual reality device of virtual reality scenario |
US10277882B2 (en) | 2015-12-31 | 2019-04-30 | Beijing Pico Technology Co., Ltd. | Virtual reality scene implementation method and a virtual reality apparatus |
WO2017120768A1 (en) * | 2016-01-12 | 2017-07-20 | 深圳多哚新技术有限责任公司 | Heat dissipation apparatus based on hand-held terminal of vr glasses, hand-held terminal and vr glasses |
CN107153446A (en) * | 2016-03-02 | 2017-09-12 | 宏达国际电子股份有限公司 | Virtual reality system and tracker device |
TWI625151B (en) * | 2016-03-02 | 2018-06-01 | 宏達國際電子股份有限公司 | Virtual reality system and tracker device |
CN107153446B (en) * | 2016-03-02 | 2021-06-01 | 宏达国际电子股份有限公司 | Virtual reality system and tracker device |
CN105920838A (en) * | 2016-06-08 | 2016-09-07 | 北京行云时空科技有限公司 | System and method for movement collection and control |
US10589176B2 (en) | 2016-06-10 | 2020-03-17 | Colopl, Inc. | Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space |
CN109069927A (en) * | 2016-06-10 | 2018-12-21 | Colopl株式会社 | For providing the method for Virtual Space, for making computer realize the program of this method and for providing the system of Virtual Space |
WO2018049624A1 (en) * | 2016-09-14 | 2018-03-22 | 深圳市柔宇科技有限公司 | Head-mounted display device |
CN106737604A (en) * | 2017-02-03 | 2017-05-31 | 重庆梦神科技有限公司 | The direction controlling arm of force and virtual reality experience equipment |
CN106737604B (en) * | 2017-02-03 | 2024-04-02 | 释空(上海)品牌策划有限公司 | Direction control arm of force and virtual reality experience equipment |
TWI635319B (en) * | 2017-04-24 | 2018-09-11 | 英華達股份有限公司 | Virtual reality system and method |
CN109983415A (en) * | 2017-07-24 | 2019-07-05 | 深圳市大疆创新科技有限公司 | Remote controler and unmanned vehicle system |
CN108037827A (en) * | 2017-12-08 | 2018-05-15 | 北京凌宇智控科技有限公司 | The virtual objects throwing emulation mode and its system of Virtual actual environment |
TWI664995B (en) * | 2018-04-18 | 2019-07-11 | 鴻海精密工業股份有限公司 | Virtual reality multi-person board game interacting system, initeracting method, and server |
US10569163B2 (en) | 2018-04-18 | 2020-02-25 | Hon Hai Precision Industry Co., Ltd. | Server and method for providing interaction in virtual reality multiplayer board game |
CN109213323A (en) * | 2018-08-28 | 2019-01-15 | 北京航空航天大学青岛研究院 | A method of screen Attitude estimation is realized based on eye movement interaction technique |
US11789276B1 (en) | 2020-04-06 | 2023-10-17 | Apple Inc. | Head-mounted device with pivoting connectors |
Also Published As
Publication number | Publication date |
---|---|
CN103877726B (en) | 2017-09-26 |
WO2015154627A1 (en) | 2015-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103877726B (en) | A kind of virtual reality components system | |
US9884248B2 (en) | Display control method for head-mounted display (HMD) and image generation device | |
JP6212667B1 (en) | Method executed by computer to communicate via virtual space, program causing computer to execute the method, and information processing apparatus | |
US20200159389A1 (en) | Information processing method and apparatus for executing the information processing method | |
US9703102B2 (en) | Information processing device including head mounted display | |
JP6642432B2 (en) | Information processing apparatus, information processing method, and image display system | |
KR101548156B1 (en) | A wireless exoskeleton haptic interface device for simultaneously delivering tactile and joint resistance and the method for comprising the same | |
JP6470796B2 (en) | Information processing method, program, and computer | |
CN105031918B (en) | A kind of man-machine interactive system based on virtual reality technology | |
US20050059489A1 (en) | Motion sensing applications | |
KR101827912B1 (en) | Walkable virtual reality apparatus | |
JP6229089B1 (en) | Method executed by computer to communicate via virtual space, program causing computer to execute the method, and information processing apparatus | |
WO2012154938A1 (en) | Headset computer that uses motion and voice commands to control information display and remote devices | |
JP6368411B1 (en) | Method, program and computer executed on a computer to provide a virtual experience | |
CN205007551U (en) | Human -computer interaction system based on virtual reality technology | |
KR20160123017A (en) | System for providing a object motion data using motion sensor and method for displaying a a object motion data using thereof | |
WO2018184232A1 (en) | Body sensing remote control method, control apparatus, gimbal and unmanned aerial vehicle | |
JP2018072604A (en) | Method for suppressing vr sickness, program for causing computer to execute the method, and information processing device | |
JP6248219B1 (en) | Information processing method, computer, and program for causing computer to execute information processing method | |
JP6495398B2 (en) | Method and program for providing virtual space, and information processing apparatus for executing the program | |
JP6263292B1 (en) | Information processing method, computer, and program for causing computer to execute information processing method | |
JP2019020836A (en) | Information processing method, device, and program for causing computer to execute the method | |
JP6330072B1 (en) | Information processing method, apparatus, and program for causing computer to execute information processing method | |
KR20190113453A (en) | Method and apparatus for controling head mounted device and rider device in virtual reality system | |
JP2019015972A (en) | Method for suppressing vr sickness, program for causing computer to execute the method, and information processing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20140625 Assignee: Google Inc. Assignor: BEIJING ANTVR TECHNOLOGY Co.,Ltd. Contract record no.: X2024990000126 Denomination of invention: A Virtual Reality Component System Granted publication date: 20170926 License type: Common License Record date: 20240411 |