CN107918482A - The method and system of overstimulation is avoided in immersion VR systems - Google Patents

The method and system of overstimulation is avoided in immersion VR systems Download PDF

Info

Publication number
CN107918482A
CN107918482A CN201610882538.9A CN201610882538A CN107918482A CN 107918482 A CN107918482 A CN 107918482A CN 201610882538 A CN201610882538 A CN 201610882538A CN 107918482 A CN107918482 A CN 107918482A
Authority
CN
China
Prior art keywords
user
control
guide rail
virtual reality
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610882538.9A
Other languages
Chinese (zh)
Other versions
CN107918482B (en
Inventor
谢炯坤
党建勋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silan Zhichuang Technology Co ltd
Original Assignee
Tianjin Feng Time Interactive Technology Co Ltd Shenzhen Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Feng Time Interactive Technology Co Ltd Shenzhen Branch filed Critical Tianjin Feng Time Interactive Technology Co Ltd Shenzhen Branch
Priority to CN201610882538.9A priority Critical patent/CN107918482B/en
Publication of CN107918482A publication Critical patent/CN107918482A/en
Application granted granted Critical
Publication of CN107918482B publication Critical patent/CN107918482B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

Disclose the method and system that overstimulation is avoided in immersion VR systems.It is disclosed to avoid overstimulation user method in immersion virtual display system, including:Virtual reality scenario is presented;Discomfort is experienced in response to identification user, changes virtual reality scenario.

Description

The method and system of overstimulation is avoided in immersion VR systems
Technical field
The present invention relates to virtual reality (VR, Virtual Reality) technology.Especially, the present invention relates in immersion The uncomfortable experience of user is identified in VR systems, avoids causing user the method and system of overstimulation.
Background technology
In human-computer interaction technology, control is the reusable software component for building graphic user interface.An in general, control Part corresponds to a kind of function.For example, Fig. 1 illustrates " confirmation " control in two-dimensional graphical user interface." confirmation " control includes carrying Show window, reminding window includes " confirming " button and " cancellation " button.When " confirmation " control is called, eject as shown in Figure 1 Reminding window, click of the identification user to " confirmation " button or " cancellation " button is intended to obtain the operation of user, and realizes man-machine Interaction.Slip unlocking technology of the prior art, user is informed by the slip of hand on the touchscreen to information processing equipment Input be intended to.
Novel human-machine interaction technology is also evolving, and the human-computer interaction technology based on gesture identification is one of hot spot.It is right The identification of hand exercise, can pass through accomplished in many ways.US20100199228A1 (publication date from Microsoft:2010 On August 5) body posture that user is captured and analyzed using depth camera is provided, and it is construed as computer command Scheme.US20080291160A1 (publication date from Nintendo companies:On November 27th, 2008) provide using infrared The scheme of sensor and acceleration transducer capture user's hand position.From Panasonic Electric Equipment Industrial Co., Ltd CN1276572A, which is provided, takes pictures hand using camera, then image is normalized analysis, and will normalization Obtained image carries out space projection, and by the projection coordinate of gained compared with the projection coordinate of the image prestored. Fig. 2 shows the gesture identification that the patent application CN201110100532.9 from Tianjin Fengshi Interaction Technology Co., Ltd. is provided With the sensory perceptual system and method for locus.As shown in Fig. 2, gesture recognition system includes:Main frame 101, multi-cam Control circuit 102, multiple cameras 103, user's hand 104, the application program for running on main frame 101 of system 105th, the object to be operated 106 in application program 105 and virtual hand cursor 107.Gesture recognition system further includes not to be shown in fig. 2 Go out for the infrared fileter that illuminates the infrared illumination source of user's hand 104 and be positioned over before each camera.It is more A camera 103 captures the image of user's hand 104, at the hand images that control circuit 102 gathers camera 103 Reason, and identify posture and/or the position of hand.Aided in addition, also having in the prior art using data glove to hand gestures Identification scheme.
Immersion VR systems combine computer graphics techniques, wide-angle stereo display technique, sensing tracking technique, distribution The newest fruits of the technologies such as calculating, artificial intelligence, a virtual world is generated by computer simulation, and is presented on user's eye Before, provide hearing experience true to nature to the user so that user is whole-heartedly immersed among virtual world.When the sum that user sees All heard all just like it is true as real world when, user can interact with the virtual world naturally.
With each side technology maturation of wear-type VR equipment, immersion VR is brought to user faces border impression, makes at the same time Obtain user and a new level is risen to the demand for experience of three-dimension interaction.Existing wear-type VR equipment generally comprises wear-type and shows Show device and VR contents generation equipment.
Head-mounted display can be worn on user's head and provide a user the immersion visual field of virtual scene.Wear-type Display also includes the sensor for head positioning.
VR contents generation equipment includes computing module, memory module and head positioning module.Head positioning module in real time from Head positioning sensor in head-mounted display obtains data, by sensor fusion related algorithm processing, head positioning mould Block can be derived that the head pose of active user.
VR contents generation equipment obtains current head posture, renders the virtual field using active user's head pose as visual angle Scape, and pass through head-mounted display presentation user.Head-mounted display can integrate (such as VR with VR contents generation equipment Mobile all-in-one machine) or linked together (such as HTC Vive) by display data line (such as HDMI).
Disclosed in Chinese patent application CN201310407443 a kind of based on motion-captured immersion virtual reality system System, proposes to carry out motion capture to user by inertial sensor, and correcting inertia using the biomethanics constraint of human limb passes The accumulated error that sensor is brought, so as to fulfill the accurate positionin and tracking to user's limbs.
A kind of virtual reality components system is disclosed in Chinese patent application CN201410143435, user is led in the invention Cross controller to interact with virtual environment, controller carries out locating and tracking using inertial sensor to user's limbs.
Chinese patent application CN2015104695396 provides immersed system of virtual reality 100, by quoting that its is complete Text is herein incorporated.Virtual reality system 100 can be worn on head by user.It is virtual existing when user walks about and turns round indoors Real system 100 can detect the change of user's head pose to change corresponding render scenes.It is virtual existing when user reaches out one's hands Also virtual hand will be rendered according to the pose of current hand in real system 100, and user is manipulated other things in virtual environment Body, three-dimensional interactive is carried out with reality environment.Virtual reality system 100 also can recognize that other moving objects in scene, go forward side by side Row positioning and tracking.Virtual reality system 100 includes 3 d display device 110, visual perception device 120, visual processing apparatus 160, scene generating means 150.Alternatively, stereo sound effect output device 140, auxiliary hair can also be included in virtual reality system Electro-optical device 130.Auxiliary illuminating device 130 is used to aid in vision positioning.For example, auxiliary illuminating device 130 can send infrared light, For providing illumination for the visual field observed by visual perception device 120, promote the Image Acquisition of visual perception device 120.
Each device can carry out the exchange of data/control signals by wire/wireless mode in virtual reality system.Solid is aobvious Showing device 110 can be but not limited to liquid crystal display, projector equipment etc..3 d display device 110 is used for virtual by what is rendered Image projects to the eyes of people respectively, to form stereopsis.Visual perception device 120 may include that camera, camera, depth regard Feel sensor and/or inertial sensor group (three axis angular rate sensors, three axes acceleration sensors, three axis geomagnetic sensors Deng).Visual perception device 120 is used for the image of real-time capture surrounding environment and object, and/or the fortune of measurement visual perception device Dynamic state.Visual perception device 120 can be fixed on user's head, and keep fixed relative pose with user's head.So as to such as Fruit obtains the pose of visual perception device 120, then can calculate the pose of user's head.Stereo sound effect device 140 is used to produce Audio in raw virtual environment.Visual processing apparatus 160 is used to the image of seizure carrying out processing analysis, to the head of user Carry out it is self-positioning, and in environment moving object carry out locating and tracking.Scene generating means 150 are used for working as according to user With updating scene information to the locating and tracking of moving object, can also be predicted according to inertial sensor information to catch fore head posture The image information obtained, and real-time rendering respective virtual image.
Visual processing apparatus 160, scene generating means 150 can be realized by the software for running on computer processor, also may be used Realized by configuring FPGA (field programmable gate array), can also be realized by ASIC (application specific integrated circuit).Visual processes Device 160, scene generating means 150 can be embedded on portable equipment, can also be located remotely from user's portable equipment On host or server, and communicated by wired or wireless mode with user's portable equipment.Visual processing apparatus 160 and field Scape generating means 150 can realize by single hardware unit, can also be distributed in different computing devices, and using isomorphism and/ Or the computing device of isomery is realized.
The content of the invention
, it is necessary to design control to promote the exploitation of application program in the interactive process based on gesture identification.Control Using gesture as input, and event or message are produced as output.Event or message may indicate that " confirmation " or " cancellation " of user Operation purpose, or the user view of a variety of different implications of instruction.And since the biological characteristic of people determines that the hand of user exists The track of interactive space can not realize the problem of straight or specification, so that existing human-computer interaction technology is difficult to effective geography Solve the intention of gesture input.
The focus of VR systems of the prior art concentrates on the product industrial design that can provide immersion experience, interaction In design and content design.When experiencing some irritating VR contents, for example watch horror film or immersively carry out During irritating game, some users are there may be Psychological inadaptability or resistance, in 3D scene of the long-time exposed to VR systems When, dizziness occurs in certain customers.The VR systems of the prior art lack the effective technological means reply above problem, ignore use The uncomfortable experience that family is being undergone, causes user to contradict virtual reality experience, or cause the overstimulation to user's spirit.It is existing There is technology also not consider how that identification user receives overstimulation in immersion VR experience, and how to tackle.
According to the first aspect of the invention, there is provided the first man-machine interaction method based on gesture identification, wherein control base In the order of the gesture information identification user of user, the control includes entrance area and guide rail area, and entrance area is by user Interface is divided into Part I and Part II, and Part I does not include guide rail area, and Part II includes guide rail area;The side Method includes:It is moved to the Part II of user interface, control by entrance area from the Part I of user interface in response to cursor Part enters activated state, and shows sliding block in the guide rail area of control;In response to sliding block guide rail is removed from the first end of guide rail area Region, the control generate the first event.
The first man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention The second man-machine interaction method based on gesture identification of first aspect, including:In response to sliding block from guide rail area second End removes guide rail area, and the control generates second event.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided according to a first aspect of the present invention based on Third party's machine exchange method of gesture identification, including:Removed in response to sliding block from the first end or second end of guide rail area Guide rail area, the control enter unactivated state.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 4th Man-machine interaction method, including:The control is initialized, the control enters unactivated state.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 5th Man-machine interaction method, wherein the first end is guide rail area close to the part of entrance area.
The second man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention The Sixth Man machine exchange method based on gesture identification of first aspect, wherein the second end is guide rail area away from entrance area Part.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 7th Man-machine interaction method, including:On a user interface according to gesture information display highlighting, and gesture information instruction by The position of the user's hand for user's extracting hand images that image capture device is caught and/or posture.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 8th Man-machine interaction method, including:According to projected position of the cursor on the center line of guide rail area, draw and slide in guide rail area Block.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 9th Man-machine interaction method, including:Enter activated state in response to control, hide cursor on a user interface, and change control Appearance, enter activated state to prompt the user with control.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the tenth Man-machine interaction method, including:Enter unactivated state in response to control, the position supported on a user interface according to gesture information Put display highlighting.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the tenth One man-machine interaction method, including:Guide rail area is removed in response to sliding block, hides the sliding block.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the tenth Two man-machine interaction methods, including:Under the activated state of control, " grasping " in response to gesture information instruction acts, and makes light Mark is fixed on sliding block, and draws sliding block according to gesture information.
The first man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention 13rd man-machine interaction method of first aspect, wherein the first end of the guide rail area have it is multiple.
The second man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention 14th man-machine interaction method of first aspect, wherein the second end of the guide rail area have it is multiple.
According to the second aspect of the invention, there is provided first based on gesture identification according to a second aspect of the present invention is man-machine The order of the gesture information identification user of interactive device, wherein control based on user, the control include entrance area and guide rail User interface is divided into Part I and Part II by region, entrance area, and Part I does not include guide rail area, Part II Including guide rail area;Described device includes:Active module, for passing through entrance from the Part I of user interface in response to cursor Region is moved to the Part II of user interface, and control enters activated state, and shows sliding block in the guide rail area of control;Event is given birth to Into module, guide rail area is removed from the first end of guide rail area in response to sliding block, the control generates the first event.
The first human-computer interaction device based on gesture identification according to the second aspect of the invention, there is provided according to the present invention Second human-computer interaction device of second aspect, including:Second event generation module, in response to sliding block from guide rail area Second end remove guide rail area, the control generates second event.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 3rd Human-computer interaction device, including:Flexible module is deactivated, guide rail is removed from the first end or second end of guide rail area in response to sliding block Region, the control enter unactivated state.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 4th Human-computer interaction device, including:Initialization module, for initializing the control, the control enters unactivated state.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 5th Human-computer interaction device, wherein the second end is guide rail area close to the part of entrance area.
The second human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 6th Human-computer interaction device, wherein the first end is part of the guide rail area away from entrance area.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 7th Human-computer interaction device, wherein further including:The device shown on a user interface according to gesture information, and the gesture information refer to Show position and/or the posture of user's hand of the user's extracting hand images caught by image capture device.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 8th Human-computer interaction device, including:Sliding block drafting module, for according to projected position of the cursor on the center line of guide rail area, Sliding block is drawn in guide rail area.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 9th Human-computer interaction device, including:Outward appearance change module, for entering activated state in response to control, hides on a user interface Cursor, and play the sound specified, shows the word specified, and/or provides mechanics feedback, is entered with to prompt the user with control and swashed State living.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth Human-computer interaction device, including:Cursor display module, for entering unactivated state in response to control, on a user interface according to The position display cursor supported according to gesture information, plays the sound specified, and shows the word specified, and/or provides mechanics feedback.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth One human-computer interaction device, including:Sliding block hidden module, for removing guide rail area in response to sliding block, hides the sliding block.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth Two human-computer interaction devices, including:Cursor fixed module, under the activated state of control, being indicated in response to gesture information " grasping " action, cursor is fixed on sliding block, and sliding block is drawn according to gesture information.
The first human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth Three human-computer interaction devices, wherein the first end of the guide rail area have it is multiple.
The second human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth Four human-computer interaction devices, wherein the second end of the guide rail area have it is multiple.
According to the third aspect of the invention we, there is provided a kind of information processing equipment, wherein described information processing equipment include Processor, memory and display device, described information processing equipment are additionally coupled to gesture identification equipment and receive gesture identification The gesture information that equipment provides;The memory storage program, the processor operation described program set described information processing The standby foregoing man-machine interaction method performed according to the first aspect of the invention.
According to the fourth aspect of the invention, there is provided a kind of computer program, it causes when by information processing equipment When managing device operation described program, make the foregoing a variety of man-machine friendships of described information processing equipment execution according to the first aspect of the invention One of mutual method.
According to the fifth aspect of the invention, there is provided first according to a fifth aspect of the present invention virtually shows in immersion is Overstimulation user method is avoided in system, including:Virtual reality scenario is presented;Discomfort is experienced in response to identification user, is changed Virtual reality scenario.
First according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries Supplied according to a fifth aspect of the present invention second overstimulation user method is avoided in immersion virtual display system, wherein logical Cross identification user's head to move with designated mode, to identify that user experiences discomfort.
First or second according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system Method, there is provided the 3rd according to a fifth aspect of the present invention avoids overstimulation user side in immersion virtual display system Method, wherein by identifying that user's hand is moved with designated mode, to identify that user experiences discomfort.
First to the 3rd according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system One of method, there is provided the 4th according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system Method, wherein specific characteristic and/or audio occur by the audio of capture includes the voice of the specified phrase of instruction or sentence, comes Identification user experiences discomfort.
First to fourth according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system One of method, there is provided the 5th according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system Method, wherein the instruction by receiving user from interactive device, to identify that user experiences discomfort.
First to the 5th according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system One of method, there is provided the 4th according to a sixth aspect of the present invention avoids overstimulation user in immersion virtual display system Method, wherein changing virtual reality scenario by switching, weakening and/or closing the video content played.
The 6th according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries Supplied according to a fifth aspect of the present invention the 7th overstimulation user method is avoided in immersion virtual display system, wherein logical Cross switching, reduction and/or close the audio content that is playing to change virtual reality scenario.
The the 6th or the 7th according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system Method, there is provided the 8th according to a fifth aspect of the present invention avoids overstimulation user side in immersion virtual display system Method, wherein changing virtual reality scenario by ejecting prompt window.
Second according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries Supplied according to a fifth aspect of the present invention the 9th overstimulation user method is avoided in immersion virtual display system, wherein logical The speed and/or acceleration for crossing detection user's head movement identify that user's head is moved with designated mode more than threshold value.
The 3rd according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries Supplied according to a fifth aspect of the present invention the tenth overstimulation user method is avoided in immersion virtual display system, wherein logical Cross the speed of detection user hand movement and/or acceleration is more than threshold value, or the distance of hand opposing headers or eyes is persistently small Identify that user's hand is moved with designated mode in threshold value.
The 4th according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries Supplied according to a fifth aspect of the present invention the 11st overstimulation user method is avoided in immersion virtual display system, wherein The word occurred in threshold value, or identification audio is exceeded by the frequency and/or loudness that detect capture audio and belongs to specified set, To identify that user experiences discomfort.
The 5th according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries Supplied according to a fifth aspect of the present invention the 12nd overstimulation user method is avoided in immersion virtual display system, wherein By identifying that user presses the specified button of interactive device, user exceeds threshold value, and/or user to handing over to the grip of interactive device Mutual equipment makes the action thrown away and/or got rid of, to identify that user experiences discomfort.
According to the sixth aspect of the invention, there is provided the first immersive VR system according to a sixth aspect of the present invention System, including computing unit, display device and sensor assembly;The computing unit is used to run virtual reality applications to build Virtual reality scenario;Sensor assembly is used for the state for perceiving user;The computing unit also operation program is with based on sensor The state for the user that module is perceived identifies whether user experiences discomfort, and in response to identifying that user experiences discomfort, Change constructed virtual reality scenario.
The first immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention The second immersed system of virtual reality, wherein the computing unit is changed by virtual reality by indicating virtual reality applications Using constructed virtual reality scenario.
The first immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention The 3rd immersed system of virtual reality, wherein the computing unit is changed by virtual reality applications by indicating real world devices Constructed virtual reality scenario.
First according to the sixth aspect of the invention is one of to the 3rd immersed system of virtual reality, there is provided according to this hair 4th immersed system of virtual reality of bright 6th aspect, wherein sensor assembly include head pose acquisition equipment, the head Portion's attitude acquisition device is used to capture and export the posture of user's head.
The 4th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention The 5th immersed system of virtual reality, wherein sensor assembly gesture acquisition equipment, the gesture acquisition equipment is used to capture And export the gesture of user.
The 5th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention The 6th immersed system of virtual reality, wherein sensor assembly includes audio capturing device, and the audio capturing device is used for Words and phrases in sound, voice, and/or identification voice that capture user sends.
The 6th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention The 7th immersed system of virtual reality, further include interactive device, the interactive device is used for pushing button, using for instruction user The grip at family makes interactive device the action thrown away and/or got rid of beyond threshold value, speed, acceleration, and/or user.
The 4th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention The 8th immersed system of virtual reality, wherein the computing unit also operation program obtains user's head by sensor assembly Posture, and the speed of detection user's head movement and/or acceleration use account more than threshold value to identify from user's head posture Moved with designated mode in portion;And user's head is moved into the foundation as identification user experience discomfort using designated mode.
The 5th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention The 9th immersed system of virtual reality, wherein the computing unit also operation program pass through sensor assembly obtain user's hand Posture, and the speed of detection user hand movement and/or acceleration are more than threshold value, or user's hand from user's hand gestures The distance of opposing headers identifies that user experiences discomfort continuously less than threshold value.
The 6th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention The tenth immersed system of virtual reality, wherein the computing unit also operation program by sensor assembly obtain user send Audio, threshold value is exceeded by the frequency and/or loudness that detect capture audio, or identify that the word occurred in audio belongs to finger Fixed set, to identify that user experiences discomfort.
The 7th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention The 11st immersed system of virtual reality, wherein the computing unit also operation program obtain interactive device instruction, to know Other user experiences discomfort.
First according to the sixth aspect of the invention is one of to the 11st immersed system of virtual reality, there is provided according to this The 12nd immersed system of virtual reality of the 6th aspect is invented, wherein the computing unit is by switching, weakening and/or close The video content played, and/or by switching, weakening and/or close the audio content played, and/or pass through bullet Go out prompt window to change constructed virtual reality scenario.
According to the seventh aspect of the invention, there is provided one kind avoids overstimulation user in immersion virtual display system Device, including:Module is presented, for virtual reality scenario to be presented;Change module, for being experienced not in response to identification user It is suitable, change virtual reality scenario.
According to the eighth aspect of the invention, there is provided a kind of information processing equipment, including processor, memory, display are set Standby, described information processing equipment is additionally coupled to the state of sensor assembly and the user of receiving sensor module perception;It is described to deposit Reservoir storage program, the processor operation described program make the execution of described information processing equipment carry according to a fifth aspect of the present invention What is supplied avoids one of overstimulation user method in immersion virtual display system.
Brief description of the drawings
When being read together with attached drawing, by reference to the detailed description of illustrative embodiment, will be best understood below The present invention and preferable use pattern and its further objects and advantages, wherein attached drawing include:
Fig. 1 illustrates " confirmation " control of two-dimensional graphical user interface in the prior art;
Fig. 2 is gesture recognition system structure diagram of the prior art;
Fig. 3 is the block diagram of the man-machine interactive system according to embodiments of the present invention based on gesture identification;
Fig. 4 is the schematic diagram of the control based on gesture identification in two-dimensional user interface according to embodiments of the present invention;
Fig. 5 A-5D are the various states of the control based on gesture identification in two-dimensional user interface according to embodiments of the present invention Schematic diagram;
Fig. 6 is the flow of the man-machine interaction method based on gesture identification in two-dimensional user interface according to embodiments of the present invention Figure;
Fig. 7 is the schematic diagram of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention;
Fig. 8 A-8D are the various states of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention Schematic diagram;
Fig. 9 is the block diagram for the information processing equipment for realizing the embodiment of the present invention;
Figure 10 is the block diagram of virtual reality system according to embodiments of the present invention;
Figure 11 is the flow chart of the method for avoiding virtual reality system overstimulation user according to embodiments of the present invention;With And
Figure 12 is the flow of the real another method for avoiding virtual reality system overstimulation user for applying example according to the present invention Figure.
Embodiment
The embodiment of the present invention is described below in detail, the example of the embodiment is shown in the drawings, wherein from beginning to end Same or similar label represents same or similar element or has the function of same or like element.Below with reference to attached The embodiment of figure description is exemplary, and is only used for explaining the present invention, and is not considered as limiting the invention.On the contrary, this The embodiment of invention includes falling into all changes in the range of the spirit and intension of attached claims, modification and equivalent Thing.
In the description of the present invention, it is to be understood that term " first ", " second " etc. are only used for description purpose, without It is understood that to indicate or implying relative importance.In the description of the present invention, it is necessary to which explanation, provides unless otherwise clear and definite And restriction, term " connected ", " connection " should be interpreted broadly, for example, it may be fixedly connected or be detachably connected, Or it is integrally connected;Can mechanically connect or be electrically connected;It can be directly connected, intermediary can also be passed through It is indirectly connected.For the ordinary skill in the art, the tool of above-mentioned term in the present invention can be understood with concrete condition Body implication.In addition, in the description of the present invention, unless otherwise indicated, " multiple " are meant that two or more.
Any process or method described otherwise above description in flow chart or herein is construed as, and represents to include Module, fragment or the portion of the code of the executable instruction of one or more the step of being used for realization specific logical function or process Point, and the scope of the preferred embodiment of the present invention includes other realization, wherein can not press shown or discuss suitable Sequence, including according to involved function by it is basic at the same time in the way of or in the opposite order, carry out perform function, this should be of the invention Embodiment person of ordinary skill in the field understood.
Fig. 3 is the block diagram of the man-machine interactive system according to embodiments of the present invention based on gesture identification.It is real according to the present invention Applying the man-machine interactive system of example includes gesture input device 310 coupled to each other, information processing equipment 320 and display device 330.In one example, gesture input device 310, are sent to for capturing the image of user's hand, and by the image of acquisition Information processing equipment is handled.Information processing equipment 320, for receiving the hand images of gesture input device transmission, identification The gesture information of user's hand in image.Information processing equipment 320 also by display device 330 to user's present graphical and/or Image, for example, on display device 330 draw user's hand virtual image.Information processing equipment can be such as computer, Mobile phone or dedicated gesture identification equipment.Display device 330 can be such as flat-panel screens, projecting apparatus, head-mounted display.
In another example, gesture input device 310 perceives position and/or the posture of user's hand, identifies user's hand The gesture information in portion, and user's hand information is sent to information processing equipment 320.Information processing equipment 320 identifies that gesture is defeated The user's hand information for entering the offer of equipment 310 makees the input that provides to the user, and is provided a user by display device 330 defeated Go out, to realize human-computer interaction.Obviously, information processing equipment 320 can also be handed over by the forms such as sound, mechanical function and user Mutually.
As still another example, gesture input device 310 still can be such as depth transducer, Distance-sensing Device, VR controllers (such as Oculus Rift Touch), game paddle, data glove (such as CyberGlove), motion capture system (such as OptiTracker), gyroscope, for perceiving position and/or the posture of user's hand.
From user's gesture done in real world and/or action, the gesture information based on virtual coordinate system is extracted (i).Gesture information (i) can be a vector, and formalization representation for i=C, palm, thumb, index, mid, ring, little}.Wherein, c represents the hand-type of whole hand, for example, clenching fist, the five fingers open, triumphantly gesture etc., palm represent instruction palm Positional information, thumb, index, mid, ring and little represent thumb, forefinger, middle finger, the third finger and little finger of toe respectively Positional information and/or orientation information.And wherein, virtual coordinate system is showing as the void constructed by information processing equipment 320 Intend the positional information in the world.And show the positional information in object or space in real world with real coordinate system.Information processing Virtual world constructed by equipment 320 can be the two-dimensional space, three dimensions or fusion of such as two-dimensional graphical user interface The virtual reality scenario of user.Real coordinate system and virtual coordinate system, can be two-dimensional coordinate system or three-dimensional system of coordinate.Can be by one Fixed frequency or time interval renewal gesture information (i), or when the hand position and/or posture of user change, more New gesture information (i).
On a user interface, can be according to gesture information (i) display highlighting, for providing a user eye response.Cursor Position on graphical interfaces is represented by the function of gesture information (i), such as func_a (i).Those skilled in the art can be with Understand, function func_a differences according to different application scenarios or setting.
For example, in a two-dimensional user interface, the position that draw cursor is calculated by formula (1):
Func_a (i)=C*0+palm*0+index.position*0.5+mid*0+little*0 (1)
(1) in formula, index.position refers to the position of user's forefinger, thus from (1) formula it was found from, cursor is in user circle Position on face, only relies upon user's index finger location, and the distance that cursor moves on a user interface, be the movement of user's forefinger away from From half.
Cursor can have single pattern, such as the shape of hand.Cursor can also have a variety of patterns for corresponding to not homochirality.
Illustrate with reference to Fig. 4-Fig. 6, in two-dimensional user interface how by gesture come operational controls.
Fig. 4 is the schematic diagram of the control based on gesture identification in two-dimensional user interface according to embodiments of the present invention.Referring to Fig. 4, the control based on gesture identification includes in two-dimensional user interface according to embodiments of the present invention:Entrance area and guide rail area Domain.Entrance area in Fig. 4 is a line segment.In another example, entrance area can be curve.Entrance area is by user The two dimensional surface at interface is divided into two parts, by the side including guide rail area, is known as rail-sides, and opposite side is known as freely Side.In Fig. 4, guide rail area is rectangle.Obviously, in other examples, guide rail area can have other shapes, such as line segment, three It is angular, oval etc..Entrance area can be plotted in two-dimensional user interface with guide rail area, to prompt user control place Position.In another example, entrance area and/or guide rail area can be hidden, not influence to show in user interface Content.Guide rail area closes on or is adjacent to entrance area.Guide rail area is known as arrival end close to the part of entrance area, and leads Part of the rail region away from entrance area is known as the port of export.In another example, in order to allow the hanging of user's hand to operate It is easier to be identified, entrance area and the guide rail area of control are rendered as gap shape or bell mouth shape, so as to be easy to guide user Cursor is set to enter guide rail area by gesture.
In the example in fig. 4, guide rail area further includes sliding block.Sliding block can be moved along guide rail area.
Fig. 5 A-5D are the various states based on gesture identification control in two-dimensional user interface according to embodiments of the present invention Schematic diagram.
Control based on gesture identification has activated state and inactive state.Inactive state is the original state of control.Fig. 5 A The control in inactive state is illustrated, and is associated with the cursor of gesture information (i).Notice in fig. 5, guide rail area On do not draw sliding block, or sliding block is hidden.Do not include sliding block in guide rail area, can be the prompting to user, inform user Control is in inactive state.
When the gesture of user make it that cursor enters rail-sides from free side by entrance area, control is turned by inactive state It is changed to activated state.Control receives the event of instruction gesture information (i), and identifies the change that cursor position occurs for gesture information (i) Change.When gesture information (i) makes cursor position enter rail-sides via entrance area from the free side of control, by control state change For activated state.And draw the control of activated state.
Alternatively, when the gesture of user make it that cursor enters rail-sides from free side by entrance area, in guide rail area Sliding block is drawn on domain, and displays that cursor;User is moved the cursor on sliding block by gesture, and is acted by " grasping ", Cursor is set to be fixed on sliding block, and next, sliding block will follow cursor to move." grasping " action is not essential, in a reality Apply in example, when control is in activated state, sliding block follows cursor to move, or control is based on gesture information (i) and sends out slide position Changing.Still alternatively, with the movement of sliding block, also play and specify sound, change visual presence and/or provide a user power Learn feedback.For example, as sliding block is moved to the port of export, gradually increase and/or frequency gradually rise for the sound of broadcasting, and with cunning Block is moved to the port of export, and the sound abated gradually and/or frequency gradually reduces for broadcasting.
Fig. 5 B shows are in the control of activated state.In the control of activated state, guide rail area includes sliding block.Sliding block associates In gesture information (i) to the prompting to its hand position of user, and hide cursor.As an example, according to definite drafting light The rule of cursor position, determines to draw the position of sliding block.Further, the position for drawing sliding block is limited at guide rail area so that Sliding block is showed to move along guide rail.In the example of Fig. 5 B, guide rail area includes center line.The position of cursor (not drawing out) is in Projected position on line, to draw the position of sliding block in guide rail area.Alternatively, the appearance of control is changed, to be carried to user Show that control is activated and enters activated state.For example, drawing shade along the edge of control, and/or change the color of control regions, And/or the word that display is specified.Still alternatively, fed back by providing a user mechanics, and/or play the sound specified, with Control is prompted the user with to be activated and enter activation.
If the gesture of user causes cursor to enter guide rail area from rail-sides, or the gesture of user makes cursor bypass entrance Region and enter rail-sides, then the control in inactive state will not be made to be changed into activated state.And sliding block will not be shown, and according to Right display highlighting.
In an embodiment according to the present invention, led if user makes sliding block be removed from the port of export of guide rail area by gesture Rail, represents user to control instruction " confirmations " order, and if user sliding block is made from the shifting of the arrival end of guide rail area by gesture Go out guide rail, represent user to control instruction " cancellation " order.
Fig. 5 C illustrate the control for receiving " confirmation " order.For the control of activated state, as an example, user's sidesway to the right Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 5 C) of sliding block accordingly move right along guide rail area, with time to User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger, Control detects that sliding block removes guide rail area from the port of export of guide rail area, and control generation represents the event of " confirmation " order.It is logical The event that processing represents " confirmation " order is crossed, is identified with the operation associated by control.As sliding block removes guide rail area, hide Sliding block, for example, only drawing sliding block in guide rail area, and hides part of the sliding block beyond guide rail area.Alternatively, control is changed The appearance of part, identifies the intention of user and generates " confirmation " event to prompt the user with control.For example, dodge control regions It is bright, and/or change the color of control regions, and/or the word that display is specified.Still alternatively, mechanics feedback is provided a user, And/or the sound specified is played, to prompt the user with the intention that control identifies user.Further, led as sliding block removes Rail region, the State Transferring of control is inactive state, and draws cursor to track the gesture of user.
Fig. 5 D illustrate the control for receiving " cancellation " order.For the control of activated state, as an example, user's sidesway to the left Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 5 D) of sliding block are accordingly moved to the left along guide rail area, with time to User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger, Control detects that sliding block removes guide rail area from the arrival end of guide rail area, and control generation represents the event of " cancellation " order, or Person does not produce event, carrys out instruction user and does not make " confirmation " instruction, or original attempt is abandoned or eliminated to instruction user. By handling the event for representing " cancellation " and ordering, it is cancelled or ignores with the operation associated by control.As sliding block removes guide rail Region, hides sliding block.Alternatively, change the appearance of control, and/or provide a user mechanics feedback, to prompt the user with control Identify the intention of user and generate " cancellation " event.Further, as sliding block removes guide rail area, the state of control turns Inactive state is changed to, and draws cursor to track the gesture of user.
In another embodiment in accordance with the invention, guide rail area has cross shape.In activated state, when sliding block is from right side Or during the removal guide rail of top, control produces " confirmation " event;And when sliding block removes guide rail from left side or lower section, control produces " cancellation " event.
One of ordinary skill in the art will realize that guide rail area can have multiple outlets.When sliding block is removed from some outlets When, control produces " confirmation " event, and sliding block is exported from other when removing, and control produces " cancellation " event.To prompt user The different implications of outlet, in each export direction, can feed back the implication just exported to user by video, audio and/or mechanics Different instructions is provided.
Still there is one embodiment according to the present invention, guide rail area has cross shape, and have multiple outlets, Each branch in cross region corresponds to one outlet, and each outlet indicates that different implications or multiple outlet instructions are a variety of Implication.For example, when sliding block is exported from top and removed, control produces " cancellation " event, and instruction user is abandoned to playing music Attempt;When sliding block is removed from right-side outlet, control produces " mute " event, and instruction user wishes vertical even if from certain application The volume of the audio output of program falls to 0;When sliding block is exported from top to be removed, control produces " being arranged to high sampling rate " thing Part;And when sliding block exports removal from below, control produces " being arranged to low sampling rate " event.It is defeated in response to receiving control institute The different implications of instruction or the event of different command gone out, application program are handled accordingly.
Fig. 6 is the flow of the man-machine interaction method based on gesture identification in two-dimensional user interface according to embodiments of the present invention Figure.To use control according to embodiments of the present invention, initialization control (610).Control initialization procedure, is included in user circle Control is drawn on face, for example, drawing control as shown in Figure 5A in the user interface.And control is set to receive gesture information (i). Alternatively, cursor is also drawn on a user interface, and the position for drawing cursor is associated with gesture information (i).In another example, by The program or other programs that control is applied to draw cursor on a user interface.Control receives gesture information (i), believes from gesture Cease the position that cursor is obtained in (i).Rail-sides are moved to by entrance area from free side in response to cursor, control enters activation State (620).Fig. 5 B shows control of activated state.Alternatively, when control enters activated state, also change the appearance of control, produce The sound specified, and/or mechanics feedback is provided, enter activated state to prompt the user with control.Control is also drawn in guide rail area Sliding block.The position for drawing sliding block is limited at guide rail area so that shows sliding block and is moved along guide rail.And sliding block is set to follow use The gesture at family and move.As an example, the rule according to definite drafting cursor position, determines to draw the position of sliding block.Further Ground, by cursor position in the projection of guide rail area center line, the position as drafting sliding block.
The position of sliding block is obtained in control gesture information (i).Whether control detection sliding block removes from the side of guide rail area Guide rail area (640).Referring to Fig. 5 C, when control detects that sliding block removes guide rail area, control production from the port of export of guide rail area As an example, the first event can be " confirmation " event, " mute " event etc. to raw first event (650).And referring to Fig. 5 D, When control detects that sliding block removes guide rail area from the arrival end of guide rail area, control produces second event (650).Second event Can be the events such as " cancellation " event, " be arranged to high and use rate ".
In step 650, with first event that produces, control enters unactivated state.Alternatively, change the appearance of control, produce The raw sound specified, and/or provide mechanics feedback, identifies the intention of user and generates the first thing to prompt the user with control Part.And alternatively, cursor is drawn to track the gesture of user.
In step 660, with second event is produced, control enters unactivated state.Alternatively, change the appearance of control, produce The raw sound specified, and/or provide mechanics feedback, identifies the intention of user and generates the second thing to prompt the user with control Part.And alternatively, cursor is drawn to track the gesture of user.
In a further embodiment, user creates space according to embodiments of the present invention in virtual world, and/or sets Put or change the position of control.Control can be arranged on position easy to operation by user.For example, user's arm is fully stretched to side During expansion, the position where cursor.So as to not only indicate the order such as " confirmation "/" cancellation " easy to user, but also do not influence to virtual generation The operation of other objects in boundary.
Illustrated with reference to Fig. 7-8, gesture identification control is based on according to the present invention in three-dimensional user interface Embodiment.Fig. 7 is the schematic diagram of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention.Referring to Fig. 7, the control based on gesture identification includes in three-dimensional user interface according to embodiments of the present invention:Entrance area and guide rail area Domain.Entrance area in Fig. 7 is a finite rectangular plane.In another example, entrance area can be in curved surface or plane The region surrounded by closed curve.In Fig. 7, the three dimensions of user interface is divided into two parts by the plane where entrance area, By the side including guide rail area, it is known as rail-sides, and opposite side is known as free side.In Fig. 7, guide rail area is cuboid. Obviously, in other examples, guide rail area can have other shapes, such as cylinder, sphere, spheroid etc..Entrance area with Guide rail area can be plotted on three-dimensional user interface, to prompt the position where user control.Guide rail area can be with three The object of dimension user interface blends, for example, the vase of user interface, mailbox etc..In another example, entrance area and/ Or guide rail area can be hidden, not influence the content shown in user interface.Guide rail area closes on or is adjacent to inlet region Domain.Guide rail area is known as arrival end close to the part of entrance area, and part of the guide rail area away from entrance area is known as exporting End.
In the example of fig. 7, guide rail area further includes sliding block.Sliding block can be moved along guide rail area.Alternatively, in control also Including guide rail line, in the figure 7, guide rail line is the center line along the long axis direction of the cuboid of guide rail area.One end of guide rail line Point is on entrance area.Sliding block is moved along guide rail line.
Fig. 8 A-8D are the various states of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention Schematic diagram.
Fig. 8 A illustrate the control in inactive state, and are associated with the cursor of gesture information (i).In inactive state On control, sliding block is not drawn on guide rail area, or sliding block is hidden.Do not include sliding block in guide rail area, be carrying to user Show, inform that user control is in inactive state.
When the gesture of user make it that cursor enters rail-sides from free side by entrance area, control is turned by inactive state It is changed to activated state.Control receives the event of instruction gesture information (i), and identifies the change that cursor position occurs for gesture information (i) Change.When gesture information (i) makes cursor position enter rail-sides via entrance area from the free side of control, by control state change For activated state.And draw the control of activated state.
Alternatively, when the gesture of user make it that cursor enters rail-sides from free side by entrance area, in guide rail area Sliding block is drawn on domain, and displays that cursor;User is moved the cursor on sliding block by gesture, and is acted by " grasping ", Cursor is set to be fixed on sliding block;And next, sliding block will follow cursor to move.
Fig. 8 B shows are in the control of activated state.In the control of activated state, guide rail area includes sliding block.Sliding block associates In gesture information (i) to the prompting to its hand position of user, and hide cursor.As an example, according to definite drafting light The rule of cursor position, determines to draw the position of sliding block.Further, the position for drawing sliding block is limited at guide rail area so that Sliding block is showed to move along guide rail.In the example of Fig. 8 B, guide rail area includes guide rail line.The position of cursor (not drawing out) exists Projected position in guide rail line, to draw the position of sliding block in guide rail area.Alternatively, change the appearance of control, with to Prompt control to be activated and enter activated state in family.For example, drawing shade along the edge of control, and/or change the face of control regions Color, and/or the sound specified is played, and/or the word that display is specified.
If the gesture of user causes cursor to enter guide rail area from rail-sides, or the gesture of user makes cursor bypass entrance Region and enter rail-sides, then the control in inactive state will not be made to be changed into activated state.And sliding block will not be shown, and Still display highlighting.
In an embodiment according to the present invention, led if user makes sliding block be removed from the port of export of guide rail area by gesture Rail, represents user to control instruction " confirmations " order, and if user sliding block is made from the shifting of the arrival end of guide rail area by gesture Go out guide rail, represent user to control instruction " cancellation " order.And alternatively, if the gesture of user is attempted to make sliding block from arrival end Or the region beyond the port of export removes guide rail area, sliding block is limited in guide rail area, and control is still in inactive state.
Fig. 8 C illustrate the control for receiving " confirmation " order.For the control of activated state, as an example, user's sidesway to the right Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 8 C) of sliding block accordingly move right along guide rail area, with time to User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger, Control detects that sliding block removes guide rail area from the port of export of guide rail area, and control generation represents the event of " confirmation " order.It is logical The event that processing represents " confirmation " order is crossed, is identified with the operation associated by control.As sliding block removes guide rail area, hide Sliding block.Further, with sliding block remove guide rail area, the State Transferring of control be inactive state, and drafting cursor with The gesture of track user.
Fig. 8 D illustrate the control for receiving " cancellation " order.For the control of activated state, as an example, user's sidesway to the left Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 8 D) of sliding block are accordingly moved to the left along guide rail area, with time to User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger, Control detects that sliding block removes guide rail area from the arrival end of guide rail area, and control generation represents the event of " cancellation " order, or Person does not produce event, carrys out instruction user and does not make " confirmation " instruction, or original attempt is abandoned or eliminated to instruction user. By handling the event for representing " cancellation " and ordering, it is cancelled or ignores with the operation associated by control.As sliding block removes guide rail Region, hides sliding block.Further, as sliding block removes guide rail area, the State Transferring of control is inactive state, and is drawn Cursor is to track the gesture of user.
One of ordinary skill in the art will realize that guide rail area can have multiple outlets.When sliding block is removed from some outlets When, control produces " confirmation " event, and sliding block is exported from other when removing, and control produces " cancellation " event.To prompt user The different implications of outlet, in each export direction, can feed back the implication just exported to user by video, audio and/or mechanics Different instructions is provided.
Still there is one embodiment according to the present invention, guide rail area has each outlet instruction in multiple outlets different Implication or multiple outlets indicate a variety of implications.For example, when sliding block is removed from first outlet, control produces " cancellation " thing Part;When sliding block is removed from second outlet, control produces " mute " event;When sliding block is removed from the 3rd outlet, control produces " being arranged to high sampling rate " event;And when sliding block is removed from the 4th outlet, control produces " being arranged to low sampling rate " event. The event for indicating different implications or different command exported in response to receiving control, application program are handled accordingly.
In another embodiment of the present invention, control is shown as follows in three-dimensional user interface.When initial, control During in unactivated state, display highlighting, hides entrance area, shows guide rail area;Control is converted into sharp by unactivated state During state living, gesture cursor is weakened, sliding block is shown on guide rail area, and show guide rail area and/or guide rail line.Control into Enter the moment of state of activation, trigger special efficacy, including:Gradually reduction cursor, and display guide rail area and/or guide rail line.With Another special efficacy when confirming or cancelling the operation associated by current control, is triggered by control in family, including:Sliding block is being highlighted Gradually weaken and disappear afterwards, gradually recover the display to cursor, guide rail area and/or guide rail line segment are progressively reverted into control Pattern during in inactive state.
In another embodiment in accordance with the invention, illustrate and how to be operated according to the present invention using gesture in the application The control that embodiment provides.(1) control of unactivated state is shown in the user interface of display device and is associated with gesture The cursor of information (i);(2) user changes gesture or mobile hand position, observes the change of control and cursor on display device;With Family controls cursor to enter guide rail area, control quilt by entrance area from free side by varying gesture or mobile hand position Activation.When activating control, user interface prompt user control is activated.(3) after control is active, user passes through Change gesture or mobile hand position carrys out control slide block so that sliding block is moved along guide rail area.If user will perform confirmation behaviour Make, then so that sliding block is removed from the port of export of guide rail area;If performing cancellation operation, cause sliding block from guide rail area Arrival end removes.Alternatively, when performing confirmation operation and cancelling operation, the operation performed by user interface prompt user.
Fig. 9 is the block diagram for the information processing equipment for realizing the embodiment of the present invention.In an embodiment according to the present invention, information Processing equipment 900 generates control on a user interface, and identifies user gesture information (i) or receive gesture input/gesture identification The gesture information (i) that equipment provides, identifies the instruction of user, and provides a user feedback with same user mutual.Shown in Fig. 9 Information processing equipment 900 is computer.Computer is only an example of appropriate computing environment, and be not intended to imply on The use of the present invention or any restrictions of envelop of function.The information processing equipment that Fig. 9 is shown also is not construed as in institute Any component or the component combined aspects shown have any dependence or requirement.
Information processing equipment 900 includes being coupled directly or indirectly to the memory 912 of bus 910, one or more places Manage device 914, one or more presentation component 916, I/O components 920 and power supplys 922.Representated by bus 910 can be a kind of Or more kind bus (such as address bus, data/address bus or its combination).It is that various components define not in a practical situation It is the inevitable mode as in Fig. 9.For example, the presentation component of such as display device etc can be considered as I/O components 920. In addition, processor can have memory.Present inventors have realized that this is exactly the property of this area, and reaffirm Fig. 9's Diagram is merely to illustrate that the illustrative computer system that can be used in conjunction with one or more embodiments of the invention System.
Information processing equipment 900 generally includes multiple memorizers 912.Unrestricted as an example, memory 912 can wrap Include:Random access memory (RAM), read-only storage (ROM), electronic erasable programmable read only memory (EEPROM), sudden strain of a muscle Deposit, compact disk read-only storage (CDROM), digital universal disc (DVD) or other optics or holographic media, magnetic holder, tape, magnetic Disk storage or other magnetic storage apparatus.Computer-readable storage medium can be non-volatile.
Information processing equipment 900 includes one or more processors 914, its from such as bus 910, memory 912 or The various entities of I/O components 920 etc read data.One or more is presented component 916 and number is presented to user or other equipment According to instruction.Exemplary presentation component 916 includes display device, loudspeaker, print components, vibration component, flat-panel screens, throwing Shadow instrument, head-mounted display etc..It can also be used to couple display device, loudspeaker, print components, vibration that component 916, which is presented, The I/O ports of component, flat-panel screens, projecting apparatus, head-mounted display etc..Illustrative I/O components 920 include camera, Microphone, control stick, game paddle, dish-shaped satellite signal transmitting and receiving antenna, scanner, printer, wireless device etc..
The control based on gesture identification can also be implemented in gesture identification equipment or gesture input device according to the present invention. Gesture identification equipment or gesture input device are desirably integrated into the input equipments such as keyboard, mouse, remote controler.
Figure 10 is the block diagram of virtual reality system according to embodiments of the present invention.Virtual reality system is by hardware and operation Software sharing on hardware, is clear purpose, in Fig. 10, illustrates the hardware layer and software layer split by dotted line. The hardware device with virtual reality system according to embodiments of the present invention is shown in hardware layer, and is illustrated in software layer virtual existing The software of real system.
Virtual reality system includes the computing unit 1010, display device 1020, memory/storage device 1030 being mutually coupled And sensor assembly 1040.Computing unit 1010 includes CPU (Central Processing Unit, central processing unit) And for accelerating GPU (Graphic Processing Unit, the figure of the tasks such as video display, image recognition, scene rendering Shape processing unit).Computing unit 1010 may include one or more CPU and one or more GPU.Display device 1020 is being counted Calculate under the instruction of unit 1010 to user's present graphical, picture material, virtual reality scenario is presented.Display device 220 can be with It is such as head-mounted display, projector, virtual reality glasses.Memory/storage device 1030 can include:Random access memory Device (RAM), read-only storage (ROM), electronic erasable programmable read only memory (EEPROM), flash memory, compact disk is read-only deposits Reservoir (CDROM), digital universal disc (DVD) or other optics or holographic media, magnetic holder, tape, disk storage or other magnetism Storage device.Memory/storage device 1030 is used for ephemeral data and/or result data when storage program and program operation. Sensor assembly 1040 detects User Status, and provides input to virtual reality with regard to the state of user.
Sensor assembly 1040 includes multiple sensors and/or input unit.For example, sensor assembly 1040 includes head Attitude acquisition device 1042, gesture acquisition equipment 1044, audio/speech acquisition equipment 1046 and/or interactive device 1048.
As an example, head pose acquisition equipment 1042 includes the gyroscope inside the head-mounted display, for Certain frequency provides the head pose information such as the position of user's head, direction, pitching, alternatively, head to VR systems in real time Portion's attitude acquisition device is also handled head attitude information, to identify the action of user's head, such as from head pose Identified in information shake the head, head is acutely rocked, head quickly layback etc. headwork.In another example, head pose Acquisition equipment 242 includes video acquisition device, by capturing the image of user's head, identifies the attitude information and/or head on head The action in portion.
Gesture acquisition equipment 1044 may include the gesture identification provided in Chinese patent application CN201110100532.9 Its full text, is herein incorporated by system by quoting.Gesture identifying device 1044 can also be data glove (CyberGlove).Hand Gesture acquisition equipment 244 extracts gesture posture (i) from user's gesture done in real world and/or action.Gesture posture (i) it can be a vector, and formalization representation is i={ C, palm, thumb, index, mid, ring, little }.Wherein, C represents the hand-type of whole hand, for example, clenching fist, the five fingers opening, triumphantly gesture etc., palm represents the positional information of instruction palm, Thumb, index, mid, ring and little represent respectively thumb, forefinger, middle finger, the third finger and little finger of toe positional information and/ Or orientation information.
Audio/speech acquisition equipment 1046 can be microphone/array, speech processes/identification module, language processing module Deng.Audio/speech acquisition equipment 1046 is used to identify audio-frequency information and/or the identification user emergency such as scream, cry and shout of user, embraces The language such as resentment.Audio/speech acquisition equipment 1046 also can extract the features such as power, the frequency of audio, and/or convert speech into Herein.
Interactive device 1048 can be such as VR controllers (such as Oculus Rift Touch), game paddle, data glove (such as CyberGlove), motion capture system (such as OptiTracker), for identifying the instruction of user.For example, when user presses Button or when making predetermined action/gesture, represents that user provides the instruction specified to virtual reality system.In another example In, interactive device 1048 includes pressure sensor, can perceive user's grip size, and identify by the grip size of user The anxiety or sense of discomfort of user.In another example, interactive device 1048 includes inertial sensor, by speed and/or adds Speed can perceive user and the interactive action such as acutely be rocked, throws away/get rid of to interactive device, and use is identified by detecting these actions The degree to get a fright.
VR application software 1060 is run on computing unit 1010, for building virtual reality scenario, and same user mutual. Stimulate detection and processing module 1070 to collect the information obtained from sensor assembly 1040, the degree that is upset of identification user or The sense of discomfort that user experiences, and instruction is made to VR application software 1060, indicate virtual constructed by the change of VR application software Reality scene, to mitigate the stimulation user or reduce the sense of discomfort caused by user.Stimulate detection can with processing module 1070 1060 masked segment of VR application software or whole VR scenes are indicated, to reduce the stimulation to user.Or stimulate detection and processing Module 1070 indicates 1060 pop-up dialogue box of VR application software, suspension or pause to the structure to VR scenes.Stimulate detection and place Reason module 1070 may further indicate that VR application software 1060 weakens current VR scenes, such as reduce the degree true to nature of current VR scenes Or change the content of VR scenes, reduce its terrified, degree of violence.
Alternatively, detection is stimulated directly to change the VR scenes that display device 1020 is shown with processing module 1060, and nothing The assistance of VR application software 1060 is needed, hence for the third party's VR application software for not providing the corresponding interface or controlling mechanism, It can implement according to an embodiment of the invention.Detection is alternatively stimulated to indicate what audio frequency apparatus adjustment played with processing module 1060 Audio, reduces volume, changes rhythm, plays cheerful and light-hearted music etc., the assistance without VR application software 1060.
Stimulate detection and processing module 1060 can poll sensors module 1040 with identify degree that user is upset or The sense of discomfort that user experiences.And in another example, sensor assembly perceive user be upset or undergo it is not in due course, Or under specified requirements (for example, timer expires), message or interrupt signal are sent with processing module 1070 to detection is stimulated.
Figure 11 is the flow chart of the method for avoiding virtual reality system overstimulation user according to embodiments of the present invention. According to an embodiment of the invention, VR is using 1060 (referring also to Figure 10) structure VR scenes (1110).Stimulate detection and processing mould Block 1060 identifies that the user in VR scenes is just undergoing the degree of uncomfortable experience and/or uncomfortable experience by sensor assembly 1040 (1120).And when necessary, to avoid giving user's overstimulation, sense of discomfort of the user in VR scenes is reduced, changes institute's structure The VR scenes (1130) built, reduce the stimulation to user, so as to alleviate tension/sense of discomfort of user.
Figure 12 is the flow of the real another method for avoiding virtual reality system overstimulation user for applying example according to the present invention Figure.VR is using 1060 (referring also to Fig. 2) structure VR scenes (1210).Pass through head pose trap setting 1042 (referring also to Figure 10) Obtain user's head posture (1220).As an example, head pose trap setting 1042 carries out head pose with fixed frequency Sampling, obtains the head pose of t frames, is denoted as Phead(t).Head pose includes 3 frees degree, i.e.,Its Middle ψ, θ,The angle that head is rotated around yaw axis, pitch axis and the axis of rolling is represented respectively.In another example, head pose Including 6 frees degree, i.e.,In addition to the angle that head is rotated around three axis, x, y, z represents head respectively The offset that portion is translated along yaw axis, pitch axis and the axis of rolling.
From head pose, the motor pattern on head is identified.When there is the motor pattern specified in the head of user, it is believed that use Experiencing nervous or uncomfortable (421) in family.In one example, user is frightened then head and can be swung back rapidly, and head is rapid The speed that the head movement pattern of layback can be moved by detecting user's head is more than specified threshold τ1To identify, for example, Phead (t)-Phead(t-1) > τ1, wherein Phead(t) offset taken in the head pose of t frames along the axis of rolling is taken.In another example In son, whether the head pose for comparing t frames head pose and t-n frames is more than threshold tau along the offset difference of the axis of rolling1, To identify whether the speed of user's head movement is more than threshold tau1.Wherein, t and n is positive integer.
In another example, user is frightened then, continuously can acutely be rocked head and be represented to resist the VR scenes seen.Even The acceleration that the continuous motor pattern for acutely rocking head can be rotated by detecting user's head is more than specified threshold τ '1To identify, Such as Phead(t)+Phead(t-2)-2Phead(t-1) > τ '1, wherein Phead(t) take in the head pose of t frames around yaw axis Rotational angle.In still another example, the head pose for comparing t frames head pose and t-n frames turns around yaw axis The second order difference of dynamic angle whether be more than threshold tau '1
If by the head pose of user, identification user is experiencing nervous or uncomfortable, change VR scenes (1230).
Alternatively, user gesture posture (1222) is obtained by gesture acquisition equipment 1044 (referring also to Figure 10).As act Example, gesture acquisition equipment 1044 provide gesture posture (i) with assigned frequency, and gesture posture (i) can be a vector, and form Change is expressed as i={ C, palm, thumb, index, mid, ring, little }.Wherein, c represents the hand-type of whole hand, for example, Clench fist, the five fingers open, triumph gesture etc., palm represents the positional information of instruction palm, thumb, index, mid, ring and Little represents the positional information and/or orientation information of thumb, forefinger, middle finger, the third finger and little finger of toe respectively.In another example In, gesture posture includes 3 frees degree, is denoted asWherein ψ, θ,Represent hand around yaw axis, pitching respectively The angle that axis and the axis of rolling rotate.As still another example, gesture posture includes 6 frees degree, i.e.,In addition to the angle that hand is rotated around three axis, x, y, z represents hand along yaw axis, pitch axis respectively With the offset of axis of rolling translation.
From gesture posture, the motor pattern of user's hand is identified.When the motor pattern specified occurs in the hand of user, recognize Experienced for user nervous or uncomfortable (1223).In one example, user is frightened then hand and can be pushed forward rapidly, to wish Prevention is dangerous to be considered, and the hand exercise pattern that hand is pushed forward rapidly can be by detecting the speed of user's hand movement more than specified Threshold tau2To identify, for example, Ppalm(t)-Ppalm(t-1) > τ2, wherein Ppalmm(t) take and taken in the hand gestures of t frames along rolling The offset of axis.In another example, user, which is frightened, then continuously can acutely wave to represent to resist the VR contents seen, and use The hand exercise pattern that family is continuously acutely waved can by detect user's hand move acceleration be more than threshold tau '2To identify, For example, Ppalm(t)+Ppalm(t2)2Ppalm(t1) > τ '2, wherein Ppalm(t) take in the hand gestures of t frames and turn around the axis of rolling Dynamic angle.In another example, the behavior that user covers eyes implies user and wants to avoid the content seen, and with covering eye The hand exercise pattern of eyeball can by detect user's palm seal it is before eyes reach certain time and identify, for example, the hand of continuous T frame Hand position is less than threshold tau apart from the average distance of eye position q in portion's posture "2, useTable Show, T is positive integer.And alternatively, the head pose provided by head pose trap setting 1042 represents eye position q, Or eyes are identified from head pose and obtain the position of eyes.
If nervous or uncomfortable, change VR scenes (1230) are being experienced by the recognizing model of movement user of user's hand.
According to another embodiment of the invention, structure is shown in Fig. 4 according to the present invention or Fig. 7 in VR scenes The control of embodiment, the gesture by identifying user activate control.When user is ordered by gesture to control instruction " confirmation ", Show that user is nervous or overstimulation.In response, change VR scenes, user is stimulated with reducing, reduces the tight of user Open sense.And when user is ordered by gesture to control instruction " cancellation ", show that user can receive the content of VR scenes, and it is temporary Shi Wuxu makes a change the VR scenes of structure.
Alternatively, user gesture posture (1224) is obtained by audio/speech acquisition equipment 1046 (referring also to Figure 10).Make For citing, audio/speech acquisition equipment 1046 provides the sound of capture to stimulation detection with processing module 1070 (referring also to Figure 10) Frequently, acoustic characteristic, voice and/or the text identified from voice.Acoustic characteristic includes the frequency of audio, amplitude etc..
If specific characteristic is presented in the audio signal of capture, or voice includes specifying words and phrases, it is believed that user is experiencing tightly Open or uncomfortable (425).In one example, user, which is frightened, can then call out words such as " help ".Word set y is provided, wherein Element to be user experiencing words and phrases that are nervous or can not saying or call out out in due course.The audio x that capture user sends, identifies sound Words and phrases in frequency x, and word y is converted into, if word y is in predefined character set y, then it is assumed that user represents tight in experience Open, VR contents are not felt well or resisted.
In another example, user, which is frightened, then can scream or shout.The frequency of audio that identification user sends and/or Loudness, if frequency and/or loudness exceed a prescribed threshold value, then it is assumed that user represent experience it is nervous, do not feel well to VR contents or Resist.
If nervous or uncomfortable, change VR scenes (1230) are being experienced by audio identification user.
Still alternatively, the signal (1226) of interactive device is obtained by interactive device 1048 (referring also to Figure 10).Interaction The instruction of user of the signal designation of equipment.The sense of discomfort (1227) of the instruction instruction user of user.In one example, user It is apprised of when not feeling well in VR systems, can be informed by pushing button or making predetermined action/gesture to VR systems Sense of discomfort.For example, button is provided on the interactive device 1048 of the VR controllers of such as user, game paddle or data glove, User pushes button, and interactive device 1048 to stimulating detection or processing module 1070 to send signal, by button sent out by instruction user The instruction not felt well is gone out.In another example, user's body can be nervous or not in due course, can hold hand-held object.And hand over Mutual equipment 1048 includes pressure sensor, and is held by user.When the grip for perceiving user is held beyond threshold value or for a long time Continuous when exceed threshold value, interactive device 1048 is to stimulating detection or the transmission signal of processing module 1070, the sense of discomfort of instruction user (1227)。
If nervous or uncomfortable, change VR scenes (1230) are being experienced by the signal identification user of interactive device.
Still alternatively, the motor pattern of the interactive device held by user, to identify the motor pattern of user's hand. When there is the motor pattern specified in the hand of user, it is believed that user is experiencing nervous or uncomfortable.For example, on interactive device Gyroscope is set, and the posture of handheld device is provided in real time or with certain frequency.Or the image of capture interactive device, from image The posture of middle identification interactive device.The configured informations such as ultrasonic wave, infrared ray, visible ray can also be produced on interactive device, from And identify the posture of handheld device by receiving configured information.From the posture of handheld device, the movement of handheld device is obtained Pattern.If the speed and/or acceleration of handheld device are more than threshold value, it is believed that the user for holding handheld device is just experiencing nervous or not It is suitable.
It is pointed out that proposed above with way of example can be by identifying head movement pattern, the user of user Hand exercise pattern, the signal of the audio/speech feature of user or interactive device identify that it is nervous or not that user is experiencing It is suitable.One of ordinary skill in the art will realize that above-mentioned identification user is experiencing nervous or uncomfortable means and use can be combined.It is logical Cross the one or more for combining above-mentioned means, can it is more accurate, more succinct or more effectively identification user experiencing it is nervous or not It is suitable.
Nervous or uncomfortable, change VR scenes (1230) are just being experienced in response to recognizing user.Implementing according to the present invention In example, change VR scenes by multiple means.In one example, by switching, weakening and/or closing the video of current presentation Content (1232) changes VR scenes, and user is stimulated with reducing, and reduces the tension of user.For example, ought be preceding to user's exhibition The terrified VR scenes shown are switched to easily content, reduce fidelity/resolution ratio of current VR scenes, make the VR fields of current presentation Scape thickens, or changes the brightness of current VR scenes.In another example, worked as by switching, weakening and/or close The audio content (1234) of preceding broadcasting changes VR scenes, to reduce the stimulation to user.For example, by present tense, stimulate Music, changes into light, cheerful and light-hearted music, makes to bring source of sound that is nervous, stimulating to move to the direction away from user, changes sound Amount, or even stop the broadcasting of sound etc..In still another example, prompt window (1236) is ejected to user, to inquire use The wish at family and the intense strain for alleviating user.For example, whether excessively nervous requried the users by pop-up window, if need Rest.Multiple means provided above can be combined to change VR scenes.
Still in another embodiment, detection is being stimulated also to be supervised with processing module 1070 (referring to Figure 10) according to the present invention Survey VR scenes.VR scenes displaying it is nervous, stimulate etc. when, access sensors module 1040 is to identify whether user experiences not It is suitable.Alternatively, VR informs stimulation detection and processing module 1070 using 1060 with regard to its content, for example, will appear from or going out It is existing nervous, when stimulating content, notice stimulates detection to start or strengthen the detection to user with processing module 1070, effectively to identify Whether user experiences discomfort.
Although an embodiment of the present invention has been shown and described, for the ordinary skill in the art, can be with Understanding without departing from the principles and spirit of the present invention can carry out these embodiments a variety of changes, modification, replace And modification, the scope of the present invention is by appended claims and its equivalent limits.

Claims (10)

1. one kind avoids overstimulation user method in immersion virtual display system, including:
Virtual reality scenario is presented;
Discomfort is experienced in response to identification user, changes virtual reality scenario.
2. according to the method described in claim 1, wherein
By identifying that user's head is moved with designated mode, by identifying that user's hand is moved with designated mode, pass through capture There is specific characteristic in audio and/or audio includes the voice that phrase or sentence are specified in instruction, and/or by being connect from interactive device The instruction of user is received, to identify that user experiences discomfort, to identify that user experiences discomfort.
3. according to the method described in one of claim 1-2, wherein
By switching, weakening and/or closing the video content and/or audio content that are playing, to change virtual reality scenario.
4. according to the method described in one of claim 1-3, wherein
Change virtual reality scenario by ejecting prompt window.
5. according to the method described in claim 2, wherein
Identify that user's head is transported with designated mode more than threshold value by detecting speed that user's head moves and/or acceleration It is dynamic.
6. the method according to claim 2 or 5, wherein
Threshold value, or the distance of hand opposing headers or eyes are more than by the speed and/or acceleration that detect the movement of user's hand Identify that user's hand is moved with designated mode continuously less than threshold value.
7. according to the method described in claim 2,5 or 6, wherein
The word occurred in threshold value, or identification audio is exceeded by the frequency and/or loudness that detect capture audio and belongs to specified Set, to identify that user experiences discomfort.
8. a kind of immersed system of virtual reality, including computing unit, display device and sensor assembly;
The computing unit is used to run virtual reality applications to build virtual reality scenario;
Sensor assembly is used for the state for perceiving user;
The computing unit also operation program with the state of the user perceived based on sensor assembly come identify user whether body Discomfort is tested, and in response to identifying that user experiences discomfort, changes constructed virtual reality scenario.
9. a kind of device that overstimulation user is avoided in immersion virtual display system, including:
Module is presented, for virtual reality scenario to be presented;
Change module, for experiencing discomfort in response to identification user, change virtual reality scenario.
10. a kind of information processing equipment, including processor, memory, display device, described information processing equipment are additionally coupled to pass The state for the user that sensor module and receiving sensor module perceive;
The memory storage program, the processor operation described program make described information processing equipment perform claim require 1- Method described in one of 7.
CN201610882538.9A 2016-10-08 2016-10-08 Method and system for avoiding overstimulation in immersive VR system Active CN107918482B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610882538.9A CN107918482B (en) 2016-10-08 2016-10-08 Method and system for avoiding overstimulation in immersive VR system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610882538.9A CN107918482B (en) 2016-10-08 2016-10-08 Method and system for avoiding overstimulation in immersive VR system

Publications (2)

Publication Number Publication Date
CN107918482A true CN107918482A (en) 2018-04-17
CN107918482B CN107918482B (en) 2023-12-12

Family

ID=61891759

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610882538.9A Active CN107918482B (en) 2016-10-08 2016-10-08 Method and system for avoiding overstimulation in immersive VR system

Country Status (1)

Country Link
CN (1) CN107918482B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108671522A (en) * 2018-04-23 2018-10-19 万赢体育科技(上海)有限公司 A kind of exercise guidance method and system for sport in the future shop
CN110413105A (en) * 2018-04-30 2019-11-05 苹果公司 The tangible visualization of virtual objects in virtual environment
CN113178019A (en) * 2018-07-09 2021-07-27 上海交通大学 Indication information identification method, system and storage medium based on video content
CN113192376A (en) * 2021-04-28 2021-07-30 深圳市思麦云科技有限公司 System based on virtual reality glasses VR technique is applied to electric power trade
WO2022127478A1 (en) * 2020-12-17 2022-06-23 Huawei Technologies Co., Ltd. Methods and systems for multi-precision discrete control of user interface control element of gesture-controlled device
WO2022161386A1 (en) * 2021-01-30 2022-08-04 华为技术有限公司 Pose determination method and related device
WO2024077872A1 (en) * 2022-10-09 2024-04-18 网易(杭州)网络有限公司 Display position adjustment method and apparatus, storage medium, and electronic device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030133600A1 (en) * 2002-01-11 2003-07-17 Yea-Shuan Huang Image preprocessing method capable of increasing the accuracy of face detection
US20110140994A1 (en) * 2009-12-15 2011-06-16 Noma Tatsuyoshi Information Presenting Apparatus, Method, and Computer Program Product
CN103491432A (en) * 2012-06-12 2014-01-01 联想(北京)有限公司 Method, device and system for multimedia information output control
CN103677243A (en) * 2012-09-25 2014-03-26 联想(北京)有限公司 Control method, control device and multimedia input and output system
CN104076513A (en) * 2013-03-26 2014-10-01 精工爱普生株式会社 Head-mounted display device, control method of head-mounted display device, and display system
CN104407766A (en) * 2014-08-28 2015-03-11 联想(北京)有限公司 Information processing method and wearable electronic equipment
CN104951479A (en) * 2014-03-31 2015-09-30 小米科技有限责任公司 Video content detecting method and device
US20150302651A1 (en) * 2014-04-18 2015-10-22 Sam Shpigelman System and method for augmented or virtual reality entertainment experience
WO2016001908A1 (en) * 2014-07-03 2016-01-07 Imagine Mobile Augmented Reality Ltd 3 dimensional anchored augmented reality
CN105704468A (en) * 2015-08-31 2016-06-22 深圳超多维光电子有限公司 Stereoscopic display method, device and electronic equipment used for virtual and reality scene
CN105704478A (en) * 2015-08-31 2016-06-22 深圳超多维光电子有限公司 Stereoscopic display method, device and electronic equipment used for virtual and reality scene
CN105893780A (en) * 2016-05-10 2016-08-24 华南理工大学 Mental and psychological assessment method based on VR interaction and brain wave and cerebral blood flow monitoring
US20160267720A1 (en) * 2004-01-30 2016-09-15 Electronic Scripting Products, Inc. Pleasant and Realistic Virtual/Augmented/Mixed Reality Experience
CN105975083A (en) * 2016-05-27 2016-09-28 北京小鸟看看科技有限公司 Vision correction method in virtual reality environment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030133600A1 (en) * 2002-01-11 2003-07-17 Yea-Shuan Huang Image preprocessing method capable of increasing the accuracy of face detection
US20160267720A1 (en) * 2004-01-30 2016-09-15 Electronic Scripting Products, Inc. Pleasant and Realistic Virtual/Augmented/Mixed Reality Experience
US20110140994A1 (en) * 2009-12-15 2011-06-16 Noma Tatsuyoshi Information Presenting Apparatus, Method, and Computer Program Product
CN103491432A (en) * 2012-06-12 2014-01-01 联想(北京)有限公司 Method, device and system for multimedia information output control
CN103677243A (en) * 2012-09-25 2014-03-26 联想(北京)有限公司 Control method, control device and multimedia input and output system
CN104076513A (en) * 2013-03-26 2014-10-01 精工爱普生株式会社 Head-mounted display device, control method of head-mounted display device, and display system
CN104951479A (en) * 2014-03-31 2015-09-30 小米科技有限责任公司 Video content detecting method and device
US20150302651A1 (en) * 2014-04-18 2015-10-22 Sam Shpigelman System and method for augmented or virtual reality entertainment experience
WO2016001908A1 (en) * 2014-07-03 2016-01-07 Imagine Mobile Augmented Reality Ltd 3 dimensional anchored augmented reality
CN104407766A (en) * 2014-08-28 2015-03-11 联想(北京)有限公司 Information processing method and wearable electronic equipment
CN105704468A (en) * 2015-08-31 2016-06-22 深圳超多维光电子有限公司 Stereoscopic display method, device and electronic equipment used for virtual and reality scene
CN105704478A (en) * 2015-08-31 2016-06-22 深圳超多维光电子有限公司 Stereoscopic display method, device and electronic equipment used for virtual and reality scene
CN105893780A (en) * 2016-05-10 2016-08-24 华南理工大学 Mental and psychological assessment method based on VR interaction and brain wave and cerebral blood flow monitoring
CN105975083A (en) * 2016-05-27 2016-09-28 北京小鸟看看科技有限公司 Vision correction method in virtual reality environment

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108671522A (en) * 2018-04-23 2018-10-19 万赢体育科技(上海)有限公司 A kind of exercise guidance method and system for sport in the future shop
CN110413105A (en) * 2018-04-30 2019-11-05 苹果公司 The tangible visualization of virtual objects in virtual environment
CN110413105B (en) * 2018-04-30 2021-12-07 苹果公司 Tangible visualization of virtual objects within a virtual environment
US11756269B2 (en) 2018-04-30 2023-09-12 Apple Inc. Tangibility visualization of virtual objects within a computer-generated reality environment
CN113178019A (en) * 2018-07-09 2021-07-27 上海交通大学 Indication information identification method, system and storage medium based on video content
CN113178019B (en) * 2018-07-09 2023-01-03 上海交通大学 Indication information identification method, system and storage medium based on video content
WO2022127478A1 (en) * 2020-12-17 2022-06-23 Huawei Technologies Co., Ltd. Methods and systems for multi-precision discrete control of user interface control element of gesture-controlled device
WO2022161386A1 (en) * 2021-01-30 2022-08-04 华为技术有限公司 Pose determination method and related device
CN113192376A (en) * 2021-04-28 2021-07-30 深圳市思麦云科技有限公司 System based on virtual reality glasses VR technique is applied to electric power trade
WO2024077872A1 (en) * 2022-10-09 2024-04-18 网易(杭州)网络有限公司 Display position adjustment method and apparatus, storage medium, and electronic device

Also Published As

Publication number Publication date
CN107918482B (en) 2023-12-12

Similar Documents

Publication Publication Date Title
CN107918482A (en) The method and system of overstimulation is avoided in immersion VR systems
Rautaray Real time hand gesture recognition system for dynamic applications
LaViola et al. 3D spatial interaction: applications for art, design, and science
JP2022549853A (en) Individual visibility in shared space
EP3519926A1 (en) Method and system for gesture-based interactions
WO2017153771A1 (en) Virtual reality
CN107077229B (en) Human-machine interface device and system
CN107918481A (en) Man-machine interaction method and system based on gesture identification
US11069137B2 (en) Rendering captions for media content
CN114630738B (en) System and method for simulating sensed data and creating a perception
JPWO2019187862A1 (en) Information processing equipment, information processing methods, and recording media
CN113892075A (en) Corner recognition gesture-driven user interface element gating for artificial reality systems
Kim et al. Real-time hand gesture-based interaction with objects in 3D virtual environments
Takács et al. The virtual human interface: A photorealistic digital human
Kirakosian et al. Near-contact person-to-3d character dance training: Comparing ar and vr for interactive entertainment
McMenemy et al. A hitchhiker's guide to virtual reality
US11287971B1 (en) Visual-tactile virtual telepresence
Dhanasree et al. Hospital emergency room training using virtual reality and leap motion sensor
CN106125927B (en) Image processing system and method
CN109144598A (en) Electronics mask man-machine interaction method and system based on gesture
KR101525011B1 (en) tangible virtual reality display control device based on NUI, and method thereof
CN105224910B (en) A kind of system and method for the common notice of training
Spanogianopoulos et al. Human computer interaction using gestures for mobile devices and serious games: A review
US11244516B2 (en) Object interactivity in virtual space
Verma et al. Hand Gesture Recognition Techniques, A Review

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20191121

Address after: 300450 room 203b-3, building 3, No.4, Haitai development road 2, Huayuan Industrial Zone (outside the ring), Binhai high tech Zone, Binhai New Area, Tianjin

Applicant after: TIANJIN SHARPNOW TECHNOLOGY Co.,Ltd.

Address before: 518000 A2, Shenzhen City, Guangdong Province, the 12 building of Kang Jia R & D building, south of science and technology south twelve

Applicant before: TIANJIN FENGSHI HUDONG TECHNOLOGY Co.,Ltd. SHENZHEN BRANCH

TA01 Transfer of patent application right
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210119

Address after: 230000 4th floor, L / F, Laobao building, 206 Jinzhai Road, Luyang District, Hefei City, Anhui Province

Applicant after: Anhui Bifang Education Investment Co.,Ltd.

Address before: Room 203b-3, building 3, No.4, Haitai development road 2, Huayuan Industrial Zone, Binhai high tech Zone, Binhai New Area, Tianjin 300450

Applicant before: Tianjin Sharpnow Technology Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210402

Address after: 518000 509, xintengda building, building M8, Maqueling Industrial Zone, Maling community, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Silan Zhichuang Technology Co.,Ltd.

Address before: 230000 4th floor, L / F, Laobao building, 206 Jinzhai Road, Luyang District, Hefei City, Anhui Province

Applicant before: Anhui Bifang Education Investment Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant