CN107885311A - A kind of confirmation method of visual interactive, system and equipment - Google Patents
A kind of confirmation method of visual interactive, system and equipment Download PDFInfo
- Publication number
- CN107885311A CN107885311A CN201610861394.9A CN201610861394A CN107885311A CN 107885311 A CN107885311 A CN 107885311A CN 201610861394 A CN201610861394 A CN 201610861394A CN 107885311 A CN107885311 A CN 107885311A
- Authority
- CN
- China
- Prior art keywords
- visual interactive
- user
- signal
- interactive
- bone conduction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Dermatology (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a kind of confirmation method of visual interactive, system and equipment, the described method comprises the following steps:Object Selection and activation are carried out by Eye-controlling focus in interactive interface;The confirmation of selected object is carried out by bone conduction technology in interactive interface.What the present invention can realize has the technical effect that:Interactive experience effect can be strengthened, lift interactive efficiency:First, the method that eyes selection limbs confirm improves input efficiency;Second, it is most naturally most familiar of action that limbs, which tap, eliminates the learning cost of user;3rd, the dependence to physical button has been broken away from, has improved portability.
Description
Technical field
The invention belongs to virtual reality method and technology field, and in particular to a kind of confirmation method of visual interactive, system and
Equipment.
Background technology
In virtual reality (VR) experience of the prior art, some employs Eye Tracking Technique, when the mesh of user
, it is necessary to which the regular hour of watching attentively is confirmed when light watches somewhere on interactive interface attentively, input efficiency is so undoubtedly reduced;Or
Person is confirmed that such user needs to be always maintained at the state for lifting hand, easily causes user's by other buttons on VR
Fatigue;Or by way of external handle, the globality of equipment is reduced, it is not easy to user and carries and preserve.And regarding
When feeling interaction, using Eye Tracking Technique, caused cursor of mouse image is easier shakiness in UI (i.e. interactive interface) screen
Determine, float, reduce the experience effect of user.
The content of the invention
The technical problem to be solved in the present invention, it is to provide method, system and equipment that a kind of visual interactive confirms, reduces and use
The learning cost at family, the efficiency of increase product interaction, the globality of lifting means;During being interacted by user's sight
It is quick to confirm after selection.
To solve the above problems, the present invention adopts the following technical scheme that:
A kind of method that visual interactive confirms, it is characterised in that comprise the following steps:
Step 1: object selection operation is carried out by Eye-controlling focus in interactive interface;
Step 2: the object that step 1 selects is entered into line activating in interactive interface;
Step 3: the object activated in step 2 is confirmed by osteoacusis.
Preferably, to the tracking of user's vision, algorithm is passed through by identifying the position of Purkinje image and identification pupil
The correspondence position being converted on screen, i.e., object selection operation is carried out by Eye-controlling focus in interaction interactive interface.
Preferably, the correspondence position on the screen is overlapping with the coordinate position of UI interactive interfaces icon on screen, then swashs
UI interactive interface icons in the region living.
Icon activation is that the user of VR equipment carries out the link of selection to the image in interactive interface, and the form of expression can be
Prominent icon, such as:Light icon, icon amplification, the depth of field of icon shoal, icon with respect to the motion of other icons, it is or other any
The information alert that can be judged by human body, such as:Text prompt, voice message, vibration prompt etc..
The bone conduction sensor identification user that carries that osteoacusis to activation icon be identified through visual interactive equipment sends out
The signal gone out is realized;Or the identification user of the bone conduction sensor by being connected in the form of accessory with visual interactive equipment sends
Signal realize.
Described can be wired or wireless in a manner of accessory form is connected with visual interactive equipment.
User taps the generation of vibration realizing signal caused by rigid objects by finger or four limbs arbitrary portion.
Bone conduction sensor is arranged on user's trunk, to receive the signal that user sends;Tests prove that by bone
It is more excellent that conduction sensor is arranged on after user's forehead and ear etc. effect.
After bone conduction sensor receives signal, by being amplified to signal, after noise reduction and identifying processing, according to place
Corresponding icon is carried out confirmation operation by reason result by transfer algorithm.
The present invention designs any one visual interactive system simultaneously, employs the method that a kind of above-mentioned visual interactive confirms.
And any one visual interactive device, the method confirmed using above-mentioned visual interactive.
Its operation principle is:Using bone conduction sensor can receive by human body be transmitted through come sound vibrations this is special
Property, bone conduction sensor is pressed close into head or the more place of facial skeleton (such as forehead, after ear, after brain etc.), finger strikes
Vibrations can be produced by hitting rigid objects, and bone conduction sensor gathers vibration frequency and duration and other patterns in real time, then passes through signal
Amplification, denoising, recognizer identify whether there is hammer action.
What the present invention can realize has the technical effect that:Experience effect of the user in virtual reality interactive experience can be improved
Fruit, computer reaction speed is lifted, non-jitter cursor, improves the picture texture of interactive interface.
Brief description of the drawings
Fig. 1:The apparatus structure schematic diagram of embodiments of the invention one;
Fig. 2:The apparatus structure schematic diagram of embodiments of the invention two;
Fig. 3:The flow chart of the embodiment of the present invention.
Brief description of the drawings:1st, visual interactive equipment;2nd, bone conduction sensor;3rd, bone conduction sensor signal generation apparatus;4、
Bone conduction sensor signal receiving device.
Embodiment
Embodiment one:The present invention adopts the following technical scheme that:
A kind of method that visual interactive confirms, it is characterised in that comprise the following steps:
Step 1: object selection operation is carried out by Eye Tracking Technique in the interactive interface of visual interactive equipment 1;
Step 2: a kind of object of selection of step is entered into line activating in interactive interface;
Step 3: the object activated in step 2 is confirmed by osteoacusis.
First by identifying the position of pupil and Purkinje image, it is right on screen to be then translated into by transfer algorithm
The region answered, carry out the selection operation of object;If object is icon, the seat of comparison diagram target coordinate and the corresponding region
Mark, activates icon if condition is met.
Icon activation is prompting link before VR equipment confirms to image of the user in interactive interface, the form of expression
Can be prominent icon, such as:Light icon, icon amplification, the depth of field of icon shoal, icon with respect to other icons motion, or its
Its any information alert that can be judged by human body, such as:Text prompt, voice message, vibration prompt etc..
In the present embodiment one, osteoacusis confirm being passed by the osteoacusis that carries of visual interactive equipment to activation icon
Sensor 2 identifies the signal that sends of user to realize;And carry bone conduction sensor 2 and be arranged on user's forehead position, be bonded bone
Head is placed.
After the icon activation that visual pursuit is chosen, user raps by using finger and rigid objects, produces confirmation signal,
Bone conduction sensor gathers signal in real time, by being amplified to signal, after noise reduction and identifying processing, is passed through according to result
Corresponding icon is carried out confirmation operation by transfer algorithm.
Embodiment two:
Bone conduction sensor is divided into:Bone conduction sensor signal generation apparatus 3 and bone conduction sensor signal receive dress
Put 4.
Bone conduction sensor signal generation apparatus 3 is positioned at user's wrist, is gathered signal in real time and is converted the signal to
Wireless signal transmission is to bone conduction sensor reception device 4;Bone conduction sensor signal receiving device 4 is set positioned at visual interactive
On standby 1, the wireless signal of the transmitting of bone conduction sensor signal generation apparatus 3 is received, by being decoded to signal again by putting
Greatly, after noise reduction and identifying processing, corresponding icon is carried out by confirmation operation by transfer algorithm according to result.
Remaining process is identical with embodiment one.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any
The change or replacement expected without creative work, it should all be included within the scope of the present invention.Therefore, it is of the invention
Protection domain should be determined by the scope of protection defined in the claims.
Claims (10)
1. a kind of confirmation method of visual interactive, it is characterised in that comprise the following steps:
Step 1: object selection operation is carried out by Eye-controlling focus in interactive interface;
Step 2: the object selected in step 1 is entered into line activating in interactive interface;
Step 3: the object activated in step 2 is confirmed by osteoacusis.
2. the method that a kind of visual interactive according to claim 1 confirms, it is characterised in that the step 2 activation manipulation
In link, the result selected in step 1 is protruded into signal to user.
3. the method that a kind of visual interactive according to claim 1 confirms, it is characterised in that described to pass through osteoacusis pair
Activation object confirm it being that the signal that the bone conduction sensor identification user carried by visual interactive equipment sends is realized.
4. the method that a kind of visual interactive according to claim 1 confirms, it is characterised in that described to pass through osteoacusis pair
Activation object confirm it being to identify that the signal that user sends is realized by bone conduction sensor existing in the form of accessory, described
Bone conduction sensor is connected with interactive device in a wired or wireless manner.
5. the method that a kind of visual interactive according to any one in claim 3 or 4 confirms, it is characterised in that:It is described
Bone conduction sensor is arranged on user's trunk, is pressed close at bone;Or user's head is arranged on, press close at bone, to connect
Receive the signal that user sends;The bone conduction sensor at least one.
6. the method that a kind of visual interactive according to claim 5 confirms, it is characterised in that:The signal that the user sends
Pass through vibration realizing caused by the finger tapping rigid objects of user.
7. the method that a kind of visual interactive according to claim 5 confirms, it is characterised in that:The signal that the user sends
Vibration realizing caused by rigid objects is tapped by the limbs of user.
8. the method that a kind of visual interactive according to claim 5 confirms, it is characterised in that:Bone conduction sensor receives
After signal, after being amplified to signal, filtering, identify, information needed is extracted, then carried out pair with default validation testing
Than comparing result is exported to other systems.
9. a kind of visual interactive system, the method confirmed using a kind of visual interactive as any one of claim 1-8.
10. a kind of visual interactive equipment, the side confirmed using a kind of visual interactive as any one of claim 1-8
Method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610861394.9A CN107885311A (en) | 2016-09-29 | 2016-09-29 | A kind of confirmation method of visual interactive, system and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610861394.9A CN107885311A (en) | 2016-09-29 | 2016-09-29 | A kind of confirmation method of visual interactive, system and equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107885311A true CN107885311A (en) | 2018-04-06 |
Family
ID=61769622
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610861394.9A Pending CN107885311A (en) | 2016-09-29 | 2016-09-29 | A kind of confirmation method of visual interactive, system and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107885311A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110209276A (en) * | 2019-05-28 | 2019-09-06 | 网易(杭州)网络有限公司 | Object selection method and device, electronic equipment, storage medium in virtual reality |
CN114356482A (en) * | 2021-12-30 | 2022-04-15 | 业成科技(成都)有限公司 | Method for interacting with human-computer interface by using sight line drop point |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101539804A (en) * | 2009-03-11 | 2009-09-23 | 上海大学 | Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen |
US20100110368A1 (en) * | 2008-11-02 | 2010-05-06 | David Chaum | System and apparatus for eyeglass appliance platform |
CN102184011A (en) * | 2011-05-06 | 2011-09-14 | 中国科学院计算技术研究所 | Human-computer interaction equipment |
CN103946732A (en) * | 2011-09-26 | 2014-07-23 | 微软公司 | Video display modification based on sensor input for a see-through near-to-eye display |
CN104866105A (en) * | 2015-06-03 | 2015-08-26 | 深圳市智帽科技开发有限公司 | Eye movement and head movement interactive method for head display equipment |
CN105759839A (en) * | 2016-03-01 | 2016-07-13 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle (UAV) visual tracking method, apparatus, and UAV |
-
2016
- 2016-09-29 CN CN201610861394.9A patent/CN107885311A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100110368A1 (en) * | 2008-11-02 | 2010-05-06 | David Chaum | System and apparatus for eyeglass appliance platform |
CN101539804A (en) * | 2009-03-11 | 2009-09-23 | 上海大学 | Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen |
CN102184011A (en) * | 2011-05-06 | 2011-09-14 | 中国科学院计算技术研究所 | Human-computer interaction equipment |
CN103946732A (en) * | 2011-09-26 | 2014-07-23 | 微软公司 | Video display modification based on sensor input for a see-through near-to-eye display |
CN104866105A (en) * | 2015-06-03 | 2015-08-26 | 深圳市智帽科技开发有限公司 | Eye movement and head movement interactive method for head display equipment |
CN105759839A (en) * | 2016-03-01 | 2016-07-13 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle (UAV) visual tracking method, apparatus, and UAV |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110209276A (en) * | 2019-05-28 | 2019-09-06 | 网易(杭州)网络有限公司 | Object selection method and device, electronic equipment, storage medium in virtual reality |
CN114356482A (en) * | 2021-12-30 | 2022-04-15 | 业成科技(成都)有限公司 | Method for interacting with human-computer interface by using sight line drop point |
CN114356482B (en) * | 2021-12-30 | 2023-12-12 | 业成科技(成都)有限公司 | Method for interaction with human-computer interface by using line-of-sight drop point |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210064132A1 (en) | Systems, methods, and interfaces for performing inputs based on neuromuscular control | |
JP6777089B2 (en) | Information processing equipment, information processing methods and programs | |
US10933321B2 (en) | Information processing device and information processing method | |
KR101959609B1 (en) | Performance of an operation based at least in part on tilt of a wrist worn apparatus | |
US10860104B2 (en) | Augmented reality controllers and related methods | |
WO2016188258A1 (en) | Eye-controlled apparatus, eye-controlled method and eye-controlled system | |
US10162594B2 (en) | Information processing device, method of information processing, and program | |
US11409371B2 (en) | Systems and methods for gesture-based control | |
WO2016063801A1 (en) | Head mounted display, mobile information terminal, image processing device, display control program, and display control method | |
CN104023802B (en) | Use the control of the electronic installation of neural analysis | |
CN110824979B (en) | Unmanned equipment control system and method | |
US20100090877A1 (en) | Remote control device | |
US20190272040A1 (en) | Manipulation determination apparatus, manipulation determination method, and, program | |
CN111580661A (en) | Interaction method and augmented reality device | |
WO2021154612A1 (en) | Determining a geographical location based on human gestures | |
WO2021073743A1 (en) | Determining user input based on hand gestures and eye tracking | |
US10275018B2 (en) | Touch interaction methods and systems using a human body as a signal transmission medium | |
KR20170107229A (en) | cognitive training apparatus and method with eye-tracking | |
CN107885311A (en) | A kind of confirmation method of visual interactive, system and equipment | |
US20210232224A1 (en) | Human-machine interface | |
KR102048551B1 (en) | System and Method for Virtual reality rehabilitation training using Smart device | |
CN108008810A (en) | A kind of confirmation method and system based on Mental imagery | |
JP6594235B2 (en) | Work support apparatus and program | |
CN108881622A (en) | A kind of based reminding method, flexible screen terminal and computer readable storage medium | |
CN108257441A (en) | Support the virtual culinary art training system and method based on motion sensing manipulation of more menus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180406 |