CN106527725A - Method of inputting information or command into equipment through view field center track in VR/AR environment - Google Patents
Method of inputting information or command into equipment through view field center track in VR/AR environment Download PDFInfo
- Publication number
- CN106527725A CN106527725A CN201611009545.4A CN201611009545A CN106527725A CN 106527725 A CN106527725 A CN 106527725A CN 201611009545 A CN201611009545 A CN 201611009545A CN 106527725 A CN106527725 A CN 106527725A
- Authority
- CN
- China
- Prior art keywords
- input
- area
- regions
- environment
- equipment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a method of interaction between humans and equipment in a VR/AR environment. The method is mainly characterized in that a specific track at the center of a view field is utilized to judge an input intention of a user and input information or a command accurately. An input area is divided into three areas, namely a trigger area A, a confirmation area B and an ending area C from inside to outside; the trigger condition is that the trigger area A at the center transits to the confirmation area B, and the ending condition is that input information is confirmed to be ending or the ending area C is entered.
Description
Technical field:
The present invention relates to VR/AR fields, mainly including but not limited to various helmets allow people to be in AR or VR environment.
Background technology:
The present invention is a kind of method that VR/AR environment servant is interacted with equipment, importantly specific using central region
Track, judges that the input of user is intended to and accurately input information or instruction.Purpose is to solve man-machine interaction under VR/AR environment to be stranded
Difficult problem, and good Consumer's Experience can be possessed based on ergonomics.
The content of the invention:
In order to achieve the above object, the technical solution used in the present invention is that, under a kind of environment based on VR/AR, one kind is given
VR/AR environment terminal input message or the method for instruction, described input method include following condition:
A-quadrant is the toggle area being ready for, and when user's central region is in the region, represents that user prepares
One information of input.B regions are the confirmation region of input, enter the diverse location in the region and set position generation according to user
The information of table determines that the input of user is intended to, and the intention is passed to system.C regions are closing region, when user's central region
In the region when, represent user want to leave input state.A, B, C region is extended from inside to outside successively, as shown in Figure 1.
Description of the drawings:
Area schematics of the Fig. 1 for the method.
Particular flow sheets of the Fig. 2 for the method.
Structural representations of Fig. 3 to the Fig. 8 for the method.
Specific embodiment:
Described AR/VR environment input pattern steps are as follows:
Step 1:User is in input state, and central region is located at a-quadrant, into input SBR.
Step 2:Catch whether user's central region leaves a-quadrant into B regions, leave a-quadrant and then illustrating user
It is input into.
Step 4:And user's central region is recorded into the position in B regions.
Step 5:The position in B regions, and position implication set in B regions are entered according to user, the input of user is determined
Behavior.
Step 6:The user behavior transmission that analysis is obtained is system as input, then leaves input shape if end
State, otherwise then detects a-quadrant again and repeat step 1 arrives step 5.
Step 7:Catch whether user enters C regions, enter and then close input state.
Claims (6)
1. to equipment input information or the method for instruction, its feature exists for a kind of utilization central region track of VR/AR environment servant
In, including:
Input area is divided into three regions from inside to outside, and A trigger regions, B determine area, C end zones;
Trigger condition is to determine area for B by the A trigger region transition at center;
Termination condition is to terminate or into C end zones to determine input information.
2., according to claim 1, input area can be circle, and button information as to be input into is distributed in B regions.
3., according to claim 1, input area can be ellipse, and button information as to be input into is distributed in B regions.
4. according to claim 1, input area can be rectangle, and button information as to be input into is distributed in B regions.
5. according to claim 1, input area can be polygon, and button information as to be input into is distributed in B regions.
6., according to claim 1, the button being distributed in B areas can have any shape.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611009545.4A CN106527725A (en) | 2016-11-16 | 2016-11-16 | Method of inputting information or command into equipment through view field center track in VR/AR environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611009545.4A CN106527725A (en) | 2016-11-16 | 2016-11-16 | Method of inputting information or command into equipment through view field center track in VR/AR environment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106527725A true CN106527725A (en) | 2017-03-22 |
Family
ID=58353313
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611009545.4A Pending CN106527725A (en) | 2016-11-16 | 2016-11-16 | Method of inputting information or command into equipment through view field center track in VR/AR environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106527725A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109891371A (en) * | 2017-04-21 | 2019-06-14 | 深圳市大疆创新科技有限公司 | Control method, remote control equipment and the mobile device of aircraft |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102770843A (en) * | 2010-01-29 | 2012-11-07 | Olaworks株式会社 | Method for providing information on object which is not included in visual field of terminal device, terminal device and computer readable recording medium |
CN105324733A (en) * | 2013-06-25 | 2016-02-10 | 富士通株式会社 | Information processing device and program |
-
2016
- 2016-11-16 CN CN201611009545.4A patent/CN106527725A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102770843A (en) * | 2010-01-29 | 2012-11-07 | Olaworks株式会社 | Method for providing information on object which is not included in visual field of terminal device, terminal device and computer readable recording medium |
CN105324733A (en) * | 2013-06-25 | 2016-02-10 | 富士通株式会社 | Information processing device and program |
US20160077586A1 (en) * | 2013-06-25 | 2016-03-17 | Fujitsu Limited | Information processing device that has function to detect line of sight of user |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109891371A (en) * | 2017-04-21 | 2019-06-14 | 深圳市大疆创新科技有限公司 | Control method, remote control equipment and the mobile device of aircraft |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104572001B (en) | Open method and the mobile terminal of split screen | |
CN106411663A (en) | Method for controlling smart home | |
CN102931727B (en) | Topological anti-error check method of power dispatching master station type integrated intelligent anti-error system | |
CN102436308A (en) | Materialized mapping method of touch screen virtual key and touch screen game mobile phone using same | |
CN104865931B (en) | Controlled terminal and the correlating method and device of control terminal | |
CN105225417A (en) | The system and method preventing from child from climbing window falling | |
CN106527725A (en) | Method of inputting information or command into equipment through view field center track in VR/AR environment | |
WO2014048170A1 (en) | Method and device for in-air gesture identification applied in terminal | |
CN103607303B (en) | Signaling flow analysis system and signaling flow analysis method | |
CN107194542A (en) | On-site emergency disposal method, system and moving emergency server | |
CN106647301A (en) | Smart home safety operation method and system | |
CN103558987A (en) | Electronic equipment communication method and electronic equipment communication system | |
CN103135910B (en) | The method and device of edit contact information | |
CN106293485B (en) | A kind of terminal control method and device based on touch track | |
CN105228008A (en) | A kind of orientation gesture that utilizes inputs instruction to manipulate the method for intelligent television | |
CN209911970U (en) | Management system for control authority of kitchen equipment | |
CN106933122A (en) | Train display intelligent interactive method and system | |
CN110045631A (en) | Smart home intersection control routine and its implementation with self-teaching function | |
CN105260028A (en) | Method for controlling onboard computer by motion sensing through mobile phone camera | |
CN103167114A (en) | Gesture unlocking method of smartphone | |
CN112672277B (en) | Network distribution method and network distribution system | |
CN202025295U (en) | Device for rapidly inputting emoticons | |
CN202995422U (en) | Identity intelligence identification home appliance control system | |
CN106066613A (en) | The calling system of a kind of AGV and method of calling thereof | |
CN208188794U (en) | A kind of mouse with distant control function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
DD01 | Delivery of document by public notice |
Addressee: Shanghai Louding Network Technology Co.,Ltd. Document name: Notification of Publication of the Application for Invention |
|
DD01 | Delivery of document by public notice | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
DD01 | Delivery of document by public notice |
Addressee: Shang Weisi Document name: First notice of examination opinions |
|
DD01 | Delivery of document by public notice | ||
DD01 | Delivery of document by public notice |
Addressee: Shang Weisi Document name: Deemed withdrawal notice |
|
DD01 | Delivery of document by public notice | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170322 |
|
WD01 | Invention patent application deemed withdrawn after publication |