US10417826B2 - Information input method in 3D immersive environment - Google Patents
Information input method in 3D immersive environment Download PDFInfo
- Publication number
- US10417826B2 US10417826B2 US15/292,988 US201615292988A US10417826B2 US 10417826 B2 US10417826 B2 US 10417826B2 US 201615292988 A US201615292988 A US 201615292988A US 10417826 B2 US10417826 B2 US 10417826B2
- Authority
- US
- United States
- Prior art keywords
- key position
- selected key
- immersive environment
- information input
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0236—Character input methods using selection techniques to select from displayed items
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0237—Character input methods using prediction or retrieval techniques
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0489—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
- G06F3/04892—Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key
Definitions
- the present disclosure relates to the technical field of virtual reality, and particularly to an information input method in a 3D immersive environment.
- the present disclosure provides an information input method in a 3D immersive environment, comprising:
- a system where the 3D immersive environment lies comprises a handle
- the selecting a key position on the virtual keyboard and determining the selected key position specifically comprises:
- a system where the 3D immersive environment lies comprises a handle having a gyroscope function
- the selecting a key position on the virtual keyboard and determining the selected key position specifically comprises:
- the sending a confirmation instruction to the selected key position, and inputting information corresponding to the selected key position into an information input box specifically comprises:
- binding the confirmation instruction to a certain key on the handle when the key is pressed, sending the confirmation instruction to the selected key position, and inputting information corresponding to the current selected key position into the information input box.
- a system where the 3D immersive environment lies comprises a handle on which a touch panel is disposed; or, a touch panel is disposed on a headset device of a system where the 3D immersive environment lies;
- the selecting a key position on the virtual keyboard and determining the selected key position specifically comprises:
- the sending a confirmation instruction to the selected key position, and inputting information corresponding to the selected key position into an information input box specifically comprises:
- binding the confirmation instruction to a certain key on the handle when the key is pressed, sending the confirmation instruction to the selected key position, and inputting information corresponding to the current selected key position into the information input box;
- the selecting a key position on the virtual keyboard and determining the selected key position specifically comprises:
- the sending a confirmation instruction to the selected key position, and inputting information corresponding to the selected key position into an information input box specifically comprises:
- a system where the 3D immersive environment lies comprises a handle, binding the confirmation instruction to a certain key on the handle; when the key is pressed, inputting information corresponding to the current selected key position into the information input box.
- Advantageous effects of embodiments of the present disclosure are as follows: displaying the virtual keyboard to the user in the 3D immersive environment and receiving the user's selection and confirmation instruction of the key position on the virtual keyboard solves the problem that the user, wearing the virtual reality headset device, cannot see actual key positions of a keyboard in reality and cannot input information and enables the user to conveniently and quickly input information by typewriting in the virtual environment.
- many manners of controlling input e.g., operating a key on the handle, moving the handle, sliding on the touch panel, or moving head, or the like. These manners may be implemented simultaneously in the same virtual reality system so that the user selects a suitable input manner according to his own needs and different users' demands are satisfied.
- FIG. 1 is a flow chart of an information input method in a 3D immersive environment according to an embodiment of the present disclosure
- FIG. 2 is a schematic view of controlling input through keys on a handle in an information input method in a 3D immersive environment according to an embodiment of the present disclosure
- FIG. 3 is a schematic view of controlling input by moving the handle in an information input method in a 3D immersive environment according to an embodiment of the present disclosure
- FIG. 4 is a schematic view of controlling input by sliding on a touch panel in an information input method in a 3D immersive environment according to an embodiment of the present disclosure.
- FIG. 5 is a schematic view of controlling input by turning head in an information input method in a 3D immersive environment according to an embodiment of the present disclosure.
- FIG. 1 is a flow chart of an information input method in a 3D immersive environment according to an embodiment of the present disclosure.
- the information input method in a 3D immersive environment according to an embodiment of the present disclosure comprises:
- Step S 110 displaying a virtual keyboard in the 3D immersive environment.
- the embodiment of the present disclosure provides a virtual keyboard to the user in the 3D immersive environment to enable the user to input information by using the virtual keyboard.
- Step S 120 selecting a key position on the virtual keyboard and determining the selected key position.
- Various key positions are arranged on the virtual keyboard by a certain rule and include various character keys and function keys.
- the user Upon inputting information, the user first selects a desired key position in the virtual keyboard; for example, if the user needs to input the letter m, he needs to select the key position representing the letter m in the virtual keyboard.
- Step S 130 sending a confirmation instruction to the selected key position, and inputting information corresponding to the selected key position into an information input box to thereby complete information input for one time.
- a confirmation instruction is sent to the selected key position to indicate that the user confirms inputting information corresponding to the currently-selected key position. For example, after the letter m is selected on the virtual keyboard, the confirmation instruction is sent to input the letter m into the information input box; if the confirmation instruction is not sent, even if a certain key position is already selected, the corresponding information will not be input into the input box.
- Displaying the virtual keyboard to the user in the 3D immersive environment and receiving the user's selection and confirmation instruction of the key position on the virtual keyboard solves the problem that the user, wearing the virtual reality headset device, cannot see actual key positions of a keyboard in reality and cannot input information and enables the user to conveniently and quickly input information by typewriting in the virtual environment.
- FIG. 2 is a schematic view of controlling input through keys on a handle in an information input method in a 3D immersive environment according to an embodiment of the present disclosure.
- a system where the 3D immersive environment lies comprises a handle, the “selecting a key position on the virtual keyboard and determining the selected key position” in step S 120 specifically comprises: setting a certain key position of the virtual keyboard as an initial selected key position; and sending a key position selection instruction via up, down, left or right direction key of the handle, and taking a key position adjacent to the currently-selected key position in a corresponding direction as a new selected key position.
- the key position is controlled and selected by operating keys of the handle, the handle is connected in the system where the 3D immersive environment lies in a wired or wireless manner, for example, a virtual reality device, and the handle is connected to the device via Bluetooth.
- a virtual keyboard pops up to the user, and a default selected key position is set on the keyboard, e.g., a key position of letter g in the middle of the keyboard.
- the selected key position may display a certain effect to highlight so that the user may visually see which key position is the currently-selected key position. For example, digit key 4 in FIG. 4 is highlighted, and this indicates that the current user selects the digit key 4 .
- the up, down, left or right direction key of the handle is pressed, the selected key position moves accordingly. For example, when the right direction key on the handle is pressed at this time, the digit key 5 is selected and highlighted.
- the “sending a confirmation instruction to the selected key position, and inputting information corresponding to the selected key position into an information input box” in step S 130 specifically comprises: binding the confirmation instruction to a certain key on the handle; when the key is pressed, sending the confirmation instruction to the selected key position, and inputting information corresponding to the current selected key position into the information input box.
- the key X on the handle may be set as a confirmation key, and when the user presses the key X on the handle, information corresponding to the currently-selected key, for example, digit 4 in FIG. 1 , is input into the information input box.
- input of character, letter, digit, symbol and the like on the virtual keyboard in the immersive environment may be implemented only by connecting the handle into the system where the 3D immersive environment lies, and furthermore, that is adapted for various handles available in the market so that the user may conveniently and quickly complete input.
- FIG. 3 is a schematic view of controlling input by moving the handle in an information input method in a 3D immersive environment according to an embodiment of the present disclosure.
- the system where the immersive environment lies comprises a handle having a gyroscope function, and a gyroscope sensor of the handle is used to collect handle movement information to simulate movement of a virtual cursor in the immersive environment.
- the “selecting a key position on the virtual keyboard and determining the selected key position” in step S 120 specifically comprises:
- the virtual cursor corresponding to a mouse pointer of a desktop of a PC and used to select an object in the immersive environment.
- the gyroscope of the handle is used to collect spatial movement information of the handle, and movement of the virtual cursor in the immersive environment is controlled.
- the handle in the user's hand corresponds to the virtual cursor in the immersive environment
- the user may hold the handle to move randomly in the reality space, and the virtual cursor in the immersive environment also move correspondingly at the same time.
- the handle is moved to control the virtual cursor to move into the area of the virtual keyboard, the virtual cursor becomes a selection box for selecting a key position, the handle is moved to control the selection box to move to a certain key position of the virtual keyboard, and the key position where the selection box lies is taken as the selected key position.
- the virtual cursor is a cross-shaped cursor, and when the cross-shaped cursor is moved into the area of the virtual keyboard, the cross-shaped cursor becomes a selection box, wherein the handle corresponds to the selection box.
- the selection box moves accordingly and may select a desired key position. As shown in FIG. 2 , a frame will appear around the key position for highlight purpose so that the user will visually understand which key position is selected.
- the “sending a confirmation instruction to the selected key position, and inputting information corresponding to the selected key position into an information input box” in step S 130 specifically comprises: binding the confirmation instruction to a certain key on the handle; when the key is pressed, sending the confirmation instruction to the selected key position, and inputting information corresponding to the current selected key position into the information input box. This may apply to various handles available in the market.
- movement of the virtual cursor in the immersive environment is controlled to select the key position in the virtual keyboard and achieve information input in the immersive environment. Furthermore, the virtual cursor moves along with the user's hand motion, immersive feeling is enhanced and a better user experience is provided.
- FIG. 4 is a schematic view of controlling input by sliding on a touch panel in an information input method in a 3D immersive environment according to an embodiment of the present disclosure.
- the system where the 3D immersive environment lies comprises a handle on which a touch panel is disposed; or a touch panel is disposed on a headset device of the system where the 3D immersive environment lies.
- the “selecting a key position on the virtual keyboard and determining the selected key position” in step S 120 specifically comprises: displaying a virtual cursor in the immersive environment; sliding a finger up, down, to the left or to the right on a surface of the touch panel, controlling the virtual cursor to move into the area of the virtual keyboard, and changing the virtual cursor into a selection box for selecting a key position; and sliding a finger up, down, to the left or to the right on a surface of the touch panel to control the selection box to move to a certain key position of the virtual keyboard, and taking the key position where the selection box lies as the selected key position.
- the finger slides on the surface of the touch panel to control the movement of the virtual cursor in the immersive environment
- the virtual cursor after moving into the area of the virtual keyboard, changes from the cross-shaped cursor to a selection box, and the finger continues to slide on the surface of the touch panel to control the selection box to move.
- the “sending a confirmation instruction to the selected key position, and inputting information corresponding to the selected key position into an information input box” in step S 130 specifically comprises: binding the confirmation instruction to a certain key on the handle; when the key is pressed, sending the confirmation instruction to the selected key position, and inputting information corresponding to the current selected key position into the information input box; or, by clicking the touch panel one time, sending the confirmation instruction to the selected key position, and inputting information corresponding to the current selected key position into the information input box. Similar to the preceding two preferred embodiments, in the present preferred embodiment, after the key position is determined, it is feasible to send the confirmation instruction by pressing a key on the handle or complete confirmation by clicking the touch panel.
- FIG. 5 is a schematic view of controlling input by turning head in an information input method in a 3D immersive environment according to an embodiment of the present disclosure.
- the “selecting a key position on the virtual keyboard and determining the selected key position” in step S 120 specifically comprises: locking the virtual keyboard in the immersive environment so that the virtual keyboard does not change the position along with the user's head motion; displaying the virtual cursor straight ahead the user's sight line in the immersive environment so that the virtual cursor moves along with the user's head motion; and controlling the virtual cursor to a certain key position of the virtual keyboard through the head motion and taking the key position where the virtual cursor lies as the selected key position.
- the key position is selected by controlling the virtual cursor. If the virtual keyboard, like the virtual cursor, moves as the user's head turns, it is impossible to select a desired key position by using the virtual cursor. Hence, it is necessary to lock the virtual keyboard in the immersive environment so that the virtual keyboard does not move along with the user's head motion.
- the “sending a confirmation instruction to the selected key position, and inputting information corresponding to the selected key position into an information input box” in step S 130 specifically comprises: when a time period for which the virtual cursor stays on the selected key position reaches a preset value, sending the confirmation instruction to the selected key position, and inputting information corresponding to the selected key position into the information input box; or, the system where the 3D immersive environment lies comprises a handle, binding the confirmation instruction to a certain key on the handle; when the key is pressed, inputting information corresponding to the current selected key position into the information input box.
- the confirmation instruction is sent by pressing a key on the handle.
- a countdown dynamic effect will occur.
- the virtual cursor does not leave the key position.
- information such as letter, digit, symbol or character corresponding to the key position is input, and if the user does not want to input information corresponding to the key position, he may cancel the countdown and cancel the input of the information corresponding to the key position only by moving his head to control the virtual cursor to move away.
- the information input method in a 3D immersive environment solves the problem that the user, wearing a virtual reality headset device, cannot see actual key positions of a keyboard in reality and cannot input information and enables the user to conveniently and quickly input by typewriting in the virtual environment.
- Solutions of preferred embodiments of the present disclosure may be used in combination, and it is also feasible to simultaneously implement solutions of preferred embodiments of the present disclosure in the same virtual reality system so that the user selects a suitable input method according to his own needs. For example, without the handle, the user may input information via the touch panel or head motion; if some users are likely to feel dizzy when he sways his head, he may input information via the handle; and if some users pursue for a better immersive feeling, he may resort to head motion or handle movement. In this way, different users' demands are satisfied.
- the information input method in a 3D immersive environment according to the present disclosure has the following advantageous effects:
- the information input method in a 3D immersive environment solves the problem that the user, wearing a virtual reality headset device, cannot see actual key positions of a keyboard in reality and cannot input information and enables the user to conveniently and quickly input by typewriting in the virtual environment.
- the information input method in a 3D immersive environment provides many manners of controlling input, e.g., operating a key on the handle, moving the handle, sliding on the touch panel, or moving head, or the like. These manners may be implemented simultaneously in the same virtual reality system so that the user selects a suitable input manner according to his own needs and different users' demands are satisfied.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (9)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610237717.7A CN105955453A (en) | 2016-04-15 | 2016-04-15 | Information input method in 3D immersion environment |
| CN201610237717 | 2016-04-15 | ||
| CN201610237717.7 | 2016-04-15 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20170301138A1 US20170301138A1 (en) | 2017-10-19 |
| US10417826B2 true US10417826B2 (en) | 2019-09-17 |
Family
ID=56918123
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/292,988 Active US10417826B2 (en) | 2016-04-15 | 2016-10-13 | Information input method in 3D immersive environment |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US10417826B2 (en) |
| CN (1) | CN105955453A (en) |
Families Citing this family (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107992213B (en) * | 2016-10-27 | 2021-07-16 | 腾讯科技(深圳)有限公司 | Identification generation method and identity verification method based on virtual reality |
| EP3502939B1 (en) | 2016-08-19 | 2023-06-14 | Tencent Technology (Shenzhen) Company Limited | Authentication method based on virtual reality scene, virtual reality device, and storage medium |
| CN106527916A (en) * | 2016-09-22 | 2017-03-22 | 乐视控股(北京)有限公司 | Operating method and device based on virtual reality equipment, and operating equipment |
| CN106980362A (en) * | 2016-10-09 | 2017-07-25 | 阿里巴巴集团控股有限公司 | Input method and device based on virtual reality scenario |
| CN106774823A (en) * | 2016-11-11 | 2017-05-31 | 奇酷互联网络科技(深圳)有限公司 | Virtual reality device and its input method |
| CN108121439A (en) * | 2016-11-30 | 2018-06-05 | 成都理想境界科技有限公司 | Dummy keyboard input method and device based on head-mounted display apparatus |
| CN108121438B (en) * | 2016-11-30 | 2021-06-01 | 成都理想境界科技有限公司 | Virtual keyboard input method and device based on head-mounted display equipment |
| CN108121492A (en) * | 2016-11-30 | 2018-06-05 | 成都理想境界科技有限公司 | Virtual keyboard display method and device based on head-mounted display apparatus |
| CN108064372A (en) * | 2016-12-24 | 2018-05-22 | 深圳市柔宇科技有限公司 | Head-mounted display apparatus and its content input method |
| CN106873828A (en) * | 2017-01-21 | 2017-06-20 | 司承电子科技(上海)有限公司 | A kind of implementation method of the 3D press key input devices for being applied to virtual reality products |
| CN108427194A (en) * | 2017-02-14 | 2018-08-21 | 深圳梦境视觉智能科技有限公司 | A kind of display methods and equipment based on augmented reality |
| CN108427195A (en) * | 2017-02-14 | 2018-08-21 | 深圳梦境视觉智能科技有限公司 | A kind of information processing method and equipment based on augmented reality |
| CN106933364B (en) * | 2017-03-15 | 2019-09-27 | 京东方科技集团股份有限公司 | Character input method, character input device, and wearable device |
| CN107302763B (en) * | 2017-05-26 | 2019-11-08 | 北京小鸟看看科技有限公司 | Realize method, client terminal device and the system for wearing display equipment and accessory binding |
| CN108932100A (en) * | 2017-05-26 | 2018-12-04 | 成都理想境界科技有限公司 | A kind of operating method and head-mounted display apparatus of dummy keyboard |
| CN108700957B (en) * | 2017-06-30 | 2021-11-05 | 广东虚拟现实科技有限公司 | Electronic system and method for text entry in virtual environments |
| CN108170506B (en) * | 2017-11-27 | 2021-09-17 | 北京硬壳科技有限公司 | Method and device for controlling app and control system |
| WO2019203837A1 (en) * | 2018-04-19 | 2019-10-24 | Hewlett-Packard Development Company, L.P. | Inputs to virtual reality devices from touch surface devices |
| CN109613979B (en) * | 2018-11-29 | 2022-02-08 | 武汉中地地科传媒文化有限责任公司 | Character input method and device, AR equipment and computer storage medium |
| CN112218134B (en) * | 2020-09-08 | 2023-06-02 | 华为技术加拿大有限公司 | Input method and related equipment |
| CN116400839B (en) * | 2023-06-01 | 2023-08-22 | 北京虹宇科技有限公司 | Input method, device and equipment in three-dimensional space |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6098086A (en) * | 1997-08-11 | 2000-08-01 | Webtv Networks, Inc. | Japanese text input method using a limited roman character set |
| US20110201387A1 (en) * | 2010-02-12 | 2011-08-18 | Microsoft Corporation | Real-time typing assistance |
| US20110221656A1 (en) * | 2010-02-28 | 2011-09-15 | Osterhout Group, Inc. | Displayed content vision correction with electrically adjustable lens |
| US20140218294A1 (en) * | 2013-02-05 | 2014-08-07 | Shenzhen Skyworth-RGB electronics Co. Ltd. | Method for remote control to input characters to display device |
| CN104076512A (en) | 2013-03-25 | 2014-10-01 | 精工爱普生株式会社 | Head-mounted display device and method of controlling head-mounted display device |
| US20150293644A1 (en) * | 2014-04-10 | 2015-10-15 | Canon Kabushiki Kaisha | Information processing terminal, information processing method, and computer program |
| US20160124926A1 (en) * | 2014-10-28 | 2016-05-05 | Idelan, Inc. | Advanced methods and systems for text input error correction |
| US20170076502A1 (en) * | 2015-09-16 | 2017-03-16 | Google Inc. | Touchscreen hover detection in an augmented and/or virtual reality environment |
| US20170212596A1 (en) * | 2016-01-22 | 2017-07-27 | Sharp Laboratories Of America, Inc. | Systems and methods for determining input movement |
-
2016
- 2016-04-15 CN CN201610237717.7A patent/CN105955453A/en active Pending
- 2016-10-13 US US15/292,988 patent/US10417826B2/en active Active
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6098086A (en) * | 1997-08-11 | 2000-08-01 | Webtv Networks, Inc. | Japanese text input method using a limited roman character set |
| US20110201387A1 (en) * | 2010-02-12 | 2011-08-18 | Microsoft Corporation | Real-time typing assistance |
| US20110221656A1 (en) * | 2010-02-28 | 2011-09-15 | Osterhout Group, Inc. | Displayed content vision correction with electrically adjustable lens |
| US20140218294A1 (en) * | 2013-02-05 | 2014-08-07 | Shenzhen Skyworth-RGB electronics Co. Ltd. | Method for remote control to input characters to display device |
| CN104076512A (en) | 2013-03-25 | 2014-10-01 | 精工爱普生株式会社 | Head-mounted display device and method of controlling head-mounted display device |
| US20150293644A1 (en) * | 2014-04-10 | 2015-10-15 | Canon Kabushiki Kaisha | Information processing terminal, information processing method, and computer program |
| US20160124926A1 (en) * | 2014-10-28 | 2016-05-05 | Idelan, Inc. | Advanced methods and systems for text input error correction |
| US20170076502A1 (en) * | 2015-09-16 | 2017-03-16 | Google Inc. | Touchscreen hover detection in an augmented and/or virtual reality environment |
| US20170212596A1 (en) * | 2016-01-22 | 2017-07-27 | Sharp Laboratories Of America, Inc. | Systems and methods for determining input movement |
Also Published As
| Publication number | Publication date |
|---|---|
| CN105955453A (en) | 2016-09-21 |
| US20170301138A1 (en) | 2017-10-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10417826B2 (en) | Information input method in 3D immersive environment | |
| US20130307796A1 (en) | Touchscreen Device Integrated Computing System And Method | |
| US10216286B2 (en) | On-screen diagonal keyboard | |
| US10032346B2 (en) | Haptic device incorporating stretch characteristics | |
| Hinckley et al. | Input/Output Devices and Interaction Techniques. | |
| US9813768B2 (en) | Configured input display for communicating to computational apparatus | |
| US20160098094A1 (en) | User interface enabled by 3d reversals | |
| CN107209582A (en) | The method and apparatus of high intuitive man-machine interface | |
| US10238960B2 (en) | Dual input multilayer keyboard | |
| EP3262505A1 (en) | Interactive system control apparatus and method | |
| Brancati et al. | Touchless target selection techniques for wearable augmented reality systems | |
| DE102015016443A1 (en) | Touch input device and vehicle with this | |
| DE102015222420A1 (en) | TOUCHING DEVICE AND VEHICLE WITH THIS | |
| DE102016206719A1 (en) | Touch input device and vehicle with a touch input device | |
| JP2015135572A (en) | Information processing apparatus and control method thereof | |
| CN106873763B (en) | Virtual reality equipment and information input method thereof | |
| Kim et al. | A tangible user interface with multimodal feedback | |
| Hofmann et al. | Design and Implementation of a Virtual Workstation for a Remote AFISO | |
| Bernhaupt et al. | Absolute indirect touch interaction: impact of haptic marks and animated visual feedback on usability and user experience | |
| TW200807284A (en) | Programmable touch system | |
| KR102238697B1 (en) | A table top interface apparatus, a multi touch object and method thereof | |
| GB2535730A (en) | Interactive system control apparatus and method | |
| US20160253038A1 (en) | Method and System for Precise Object Control on Touch Screen Device | |
| Pietroszek | 3D Pointing with Everyday Devices: Speed, Occlusion, Fatigue | |
| Darbar | Extending Interaction Space in Augmented Reality: Contributions in Optical-See-Through and Projection-Based Augmented Environments |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: BEIJING PICO TECHNOLOGY CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAN, JIN;REEL/FRAME:040029/0720 Effective date: 20161010 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |