CN109791429B - Head-mounted display device and input control method thereof - Google Patents

Head-mounted display device and input control method thereof Download PDF

Info

Publication number
CN109791429B
CN109791429B CN201780058749.3A CN201780058749A CN109791429B CN 109791429 B CN109791429 B CN 109791429B CN 201780058749 A CN201780058749 A CN 201780058749A CN 109791429 B CN109791429 B CN 109791429B
Authority
CN
China
Prior art keywords
distance
optical lens
pointer
sensitivity
head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780058749.3A
Other languages
Chinese (zh)
Other versions
CN109791429A (en
Inventor
叶泽钢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Royole Technologies Co Ltd
Original Assignee
Shenzhen Royole Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Royole Technologies Co Ltd filed Critical Shenzhen Royole Technologies Co Ltd
Publication of CN109791429A publication Critical patent/CN109791429A/en
Application granted granted Critical
Publication of CN109791429B publication Critical patent/CN109791429B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

Abstract

A head-mounted display device (100) comprises a display device (10), a processor (20) and a sensor unit (30), wherein the display device (10) comprises a display screen (11) and an optical lens (12). The display screen (11) is used for outputting a display screen including a pointer. The optical lens (12) is located in an output path of the display. The sensor unit (30) is used for detecting the distance between the optical lens (12) and the display screen (11) to obtain a detection distance. The processor (20) is used for adjusting the sensitivity of the pointer movement at least according to the detection distance detected by the sensor unit (30). The equipment and the method can automatically adjust the moving sensitivity of the pointer according to the wearing mode of the wearer, so that the equipment and the method are suitable for different people.

Description

Head-mounted display device and input control method thereof
Technical Field
The present disclosure relates to display devices, and particularly to a head-mounted display device and an input control method of the head-mounted display device.
Background
Head-mounted display devices have been gradually favored by people due to their convenience and ability to achieve stereoscopic display and stereo effects. In recent years, with the advent of Virtual Reality (VR) technology, head-mounted display devices have become more widely used as hardware support devices for VR technology. However, in the current head-mounted display device, after a user wears the head-mounted display device, input control such as pointer movement control is generally required to be performed by using an external touch panel, a smart terminal, or the like as an input device. However, in the existing products, the sensitivity of pointer movement is fixed, and the requirements of different people cannot be met.
Disclosure of Invention
The embodiment of the invention discloses a head-mounted display device and an input control method thereof, which can automatically adjust the sensitivity of a displayed pointer according to the wearing mode of a user wearing the head-mounted display device.
The head-mounted display equipment disclosed by the embodiment of the invention comprises a display device and a processor, wherein the display device comprises a display screen and an optical lens. The display screen is used for outputting a display picture including a pointer. The optical lens is positioned in an output path of the display picture. The head-mounted display device further comprises a sensor unit, and the sensor unit is used for detecting the distance between the optical lens and the display screen to obtain a detection distance. The processor is used for adjusting the sensitivity of the pointer movement at least according to the detection distance detected by the sensor unit.
The input control method disclosed by the embodiment of the invention is used for automatically adjusting the sensitivity of a pointer of a head-mounted display device, and is characterized by comprising the following steps: detecting the distance between an optical lens of the head-mounted display equipment and a display screen through a sensor unit of the head-mounted display equipment to obtain a detection distance; and adjusting the sensitivity of the pointer movement at least according to the detection distance detected by the sensor unit.
According to the head-mounted display equipment and the input control method thereof, after the position of the optical lens is moved according to diopter and the like of a user wearing the head-mounted display equipment, the moving sensitivity of the pointer can be automatically adjusted according to the distance between the optical lens and the display screen, so that the head-mounted display equipment is suitable for the user.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a block diagram of a head-mounted display device according to an embodiment of the invention.
Fig. 2 is a schematic side view of a partial structure of a display device according to an embodiment of the invention.
Fig. 3 is an enlarged schematic view of a screen of a display device according to an embodiment of the invention.
Fig. 4 is a schematic overall structure diagram of a head-mounted display device according to an embodiment of the invention.
Fig. 5 is a flowchart of an input control method according to an embodiment of the invention.
FIG. 6 is a sub-flowchart of step S503 in FIG. 5 in one embodiment.
Fig. 7 is a sub-flowchart of step S503 in fig. 5 in another embodiment.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1 and fig. 2 together, fig. 1 is a block diagram of a head-mounted display apparatus 100 according to an embodiment of the invention. As shown in fig. 1, the head-mounted display apparatus 100 includes a display device 10, a processor 20, and a sensor unit 30.
The display device 10 includes a display screen 11 and an optical lens 12. The display screen 11 is used to output a display screen including a pointer. The optical lens 12 is located in an output path of the display. That is, when the user wears the head-mounted display device 100, the optical lens 12 is located on a side of the display screen 11 close to the eyes of the user, so that the display image output by the display screen 11 will pass through the optical lens 12 before reaching the eyes of the user.
As shown in fig. 2, the sensor unit 30 is configured to detect a distance between the optical lens 12 and the display screen 11 to obtain a detected distance D1. Wherein, the sensor unit 30 is disposed on the display screen 11 and the optical lens 12.
The processor 20 is configured to adjust the sensitivity of the pointer movement at least according to the detection distance D1 detected by the sensor unit 30.
In some embodiments, the processor 20 adjusts a pointer sensitivity factor (factor) according to at least the detected distance detected by the sensor unit 30 to achieve the adjustment of the sensitivity of the pointer movement. Wherein the moving distance of the pointer on the display screen 11 is equal to the physical moving distance of the input device 200 (shown in fig. 4) multiplied by the pointer sensitivity factor, and when the pointer sensitivity factor is changed, the moving sensitivity of the pointer is also changed accordingly, so that the adjustment of the pointer moving sensitivity can be realized by adjusting the pointer sensitivity factor.
The optical lens 12 is movable according to a user operation, for example, when the user adjusts the diopter of the head mounted display device 100, the optical lens 12 moves to different positions, and when the positions of the optical lens 12 are different, different corrective power effects can be generated. Therefore, when a user, such as a user with myopia, uses the head-mounted display device 100, the position of the optical lens 12 is adjusted, and at this time, the sensitivity of the pointer movement can be automatically adjusted according to the detected distance between the optical lens 12 and the display screen 11, so as to be suitable for different people.
As shown in fig. 2, in an embodiment, the processor 20 obtains an actual distance W1 between the display screen 11 and an exit pupil position C1 of the display device 10, calculates a difference between the actual distance W1 and the detected distance D1 to obtain an exit pupil distance Dc, calculates a ratio of the exit pupil distance Dc to a first preset distance J1 to obtain a target pointer sensitivity factor, and adjusts a current pointer sensitivity factor to the target pointer sensitivity factor. That is, the target pointer sensitivity factor is equal to the ratio of the exit pupil distance Dc to the first preset distance J1. The exit pupil position C1 is a position where the user's eyes are located when the head-mounted display device 100 is worn by the user. The first preset distance J1 is the distance between the optical lens 12 and the exit pupil position C1 when the optical lens 12 is at the initial position. In some embodiments, an eyepiece is disposed at the exit pupil position C1, which may be referred to as an eyepiece at exit pupil position C1.
That is, in the illustrated embodiment, the processor 20 calculates the target pointer sensitivity factor according to the formula Dc/J1. Wherein, as previously described, Dc-W1-D1, J1 is the first predetermined distance. When the optical lens 12 is at the initial position, the optical lens 12 does not perform power correction, and the display image viewed through the optical lens 12 by the user is a normal display image without performing vision correction.
In another embodiment, the processor 20 determines the correction degree Y1 of the optical lens 12 to the human eye at the current position according to the detected distance D1, calculates the zoom factor of the display viewed through the optical lens 12 according to the detected distance D1 and the correction degree Y1, determines the reciprocal of the zoom factor as the target pointer sensitivity factor, and adjusts the current pointer sensitivity factor to the target pointer sensitivity factor. The detecting distance D1 and the correcting degree Y1 have a preset corresponding relationship, and the correcting degree Y1 corresponding to the detecting distance D1 can be obtained according to the corresponding relationship. The preset corresponding relation can be obtained in advance through a plurality of times of test tests or through a specific relational expression.
Specifically, the processor 20 calculates the zoom factor F1 according to the formula F1 ═ P1/P2 ═ (D1 × Y1) × ppi S1/D × Y2 × ppi S1, where D1 is the detection distance, Y1 is the correction degree, and D is the distance between the optical lens and the display screen 11 when the optical lens is at the initial position. Ppi is picture pixel density, and S1 is the physical movement distance of the input device 200 (shown in fig. 4) controlling the pointer movement. Y2 is normal diopter, typically 1.
Therefore, the scaling factor F1 ═ P1/P2 ═ D1 × Y1 × ppi S1/D × Y2 × ppi S1, which can be simplified to F1 ═ D1 × Y1)/D.
Wherein, P1 (D1Y 1) ppi S1 is the moving distance of the pointer on the display screen when the input device 200 moves the distance S1 when the distance between the optical lens 12 and the display screen 11 is D1. P2 is d Y2 ppi S1, which is the moving distance of the pointer on the display screen when the input device 200 moves the distance S1 when the distance between the optical lens 12 and the display screen 11 is the initial distance d.
Wherein, P1/P2 is the ratio of the moving distance of the pointer in the zoomed display screen to the moving distance in the normal/initial display screen when the input device 200 moves the same distance, which is equivalent to the zoom multiple of the display screen. The larger the amplification factor is, the lower the sensitivity factor needs to be adjusted, so as to avoid the problem that the moving distance is amplified too much to cause the sensitivity to be too high.
Therefore, the scaling factor is inversely proportional to the sensitivity. The processor 20 determines the inverse of the scaling factor as the target pointer sensitivity factor. Thus, the target pointer sensitivity factor is D/(D1Y 1).
As shown in fig. 1, the sensor unit 30 includes a signal generating unit 31 and a signal receiving unit 32. One of the signal generating unit 31 and the signal receiving unit 32 is disposed on the optical lens 12, the other of the signal generating unit 31 and the signal receiving unit 32 is disposed on the display screen 11, the signal generating unit 31 is configured to generate a specific signal, and the signal receiving unit 32 is configured to receive the specific signal and determine a distance between the optical lens 12 and the display screen 11 according to the received specific signal.
For example, in some embodiments, the signal generating unit 31 is an infrared transmitter for transmitting an infrared signal, and the signal receiving unit 32 is an infrared receiver for receiving the infrared signal and determining the distance between the optical lens 12 and the display screen 11 according to the time interval between the receiving of the infrared signal and the transmission of the infrared signal by the infrared transmitter and the speed of the infrared signal.
For another example, in some embodiments, the signal generating unit 31 is a magnet, and the signal receiving unit 32 is a magnetic force receiving sensor, configured to detect or receive a magnetic force generated by the magnet, and determine a distance between the optical lens 12 and the display screen 11 according to a strength of the received magnetic force.
As shown in fig. 1, the display device 10 further includes a driving unit 13, where the driving unit 13 is connected to the optical lens 12 and is configured to drive the optical lens 12 to move along the output path of the display screen in response to a user operation so as to increase or decrease the distance between the optical lens 12 and the display screen 11.
The optical lens 12 is mounted on an axis (not shown) parallel to the user's sight line and/or parallel to the output path of the display screen and is driven by the driving unit 13 to move along the axis so as to move closer to or away from the user's eye/exit pupil position C1 (i.e., move away from or close to the display screen 11) for focusing, and exists as a corrective lens having a corresponding correction power.
In the present embodiment, the optical lens 12 is a magnifying lens, such as a convex lens, and the magnification of the display screen on the display screen 11 entering the human eye when the optical lens 12 moves in response to the driving of the driving unit 13 is changed to change the correction power.
Referring to fig. 3, when the optical lens 12 is driven to move to enlarge the display image, the image that the display screen 11 outputs the display image and enters the human eye after passing through the optical lens 12 is an enlarged virtual image M1, and the effect seen by the human eye is an enlarged display effect.
Please refer to fig. 4, which is a schematic diagram of the overall structure of the head-mounted display apparatus 100. The head-mounted display apparatus 100 is also connected with an input device 200. The input device 200 is used for controlling input control such as movement of a pointer. The input device 200 may be a touch pad, a mouse, or a touch screen of a mobile phone, a tablet computer, or the like.
The processor 20 adjusts a pointer sensitivity factor according to the detected distance detected by the sensor unit, so as to adjust the sensitivity of the pointer movement, including: when the input apparatus 200 controls the pointer to move, the physical movement distance of the input apparatus 200 is multiplied by the pointer sensitivity factor to obtain the movement distance of the pointer on the display screen, and the control target pointer moves a corresponding distance on the display screen, and the sensitivity of the pointer movement is adjusted accordingly. Thus, the adjustment of the sensitivity is achieved by adjusting the sensitivity factor.
When the input device 200 is a mouse, the physical movement distance is a movement distance of the input device 200 itself, and when the input device 200 is a touch pad or a touch screen, the physical movement distance may be a movement distance of a finger, a touch pen, or the like touching the input device 200.
As shown in fig. 4, the head-mounted display device 100 further includes an adjustment button 40, and the adjustment button 40 is located outside the head-mounted display device 100 and can be operated by a user. The driving unit 13 drives the optical lens 12 to move in response to the user's operation of the adjustment button 40. For example, the adjusting button 40 may be a knob, and the adjusting button 40 may be directly connected to the driving unit 13, so that the adjusting button 40 moves the driving unit 13 in response to the rotation of the user, and the driving unit 13 drives the optical lens 12 to move.
In some embodiments, the adjustment button 40 generates an adjustment signal in response to a user operation, and the processor 20 is further connected to the adjustment button 40 and the driving unit 13, and controls the driving unit 13 to drive the optical lens 12 to move correspondingly after receiving the adjustment signal.
Therefore, in the invention, when the user wears the head-mounted display device 100 and adjusts the position of the optical lens 12 according to the degree of eyes of the user so as to zoom the display picture, the sensitivity of the pointer is correspondingly adjusted, so that the degree of the user can be adjusted to be matched with the normal sensitivity no matter how much the display picture is zoomed, and the requirements of different user groups are met.
As shown in fig. 4, the head-mounted display device 100 further includes an earphone device 60, where the earphone device 60 includes an earphone band 61 and two earphone earphones 62, and the earphone band 62 is used for electrically connecting the two earphone earphones. Wherein the adjustment button 40 may be disposed on the headphone 62.
The processor 20 may be a microcontroller, a microprocessor, a single chip, a digital signal processor, or the like.
The head-mounted display device 100 may be a head-mounted device such as an intelligent helmet or intelligent glasses.
Fig. 5 is a flowchart of an input control method according to an embodiment of the invention. Wherein, the input control method is applied to the head-mounted display device 100 described above, and the steps in the input control method are not limited to the following execution order. The input control method comprises the following steps:
the distance between the optical lens 12 and the display screen 11 is detected by the sensor unit 30 to obtain a detected distance D1 (S501).
The sensitivity of the pointer movement is adjusted at least according to the detection distance D1 detected by the sensor unit 30 (S503). Wherein, the step S503 specifically includes: and adjusting a pointer sensitivity factor (factor) at least according to the detection distance detected by the sensor unit 30 to realize the adjustment of the sensitivity of the pointer movement. Wherein the moving distance of the pointer is equal to the physical moving distance of the input device 200 multiplied by the pointer sensitivity factor, and when the pointer sensitivity factor is changed, the moving sensitivity of the pointer is also changed accordingly, so that the adjustment of the pointer moving sensitivity can be realized by adjusting the pointer sensitivity factor.
Wherein, in some embodiments, the input control method further comprises the steps of: the optical lens 12 is controlled to move along the direction of the light outgoing path of the display screen displayed on the display screen 11 in response to a user operation. The steps can be executed before the step S501, and when the position of the optical lens 12 moves, the detection distance D1 detected by the sensor unit 30 will also change accordingly.
Please refer to fig. 6, which is a sub-flowchart of step S503 in an embodiment. In one embodiment, the step S503 includes:
an actual distance W1 between the display screen 11 and the exit pupil position C1 of the display device 11 is acquired (S5031).
Calculating the difference between the actual distance W1 and the detected distance D1 to obtain the exit pupil distance Dc (S5032).
Calculating the ratio of the exit pupil distance Dc to a first preset distance J1 to obtain a target pointer sensitivity factor (S5033). Wherein the first preset distance J1 is the distance between the optical lens 12 and the exit pupil position C1 when the optical lens 12 is at the initial position.
The pointer sensitivity factor is adjusted to the target pointer sensitivity factor to effect adjustment of the pointer movement sensitivity (S5034).
Please refer to fig. 7, which is a sub-flowchart of step S503 in another embodiment. In another embodiment, the step S503 includes:
and determining the correction degree Y1 of the optical lens 12 to the human eye at the current position according to the detected distance D1 (S5051). The detecting distance D1 and the correcting degree Y1 have a preset corresponding relationship, and the correcting degree Y1 corresponding to the detecting distance D1 can be obtained according to the corresponding relationship.
The zoom factor of the display viewed through the optical lens 12 is calculated according to the detected distance D1 and the correction power Y1, and the reciprocal of the zoom factor is determined as the target pointer sensitivity factor (S5052). Specifically, the zoom factor F1 is calculated according to the formula F1 ═ D1 × Y1)/D, where D1 is the detection distance, Y1 is the correction power, and D is the distance between the optical lens and the display screen 11 when the optical lens is at the initial position.
The pointer sensitivity factor is adjusted to the target pointer sensitivity factor to effect adjustment of the sensitivity of pointer movement (S5053).
As shown in fig. 1, the head-mounted display device 100 further includes a memory 50, where the memory 50 stores a plurality of program instructions, and after the processor 20 calls and executes the program instructions, the processor executes any one of the methods shown in fig. 5 to 7 to implement automatic adjustment of the sensitivity of the pointer.
In some embodiments, the present invention also provides a computer readable storage medium having stored therein a number of program instructions for execution by processor 20 upon invocation to perform any of the method steps of fig. 5-7. In some embodiments, the computer storage medium is the memory 50, and may be any storage device capable of storing information, such as a memory card, a solid-state memory, a micro hard disk, an optical disk, and the like.
Therefore, the head-mounted display device 100 and the input control method of the present invention can adjust the sensitivity of the pointer when the user adjusts the position of the optical lens 12 according to the diopter of the user, so as to automatically adapt to various people with different diopters.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention.

Claims (20)

1. A head-mounted display device comprising a display device and a processor, the display device comprising:
a display screen for outputting a display screen including a pointer;
an optical lens located in an output path of the display screen;
the head mounted display device further includes:
the sensor unit is arranged on the display screen and the optical lens and used for detecting the distance between the optical lens and the display screen to obtain a detection distance;
the processor is used for adjusting the sensitivity of the pointer movement at least according to the detection distance detected by the sensor unit.
2. The head-mounted display device of claim 1, wherein the processor adjusts a pointer sensitivity factor based at least on the detected distance detected by the sensor unit to adjust the sensitivity of the pointer movement.
3. The head-mounted display device of claim 2, wherein the processor obtains an actual distance between the display screen and an exit pupil location of the display device, calculates a difference between the actual distance and the detected distance to obtain an exit pupil distance, then calculates a ratio of the exit pupil distance to a first predetermined distance to obtain a target pointer sensitivity factor, and adjusts a current pointer sensitivity factor to the target pointer sensitivity factor to adjust the sensitivity of the pointer movement.
4. The head-mounted display device as claimed in claim 2, wherein the processor determines a correction degree of the optical lens to the human eye at the current position according to the detected distance, calculates a zoom factor of a display viewed through the optical lens according to the detected distance and the correction degree, determines a reciprocal of the zoom factor as a target pointer sensitivity factor, and adjusts the current pointer sensitivity factor to the target pointer sensitivity factor to adjust the sensitivity of the pointer movement.
5. The head-mounted display device as claimed in claim 4, wherein the processor calculates the zoom factor according to the formula (D1X Y1)/D, wherein D1 is the detected distance, Y1 is the correction degree, and D is the distance between the optical lens and the display screen when the optical lens is at the initial position.
6. The head-mounted display apparatus of claim 1, wherein the display device further comprises a driving unit for driving the optical lens to move along the output path of the display screen in response to a user operation to increase or decrease the distance between the optical lens and the display screen.
7. The head-mounted display device according to claim 6, wherein the optical lens is a magnifying lens, and the magnification at which the display screen on the display screen enters the human eye changes when the optical lens moves in response to the driving of the driving unit to change the correction power.
8. The head-mounted display apparatus according to claim 1, wherein the sensor unit includes a signal generating unit and a signal receiving unit, one of the signal generating unit and the signal receiving unit is disposed on the optical lens, the other of the signal generating unit and the signal receiving unit is disposed on the display screen, the signal generating unit is configured to generate a specific signal, and the signal receiving unit is configured to receive the specific signal and determine a distance between the optical lens and the display screen according to the received specific signal.
9. The head-mounted display device of claim 8, wherein the signal generating unit is an infrared transmitter for transmitting an infrared signal, and the signal receiving unit is an infrared receiver for receiving the infrared signal and determining the distance between the optical lens and the display screen according to a time interval between receiving the infrared signal and the infrared transmitter transmitting the infrared signal and a speed of the infrared signal.
10. The head-mounted display device of claim 8, wherein the signal generating unit is a magnet, and the signal receiving unit is a magnetic force receiving sensor for determining a distance between the optical lens and the display screen according to the intensity of the received magnetic force.
11. The head-mounted display device of claim 7, wherein the head-mounted display device further comprises an adjustment button, and the driving unit drives the optical lens to move in response to the user operating the adjustment button.
12. The head-mounted display device of claim 11, wherein the adjustment button is a knob, and the adjustment button is directly connected to the driving unit and moves the driving unit in response to a rotational operation of a user, so that the driving unit drives the optical lens to move.
13. The head-mounted display device of claim 11, wherein the processor is connected to the adjustment button and the driving unit, the adjustment button generates an adjustment signal in response to a user operation, and the processor controls the driving unit to drive the optical lens to move correspondingly after receiving the adjustment signal.
14. An input control method for automatically adjusting sensitivity of a pointer of a head-mounted display apparatus, the input control method comprising:
detecting the distance between an optical lens of the head-mounted display equipment and a display screen through a sensor unit of the head-mounted display equipment to obtain a detection distance; and
and adjusting the moving sensitivity of the pointer at least according to the detection distance detected by the sensor unit.
15. The input control method of claim 14, wherein the step of adjusting the sensitivity of the pointer movement based at least on the detected distance detected by the sensor unit comprises:
and adjusting a pointer sensitivity factor at least according to the detection distance detected by the sensor unit so as to realize the adjustment of the pointer movement sensitivity.
16. The input control method of claim 15, wherein the step of adjusting a pointer sensitivity factor based at least on the detected distance detected by the sensor unit to achieve the adjustment of the sensitivity of the pointer movement comprises:
acquiring an actual distance between the display screen and the exit pupil position;
calculating the difference between the actual distance and the detection distance to obtain an exit pupil distance;
calculating the ratio of the exit pupil distance to a first preset distance to obtain a sensitivity factor of the target pointer;
adjusting the pointer sensitivity factor to the target pointer sensitivity factor to effect adjustment of the sensitivity of pointer movement.
17. The input control method of claim 16, wherein the first predetermined distance is a distance between the optical lens and the exit pupil position when the optical lens is at an initial position.
18. The input control method of claim 15, wherein the step of adjusting a pointer sensitivity factor based at least on the detected distance detected by the sensor unit to achieve the adjustment of the sensitivity of the pointer movement comprises:
determining the correction degree of the optical lens to human eyes at the current position according to the detection distance;
calculating the zoom factor of a display picture seen through an optical lens according to the detection distance and the correction degree, and determining the reciprocal of the zoom factor as a sensitivity factor of a target pointer;
and adjusting the pointer sensitivity factor to the target pointer sensitivity factor to realize the adjustment of the pointer movement sensitivity.
19. The input control method of claim 18, wherein the step of calculating a zoom factor of a display viewed through the optical lens according to the detected distance and the correction power, and determining the reciprocal of the zoom factor as the target pointer sensitivity factor comprises:
and calculating a scaling factor F1 according to a formula F1 ═ D1Y 1)/D, and determining the reciprocal of the scaling factor F1 as a target pointer sensitivity factor, wherein D1 is the detection distance, Y1 is the correction degree, and D is the distance between the optical lens and the display screen (11) when the optical lens is at the initial position.
20. The input control method of claim 14, wherein the method further comprises the steps of:
and controlling the optical lens to move along the direction of the light outgoing path of the display picture displayed by the display screen in response to user operation.
CN201780058749.3A 2017-07-27 2017-07-27 Head-mounted display device and input control method thereof Active CN109791429B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/094673 WO2019019094A1 (en) 2017-07-27 2017-07-27 Head-mounted display device and input control method therefor

Publications (2)

Publication Number Publication Date
CN109791429A CN109791429A (en) 2019-05-21
CN109791429B true CN109791429B (en) 2021-12-03

Family

ID=65039441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780058749.3A Active CN109791429B (en) 2017-07-27 2017-07-27 Head-mounted display device and input control method thereof

Country Status (2)

Country Link
CN (1) CN109791429B (en)
WO (1) WO2019019094A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1942850A (en) * 2004-01-30 2007-04-04 皇家飞利浦电子股份有限公司 3-D cursor control system
CN1996092A (en) * 2005-12-28 2007-07-11 大学光学科技股份有限公司 Focus-adjustable headset type display system with virtual keyboard and device therefor
CN102005200A (en) * 2009-08-28 2011-04-06 英华达(上海)电子有限公司 System, device and method for regulating display direction
CN102346544A (en) * 2010-07-30 2012-02-08 鸿富锦精密工业(深圳)有限公司 Head-worn display system with interactive function and display method thereof
WO2015100205A1 (en) * 2013-12-26 2015-07-02 Interphase Corporation Remote sensitivity adjustment in an interactive display system
CN105094377B (en) * 2015-07-21 2018-10-16 三星电子(中国)研发中心 Cursor display method, equipment and the system of intelligent controller

Also Published As

Publication number Publication date
CN109791429A (en) 2019-05-21
WO2019019094A1 (en) 2019-01-31

Similar Documents

Publication Publication Date Title
US9465237B2 (en) Automatic focus prescription lens eyeglasses
EP3029552B1 (en) Virtual reality system and method for controlling operation modes of virtual reality system
CN105700142B (en) The display lightness regulating method and display brightness regulating system of virtual reality device
KR101260287B1 (en) Method for simulating spectacle lens image using augmented reality
US8736692B1 (en) Using involuntary orbital movements to stabilize a video
US20160353094A1 (en) Calibration of a head mounted eye tracking system
CN104076512A (en) Head-mounted display device and method of controlling head-mounted display device
US20150097772A1 (en) Gaze Signal Based on Physical Characteristics of the Eye
US9696798B2 (en) Eye gaze direction indicator
KR102056221B1 (en) Method and apparatus For Connecting Devices Using Eye-tracking
WO2015123167A1 (en) Ophthalmoscope device
EP2751609A2 (en) Head mounted display with iris scan profiling
CN108762496B (en) Information processing method and electronic equipment
KR101467529B1 (en) Wearable system for providing information
KR101203921B1 (en) Information providing apparatus using an eye tracking and local based service
CN108513627A (en) Head-mounted display apparatus and diopter adaptive regulation method
CN104656257A (en) Information processing method and electronic equipment
CN111886564A (en) Information processing apparatus, information processing method, and program
EP4035348A1 (en) Automated video capture and composition system
CN110140166A (en) For providing the system for exempting to manually enter to computer
JP7081599B2 (en) Information processing equipment, information processing methods, and programs
CN109791429B (en) Head-mounted display device and input control method thereof
EP4099680A1 (en) Head-mounted electronic vision aid device and automatic image magnification method thereof
US20230043585A1 (en) Ultrasound devices for making eye measurements
US20210216146A1 (en) Positioning a user-controlled spatial selector based on extremity tracking information and eye tracking information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Building 43, Dayun software Town, No. 8288 Longgang Avenue, Henggang street, Longgang District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Ruoyu Technology Co.,Ltd.

Address before: Building 43, Dayun software Town, No. 8288 Longgang Avenue, Henggang street, Longgang District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN ROYOLE TECHNOLOGIES Co.,Ltd.

GR01 Patent grant
GR01 Patent grant