CN108700745B - Position adjusting method and terminal - Google Patents

Position adjusting method and terminal Download PDF

Info

Publication number
CN108700745B
CN108700745B CN201780011872.XA CN201780011872A CN108700745B CN 108700745 B CN108700745 B CN 108700745B CN 201780011872 A CN201780011872 A CN 201780011872A CN 108700745 B CN108700745 B CN 108700745B
Authority
CN
China
Prior art keywords
distance
eye
vision
screen
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780011872.XA
Other languages
Chinese (zh)
Other versions
CN108700745A (en
Inventor
吴欣凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN108700745A publication Critical patent/CN108700745A/en
Application granted granted Critical
Publication of CN108700745B publication Critical patent/CN108700745B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Abstract

The embodiment of the application provides a position adjusting method and a terminal, relates to the technical field of terminals, and can be used for respectively adjusting the positions of two display screens and improving the visual experience of users with visual defects. The specific scheme is as follows: the first adjusting member 33 includes a first motor 331, a first rotating shaft 332, a first rotating drum 333 vertically disposed with the first display screen 312, the first rotating drum 333 is connected with the first motor 331, one end of the first rotating shaft 332 is in threaded fit with the first rotating drum 333, and the other end is fixedly connected with the first display screen 312, the first adjusting member 33 further includes a second motor 334, a second rotating shaft 335, and a second rotating drum 336 vertically disposed with the second display screen 322, the second rotating drum 336 is connected with the second motor 334, one end of the second rotating shaft 335 is in threaded fit with the second rotating drum 336, and the other end is fixedly connected with the second display screen 322. The embodiment of the application is used for adjusting the position of the display screen.

Description

Position adjusting method and terminal
The present application claims priority from the chinese patent office filed on 26/12/2016 under the application number 201611219941.X under the application name "a method and apparatus for focal length adjustment" and from the chinese patent office filed on 23/5/2017 under the application number 201710370609.1 under the application name "a method and apparatus for focal length adjustment", the entire contents of which are incorporated herein by reference.
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a position adjusting method and a terminal.
Background
A Head Mounted Display (HMD), i.e., a head display, can realize different effects such as virtual reality, augmented reality, mixed reality, and the like by transmitting an optical signal to the eyes of a person. The recent head shows are developed rapidly, but are mainly developed for users with normal vision, and most of the head shows are not suitable for users with visual defects.
To address this problem, one solution in the prior art is to adjust the position of the lens in the head display by manually controlling a button, thereby achieving focus adjustment and enabling a user with visual defects to see a clear image. However, the way of adjusting the focal length by adjusting the position of the lens may affect the field range of the user, so that the user experience is deteriorated.
Disclosure of Invention
The embodiment of the application provides a position adjusting method and a terminal, which can respectively adjust the positions of two display screens and improve the visual experience of users with visual defects.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, the present embodiment provides a terminal 30 including a first eyepiece 31, a second eyepiece 32, and a first adjustment member 33. The first eyepiece 31 includes a first barrel 311 and a first display screen 312 disposed in the first barrel 311, and the first display screen 312 is slidably connected to the first barrel 311. The second eyepiece 32 includes a second barrel 321 and a second display screen 322 disposed in the second barrel 321, and the second display screen 322 is slidably connected to the second barrel 321. The first adjusting member 33 includes a first motor 331, a first rotating shaft 332, and a first rotating barrel 333 vertically disposed with respect to the first display screen 312, the first rotating barrel 333 is connected to the first motor 331, one end of the first rotating shaft 332 is engaged with the first rotating barrel 333 through a screw, and the other end of the first rotating shaft 332 is fixedly connected to the first display screen 312. The first adjusting member 33 further includes a second motor 334, a second rotating shaft 335, and a second rotating drum 336 disposed perpendicular to the second display screen 322, the second rotating drum 336 is connected to the second motor 334, one end of the second rotating shaft 335 is engaged with the second rotating drum 336 by a screw thread, and the other end of the second rotating shaft 335 is fixedly connected to the second display screen 322.
In this way, the terminal 30 can adjust the position of the first display screen 312 corresponding to the left eye and the position of the second display screen 322 corresponding to the right eye to clearly image through the first motor and the second motor, and the field range of the user is not affected, so that the user experience can be improved.
With reference to the first aspect, in a possible implementation manner, at least one first sliding chute 313 is disposed on an inner wall of the first barrel 311, at least one second sliding chute 323 is disposed on an inner wall of the second barrel 321, the first display screen 312 is slidably connected to the first barrel 311 through the at least one first sliding chute 313, and the second display screen 322 is slidably connected to the second barrel 321 through the at least one second sliding chute 323.
Thus, the display screen can be slidably coupled to the lens barrel in a simple manner of the slide groove.
With reference to the first aspect and the possible implementation manners described above, in another possible implementation manner, the terminal 30 further includes a second adjusting member 34, the second adjusting member 34 includes a third motor 343, a transmission assembly 344, and a screw assembly 345, and the screw assembly 345 is connected to the first barrel 311 and the second barrel 321 through threads. The third motor 343 drives the screw assembly 345 to rotate through the transmission assembly 344, so that the first barrel 311 and the second barrel 321 move closer to or away from each other.
Like this, terminal 30 can rotate through the motor and adjust the eyepiece interval to adapt to different users 'interpupillary distance condition, improve user's visual experience. Moreover, the adjustment mode is more convenient because manual adjustment is not needed.
With reference to the first aspect and the possible implementation manners described above, in another possible implementation manner, the screw assembly 345 includes a first screw 51, a second screw 52, and a third screw 53, one end of the worm wheel 42 is engaged with the thread on the surface of the first screw 51, two ends of the first screw 51 are respectively provided with a gear, one ends of the second screw 52 and the third screw 53 are respectively provided with a gear, two ends of the first screw 51 are respectively engaged with the second screw 52 and the third screw 53 through a gear, and one ends of the second screw 52 and the third screw 53, which are not provided with a gear, are respectively connected with the first barrel 311 and the second barrel 321 through a thread.
In this way, the fastening member can be more conveniently provided by the cooperation of the first screw 51, the second screw 52 and the third screw 53, thereby fixing the first screw 51, the second screw 52 and the third screw 53 so that their positions are not easily moved.
With reference to the first aspect and the possible implementation manners described above, in another possible implementation manner, the screw assembly 345 includes a fourth screw 54 and a fifth screw 55, one end of each of the fourth screw 54 and the fifth screw 55 is respectively in threaded connection with the first barrel 311 and the second barrel 321, the transmission assembly 344 includes an output shaft 41 of the third motor 343, a connecting rod 43, the first bevel gear 44, the second bevel gear 45, the output shaft 41 is sleeved with the first bevel gear 44, the second bevel gear 45 and the third bevel gear 46 are arranged at two ends of the connecting rod 43, the first bevel gear 44 is meshed with the second bevel gear 45, the fourth bevel gear 47 is connected with the other end of the fourth screw 54, the fifth bevel gear 48 is connected with the other end of the fifth screw 55, and the third bevel gear 46 is meshed with the fourth bevel gear 47 and the fifth bevel gear 48 respectively.
In this way, the distance between the eyepieces can be adjusted through the matching of the bevel gear and the screw rod.
With reference to the first aspect and the possible implementations described above, in another possible implementation, the first motor 331, the second motor 334, and the third motor 343 may be a stepping motor, a direct current motor, an asynchronous motor, or a synchronous motor.
Thus, the specific implementation mode of the motor can be more flexible.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the terminal 30 further includes a vision input component 35, configured to detect vision information input by the user, where the vision information includes at least one of a vision parameter and a pupil distance parameter, and the vision parameter includes a left-eye vision parameter and/or a right-eye vision parameter.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the terminal 30 further includes a memory 36 for storing vision information corresponding to the identification information of the user.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the terminal 30 further includes a microprocessor 37, configured to control the first adjusting member 33 or the second adjusting member 34 to perform the adjusting operation according to the vision information detected by the vision input unit 35 or stored in the memory.
In this way, the terminal 30 can perform the adjustment operation by the first adjusting means 33 or the second adjusting means 34 according to the accurate user's eyesight information corresponding to the user, which is input by the user through the eyesight input part 35 or stored in the memory 36.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the microprocessor 37 is configured to control the first adjusting component 33 to perform an adjusting operation according to the vision information, and specifically includes: the microprocessor 37 determines a reference value of the first eye distance and/or a reference value of the second eye distance according to the vision information; and determines a first rotation direction and a first number of rotation turns of the first motor 331 according to the first eye distance reference value, and/or determines a second rotation direction and a second number of rotation turns of the second motor 334 according to the second eye distance reference value. The first eye distance is a distance between the position of the first display screen 312 and the left eye reference position, and the second eye distance is a distance between the position of the second display screen 322 and the right eye reference position. The first adjusting member 33 adjusts the position of the first display screen 312 according to the first rotation direction and the first rotation number of the first motor such that the distance between the adjusted position of the first display screen 312 and the left-eye reference position is equal to the first eye-distance reference value. The first adjusting member 33 adjusts the position of the second display screen 322 according to the second rotation direction and the second rotation number of the second motor 334, so that the distance between the adjusted position of the second display screen 322 and the right-eye reference position is equal to the second screen-to-eye reference value.
In this way, the terminal 30 may adjust the position of the display screen according to the accurate user vision information, so that the adjusted eye distance is equal to the reference value of the eye distance.
Is combined with a first partyIn another possible implementation manner, when the vision information includes a vision parameter of a left eye and/or a vision parameter of a right eye, the determining, by the microprocessor 37, a first reference value and/or a second reference value according to the vision information includes: and determining a first screen eye distance reference value according to the vision parameter of the left eye and the first expression. And determining a second screen eye distance reference value according to the vision parameter of the right eye and the first expression. The first expression is as follows: u. ofi=1/(Di-1/k) wherein k represents a constant, when uiWhen the first eye distance reference value is expressed, DiRepresents the focal power of the left eye; when u isiWhen representing the second eye distance reference value, DiIndicating the power of the right eye.
In this way, the terminal 30 may determine the reference value of the screen-eye distance according to the accurate user vision information, so that the position of the display screen may be adjusted according to the reference value of the screen-eye distance.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the microprocessor 37 is configured to control the second adjusting component 34 to perform an adjusting operation according to the vision information, and specifically includes: the microprocessor 37 determines a third rotation direction and a third rotation number of the third motor 343 according to the interpupillary distance parameter. The second adjusting member 34 is configured to adjust a distance between the first eyepiece 31 and the second eyepiece 32 according to a third rotation direction and a third rotation number of the third motor 343, so that the adjusted distance between the first eyepiece and the second eyepiece is equal to the interpupillary distance parameter.
In this way, the terminal 30 can adjust the ocular distance according to the interpupillary distance parameter of the user.
With reference to the first aspect and the possible implementations described above, in another possible implementation, the vision parameter includes at least one of myopia or hyperopia.
In this way, the terminal 30 can adjust the position of the display screen according to the user's near-sighted or far-sighted condition so that the user with the near-sighted and far-sighted vision defects can see a clear image.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the vision input part 35 including the vision input part 35 may include at least one of a voice input unit and a manual input unit.
That is, the user can input accurate vision information in various ways such as voice or manual.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the vision input unit 35 may specifically include a touch panel, an input panel with keys, a knob-type input panel, or the like.
In this way, the implementation of manual input can be made more flexible.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, when the vision information includes a value of a first eye distance and/or a value of a second eye distance, the determining, by the microprocessor 37, a first eye distance reference value and/or a second eye distance reference value according to the vision information includes: microprocessor 37 determines the value of the first eye distance as the first eye distance reference value. Microprocessor 37 determines the value of the second eye distance as the second eye distance reference value.
In this way, the terminal 30 can simply and directly determine the eye distance reference value according to the numerical value of the eye distance corresponding to the user, which is input by the user or stored in the memory 36.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the terminal 30 further includes: wearing detection section 310 for detecting whether terminal 30 has been worn by the user.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the terminal 30 further includes: an indication input part 38 for detecting first indication information input by the user when the wearing detection part 310 detects that the terminal 30 has been worn by the user, the first indication information being used for indicating adjustment of the vision parameter for the left eye and/or the vision parameter for the right eye. The microprocessor 37 is further configured to determine a first adjustment value and/or a second adjustment value according to the first indication information, where the first adjustment value is an adjustment value of the first eye distance, and the second adjustment value is an adjustment value of the second eye distance. The fourth direction of rotation and the fourth number of revolutions of the first motor 331 is determined based on the first adjustment value and/or the fifth direction of rotation and the fifth number of revolutions of the second motor 334 is determined based on the second adjustment value. The first adjustment member 33 is further configured to adjust the position of the first display screen 312 according to a fourth rotation direction and a fourth rotation number of the first motor 331 and/or adjust the position of the second display screen 322 according to a fifth rotation direction and a fifth rotation number of the second motor 334.
In this way, the terminal 30 can also finely adjust the position of the display screen according to the adjustment value of the vision parameter indicated by the user, so that the image of the display screen is clearer.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the indication input component 38 is further configured to detect second indication information of the user when the wearing detection component 310 detects that the terminal 30 has been worn by the user, where the second indication information is used to indicate that the first screen eye distance and/or the second screen eye distance is increased or decreased, or the first indication information is used to indicate that the position of the first display screen 312 and/or the second display screen 322 is adjusted forward or backward. The microprocessor 37 is further configured to determine a sixth direction and a sixth number of revolutions of the first motor 331 based on the second indication and/or a seventh direction and a seventh number of revolutions of the second motor 334 based on the second indication. The first adjustment member 33 is further configured to adjust the position of the first display screen 312 according to a sixth rotation direction and a sixth rotation number of the first motor 331 and/or adjust the position of the second display screen 322 according to a seventh rotation direction and a seventh rotation number of the second motor 334.
In this way, the terminal 30 can also finely adjust the position of the display screen according to the indication information for indicating and adjusting the position of the display screen, so that the image of the display screen is clearer.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the indication input unit 38 is further configured to detect third indication information of the user when the wearing detection unit 310 detects that the terminal 30 is worn by the user, where the third indication information is used to indicate that the interpupillary distance parameter or the ocular distance is increased or decreased. And the microprocessor 37 is configured to determine an eighth rotation direction and an eighth rotation number of the third motor 343 according to the third indication information. The second adjustment member 34 is also used to adjust the eyepiece spacing according to the eighth rotational direction and the eighth number of rotations of the third motor 343.
In this way, the terminal 30 can finely adjust the distance between the eyepieces according to the indication information of the user, so as to improve the visual experience of the user.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the indication input component 38 includes at least one of a voice indication input component and a posture indication input component; the voice indication input component comprises a microphone, and the posture indication input component comprises a camera.
In this way, the user can finely adjust the position/eyepiece distance of the display screen in the modes of voice, posture and the like without being affected by the visual obstruction after the user wears the terminal 30.
With reference to the first aspect and the possible implementations described above, in another possible implementation, the indication input component 38 is the same as or different from the vision input component 35.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the terminal 30 further includes: and an identification input part 39 for detecting identification information input by the user, the identification information of the user including at least one of character information, voice information, fingerprint information, iris information and face information corresponding to the user.
In this way, the terminal 30 can determine the correspondence relationship for and the visual acuity information by the identification information.
In a second aspect, an embodiment of the present application provides a terminal 60, including: and the vision input part 61 is used for detecting vision information input by the user, and the vision information comprises vision parameters of the left eye and/or vision parameters of the right eye. And a memory 62 for storing vision information corresponding to the identification information of the user. A microprocessor 63, configured to determine a reference value of a screen-to-eye distance according to the vision information detected by the vision input unit 61 or the vision information stored in the memory 62, where the reference value of the screen-to-eye distance includes a first reference value of the screen-to-eye distance and/or a second reference value of the screen-to-eye distance, the first screen-to-eye distance is a distance between the first display screen and the left-eye reference position, and the second screen-to-eye distance is a distance between the second display screen and the right-eye reference position; and adjusting the position of the display screen according to the reference value of the distance between the two screen eyes, so that the distance between the adjusted position of the first display screen and the reference position of the left eye is equal to the reference value of the distance between the two screen eyes, or the distance between the adjusted position of the second display screen and the reference position of the right eye is equal to the reference value of the distance between the two screen eyes.
Therefore, the terminal can respectively adjust the position of the first display screen corresponding to the left eye and the position of the second display screen corresponding to the right eye according to the accurate vision information input by the user or the stored vision information corresponding to the user, so that the imaging is clear, the field range of the user cannot be influenced, and the use experience of the user can be improved. And, can rotate the position of adjustment display screen through the motor, need not the position of user manual adjustment display screen, therefore the regulation mode is more convenient.
In a third aspect, an embodiment of the present application provides a position adjustment method, including: the terminal determines a reference value of the screen-eye distance according to the vision information of the user, the vision information comprises the vision information input by the user or the vision information corresponding to the identification information of the user and stored in the terminal, the vision information comprises the vision parameter of the left eye and/or the vision parameter of the right eye, the reference value of the screen-eye distance comprises a first reference value of the screen-eye distance and/or a second reference value of the screen-eye distance, the first screen-eye distance is the distance between the first display screen and the reference position of the left eye, and the second screen-eye distance is the distance between the second display screen and the reference position of the right eye. And the terminal adjusts the position of the display screen according to the reference value of the distance between the screen eyes, so that the distance between the adjusted position of the first display screen and the reference position of the left eye is equal to the reference value of the distance between the first screen eyes, or the distance between the adjusted position of the second display screen and the reference position of the right eye is equal to the reference value of the distance between the second screen eyes.
Therefore, the terminal can respectively adjust the position of the first display screen corresponding to the left eye and the position of the second display screen corresponding to the right eye according to the accurate vision information input by the user or the stored vision information corresponding to the user, so that the imaging is clear, the field range of the user cannot be influenced, and the use experience of the user can be improved. And, can rotate the position of adjustment display screen through the motor, need not the position of user manual adjustment display screen, therefore the regulation mode is more convenient.
With reference to the third aspect, in a possible implementation manner, the vision information further includes an interpupillary distance parameter, the interpupillary distance parameter is a reference value of an eyepiece distance, the eyepiece distance is a distance between the first eyepiece and the second eyepiece, the first eyepiece includes the first display screen, the second eyepiece includes the second display screen, and the method further includes: and the terminal adjusts the distance between the ocular lenses according to the interpupillary distance parameters so that the adjusted distance between the ocular lenses is equal to the interpupillary distance parameters.
Like this, the terminal can rotate through the motor and adjust the eyepiece interval to adapt to different users 'interpupillary distance condition, improve user's visual experience. And, because manual adjustment is not needed, the adjustment mode is more convenient.
With reference to the third aspect and the foregoing possible implementation manners, in another possible implementation manner, the method further includes: after the terminal is worn by a user, the terminal receives indication information of the user; and the terminal responds to the indication information and adjusts at least one of the position of the first display screen, the position of the second display screen and the distance between the eyepieces.
Like this, the terminal can be before being worn by the user, carries out the coarse tune to the position of display screen or eyepiece interval according to eyesight information to after being worn by the user, finely tune to the position of display screen or eyepiece interval, thereby make the formation of image more clear, more can improve user experience.
In a fourth aspect, an embodiment of the present application provides a terminal, including: a processor, a memory, a bus, and a communication interface; the memory is used for storing computer execution instructions, the processor is connected with the memory through the bus, and when the terminal runs, the processor executes the computer execution instructions stored by the memory, so that the terminal executes the scheduling method in any one of the third aspect.
Drawings
Fig. 1 is a scene schematic diagram of a user wearing a terminal according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a basic principle of a head display provided in an embodiment of the present application;
FIG. 3 is a schematic structural diagram of a head display according to an embodiment of the present disclosure;
FIG. 4 is a schematic structural diagram of another head display provided in an embodiment of the present application;
FIG. 5a is a schematic structural diagram of another head display provided in an embodiment of the present application;
FIG. 5b is a schematic structural diagram of another head display provided in the embodiments of the present application;
FIG. 5c is a schematic structural diagram of another head display provided in an embodiment of the present application;
FIG. 6a is a schematic structural diagram of another head display provided in an embodiment of the present application;
FIG. 6b is a schematic structural diagram of another head display provided in the embodiments of the present application;
FIG. 7 is a schematic structural diagram of another head display provided in an embodiment of the present application;
fig. 8 is a schematic view of a vision information input scenario provided in an embodiment of the present application;
fig. 9 is a schematic view of another vision information input scenario provided in the embodiment of the present application;
fig. 10 is a schematic diagram of a prompt scenario provided in the embodiment of the present application;
fig. 11 is a schematic diagram of another prompt scenario provided in the embodiment of the present application;
fig. 12 is a schematic diagram of another prompt scenario provided in the embodiment of the present application;
fig. 13 is a schematic diagram of another prompt scenario provided in the embodiment of the present application;
fig. 14 is a schematic diagram of another prompt scenario provided in the embodiment of the present application;
fig. 15 is a schematic diagram of another prompting scenario provided in the embodiment of the present application;
fig. 16 is a schematic diagram of an indication scenario provided in an embodiment of the present application;
fig. 17 is a schematic diagram of another indication scenario provided in the embodiment of the present application;
FIG. 18 is a schematic diagram of another exemplary indication scenario provided by an embodiment of the present application;
FIG. 19 is a schematic diagram of another exemplary indication scenario provided by an embodiment of the present application;
fig. 20 is a schematic structural diagram of another terminal according to an embodiment of the present application;
fig. 21 is a flowchart of a position adjustment method according to an embodiment of the present application;
fig. 22 is a flowchart of another position adjustment method provided in the embodiment of the present application;
fig. 23 is a schematic structural diagram of another terminal according to an embodiment of the present application.
Detailed Description
For ease of understanding, examples are given in part to illustrate concepts related to embodiments of the present application. As follows:
focal length: distance of lens optical center to focal point.
Focal power: the inverse of the focal length.
Refraction: when light passes from one medium to another medium of different refractive index, a change in the direction of travel occurs, known as refraction, in eye optics.
Diopter: the unit representing the magnitude of the refractive power is denoted by D, that is, the refractive power of the refractive material is 1 diopter or 1D when parallel rays pass through the refractive material with a focus at 1 m. By lens is meant the unit of lens power, e.g. when the focal length of the lens is 1m, the refractive power of the lens is 1D diopters.
The field angle: the angle of view is defined as the angle of view formed by the two edges of the lens through which the object image of the object to be measured can pass, with the lens as the vertex. The size of the angle of view determines the field of view, and as the angle of view increases, the field of view increases, and objects outside the field of view do not fit within the field of view.
Virtual image distance: the distance between the virtual image formed by the lens and human eyes.
Pupil distance: the distance between the pupils of the eyes of a person.
Eyepiece interval: the distance between the center points of the two eyepieces.
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
The terminal provided by the embodiment of the application is a display device which can be worn on eyes, for example, a head-mounted display device, also called a head display, a glasses type display, a portable cinema or video glasses and the like can be provided. For an example, a scene schematic diagram of a terminal worn by a user according to an embodiment of the present application may be shown in fig. 1.
Specifically, the embodiment of the present application takes a terminal device as an example, and introduces the adjustment method provided by the present application. Referring to fig. 2, the basic principle of the head display is to magnify the image on the display screen through a set of optical lenses, project the image on the retina, and further present a large screen image in the eyes of the user, which is simply to see the magnified virtual object image presented by the object through the magnifying lens.
By way of example, FIG. 3 provides a schematic illustration of an overhead display. The head display 200 may include: a first eyepiece 21, a second eyepiece 22, a first display screen 23, a second display screen 24, a first lens 25, and a second lens 26. Wherein the first display screen 23 is used for combining with the first lens 25 to form an image, and the second display screen 24 is used for combining with the second lens 26 to form an image. The first lens 25 and the second lens 26 may be specifically one lens or one lens group. The first display 23 and the second display 24 may be Liquid Crystal Display (LCD) panels or organic light-emitting display (OLED) panels. Further, although not shown in FIG. 3, it is understood that the headset may also include a headset interface, an operating area, and the like. The operation area mainly comprises a power key, a navigation key, a volume adjusting key, a menu key and the like. Or, in another implementation manner, the power key, the navigation key, the volume adjustment key, the menu key and the like can also be integrated on the separate remote controller.
For users with visual defects, the method of adjusting the focal length in the prior art is adopted to clearly image, the field range (namely the field range) of the users can be influenced, the immersion of the users is poor, and the user experience is reduced.
Because myopia or hyperopia etc. have visual defect's user, the distortion degree of its eyes lens is different, consequently, the virtual image distance that every eyeball actually can see clearly is different, and this application embodiment controls the formation of image distance of virtual image through the position of adjusting the display screen to guarantee that people's eye can see clear image, can not like prior art to user's visual field scope produce the influence, can improve user's use and experience.
The following will be described in detail by way of specific examples.
Fig. 3 shows basic components and functions of the head display, and for clarity of description, the terminal 30 and the position adjustment method provided in the embodiment of the present application will be described below by taking the head display shown in fig. 3 as an example. In the embodiment of the present application, if no specific description is given, the eyepiece is a generic name of the first eyepiece and the second eyepiece; the lens cone is a general name of a first lens cone and a second lens cone, and the display screen is a general name of a first display screen and a second display screen; the lens is a general name of the first lens and the second lens; the vision parameters are a general term for the vision parameters of the left eye and the right eye.
Referring to fig. 4, embodiments of the present application provide a head display 300 that may include a first eyepiece 31, a second eyepiece 32, and a first adjustment member 33. The first eyepiece 31 includes a first barrel 311 and a first display screen 312 disposed in the first barrel 311, and the first display screen 312 is slidably connected to the first barrel 311. The second eyepiece 32 includes a second barrel 321 and a second display screen 322 disposed in the second barrel 321, and the second display screen 322 is slidably connected to the second barrel 321. The first adjusting member 33 includes a first motor 331, a first rotating shaft 332, and a first rotating barrel 333 vertically disposed with respect to the first display screen 312, the first rotating barrel 333 is connected to the first motor 331, one end of the first rotating shaft 332 is engaged with the first rotating barrel 333 through a screw, and the other end of the first rotating shaft 332 is fixedly connected to the first display screen 312. The first adjusting member 33 further includes a second motor 334, a second rotating shaft 335, and a second rotating drum 336 disposed perpendicular to the second display screen 322, the second rotating drum 336 is connected to the second motor 334, one end of the second rotating shaft 335 is engaged with the second rotating drum 336 by a screw thread, and the other end of the second rotating shaft 335 is fixedly connected to the second display screen 322.
The sliding connection between the first display screen 312 and the first barrel 311 may include various ways. For example, referring to fig. 4, at least one first sliding slot 313 is disposed on an inner wall of the first barrel 311, the first display screen 312 is connected to the first barrel 311 through the at least one first sliding slot 313, and the first display screen 312 can slide in the first sliding slot 313.
With the structure shown in fig. 4, the first regulation member may perform the following first regulation operation: when the first motor 331 of the first adjustment member 33 rotates, the first motor 331 drives the first rotating barrel 333 to rotate, the first rotating barrel 333 rotates to drive the first rotating shaft 332 to rotate in or out of the first rotating barrel 333, and the first rotating shaft 332 rotates in or out of the first rotating barrel 333 to drive the first display screen 312 to move along a direction perpendicular to the first display screen 312, that is, to drive the first display screen 312 to move along the inner wall of the first barrel 311, so as to achieve the purpose of adjusting the position of the first display screen 312.
With the structure shown in fig. 4, the first regulation member can also perform the following second regulation operation: when the second motor 334 in the first adjustment member 33 rotates, the second motor 334 drives the second drum 336 to rotate, the second drum 336 rotates to drive the second rotating shaft 335 to rotate in or out of the second drum 336, and the second rotating shaft 335 rotates in or out of the second drum 336 to drive the second display screen 322 to move along a direction perpendicular to the second display screen 322, i.e., to drive the second display screen 322 to move along the inner wall of the second barrel 321, thereby achieving the purpose of adjusting the position of the second display screen 322.
As can be seen, the head display 300 provided in the embodiment of the present application can adjust the position of the first display screen 312 and the position of the second display screen 322, respectively. For users with eyesight defects, the head display 300 provided by the embodiment of the present application can clearly image by respectively adjusting the position of the first display screen 312 corresponding to the left eye and the position of the second display screen 322 corresponding to the right eye, without affecting the field range of the user as in the prior art, so as to improve the user experience of the user.
Moreover, since the eyesight of the left eye and the right eye of the user may not be the same, for example, the left eye of some users has a defective eyesight, but the right eye of some users has a normal eyesight, if the positions of the first display screen 312 and the second display screen 322 are adjusted synchronously, only one eye can observe a clear image, and the other eye cannot observe a clear image, so that the overall visual experience is not good. The head display 300 provided by the embodiment of the application can respectively adjust the position of the first display screen 312 corresponding to the left eye and the position of the second display screen 322 corresponding to the right eye according to actual vision conditions, so that clear images can be observed by the left eye and the right eye, and the visual experience of a user is improved.
In addition, the prior art adopts a manual adjustment mode, and the user has an impaired vision after wearing the head display 300 (see fig. 1), which is inconvenient for manual operation. The head display 300 provided by the embodiment of the application can adjust the position of the display screen by controlling the motor to rotate, and a user does not need to manually adjust the position of the display screen, so that the adjusting mode is more convenient.
Further, referring to FIG. 5a, the headset 300 provided by embodiments of the present application may further include a second adjustment member 34. The second adjustment member 34 may include a third motor 343, a transmission assembly 344, and a screw assembly 345, and the screw assembly 345 is threadedly connected with the first and second barrels 311 and 321. The second regulation member 34 may perform the following third regulation operation: the third motor 343 drives the screw assembly 345 to rotate through the transmission assembly 344, so that the first barrel 311 and the second barrel 321 move toward or away from each other, i.e., both barrels move inward or outward at the same time.
That is, the head phone 300 can adjust the positions of the first barrel 311 and the second barrel 321 by the second adjustment member 34 so that the first barrel 311 and the second barrel 321 are close to or away from each other, thereby increasing or decreasing the eyepiece pitch.
The distance between the eyepieces is consistent with the pupil distance of the user, and the pupil distance of different users is different, so the distance between the eyepieces is different. The head that this application embodiment provided shows 300 can rotate through the motor and adjust the eyepiece interval to suit with different users 'actual interpupillary distance size, improve user's visual experience. Moreover, the adjustment mode is more convenient because manual adjustment is not needed.
Further, referring to fig. 5b, a first sleeve 341 may be further fixed on the first barrel 311, a second sleeve 342 may be further fixed on the second barrel 321, and the screw assembly 345 is in threaded connection with the first barrel 311 and the second barrel 321 through the first sleeve 341 and the second sleeve 342, respectively.
In one specific implementation, referring to fig. 5b, the transmission assembly 344 includes an output shaft 41 of the third motor 343 and a worm wheel 42, the output shaft 41 is a worm, one end of the worm wheel 42 is engaged with the worm, and the other end of the worm wheel 42 is connected to the screw assembly 345. The screw assembly 345 may be a screw.
In the structure shown in fig. 5b, when the third motor 343 is rotated, the worm wheel 42 is rotated by the output shaft 41, and the worm wheel 42 is rotated to rotate the screw assembly 345, so that both ends of the screw assembly 345 are simultaneously screwed into the first sleeve 341 and the second sleeve 342, or simultaneously screwed out of the first sleeve 341 and the second sleeve 342, so that the first barrel 311 and the second barrel 321 are simultaneously moved inward or outward.
In another specific implementation manner, referring to fig. 5c, the screw assembly 345 includes a first screw 51, a second screw 52, and a third screw 53, one end of the worm 42 is engaged with the thread on the surface of the first screw 51, two ends of the first screw 51 are respectively provided with a gear, one ends of the second screw 52 and the third screw 53 are respectively provided with a gear, two ends of the first screw 51 are respectively engaged with the second screw 52 and the third screw 53 through a gear, and one ends of the second screw 52 and the third screw 53, which are not provided with a gear, are respectively connected with the first barrel 311 and the second barrel 321 through a thread.
In the structure shown in fig. 5c, when the third motor 343 is rotated, the worm wheel 42 is driven to rotate by the output shaft 41, the first screw 51 is driven to rotate by the rotation of the worm wheel 42, and the second screw 52 and the third screw 53 are driven to rotate by the first screw 51 through the gear, so that the second screw 52 and the third screw 53 are simultaneously screwed into the first sleeve 341 and the second sleeve 342, or simultaneously screwed out of the first sleeve 341 and the second sleeve 342, so that the first lens barrel 311 and the second lens barrel 321 are simultaneously moved inward or outward.
It should be noted that, in the process of adjusting the position of the display screen, the screw assembly 345 rotates, but the position does not move, and in the structure shown in fig. 5c, a combination of the first screw 51, the second screw 52 and the third screw 53 replaces one screw in fig. 5b, which may facilitate the arrangement of a fastener, such as a lock catch, to fix the first screw 346, the second screw 347 and the third screw 348, so that the positions thereof are not easily moved.
In another specific implementation manner, referring to fig. 6a, the screw assembly 345 includes a fourth screw 54 and a fifth screw 55, one end of the fourth screw 54 and one end of the fifth screw 55 are respectively in threaded connection with the first barrel 311 and the second barrel 321, the transmission assembly 344 includes an output shaft 41 of the third motor 343, a connecting rod 43, a first bevel gear 44, a second bevel gear 45, a third bevel gear 46, a fourth bevel gear 47, and a fifth bevel gear 48, the first bevel gear 44 is sleeved on the output shaft 41, the second bevel gear 45 and the third bevel gear 46 are disposed at two ends of the connecting rod 43, the first bevel gear 44 is engaged with the second bevel gear 45, the fourth bevel gear 47 is connected with the other end of the fourth screw 54, the fifth bevel gear 48 is connected with the other end of the fifth screw 55, and the third bevel gear 46 is engaged with the fourth bevel gear 47 and the fifth bevel gear 48.
In the structure shown in fig. 6a, when the third motor 343 is rotated, the first bevel gear 44 is driven to rotate by the output shaft 41, the second bevel gear 45 is driven to rotate by the first bevel gear 44, the third bevel gear 46 is driven to rotate by the second bevel gear 45, the fourth bevel gear 47 and the fifth bevel gear 48 are driven to rotate by the third bevel gear 46 through gear engagement, the fourth bevel gear 47 and the fifth bevel gear 48 rotate to drive the fourth screw 54 and the fourth screw 55 to rotate, so that the fourth screw 54 and the fourth screw 55 are simultaneously screwed into the first sleeve 341 and the second sleeve 342, or the first sleeve 341 and the second sleeve 342 are simultaneously screwed out, so that the first lens barrel 311 and the second lens barrel 321 are simultaneously moved inwards or outwards.
In another specific implementation, as shown in fig. 6b, the screw assembly 345 includes a screw instead of the fourth screw 44 and the fifth screw 45 in fig. 6a, and the fourth bevel gear 47 and the fifth bevel gear 48 are sleeved on the screw.
It should be noted that, in order to enable the two lens barrels to move inward and outward simultaneously, in a possible embodiment, the screw direction of the screw thread at the connection of the screw assembly and the two lens barrels is opposite, or the screw direction of the screw thread at the connection of the two lens barrels and the screw assembly is opposite.
It should be noted that the second adjusting member 34 shown in fig. 5 a-6 b is only an example, and any transmission structure that can ensure that two lens barrels are driven by the rotation of the motor to move inward or outward at the same time can be used in the embodiment of the present application.
Specifically, in the embodiment of the present application, the second adjusting member 34 can perform the third adjusting operation according to the third rotating direction and the third rotating number of the third motor 343. Wherein the third rotation direction of the third motor 343 determines whether the two lens barrels move inward or outward at the same time, and the third rotation number determines the distance that the two lens barrels move inward or outward at the same time.
In addition, in the embodiment of the present application, the first motor 331, the second motor 334, and the third motor 343 may be specifically a stepping motor, a direct current motor, an asynchronous motor, a synchronous motor, or the like. Also, since the motors occupy a certain space, the three motors may be distributed or concentrated according to the space, for example, the first motor 331 and the second motor 334 may be disposed in parallel between the two eyepieces, and the third motor 343 may be disposed above the eyepieces.
It is understood that, although not shown in fig. 5 a-6 b, the cross-section of the lens barrel, the shape of the display screen and the lens may be elliptical, rectangular or other shapes besides circular, which is not limited herein.
Fig. 7 shows another structural schematic diagram of the head display 300, and the head display 300 provided in the embodiment of the present application may further include a vision input component 35 for detecting vision information input by the user, where the vision information includes at least one of a vision parameter and a pupil distance parameter. Wherein the vision parameters comprise left-eye vision parameters and/or right-eye vision parameters.
Further, the head display 300 may further include a memory 36 for storing vision information corresponding to the identification information of the user.
The head display 300 may further include a microprocessor 37 for controlling the first adjusting member 33 or the second adjusting member 34 to perform an adjusting operation according to the vision information detected by the vision input part 35 or stored in the memory 36.
The microprocessor 37 is configured to control the first adjusting component 33 to perform an adjusting operation according to the eyesight information, and specifically may include:
the microprocessor 37 determines a reference value of the first eye distance and/or a reference value of the second eye distance according to the vision information, and determines a first rotation direction and a first number of rotation turns of the first motor 331 according to the reference value of the first eye distance, and/or determines a second rotation direction and a second number of rotation turns of the second motor 334 according to the reference value of the second eye distance. The first eye distance is a distance between the position of the first display screen 312 and the left eye reference position, and the second eye distance is a distance between the position of the second display screen 322 and the right eye reference position.
The first adjusting member 33 adjusts the position of the first display screen 312 according to the first rotating direction and the first rotating number of turns of the first motor, that is, the first adjusting member 33 performs the above-mentioned first adjusting operation, so that the distance between the adjusted position of the first display screen 312 and the left-eye reference position is equal to the first screen-to-eye distance reference value.
The first adjusting member 33 adjusts the position of the second display screen 322 according to the second rotating direction and the second number of rotating turns of the second motor 334, that is, the first adjusting member 33 performs the second adjusting operation, so that the distance between the adjusted position of the second display screen 322 and the right-eye reference position is equal to the second screen-to-eye distance reference value.
Specifically, the head display 300 may obtain the first rotation direction and the first rotation number of the first motor 311 and/or the second rotation direction and the second rotation number of the second motor 321 in the following manner.
In a first possible implementation, the head display 300 includes a vision input component 35, and the vision information detected by the input component 35 includes a vision parameter for the left eye and/or a vision parameter for the right eye.
Wherein the vision parameter may include at least one of near-sightedness or far-sightedness. The vision input part 35 may include at least one of a voice input unit and a manual input unit. That is, the user may input the visual information in a voice manner or a manual manner. When the user inputs the eyesight information in a voice manner, the eyesight input unit 35 may specifically include at least one microphone for detecting the eyesight information input by the user in a voice form. When the user manually inputs the visual information, the visual input unit 35 may specifically include a touch panel, an input panel with keys, a knob-type input panel, or the like. In this way, the user can input accurate vision parameters for the left and/or right eye in a voice or manual manner through the vision input section 35. For example, a scene in which the user inputs vision parameters through the touch panel 40 may be referred to in fig. 8. For example, a scene in which the user inputs the vision parameter by voice may be referred to fig. 9.
In the embodiment of the present application, when the user starts using the head display 300, for example, when the head display 300 is turned on, the head display 300 may further prompt the user to input visual information through a visual prompt and/or an audible prompt. Illustratively, referring to fig. 10, the head display 300 may emit an audible prompt "please input visual information" through a speaker (or an earphone). Alternatively, the head display 300 may issue a character prompt "please input visual information" through the display screen or the touch panel 40 in the visual input unit 35, for example. Alternatively, the head display 300 may illuminate the touch panel 40 in the visual input unit 35 to prompt the user to input visual information, for example. A schematic diagram of the head display 300 prompting the user through the touch panel 40 can be seen in fig. 11.
The microprocessor 37 can determine the reference value of the distance between the display screen and the reference position of the human eyes according to the accurate vision parameters input by the user through voice or manual mode. The distance between the eyes of the far-vision user is larger, and the reference value of the distance between the eyes of the far-vision user is larger; the corresponding screen eye distance of the user with myopia is smaller, and the corresponding screen eye distance reference value is also smaller.
It should be noted that the head display 300 provided in the embodiment of the present application may perform the above-mentioned first adjustment operation, second adjustment operation, and third adjustment operation according to the eyesight information before being worn by the user. At this time, the user does not wear the head display yet, and thus cannot know the actual position of the human eyes, and the screen-to-eye distance refers to the distance between the display screen and the preset reference position of the human eyes. Wherein the reference position for the human eye is typically set between 10mm and 20mm from the lens. For example, the left eye reference position may be set 15mm from the position of the first lens 314, and the right eye position may be set 15mm from the position of the second lens 324.
In one embodiment, the microprocessor 37 for determining the first eye distance reference value according to the vision parameter of the left eye may comprise: and determining a first screen eye distance reference value according to the vision parameter of the left eye and the first expression. The microprocessor 37 for determining a second reference value of the screen eye distance according to the vision parameter of the right eye may comprise: and determining a second screen eye distance reference value according to the vision parameter of the right eye and the first expression. Wherein, the expression one can be expressed as:
ui=1/(Di-1/k)
in expression one, when uiWhen the first eye distance reference value is expressed, DiRepresents the focal power of the left eye; when u isiWhen representing the second eye distance reference value, DiIndicating the power of the right eye. k represents a constant, and may be related to characteristics of an actual product, such as lens parameters, product specification size, and the like. Focal power DiIs given in units of 1/m, 1/m is also called dioptric power, denoted by D, and 1 diopter corresponds to 100 degrees of vision. Specifically, 100 degrees for near vision corresponds to-1 dD, and 100 degrees for far vision corresponds to + 1D. Thus, in the reference value of the eye distance calculated according to expression oneAnd the reference value of the distance between the eyes corresponding to the short sight parameter is small, and the reference value of the distance between the eyes corresponding to the long sight parameter is large.
Specifically, in one case, the microprocessor 37 may calculate the eye distance reference value (the first eye distance reference value and/or the second eye distance reference value) according to the vision parameter (the vision parameter for the left eye and/or the vision parameter for the right eye) input by the user each time and the expression one.
Or, in another case, the head display 300 may further include a memory 36, and the memory 36 may store a preset first lookup table, where the first lookup table includes a corresponding relationship between the vision parameter and the distance between the eyes in the first lookup table satisfies the first expression. For example, taking the vision parameter as an example of myopia, a first control table provided in the embodiment of the present application may specifically be as follows:
TABLE 1
Parameters of myopia Reference value of eye distance
50 degree Reference value 1
100 degree Reference value 2
150 degree Reference value 3
200 degree Reference value 4
250 degree Reference value 5
300 degree Reference value 6
... ...
When determining the first eye distance reference value, the microprocessor 37 may determine a current first eye distance, which is a distance between the current first display 312 and the left eye reference position, according to the current position of the first display 312, so as to determine a first rotation direction and a first rotation number of the first motor 331 according to a difference between the first eye distance reference value and the current first eye distance.
It should be noted that, each time the head display 300 is used, the current position of the display screen in the head display 300 may be a preset position or a reserved position after last adjustment, and is not limited herein.
For example, taking the first motor 331 as an example, if the first eye distance reference value determined by the microprocessor 37 according to the vision parameter of the left eye is 25mm, and the current first eye distance is 20mm, the first display 312 needs to be adjusted backward (in a direction away from the reference position of the left eye) by 5mm, which may be regarded as the distance to be adjusted being +5 mm. The microprocessor 37 can calculate a first number of rotations of the first motor 331 based on the 5mm and the first pitch of the first shaft 332. The first number of turns may be a quotient of 5mm and the first pitch. If the first screen-to-eye distance reference value determined by the microprocessor 37 according to the vision parameter of the left eye is 25mm, and the current first screen-to-eye distance is 30mm, the first display screen 312 needs to be adjusted forward (toward the direction close to the left eye reference position) by 5mm, which may be regarded as the distance to be adjusted being-5 mm. Also, the microprocessor 37 can adjust the position of the first display screen 312 forward or backward as needed to determine whether the first rotational direction of the first motor 331 is clockwise rotation or counterclockwise rotation. Specifically, the correspondence between the moving direction of the first display screen 312 and the rotating direction of the first motor 331 is related to the screwing direction of the screw threads of the first rotary cylinder 333 and the first rotary shaft 332. Similarly, the microprocessor 37 can also determine a second direction of rotation and a second revolution of the second motor 334.
In addition, in a possible embodiment, a preset second comparison table may be stored in the memory 36, and the second comparison table includes a corresponding relationship between the distance to be adjusted of the display screen and the rotation direction and the rotation number of the motor. After determining the distance to be adjusted, the microprocessor 37 can determine the rotation direction and the number of rotations of the motor by looking up the second lookup table.
After the microprocessor 37 determines the first rotation direction and the first rotation number of the first motor 331, the first adjustment member 33 may perform the above-described first adjustment operation such that the distance between the adjusted position of the first display screen 312 and the left-eye reference position is equal to the first-screen-eye distance reference value. After the microprocessor 37 determines the second rotation direction and the second rotation number of the second motor 334, the first adjustment member 33 may perform the above-described second adjustment operation such that the distance between the adjusted position of the second display screen 322 and the left-eye reference position is equal to the second inter-screen distance reference value.
In this kind of possible implementation, the head display can confirm the reference value of the screen eye distance according to the accurate vision parameters input by the user, thereby confirming the rotation direction and the rotation number of turns of the motor, further adjusting the position of the display screen, and enabling the user with the vision defect to observe clear images on the adjusted display screen.
In a second possible implementation, the head display 300 may further include a memory 36, and the vision information stored in the memory 36 includes a vision parameter of the left eye and/or a vision parameter of the right eye corresponding to the identification information of the user. In this case, the head display 300 may determine the reference value of the screen-to-eye distance according to the accurate vision parameter corresponding to the identification information of the user stored in the memory 36, so as to determine the rotation direction and the number of rotations of the motor, and further adjust the position of the display screen, so that the user with the visual defect can observe a clear image on the adjusted display screen.
In another embodiment, the vision information detected by the vision input unit 35 or stored in the memory 36 includes a value of the first screen-eye distance and a value of the second screen-eye distance. In the possible implementation mode, the head display can determine the reference value of the screen-eye distance according to the accurate screen-eye distance value input by the user, so that the rotation direction and the rotation number of turns of the motor are determined, the position of the display screen is adjusted, and the user with eyesight defects can observe clear images on the adjusted display screen.
In particular, the first possible implementation described above may be applicable to a scenario in which the user uses the head display 300 for the first time. In this scenario, the memory 36 does not store therein the vision information corresponding to the identification information of the user, and the user needs to input the vision parameter of the left eye and/or the vision parameter of the right eye through the vision input unit 35. In this case, if the user does not input the vision parameter, the head display 300 defaults that the vision of the user is normal, and the vision parameter is a preset normal vision parameter. Other possible implementations may be applicable to scenarios in which the user does not use the head display 300 for the first time. In this scenario, the memory 36 may store therein visual acuity information corresponding to the user's identification information, so that the user may adjust more efficiently based on the visual acuity information in the memory 36 the next time he uses the head display 300, without the user having to input the visual acuity information again.
Further, when the vision information detected by the vision component 35 or the vision information stored in the memory 36 includes the pupil distance parameter, the microprocessor 37 may be further configured to control the second adjusting member 34 to perform an adjusting operation according to the vision information, including: the microprocessor 37 determines a third rotation direction and a third rotation number of the third motor 343 according to the interpupillary distance parameter. The second adjusting member 34 is configured to perform the third adjusting operation according to the third rotating direction and the third rotating number of the third motor 343, so that the adjusted distance between the first eyepiece and the second eyepiece is equal to the interpupillary distance parameter.
The method for determining the third rotation direction and the third rotation number of the third motor 343 by the microprocessor 37 is similar to the method for determining the first rotation direction and the first rotation number of the first motor 331 by the microprocessor 37, and will not be described herein again. Wherein the difference between the interpupillary distance parameter and the current ocular distance may be similar to the distance to be adjusted described above.
Therefore, the head display provided by the embodiment of the application can determine the rotation direction and the rotation number of the motor according to the accurate pupil distance parameter input by the user or the accurate pupil distance parameter corresponding to the identification information of the user stored in the memory, and further adjust the positions of the two eyepieces, so that the positions of the adjusted eyepieces can be adapted to the pupil distance of the current user, and the visual experience of the user is improved.
Further, the head display 300 provided by the embodiment of the present application may further include an identification input component 39 for inputting identification information of the user.
Wherein the identification information of a user can be used to uniquely identify a user. For example, the identification information of the user may include at least one of character information, voice information, fingerprint information, iris information, and facial information corresponding to the user. When the identification information of the user is character information, the identification input unit 39 may be a touch panel, an input panel with keys, or the like; when the identification information of the user is voice information, the identification data component may be a microphone; when the identification information of the user is fingerprint information, the identification input section 39 may specifically be a fingerprint recognition component; the identification input means 39 may be an iris recognition component when the identification information of the user is iris information, and the identification input means 39 may be a face recognition component when the identification information of the user is face recognition information. The iris recognition assembly and the face recognition assembly can specifically comprise cameras.
In the second possible implementation manner, when the head display 300 detects the identification information of the user through the identification input part 39, the adjustment operation may be performed according to the vision information stored in the memory 36 and matched with the identification information of the user, so as to perform the adjustment operation according to the vision information.
Moreover, in the embodiment of the present application, when the user starts to use the head display 300, for example, when the head display 300 is turned on, the head display 300 may prompt the user to input the identification information through a visual prompt and/or an audible prompt. Illustratively, the head display 300 may prompt the user by voice to "please enter identification information". Alternatively, for example, referring to fig. 12, when the head display 300 includes the touch panel 40, the head display 300 may prompt the user to "please input identification information" through the touch panel 40.
Further, after adjusting the position/eyepiece spacing of the display screen, the head display 300 may also trigger a prompt message to prompt the user that the position/eyepiece spacing of the display screen has been adjusted. The prompt information may include a visual prompt and/or an audible prompt. Exemplary, or other types of prompts, are not specifically limited herein.
Illustratively, referring to fig. 13, the head display 300 may display visual cue information such as "position of adjusted display", "adjustment completed", or "ok" on the display screen (e.g., the first display 312) after adjusting the position of the display screen. Illustratively, referring to FIG. 14, the head display 300 may display a visual cue "adjusted eyepiece spacing" on the at least one display screen after adjusting the eyepiece spacing. For example, after adjusting the position of the display screen, the head display 300 may display the actual distance between the adjusted position of the display screen and the position of the human eyes on the display screen. Illustratively, the head display 300 may display the adjusted eyepiece spacing on the at least one display screen after adjusting the eyepiece spacing. Illustratively, the head display 300 may issue an "adjustment complete" voice prompt after adjusting the position of the display screen/eyepiece spacing. For example, referring to FIG. 15, the head display 300 may issue a voice prompt "position of first display screen adjusted" after adjusting the position of the first display screen 312. Illustratively, the head display 300 may emit a "tic" audible prompt after adjusting the position of the display screen/eyepiece spacing. For example, the head display 300 may simultaneously display a prompt message "adjust complete" on the display screen and sound a "tic" after adjusting the position of the display screen/the distance between the eyepieces. Illustratively, the head display 300 may prompt the user that the position/eyepiece spacing of the display screen has been adjusted by vibrating after adjusting the position/eyepiece spacing of the display screen. In addition, other possible prompting manners are possible, which are not illustrated here.
Furthermore, after the user wears the head display 300, due to different degrees of tightness and individual differences of wearing, the actual position of the human eye may be slightly different from the reference position of the human eye, or the vision parameters of the user may slightly change, after the head display 300 adjusts the position of the display screen according to the accurate vision information of the user, the image observed by the user on the display screen may not be particularly clear, and at this time, the user may further fine-tune the position of the display screen by triggering the indication information.
In one possible implementation, the head display 300 provided by the embodiment of the present application may further include a wearing detection component 310 and an indication input component 38. The wear detection part 310 may be used to detect whether the terminal 30 has been worn by the user. The indication input part 38 may be configured to detect first indication information input by the user for indicating adjustment of the vision parameter for the left eye and/or the vision parameter for the right eye when the wearing detection part 310 detects that the terminal 30 has been worn by the user.
The microprocessor 37 may be further configured to determine a first adjustment value and/or a second adjustment value according to the first indication information, where the first adjustment value is an adjustment value of the first eye distance, and the second adjustment value is an adjustment value of the second eye distance. And a fourth direction of rotation and a fourth number of revolutions of the first motor 331 based on the first adjustment value and/or a fifth direction of rotation and a fifth number of revolutions of the second motor 334 based on the second adjustment value.
The first adjustment member 33 may also be used to adjust the position of the first display screen 312 according to a fourth rotational direction and a fourth number of rotations of the first motor 331 and/or to adjust the position of the second display screen 322 according to a fifth rotational direction and a fifth number of rotations of the second motor 334.
Wherein, the adjustment range of the vision parameter indicated by the first indication information can be smaller. Specifically, the instruction input section 38 may include at least one of a voice instruction input section and a posture instruction input section. The voice indication input component may specifically include a microphone for detecting a voice indication of the user; the posture indication input means may specifically comprise a camera for detecting a posture indication of the user. The posture input unit may detect posture indication information indicating that the user adjusts the visual parameter of the left eye and/or the visual parameter of the right eye, for example, the user eye movement information and the user head movement information. Further, the instruction input section 38 may specifically include various sensors for detecting instruction information of the user.
Illustratively, referring to fig. 16, the user may indicate by voice that the left eye is adjusted up by 1 degree for near vision, that the right eye is adjusted down by 3 degrees for far vision, etc. Alternatively, for example, a preset adjustment step size may be stored in the memory 36, for example, the vision parameter is 1 degree, and the user may instruct to increase the myopia degree of the left eye by voice, which is equivalent to instructing to increase the myopia degree of the left eye by 1 degree. Or, for example, the user may turn the head to the left to indicate that the vision parameters of the left eye need to be adjusted, and turn the head upward to indicate that the vision parameters of the user are increased; the user can turn the head to the right to indicate that the vision parameters of the right eye need to be adjusted, and turn the head downward to indicate that the vision parameters of the user are adjusted downward. Alternatively, for example, the user may move the hand to the left to indicate that the vision parameter of the left eye needs to be adjusted, see fig. 17, and indicate that the vision parameter of the left eye needs to be adjusted up by pointing the hand upwards; the user can move the hand to the right to indicate that the vision parameters for the right eye need to be adjusted, see fig. 18, by pointing the hand down to indicate that the vision parameters for the right eye need to be adjusted small. Alternatively, the user may trigger the first indication information by combining voice and body posture, for example. For example, the user may vocally instruct the left eye and turn the head upward to instruct to turn up the vision parameters of the left eye by a preset step size.
The microprocessor 37 may determine the adjusted vision parameter according to the adjustment value of the vision parameter indicated by the first indication information, and determine the adjustment value of the eye distance according to the adjusted vision parameter, the first expression and the actual value of the current eye distance.
Alternatively, the memory 36 may further store a second control table similar to table 1, for representing the corresponding relationship between the adjustment value of the vision parameter and the adjustment value of the distance between eyes and the screen. The microprocessor 37 may determine, according to the adjustment value of the vision parameter indicated by the first indication information, an adjustment value of the screen-eye distance corresponding to the adjustment value of the vision parameter through the lookup table 2. For an exemplary, near vision example, the second control table may be specifically referred to as table 2 below. The adjustment values for the visual parameters in table 2 are smaller.
TABLE 2
Adjustment of myopia parameters Eye distance adjustment value
... ...
-1.5 degree Adjustment value 1
-1 degree Adjustment value 2
-0.5 degree Adjustment value 3
0.5 degree Adjustment value 4
1 degree Adjustment value 5
1.5 degree Adjustment value 6
... ...
The adjustment values of the myopia parameters in table 2 may have positive and negative scores, for example, positive when increasing and negative when decreasing. Accordingly, the adjustment value may have a positive or negative score. For example, when the myopia parameter adjustment value is positive, the myopia parameter is increased, the screen eye distance adjustment value may be negative at this time, the screen eye distance needs to be decreased, and the display screen needs to move towards the direction close to the human eyes; when the myopia parameter adjustment value is negative, the myopia parameter is decreased, the screen eye distance adjustment value can be positive, the screen eye distance needs to be increased, and the display screen needs to move towards the direction far away from the eyes.
It should be noted that the above table 2 is described by taking near vision as an example, and the far vision situation is similar to the near vision situation, and is not described herein again.
The adjustment value of the screen eye distance may be similar to the distance to be adjusted, and similar to the method for determining the first rotation direction and the first rotation number according to the distance to be adjusted, the microprocessor 37 may determine the fourth rotation direction and the fourth rotation number according to the adjustment value of the first screen eye distance, and determine the fifth rotation direction and the fifth rotation number according to the adjustment value of the second screen eye distance, so as to fine-tune the positions of the first display screen 312 and the second display screen 322, and the specific process is not described herein again.
In another possible embodiment, the indication input part 38 may be further configured to detect second indication information of the user, when the wearing detection part 310 detects that the terminal 30 has been worn by the user, the second indication information being used to indicate that the first screen eye distance and/or the second screen eye distance is increased or decreased, or the second indication information being used to indicate that the position of the first display 312 and/or the second display 322 is adjusted forward or backward. That is, the second indication information is used to directly indicate the position of the display screen.
The second indication information may specifically be voice information and/or posture information. Illustratively, the second indication may be by voice pitch greater than the first screen eye distance by 2mm, or by moving the position of first display 312 farther by 2 mm. As another example, the user may turn the head to the right, indicating an adjustment of second display 322, turn the head upward, indicating an increase in the second eye distance, or indicating a backward adjustment of the position of second display 322 by a preset step size (e.g., 0.5 mm). As yet another example, the user may turn the head to the right, indicating an adjustment of second display 322, move the hand forward to indicate a decrease in second eye distance, or move the hand backward to indicate an increase in second eye distance.
The microprocessor 37 may also be configured to determine a sixth direction of rotation and a sixth number of rotations of the first motor 331 based on the second indication and/or a seventh direction of rotation and a seventh number of rotations of the second motor 334 based on the second indication. The first adjustment member 33 may also be used to adjust the position of the first display screen 312 based on a sixth rotational direction and a sixth number of rotations of the first motor 331 and/or to adjust the position of the second display screen 322 based on a seventh rotational direction and a seventh number of rotations of the second motor 334.
That is, the microprocessor 37 may determine a sixth rotation direction and a sixth rotation number according to the adjustment value of the first screen-to-eye distance indicated by the second indication information, and further fine-tune the position of the first display screen 312 through the first motor 331; alternatively, the microprocessor 37 may determine the seventh rotation direction and the seventh rotation number according to the adjusted value of the second eye distance indicated by the second indication information, and then finely adjust the position of the second display screen 322 by the second motor 334.
It can be seen that, after the user wears the head display 300, if the image of the display screen is not clear, the user can input the first indication information in the form of voice or posture through the indication input component 38 to perform fine adjustment on the visual parameters, so as to perform fine adjustment on the position of the display screen.
Thus, the head display 300 provided by the embodiment of the present application can adjust the position of the display screen according to the accurate vision information input by the user or stored in the memory 36 before being worn by the user; after the user wears the display screen, the position of the display screen can be finely adjusted according to the indication information triggered by the user in the modes of voice, posture and the like, so that the user can observe a clearer image through the adjusted display screen.
Compare with the mode of manual regulation focus among the prior art, the head that this application embodiment provided shows 300 not only can not influence user's field of view scope, does not need the user to carry out manual regulation under the obstructed condition of eyesight after wearing moreover, therefore can not receive the obstructed influence of eyesight, and the regulative mode is more convenient, consequently more can improve user's use and experience.
Still there is the mode of a direct focus through the pronunciation adjustment among the prior art, and this kind of mode need adjust the position of lens repeatedly just can make the formation of image comparatively clear, and accommodation process is longer, and speed of regulation is slower. The head display 300 provided by the embodiment of the application can firstly perform coarse adjustment on the position of the display screen according to the accurate vision parameters input by the user, so that the display screen reaches a more accurate position, and the image of the display screen is clearer; then can finely tune the position of display screen according to the instruction information that the user triggered through modes such as pronunciation, posture to can make the formation of image of the display screen after the adjustment more clear, and accommodation process is simple quick, more can improve user and use experience.
Further, the head display 300 may also fine-tune the eyepiece spacing, similar to fine-tuning the position of the display screen.
Specifically, the instruction input section 38 may be further configured to detect third instruction information of the user, the third instruction information being used to instruct to increase or decrease the interpupillary distance parameter or the ocular distance, when the wearing detection section 310 detects that the terminal 30 has been worn by the user. The microprocessor 37 can also be used to determine an eighth rotational direction and an eighth number of rotations of the third motor 343 based on the third indication. The second adjustment member 34 is also used to adjust the eyepiece spacing according to the eighth rotational direction and the eighth number of rotations of the third motor 343. The third indication information may specifically be voice and/or posture information. Illustratively, referring to FIG. 19, the user may indicate a need to decrease the eyepiece spacing by a pinch gesture. Illustratively, the user may indicate "close to two eyepieces" by voice.
The pupil distance parameter or the eyepiece distance that is increased or decreased and indicated by the third indication information may be similar to the distance to be adjusted, and as for the process that the microprocessor 37 is configured to determine the eighth rotation direction and the eighth rotation number of the third motor 343 according to the third indication information, reference may be specifically made to the process that the microprocessor 37 determines the first rotation direction and the first rotation number of the first motor 331 according to the distance to be adjusted, which is not described herein again.
Thus, the head display 300 provided by the embodiment of the application can adjust the distance between the eyepieces according to the accurate pupil distance parameter input by the user before the user wears the head display; after the user wears the glasses, the distance between the ocular lenses can be finely adjusted according to the indication information triggered by the user through non-manual operation modes such as voice and posture, so that the adjusted distance between the ocular lenses is better matched with the actual interpupillary distances of different users, the user feels more comfortable after wearing the glasses, and the visual experience of the user is improved.
In addition, after the position of the display screen/the distance between the eyepieces is finely adjusted, the head display 300 can also trigger prompt information, which is not described herein again.
Specifically, the instruction input unit 38 and the visual input unit 35 in the embodiment of the present application may be the same unit or may be different units.
In addition, when the user uses the head display 300, the user can trigger the head display 300 to display the numerical value of the current adjusted screen eye distance and the numerical value of the eye lens distance through the display screen after observing a clear image.
It should be noted that, in the embodiment of the present application, the vision parameter stored in the memory 36 may be a vision parameter adjusted by the head display 300 according to the first indication information; the numerical value of the interpupillary distance stored in the memory 36 may be a numerical value of the interpupillary distance adjusted based on the third instruction information. Thus, when the head display 300 adjusts the position of the display screen or the distance between the eyepieces according to the vision information stored in the memory, the adjustment can be performed more accurately and efficiently according to the vision information adjusted in the last use.
For example, the vision parameters stored in the memory 36 may be the adjusted vision parameters indicated by voice or body state when the user uses the vision parameters last time. For example, if the user inputs the myopia parameter of the left eye through the vision input unit 35 as 100 degrees and then instructs to increase the myopia parameter of the left eye by 3 degrees through the instruction input unit 38, the vision parameter of the left eye of the user stored in the memory 36 is 103 degrees of myopia. Therefore, when the user uses the device next time, the position of the display screen and the distance between the eyepieces can be directly, accurately and efficiently adjusted according to the vision parameters corresponding to the identification information of the user and stored in the memory 36, and the user does not need to input the vision parameters again.
Alternatively, in the embodiment of the present application, the vision parameters stored in the memory 36 may also be the vision parameters input by the user detected by the vision input unit 35; the value of the eye distance stored in the memory 36 may also be the value of the eye distance before fine adjustment; the numerical value of the interpupillary distance stored in the memory 36 may be an interpupillary distance parameter input by the user and detected by the visual input unit 35.
In addition, in this embodiment of the application, in a possible case, after the user finishes using this time, the head display 300 may restore the position of the display screen and the distance between the eyepieces to preset values. In this case, when the user uses the display device again, the head display 300 may adjust the position of the display screen and the distance between the eyepieces according to the adjustment information corresponding to the identification information of the user.
In another possible case, after the user finishes the use, the head display 300 may keep the position of the display screen and the distance between the eyepieces adjusted this time. In this case, if the user does not use the head display 300 before reusing the head display, the position of the display screen and the distance between the eyepieces in the head display 300 are still the values adjusted by the user when using the head display last time, and when the user reuses the head display, the user does not need to adjust the position of the display screen and the distance between the eyepieces, and can directly observe clear images. If the user uses the head display 300 before the user uses the head display again, the position of the display screen and the distance between the eyepieces in the head display 300 may be changed, and when the user uses the head display again, the position of the display screen and the distance between the eyepieces may be accurately and efficiently adjusted according to the visual information corresponding to the identification information of the user stored in the memory 36.
Another embodiment of the present application further provides a terminal 60, referring to fig. 20, including: and the vision input part 61 is used for detecting vision information input by the user, and the vision information comprises vision parameters of the left eye and/or vision parameters of the right eye. And a memory 62 for storing vision information corresponding to the identification information of the user. A microprocessor 63, configured to determine a reference value of a screen-to-eye distance according to the vision information detected by the vision input unit 61 or the vision information stored in the memory 62, where the reference value of the screen-to-eye distance includes a first reference value of the screen-to-eye distance and/or a second reference value of the screen-to-eye distance, the first screen-to-eye distance is a distance between the first display screen and the left-eye reference position, and the second screen-to-eye distance is a distance between the second display screen and the right-eye reference position; and adjusting the position of the display screen according to the reference value of the distance between the two screen eyes, so that the distance between the adjusted position of the first display screen and the reference position of the left eye is equal to the reference value of the distance between the two screen eyes, or the distance between the adjusted position of the second display screen and the reference position of the right eye is equal to the reference value of the distance between the two screen eyes.
The embodiment of the present application further provides a position adjustment method, which may be applied to the terminal 30 shown in fig. 4 to 19 and the terminal 60 shown in fig. 20, and referring to fig. 21, the method may include:
101. the terminal determines the reference value of the screen-eye distance according to the vision information of the user, wherein the vision information comprises the vision information input by the user or the vision information which is stored in the terminal and corresponds to the identification information of the user.
The vision information comprises vision parameters of a left eye and/or vision parameters of a right eye, the reference value of the screen eye distance comprises a first screen eye distance reference value and/or a second screen eye distance reference value, the first screen eye distance is the distance between the first display screen and the reference position of the left eye, and the second screen eye distance is the distance between the second display screen and the reference position of the right eye.
102. And the terminal adjusts the position of the display screen according to the reference value of the distance between the screen eyes, so that the distance between the adjusted position of the first display screen and the reference position of the left eye is equal to the reference value of the distance between the first screen eyes, or the distance between the adjusted position of the second display screen and the reference position of the right eye is equal to the reference value of the distance between the second screen eyes.
Therefore, for users with eyesight defects, the method provided by the embodiment of the application can respectively adjust the position of the first display screen corresponding to the left eye and the position of the second display screen corresponding to the right eye to clearly image according to the accurate eyesight information input by the users or the stored eyesight information corresponding to the users, and cannot influence the field range of the users like the prior art, so that the use experience of the users can be improved. In addition, the method provided by the embodiment of the application can adjust the position of the display screen through the rotation of the motor, and does not need a user to manually adjust the position of the display screen, so that the adjusting mode is more convenient.
Further, the vision information may further include an interpupillary distance parameter, which is a reference value of an ocular distance, which is a distance between the first ocular and the second ocular. The first eyepiece includes a first display screen and the second eyepiece includes a second display screen, see fig. 22, the method may further comprise:
103. and the terminal adjusts the distance between the ocular lenses according to the interpupillary distance parameters so that the adjusted distance between the ocular lenses is equal to the interpupillary distance parameters.
Therefore, the method provided by the embodiment of the application can adjust the distance between the eyepieces according to the pupil distance parameters so as to adapt to the pupil distance conditions of different users and improve the visual experience of the users. Moreover, the adjustment mode is more convenient because manual adjustment is not needed.
Further, the method may further include:
104. after the terminal is worn by the user, the terminal receives indication information of the user.
105. And the terminal responds to the indication information and adjusts at least one of the position of the first display screen, the position of the second display screen and the distance between the eyepieces.
Like this, the terminal can be before being worn by the user, carries out the coarse tune to the position of display screen or eyepiece interval according to eyesight information to after being worn by the user, finely tune to the position of display screen or eyepiece interval, thereby make the formation of image more clear, more can improve user experience.
Another embodiment of the present application further provides a terminal 70, referring to fig. 23, where the terminal 70 may include: a processor 71, a memory 72, a bus 73, and a communication interface 74; the memory 72 is used for storing computer-executable instructions, the processor 71 is connected with the memory 72 through the bus 73, and when the terminal 70 is operated, the processor 71 executes the computer-executable instructions stored in the memory 72 to make the terminal 70 execute the scheduling method.
While the present application has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (22)

1. A terminal (30), characterized in that it comprises a wearable device, said terminal comprising a first eyepiece (31), a second eyepiece (32) and a first adjustment member (33);
the first eyepiece (31) comprises a first lens barrel (311) and a first display screen (312) arranged in the first lens barrel (311), and the first display screen (312) is connected with the first lens barrel (311) in a sliding manner;
the second eyepiece (32) comprises a second lens barrel (321) and a second display screen (322) arranged in the second lens barrel (321), and the second display screen (322) is connected with the second lens barrel (321) in a sliding manner;
the first adjusting member (33) comprises a first motor (331), a first rotating shaft (332) and a first rotating drum (333) which is arranged perpendicular to the first display screen (312), the first rotating drum (333) is connected with the first motor (331), one end of the first rotating shaft (332) is matched with the first rotating drum (333) through threads, and the other end of the first rotating shaft (332) is fixedly connected with the first display screen (312);
the first adjusting member (33) further comprises a second motor (334), a second rotating shaft (335) and a second rotating drum (336) which is perpendicular to the second display screen (322), the second rotating drum (336) is connected with the second motor (334), one end of the second rotating shaft (335) is in threaded fit with the second rotating drum (336), and the other end of the second rotating shaft (335) is fixedly connected with the second display screen (322);
the terminal (30) further comprises a microprocessor (37); the microprocessor (37) determines a reference value of the screen-eye distance according to vision information, wherein the vision information comprises at least one of a vision parameter and a pupil distance parameter, the vision parameter comprises a left-eye vision parameter and/or a right-eye vision parameter, the reference value of the screen-eye distance comprises a first reference value of the screen-eye distance and/or a second reference value of the screen-eye distance, the first screen-eye distance is the distance between the first display screen and the reference position of the left eye, and the second screen-eye distance is the distance between the second display screen and the reference position of the right eye; and determining a first direction of rotation and a first number of turns of the first motor (331) based on the first eye reference value, and/or determining a second direction of rotation and a second number of turns of the second motor (334) based on the second eye reference value; the microprocessor (37) is used for controlling the first adjusting component (33) to adjust the position of the first display screen (312) according to the first rotating direction and the first rotating number of turns of the first motor (331) according to the vision information, so that the distance between the adjusted position of the first display screen (312) and the left-eye reference position is equal to the first screen-to-eye distance reference value; and/or the microprocessor (37) is used for controlling the first adjusting component (33) to adjust the position of the second display screen (322) according to the second rotating direction and the second rotating number of turns of the second motor (334) according to the vision information, so that the distance between the adjusted position of the second display screen (322) and the right eye reference position is equal to the second screen-to-eye distance reference value;
wherein the microprocessor (37) for determining the first and/or second eye distance reference values from the vision information comprises:
determining the first screen eye distance reference value according to the vision parameter of the left eye and the first expression;
determining the second screen eye distance reference value according to the vision parameter of the right eye and the expression I;
the first expression is as follows:
ui=1/(Di-1/k)
wherein k represents a constant, when uiWhen representing said first eye distance reference value, DiRepresents the focal power of the left eye; when u isiWhen representing said second eye distance reference value, DiIndicating the power of the right eye.
2. A terminal (30) according to claim 1, wherein the terminal (30) further comprises a second adjustment member (34), the second adjustment member (34) comprising a third motor (343), a transmission assembly (344) and a screw assembly (345), the screw assembly (345) being in threaded connection with the first barrel (311) and the second barrel (321);
the third motor (343) drives the screw assembly (345) to rotate through the transmission assembly (344) so as to enable the first lens barrel (311) and the second lens barrel (321) to approach or move away from each other.
3. The terminal (30) of claim 2, wherein the transmission assembly (344) includes an output shaft (41) of the third motor (343) and a worm gear (42), the output shaft (41) is a worm, one end of the worm gear (42) is engaged with the worm, and the other end of the worm gear (42) is connected to the screw assembly (345).
4. A terminal (30) according to claim 3, wherein the screw assembly (345) comprises a first screw (51), a second screw (52) and a third screw (53), one end of the worm gear (42) is engaged with the thread on the surface of the first screw (51), two ends of the first screw (51) are respectively provided with gears, one ends of the second screw (52) and the third screw (53) are respectively provided with gears, two ends of the first screw (51) are respectively engaged with the second screw (52) and the third screw (53) through gears, and one ends of the second screw (52) and the third screw (53) which are not provided with gears are respectively in threaded connection with the first lens barrel (311) and the second lens barrel (321).
5. The terminal (30) according to claim 2, wherein the screw assembly (345) comprises a fourth screw (54) and a fifth screw (55), one end of the fourth screw (54) and one end of the fifth screw (55) are respectively in threaded connection with the first barrel (311) and the second barrel (321), the transmission assembly (344) comprises an output shaft (41) of the third motor (343), a connecting rod (43), a first bevel gear (44), a second bevel gear (45), a third bevel gear (46), a fourth bevel gear (47) and a fifth bevel gear (48), the first bevel gear (44) is sleeved on the output shaft (41), the second bevel gear (45) and the third bevel gear (46) are arranged at two ends of the connecting rod (43), the first bevel gear (44) is engaged with the second bevel gear (45), the fourth bevel gear (47) is connected with the other end of the fourth screw (54), the fifth bevel gear (48) is connected with the other end of the fifth screw (55), and the third bevel gear (46) is meshed with the fourth bevel gear (47) and the fifth bevel gear (48) respectively.
6. A terminal (30) according to any of claims 1-5, characterized in that the inner wall of the first barrel (311) is provided with at least one first runner (313) and the inner wall of the second barrel (321) is provided with at least one second runner (323);
the first display screen (312) is connected with the first lens barrel (311) in a sliding mode through the at least one first sliding chute (313), and the second display screen (322) is connected with the second lens barrel (321) in a sliding mode through the at least one second sliding chute (323).
7. The terminal (30) of claim 6, wherein the terminal (30) further comprises a vision input component (35);
the vision input component (35) is used for detecting vision information input by a user.
8. The terminal (30) of claim 7, wherein the terminal (30) further comprises a memory (36) for storing the vision information corresponding to the user's identification information.
9. The terminal (30) according to claim 7, wherein the microprocessor (37) is configured to control the second adjusting means (34) to perform an adjusting operation according to the vision information, and specifically comprises:
the microprocessor (37) determines a third rotation direction and a third rotation number of a third motor (343) according to the pupil distance parameter;
the second adjusting member (34) is used for adjusting the distance between the first ocular (31) and the second ocular (32) according to the third rotating direction and the third rotating number of turns of the third motor (343), so that the adjusted distance between the first ocular (31) and the second ocular (32) is equal to the interpupillary distance parameter.
10. The terminal (30) of claim 7, wherein the vision input component 35 includes at least one of a voice input unit and a manual input unit.
11. The terminal (30) of claim 7, wherein the terminal (30) further comprises:
-wear detection means (310) for detecting whether the terminal (30) has been worn by a user.
12. The terminal (30) of claim 11, wherein the terminal (30) further comprises:
indication input means (38) for detecting first indication information input by a user when the wearing detection means (310) detects that the terminal (30) has been worn by the user, the first indication information being for indicating adjustment of a vision parameter for a left eye and/or a vision parameter for a right eye;
the microprocessor (37) is further configured to determine an adjustment value of the first eye distance and/or an adjustment value of the second eye distance according to the first indication information;
determining a fourth rotation direction and a fourth number of rotations of the first motor (331) according to the adjusted value of the first eye distance, and/or determining a fifth rotation direction and a fifth number of rotations of the second motor (334) according to the adjusted value of the second eye distance;
the first adjustment member (33) is further configured to adjust the position of the first display (312) according to the fourth rotational direction and the fourth number of rotations of the first motor (331) and/or to adjust the position of the second display (322) according to the fifth rotational direction and the fifth number of rotations of the second motor (334).
13. The terminal (30) of claim 11, wherein the terminal (30) further comprises:
an indication input part (38) for detecting second indication information of the user when the wearing detection part (310) detects that the terminal (30) has been worn by the user, the second indication information being used for indicating an increase or a decrease of the first screen eye distance and/or the second screen eye distance, or the second indication information being used for indicating a forward or backward adjustment of the position of the first display (312) and/or the second display (322);
the microprocessor (37) is further configured to determine a sixth direction of rotation and a sixth number of revolutions of the first motor (331) based on the second indication information, and/or determine a seventh direction of rotation and a seventh number of revolutions of the second motor (334) based on the second indication information;
the first adjustment member (33) is further configured to adjust the position of the first display (312) according to the sixth direction of rotation and the sixth number of rotations of the first motor (331) and/or to adjust the position of the second display (322) according to the seventh direction of rotation and the seventh number of rotations of the second motor (334).
14. The terminal (30) of claim 11, wherein the terminal (30) further comprises:
indication input means (38) for detecting third indication information of a user when the wearing detection means (310) detects that the terminal (30) has been worn by the user, the third indication information being for indicating an increase or decrease in the interpupillary distance parameter or an eyepiece distance;
the microprocessor (37) is used for determining the eighth rotating direction and the eighth rotating number of turns of the third motor (343) according to the third indication information;
the second adjusting member (34) is further configured to adjust an eyepiece pitch according to the eighth rotational direction and the eighth number of rotations of the third motor (343).
15. The terminal (30) according to any one of claims 12-14, wherein the indication input component (38) comprises at least one of a voice indication input component and a posture indication input component;
wherein the voice indication input component comprises a microphone, and the posture indication input component comprises a camera.
16. The terminal (30) of claim 15, wherein the indication input means (38) is identical to the vision input means (35).
17. The terminal (30) of claim 16, wherein the terminal (30) further comprises:
and the identification input part (39) is used for detecting identification information input by a user, and the identification information of the user comprises at least one of character information, voice information, fingerprint information, iris information and facial information corresponding to the user.
18. A terminal (60), characterized in that it comprises a wearable device, said terminal comprising:
a vision input part (61) for detecting vision information input by a user, the vision information including a vision parameter of a left eye and/or a vision parameter of a right eye;
a memory (62) for storing vision information corresponding to the identification information of the user;
a microprocessor (63) for determining a reference value of a screen eye distance according to the vision information detected by the vision input unit (61) or the vision information stored in the memory (62), wherein the reference value of the screen eye distance comprises a first reference value of the screen eye distance and/or a second reference value of the screen eye distance, the first screen eye distance is a distance between the first display screen and the reference position of the left eye, and the second screen eye distance is a distance between the second display screen and the reference position of the right eye;
the microprocessor (63) is further used for determining a first rotating direction and a first rotating number of the first motor according to the first screen distance reference value, and/or determining a second rotating direction and a second rotating number of the second motor according to the second screen distance reference value;
the microprocessor (63) is further used for controlling a first adjusting component to adjust the position of the first display screen according to the first rotating direction and the first rotating number of turns of the first motor according to the vision information, so that the distance between the adjusted position of the first display screen and the left-eye reference position is equal to the first screen-to-eye distance reference value; and/or the microprocessor (63) is used for controlling the first adjusting component to adjust the position of the second display screen according to the second rotating direction and the second rotating number of turns of the second motor according to the vision information, so that the distance between the adjusted position of the second display screen and the right-eye reference position is equal to the second screen-eye distance reference value;
wherein the microprocessor (63) for determining a reference value for the distance between the eyes from the vision information detected by the vision input unit (61) or the vision information stored in the memory (62) comprises:
determining the first screen eye distance reference value according to the vision parameter of the left eye and the first expression;
determining the second screen eye distance reference value according to the vision parameter of the right eye and the expression I;
the first expression is as follows:
ui=1/(Di-1/k)
wherein k represents a constant, and when ui represents the first screen distance reference value, Di represents the focal power of the left eye; when ui represents the second screen-to-eye distance reference value, Di represents the power of the right eye.
19. A position adjusting method is applied to a wearable device and comprises the following steps:
the terminal determines a reference value of the screen-eye distance according to the vision information of a user, wherein the vision information comprises the vision information input by the user or the vision information corresponding to the identification information of the user and stored in the terminal, the vision information comprises the vision parameter of a left eye and/or the vision parameter of a right eye, the reference value of the screen-eye distance comprises a first reference value of the screen-eye distance and/or a second reference value of the screen-eye distance, the first screen-eye distance is the distance between a first display screen and a reference position of the left eye, and the second screen-eye distance is the distance between a second display screen and a reference position of the right eye; determining a first rotating direction and a first rotating number of a first motor according to the first screen distance reference value, and/or determining a second rotating direction and a second rotating number of a second motor according to the second screen distance reference value; controlling a first adjusting member to adjust the position of the first display screen according to the first rotating direction and the first rotating number of turns of the first motor according to the vision information, so that the distance between the adjusted position of the first display screen and the left-eye reference position is equal to the first screen-to-eye distance reference value; and/or a microprocessor (37) for controlling the first adjusting member (33) to adjust the position of the second display screen (322) according to the second rotation direction and the second rotation number of the second motor (334) according to the vision information, so that the distance between the adjusted position of the second display screen (322) and the right-eye reference position is equal to the second screen-to-eye distance reference value;
wherein, the terminal determining the reference value of the screen-eye distance according to the vision information of the user comprises:
determining the first screen eye distance reference value according to the vision parameter of the left eye and the first expression;
determining the second screen eye distance reference value according to the vision parameter of the right eye and the expression I;
the first expression is as follows:
ui=1/(Di-1/k)
wherein k represents a constant, when uiWhen representing said first eye distance reference value, DiRepresents the focal power of the left eye; when u isiWhen representing said second eye distance reference value, DiIndicating the power of the right eye.
20. The position adjustment method according to claim 19, wherein the visual acuity information further includes an interpupillary distance parameter, the interpupillary distance parameter being a reference value of an eyepiece distance, the eyepiece distance being a distance between a first eyepiece and a second eyepiece, the first eyepiece including a first display screen, the second eyepiece including a second display screen, the method further comprising:
and the terminal adjusts the distance between the ocular lenses according to the pupil distance parameter so that the adjusted distance between the ocular lenses is equal to the pupil distance parameter.
21. The position adjustment method according to claim 19 or 20, characterized in that the method further comprises:
after the terminal is worn by a user, the terminal receives indication information of the user;
and the terminal responds to the indication information and adjusts at least one of the position of the first display screen, the position of the second display screen and the distance between the eyepieces.
22. A terminal (70), characterized in that it comprises a wearable device, said terminal comprising: a processor (71), a memory (72); the memory (72) is configured to store computer-executable instructions, which the processor (71) executes when the terminal (70) is running, and which the memory (72) stores, in order to cause the terminal (70) to perform the position adjustment method according to any one of claims 19-21.
CN201780011872.XA 2016-12-26 2017-07-11 Position adjusting method and terminal Active CN108700745B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201611219941X 2016-12-26
CN201611219941 2016-12-26
CN201710370609 2017-05-23
CN2017103706091 2017-05-23
PCT/CN2017/092541 WO2018120751A1 (en) 2016-12-26 2017-07-11 Position adjusting method and terminal

Publications (2)

Publication Number Publication Date
CN108700745A CN108700745A (en) 2018-10-23
CN108700745B true CN108700745B (en) 2020-10-09

Family

ID=62706855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780011872.XA Active CN108700745B (en) 2016-12-26 2017-07-11 Position adjusting method and terminal

Country Status (3)

Country Link
JP (1) JP6997193B2 (en)
CN (1) CN108700745B (en)
WO (1) WO2018120751A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110308560B (en) * 2019-07-03 2022-09-30 南京玛克威信息科技有限公司 Control method of VR equipment
CN110579878B (en) * 2019-07-05 2021-09-14 华为技术有限公司 Augmented reality AR glasses
CN113316738A (en) * 2019-08-06 2021-08-27 松下知识产权经营株式会社 Display device
CN111830712B (en) * 2020-07-07 2023-03-31 Oppo广东移动通信有限公司 Intelligent glasses, assembled glasses, control method and computer storage medium
CN114675417A (en) * 2020-12-24 2022-06-28 华为技术有限公司 Display module and virtual image position adjusting method and device
CN113204120A (en) * 2021-05-26 2021-08-03 怀智悦 Intelligent glasses of multi-functional rehabilitation training
CN113655588A (en) * 2021-07-13 2021-11-16 深圳远见睿视科技有限公司 Adaptive lens expansion control method, device, equipment and storage medium
CN113974253B (en) * 2021-09-22 2023-06-23 天津(滨海)人工智能军民融合创新中心 Multidirectional adjusting and fixing device of helmet head display mechanism
CN116482857A (en) * 2022-01-13 2023-07-25 北京字跳网络技术有限公司 Head-mounted electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103487940A (en) * 2013-09-04 2014-01-01 西安Tcl软件开发有限公司 Video glasses and method for adjusting focal length of video glasses
CN205003364U (en) * 2015-08-27 2016-01-27 王集森 Display device is worn to interpupillary distance regulation formula
CN205826969U (en) * 2016-06-25 2016-12-21 深圳市虚拟现实科技有限公司 A kind of self adaptation nearly eye display device

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3298977B2 (en) * 1992-06-22 2002-07-08 ソニー株式会社 Glasses type display device
JPH0882762A (en) * 1994-07-15 1996-03-26 Sega Enterp Ltd On-head type video display device and video display system using it
JPH08111834A (en) * 1994-10-12 1996-04-30 Olympus Optical Co Ltd Head mounted video image display device
JPH08130695A (en) * 1994-11-01 1996-05-21 Olympus Optical Co Ltd Head-mounted video display device
JPH08160344A (en) * 1994-12-05 1996-06-21 Olympus Optical Co Ltd Head mounted video display device
JP3433558B2 (en) * 1995-02-28 2003-08-04 ソニー株式会社 Display device
JPH08286144A (en) * 1995-04-14 1996-11-01 Canon Inc Picture observation device and observation equipment using the device
JP4965800B2 (en) 2004-10-01 2012-07-04 キヤノン株式会社 Image display system
CN101452117B (en) * 2004-11-24 2011-01-26 寇平公司 Binocular display system and method thereof
JP2012194501A (en) 2011-03-18 2012-10-11 Brother Ind Ltd Head-mounted display and virtual image presentation method
JP5682417B2 (en) 2011-03-31 2015-03-11 ブラザー工業株式会社 Head mounted display and its brightness adjustment method
JP5953963B2 (en) 2012-06-13 2016-07-20 ソニー株式会社 Head-mounted image display device
US9529194B2 (en) 2013-11-21 2016-12-27 Samsung Electronics Co., Ltd. Head-mounted display apparatus
US9274340B2 (en) * 2014-02-18 2016-03-01 Merge Labs, Inc. Soft head mounted display goggles for use with mobile computing devices
CN106569338B (en) * 2014-12-31 2019-04-12 青岛歌尔声学科技有限公司 A kind of Worn type display
CN104898282B (en) * 2015-05-12 2017-09-12 北京小鸟看看科技有限公司 A kind of method of head mounted display and its Diopter accommodation
CN104932103B (en) * 2015-06-02 2017-10-10 青岛歌尔声学科技有限公司 A kind of adjustable head-worn display
CN205067870U (en) * 2015-11-03 2016-03-02 上海乐相科技有限公司 Wear -type virtual reality equipment
CN205333966U (en) * 2015-12-11 2016-06-22 深圳纳德光学有限公司 Head -mounted display
CN105954875A (en) * 2016-05-19 2016-09-21 华为技术有限公司 VR (Virtual Reality) glasses and adjustment method thereof
CN106019590A (en) * 2016-07-05 2016-10-12 上海乐相科技有限公司 Virtual reality device spacing adjustment method
CN106199967A (en) * 2016-08-13 2016-12-07 华勤通讯技术有限公司 Image reconstructor and head-mounted display

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103487940A (en) * 2013-09-04 2014-01-01 西安Tcl软件开发有限公司 Video glasses and method for adjusting focal length of video glasses
CN205003364U (en) * 2015-08-27 2016-01-27 王集森 Display device is worn to interpupillary distance regulation formula
CN205826969U (en) * 2016-06-25 2016-12-21 深圳市虚拟现实科技有限公司 A kind of self adaptation nearly eye display device

Also Published As

Publication number Publication date
JP6997193B2 (en) 2022-01-17
WO2018120751A1 (en) 2018-07-05
JP2020504839A (en) 2020-02-13
CN108700745A (en) 2018-10-23

Similar Documents

Publication Publication Date Title
CN108700745B (en) Position adjusting method and terminal
US10816752B2 (en) Virtual reality helmet and control method thereof
US10271722B2 (en) Imaging to facilitate object observation
JP6507241B2 (en) Head-mounted display device and vision assistance method using the same
TWI697692B (en) Near eye display system and operation method thereof
US11413211B2 (en) Vision training device
CN106233328B (en) Apparatus and method for improving, enhancing or augmenting vision
US9465237B2 (en) Automatic focus prescription lens eyeglasses
TWI619967B (en) Adjustable virtual reality device capable of adjusting display modules
CN106309089B (en) VR vision correction procedure and device
CN207799241U (en) A kind of optical system of focal length continuously adjustable for headset equipment
US9961257B2 (en) Imaging to facilitate object gaze
JP6461425B2 (en) Glasses with video projection function
JP2010540005A (en) Methods and devices for prevention and treatment of myopia and fatigue
WO2018191846A1 (en) Head-mounted display device and adaptive diopter adjustment method
JP6422954B2 (en) Adjusting the focal length
CN104090371A (en) 3D glasses and 3D display system
JP2020500689A (en) Vision training device for cognitive correction
JP2011145358A (en) Multi-focus electronic spectacles
KR20190049186A (en) Visual enhancement training device
JP2023006148A (en) Controller, spectacle lens device, glasses, method for control, and program
CN208822742U (en) A kind of subjective refraction instrument of simple judgment myopia of student
WO2016002296A1 (en) Optical control device and optical control method
CN209911643U (en) Vision aid
JP2023040827A (en) Wearable apparatus and glasses

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant