CN107544661B - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN107544661B
CN107544661B CN201610476251.6A CN201610476251A CN107544661B CN 107544661 B CN107544661 B CN 107544661B CN 201610476251 A CN201610476251 A CN 201610476251A CN 107544661 B CN107544661 B CN 107544661B
Authority
CN
China
Prior art keywords
lens
rendering
astigmatism
parameters
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610476251.6A
Other languages
Chinese (zh)
Other versions
CN107544661A (en
Inventor
杨大业
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201610476251.6A priority Critical patent/CN107544661B/en
Publication of CN107544661A publication Critical patent/CN107544661A/en
Application granted granted Critical
Publication of CN107544661B publication Critical patent/CN107544661B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an information processing method and electronic equipment, comprising the following steps: projecting light rays of a light source to eyeballs of a user through a lens so that the user can watch a display interface formed by the light rays; acquiring a gaze parameter, the gaze parameter being used to characterize position data of an eyeball; acquiring an astigmatism parameter of the lens; determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens; and rendering the display interface to be displayed according to the rendering depth.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to information processing technologies, and in particular, to an information processing method and an electronic device.
Background
With the popularity of Augmented Reality (AR) and Virtual Reality (VR) technologies, AR/VR applications have rapidly advanced from professional industrial applications to consumer entertainment applications in the past. The use scenario of AR/VR is also defined by relatively fixed locations, such as: design rooms, laboratories, and the like are spread to places of daily life. The application scenarios of AR/VR mobile are becoming more and more rich, for example: games, education, etc. Due to the use scene and the technical base of the AR/VR equipment, compared with the traditional terminal, for example: notebooks (PCs), cell phones, etc. are very different, and conventional input devices, for example: mouse and keyboard cannot be applied to AR/VR equipment. Head-mounted eye tracking is a suitable technique for mobile AR/VR applications.
The display of the head-mounted VR equipment needs a large amount of calculation, in order to achieve comfortable visual experience, the number of displayed frames and delay have high requirements on rendering calculation, and the problem that how to reduce the calculation consumption as much as possible on the premise of ensuring the picture quality needs to be solved.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present invention provide an information processing method and an electronic device.
The information processing method provided by the embodiment of the invention comprises the following steps:
projecting light rays of a light source to eyeballs of a user through a lens so that the user can watch a display interface formed by the light rays;
acquiring a gaze parameter, the gaze parameter being used to characterize position data of an eyeball;
acquiring an astigmatism parameter of the lens;
determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens;
and rendering the display interface to be displayed according to the rendering depth.
In an embodiment of the present invention, the acquiring of the gaze parameter includes:
illuminating an eyeball area by adopting light rays with a preset wave band, and reflecting the light rays with the preset wave band to a camera device through a reflecting device;
and determining the gaze parameters according to the image acquired by the camera device.
In an embodiment of the present invention, the determining a rendering depth according to the gaze parameter and the astigmatism parameter of the lens includes:
obtaining a gaze view according to the gaze parameters;
obtaining an astigmatism correction chart according to the astigmatism parameters of the lens;
and performing smooth interpolation processing on the annotation view and the astigmatism correction map to obtain a rendering map.
In the embodiment of the present invention, rendering the display interface to be displayed according to the rendering depth includes:
and projecting the light rays corresponding to the rendering map to the eyeballs of the user.
In the embodiment of the invention, when the color is lighter, the rendering depth corresponding to the color is higher;
when the color is darker, the rendering depth corresponding to the color is lower.
The electronic device provided by the embodiment of the invention comprises:
the display unit is used for projecting light rays of the light source to eyeballs of a user through the lens so that the user can watch a display interface formed by the light rays;
the eye tracking unit is used for acquiring a gaze parameter, and the gaze parameter is used for representing position data of eyeballs;
the processing unit is used for acquiring an astigmatism parameter of the lens; determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens;
the display unit is further used for rendering the display interface to be displayed according to the rendering depth.
In an embodiment of the present invention, the eye tracking unit includes: an illumination device, a reflection device, and an imaging device; wherein the content of the first and second substances,
the illuminating device illuminates an eyeball area by adopting light rays with a preset wave band, and reflects the light rays with the preset wave band to the camera device through the reflecting device;
the processing unit is further configured to determine the gaze parameter according to the image acquired by the camera.
In the embodiment of the present invention, the processing unit is further configured to obtain a gaze view according to the gaze parameter; obtaining an astigmatism correction chart according to the astigmatism parameters of the lens; and performing smooth interpolation processing on the annotation view and the astigmatism correction map to obtain a rendering map.
In the embodiment of the present invention, the display unit is further configured to project the light corresponding to the rendering map to the eyeball of the user.
In the embodiment of the invention, when the color is lighter, the rendering depth corresponding to the color is higher;
when the color is darker, the rendering depth corresponding to the color is lower.
In the technical scheme of the embodiment of the invention, the light of the light source is projected to the eyeball of the user through the lens, so that the user can watch the display interface formed by the light; acquiring a gaze parameter, the gaze parameter being used to characterize position data of an eyeball; acquiring an astigmatism parameter of the lens; determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens; and rendering the display interface to be displayed according to the rendering depth. The method combines the staring parameters and the astigmatism parameters of the lens to determine the rendering depth, reduces the overall rendering calculation requirement, ensures the picture quality, and can improve the visual effect under the condition that the rendering calculation requirement is not increased.
Drawings
Fig. 1 is a schematic flowchart of an information processing method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating an information processing method according to a second embodiment of the present invention;
FIG. 3 is a flowchart illustrating an information processing method according to a third embodiment of the present invention;
FIG. 4 is a schematic view of a head-mounted device according to an embodiment of the invention;
FIG. 5 is a schematic view of various images of an embodiment of the present invention;
FIG. 6 is a schematic diagram of eye tracking according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
So that the manner in which the features and aspects of the embodiments of the present invention can be understood in detail, a more particular description of the embodiments of the invention, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings.
Fig. 1 is a flowchart illustrating an information processing method according to a first embodiment of the present invention, where the information processing method in this example is applied to an electronic device, and as shown in fig. 1, the information processing method includes the following steps:
step 101: the light of the light source is projected to the eyeball of the user through the lens, so that the user can watch the display interface formed by the light.
In the embodiment of the present invention, the electronic device is a head-mounted device, such as smart glasses, a smart helmet, and the like. The electronic device is capable of implementing a VR display, for which purpose the electronic device has display means, such as a display screen and/or a projector.
Taking the display device of the electronic equipment as an example, the display screen is arranged on one side of the helmet, and when the user wears the helmet, the display screen is positioned in the visual field range of the user, so that the virtual picture displayed and output by the display screen is watched by the user through the lens, and the VR display is realized.
Taking the display device of the electronic equipment as an example, the projector is arranged on one side of the intelligent glasses, when the user wears the intelligent glasses, the projector projects light to the eyes of the user through the lens, so that the user can see a virtual picture formed by the light, and the VR display is also provided. The screen displayed by the display screen is a real image, and the screen displayed by the projector is a virtual image.
In the embodiment of the invention, the display device is provided with the light source, the light source can emit light rays for forming the picture, and the display device projects the light rays of the light source to the eyeball of the user through the lens so that the user can watch the display interface formed by the light rays, wherein the display interface is VR display.
Step 102: a gaze parameter is obtained, the gaze parameter being used to characterize position data of an eyeball.
In the embodiment of the present invention, the electronic device has an eye tracking apparatus, and the operating principle of the eye tracking apparatus is as follows: sending invisible infrared light to a user, and then searching and capturing the flickering of eyeballs and the reflection of retina of the user by using two built-in cameras; the direction watched by the eyeballs, namely the position data of the eyeballs can be determined according to the images obtained by the camera.
Referring to fig. 4, fig. 4 is a schematic view of a head-mounted device, a user wears the head-mounted device on a head, and an eye tracking device is disposed on the head-mounted device, and different users have different eye structures and head structures, so after the eye tracking device is turned on, the user needs to be individually calibrated first, and after the eye tracking device is calibrated, the eye tracking device can accurately acquire position data of eyeballs of the user.
Step 103: the astigmatism parameters of the lens are acquired.
In the embodiment of the present invention, the lens is an optical lens, and regarding the astigmatism parameters of the optical lens, it should be noted that the central thickness and the edge thickness of the optical lens are not consistent, and the thickness of the optical lens gradually changes with the increase of the radius, so that the caused astigmatism parameters also change; the amount of astigmatism can be considered to be uniform for the same radius. In a certain radius portion of the optical lens, the refractive index of the optical lens decreases with the increase of the wavelength of incident light, light with a wavelength λ 1 and a wavelength λ 2 enters the optical lens, and since λ 1 and λ 2 are different, the degree of refraction of light with a wavelength λ 1 and light with a wavelength λ 2 after passing through the optical lens is different, and astigmatism occurs. The astigmatism phenomenon is that when light with different wavelengths transmits through the optical lens, the propagation direction is deflected to different degrees. In practice, the image is formed by more and richer wavelengths of light, which, when transmitted through the optical lens, generate astigmatism.
Based on this, the physical size of a specific lens is determined, and thus the astigmatism parameter thereof is also determined; the astigmatism parameters of the lens can be directly acquired.
Step 104: determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens; and rendering the display interface to be displayed according to the rendering depth.
In an embodiment of the present invention, the gaze parameter is combined with an astigmatism parameter of the lens to determine the rendering depth, and referring to fig. 5, a represents an astigmatism correction map of the lens, which is determined based on the astigmatism parameter of the lens. B represents a gaze view of the user, the gaze view determined based on the gaze parameters. The C-diagram combines the astigmatism correction diagram and the fixation diagram of the lens. Here, the astigmatism correction maps of the gaze view and the lens are processed by a smooth interpolation method to obtain a rendering map.
In the embodiment of the invention, when the color is lighter, the rendering depth corresponding to the color is higher; when the color is darker, the rendering depth corresponding to the color is lower.
According to the technical scheme of the embodiment of the invention, the rendering depth is determined by combining the staring parameter and the astigmatism parameter of the lens, the integral rendering calculation requirement is reduced, the picture quality is ensured, and the visual effect can be improved under the condition that the rendering calculation requirement is not increased.
Fig. 2 is a flowchart illustrating an information processing method according to a second embodiment of the present invention, where the information processing method in this example is applied to an electronic device, and as shown in fig. 2, the information processing method includes the following steps:
step 201: the light of the light source is projected to the eyeball of the user through the lens, so that the user can watch the display interface formed by the light.
In the embodiment of the present invention, the electronic device is a head-mounted device, such as smart glasses, a smart helmet, and the like. The electronic device is capable of implementing a VR display, for which purpose the electronic device has display means, such as a display screen and/or a projector.
Taking the display device of the electronic equipment as an example, the display screen is arranged on one side of the helmet, and when the user wears the helmet, the display screen is positioned in the visual field range of the user, so that the virtual picture displayed and output by the display screen is watched by the user through the lens, and the VR display is realized.
Taking the display device of the electronic equipment as an example, the projector is arranged on one side of the intelligent glasses, when the user wears the intelligent glasses, the projector projects light to the eyes of the user through the lens, so that the user can see a virtual picture formed by the light, and the VR display is also provided. The screen displayed by the display screen is a real image, and the screen displayed by the projector is a virtual image.
In the embodiment of the invention, the display device is provided with the light source, the light source can emit light rays for forming the picture, and the display device projects the light rays of the light source to the eyeball of the user through the lens so that the user can watch the display interface formed by the light rays, wherein the display interface is VR display.
Step 202: illuminating an eyeball area by adopting light rays with a preset wave band, and reflecting the light rays with the preset wave band to a camera device through a reflecting device; and determining the gaze parameters according to the images acquired by the camera device, wherein the gaze parameters are used for representing the position data of the eyeballs.
In the embodiment of the present invention, the electronic device has an eye tracking apparatus, and referring to fig. 6, the operating principle of the eye tracking apparatus is as follows: sending invisible infrared light to a user, and then searching and capturing the flickering of eyeballs and the reflection of retina of the user by using two built-in cameras; the direction watched by the eyeballs, namely the position data of the eyeballs can be determined according to the images obtained by the camera.
Here, the display screen is located behind the lens, and the display screen can project the light of the light source to the eyeball of the user through the lens, so that the user can watch the display interface formed by the light.
Referring to fig. 4, fig. 4 is a schematic view of a head-mounted device, a user wears the head-mounted device on a head, and an eye tracking device is disposed on the head-mounted device, and different users have different eye structures and head structures, so after the eye tracking device is turned on, the user needs to be individually calibrated first, and after the eye tracking device is calibrated, the eye tracking device can accurately acquire position data of eyeballs of the user.
Step 203: the astigmatism parameters of the lens are acquired.
In the embodiment of the present invention, the lens is an optical lens, and regarding the astigmatism parameters of the optical lens, it should be noted that the central thickness and the edge thickness of the optical lens are not consistent, and the thickness of the optical lens gradually changes with the increase of the radius, so that the caused astigmatism parameters also change; the amount of astigmatism can be considered to be uniform for the same radius. In a certain radius portion of the optical lens, the refractive index of the optical lens decreases with the increase of the wavelength of incident light, light with a wavelength λ 1 and a wavelength λ 2 enters the optical lens, and since λ 1 and λ 2 are different, the degree of refraction of light with a wavelength λ 1 and light with a wavelength λ 2 after passing through the optical lens is different, and astigmatism occurs. The astigmatism phenomenon is that when light with different wavelengths transmits through the optical lens, the propagation direction is deflected to different degrees. In practice, the image is formed by more and richer wavelengths of light, which, when transmitted through the optical lens, generate astigmatism.
Based on this, the physical size of a specific lens is determined, and thus the astigmatism parameter thereof is also determined; the astigmatism parameters of the lens can be directly acquired.
Step 204: determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens; and rendering the display interface to be displayed according to the rendering depth.
In an embodiment of the present invention, the gaze parameter is combined with an astigmatism parameter of the lens to determine the rendering depth, and referring to fig. 5, a represents an astigmatism correction map of the lens, which is determined based on the astigmatism parameter of the lens. B represents a gaze view of the user, the gaze view determined based on the gaze parameters. The C-diagram combines the astigmatism correction diagram and the fixation diagram of the lens. Here, the astigmatism correction maps of the gaze view and the lens are processed by a smooth interpolation method to obtain a rendering map.
In the embodiment of the invention, when the color is lighter, the rendering depth corresponding to the color is higher; when the color is darker, the rendering depth corresponding to the color is lower.
According to the technical scheme of the embodiment of the invention, the rendering depth is determined by combining the staring parameter and the astigmatism parameter of the lens, the integral rendering calculation requirement is reduced, the picture quality is ensured, and the visual effect can be improved under the condition that the rendering calculation requirement is not increased.
Fig. 3 is a flowchart illustrating an information processing method according to a third embodiment of the present invention, where the information processing method in this example is applied to an electronic device, and as shown in fig. 3, the information processing method includes the following steps:
step 301: the light of the light source is projected to the eyeball of the user through the lens, so that the user can watch the display interface formed by the light.
In the embodiment of the present invention, the electronic device is a head-mounted device, such as smart glasses, a smart helmet, and the like. The electronic device is capable of implementing a VR display, for which purpose the electronic device has display means, such as a display screen and/or a projector.
Taking the display device of the electronic equipment as an example, the display screen is arranged on one side of the helmet, and when the user wears the helmet, the display screen is positioned in the visual field range of the user, so that the virtual picture displayed and output by the display screen is watched by the user through the lens, and the VR display is realized.
Taking the display device of the electronic equipment as an example, the projector is arranged on one side of the intelligent glasses, when the user wears the intelligent glasses, the projector projects light to the eyes of the user through the lens, so that the user can see a virtual picture formed by the light, and the VR display is also provided. The screen displayed by the display screen is a real image, and the screen displayed by the projector is a virtual image.
In the embodiment of the invention, the display device is provided with the light source, the light source can emit light rays for forming the picture, and the display device projects the light rays of the light source to the eyeball of the user through the lens so that the user can watch the display interface formed by the light rays, wherein the display interface is VR display.
Step 302: illuminating an eyeball area by adopting light rays with a preset wave band, and reflecting the light rays with the preset wave band to a camera device through a reflecting device; and determining the gaze parameters according to the images acquired by the camera device, wherein the gaze parameters are used for representing the position data of the eyeballs.
In the embodiment of the present invention, the electronic device has an eye tracking apparatus, and referring to fig. 6, the operating principle of the eye tracking apparatus is as follows: sending invisible infrared light to a user, and then searching and capturing the flickering of eyeballs and the reflection of retina of the user by using two built-in cameras; the direction watched by the eyeballs, namely the position data of the eyeballs can be determined according to the images obtained by the camera.
Here, the display screen is located behind the lens, and the display screen can project the light of the light source to the eyeball of the user through the lens, so that the user can watch the display interface formed by the light.
Referring to fig. 4, fig. 4 is a schematic view of a head-mounted device, a user wears the head-mounted device on a head, and an eye tracking device is disposed on the head-mounted device, and different users have different eye structures and head structures, so after the eye tracking device is turned on, the user needs to be individually calibrated first, and after the eye tracking device is calibrated, the eye tracking device can accurately acquire position data of eyeballs of the user.
Step 303: the astigmatism parameters of the lens are acquired.
In the embodiment of the present invention, the lens is an optical lens, and regarding the astigmatism parameters of the optical lens, it should be noted that the central thickness and the edge thickness of the optical lens are not consistent, and the thickness of the optical lens gradually changes with the increase of the radius, so that the caused astigmatism parameters also change; the amount of astigmatism can be considered to be uniform for the same radius. In a certain radius portion of the optical lens, the refractive index of the optical lens decreases with the increase of the wavelength of incident light, light with a wavelength λ 1 and a wavelength λ 2 enters the optical lens, and since λ 1 and λ 2 are different, the degree of refraction of light with a wavelength λ 1 and light with a wavelength λ 2 after passing through the optical lens is different, and astigmatism occurs. The astigmatism phenomenon is that when light with different wavelengths transmits through the optical lens, the propagation direction is deflected to different degrees. In practice, the image is formed by more and richer wavelengths of light, which, when transmitted through the optical lens, generate astigmatism.
Based on this, the physical size of a specific lens is determined, and thus the astigmatism parameter thereof is also determined; the astigmatism parameters of the lens can be directly acquired.
Step 304: obtaining a gaze view according to the gaze parameters; obtaining an astigmatism correction chart according to the astigmatism parameters of the lens; performing smooth interpolation processing on the annotation view and the astigmatism correction map to obtain a rendering map; and projecting the light rays corresponding to the rendering map to the eyeballs of the user.
In an embodiment of the present invention, the gaze parameter is combined with an astigmatism parameter of the lens to determine the rendering depth, and referring to fig. 5, a represents an astigmatism correction map of the lens, which is determined based on the astigmatism parameter of the lens. B represents a gaze view of the user, the gaze view determined based on the gaze parameters. The C-diagram combines the astigmatism correction diagram and the fixation diagram of the lens. Here, the astigmatism correction maps of the gaze view and the lens are processed by a smooth interpolation method to obtain a rendering map.
In the embodiment of the invention, when the color is lighter, the rendering depth corresponding to the color is higher; when the color is darker, the rendering depth corresponding to the color is lower.
According to the technical scheme of the embodiment of the invention, the rendering depth is determined by combining the staring parameter and the astigmatism parameter of the lens, the integral rendering calculation requirement is reduced, the picture quality is ensured, and the visual effect can be improved under the condition that the rendering calculation requirement is not increased.
Fig. 7 is a schematic structural composition diagram of an electronic device according to a fourth embodiment of the present invention, and as shown in fig. 7, the electronic device includes:
the display unit 71 is used for projecting light rays of the light source to eyeballs of a user through the lens so that the user can watch a display interface formed by the light rays;
an eye tracking unit 72 for obtaining gaze parameters characterizing position data of an eyeball;
a processing unit 73 for acquiring an astigmatism parameter of the lens; determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens;
the display unit 71 is further configured to render the display interface to be displayed according to the rendering depth.
Those skilled in the art will understand that the implementation functions of each unit in the electronic device shown in fig. 7 can be understood by referring to the related description of the information processing method.
Fig. 8 is a schematic structural composition diagram of an electronic device according to a fifth embodiment of the present invention, and as shown in fig. 8, the electronic device includes:
the display unit 81 is used for projecting light rays of the light source to eyeballs of a user through the lens so that the user can watch a display interface formed by the light rays;
an eye tracking unit 82 for obtaining gaze parameters characterizing position data of an eyeball;
a processing unit 83 for acquiring an astigmatism parameter of the lens; determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens;
the display unit 81 is further configured to render the display interface to be displayed according to the rendering depth.
The eye tracking unit 82 includes: an illumination device 821, a reflection device 822, and a camera 823; wherein the content of the first and second substances,
the illumination device 821 illuminates the eyeball area by using light rays with a preset waveband, and reflects the light rays with the preset waveband to the camera 823 through the reflection device 822;
the processing unit 83 is further configured to determine the gaze parameter according to the image acquired by the image capturing device.
The processing unit 83 is further configured to obtain a gaze view according to the gaze parameters; obtaining an astigmatism correction chart according to the astigmatism parameters of the lens; and performing smooth interpolation processing on the annotation view and the astigmatism correction map to obtain a rendering map.
The display unit 81 is further configured to project light corresponding to the rendering map to an eyeball of a user.
In the embodiment of the invention, when the color is lighter, the rendering depth corresponding to the color is higher;
when the color is darker, the rendering depth corresponding to the color is lower.
Those skilled in the art will understand that the implementation functions of each unit in the electronic device shown in fig. 8 can be understood by referring to the related description of the information processing method.
The technical schemes described in the embodiments of the present invention can be combined arbitrarily without conflict.
In the embodiments provided in the present invention, it should be understood that the disclosed method and intelligent device may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one second processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.

Claims (8)

1. An information processing method, the method comprising:
projecting light rays of a light source to eyeballs of a user through a lens so that the user can watch a display interface formed by the light rays;
acquiring a gaze parameter, the gaze parameter being used to characterize position data of an eyeball;
acquiring an astigmatism parameter of the lens;
determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens;
rendering the display interface to be displayed according to the rendering depth;
wherein the determining a rendering depth according to the gaze parameter and an astigmatism parameter of the lens comprises:
obtaining a gaze view according to the gaze parameters;
obtaining an astigmatism correction chart according to the astigmatism parameters of the lens;
and performing smooth interpolation processing on the annotation view and the astigmatism correction map to obtain a rendering map.
2. The information processing method according to claim 1, the acquiring of the gaze parameter, comprising:
illuminating an eyeball area by adopting light rays with a preset wave band, and reflecting the light rays with the preset wave band to a camera device through a reflecting device;
and determining the gaze parameters according to the image acquired by the camera device.
3. The information processing method according to claim 1, wherein rendering the display interface to be displayed according to the rendering depth includes:
and projecting the light rays corresponding to the rendering map to the eyeballs of the user.
4. The information processing method according to claim 1,
when the color is lighter, the rendering depth corresponding to the color is higher;
when the color is darker, the rendering depth corresponding to the color is lower.
5. An electronic device, comprising:
the display unit is used for projecting light rays of the light source to eyeballs of a user through the lens so that the user can watch a display interface formed by the light rays;
the eye tracking unit is used for acquiring a gaze parameter, and the gaze parameter is used for representing position data of eyeballs;
the processing unit is used for acquiring an astigmatism parameter of the lens; determining a rendering depth according to the staring parameters and the astigmatism parameters of the lens;
the display unit is further used for rendering the display interface to be displayed according to the rendering depth;
the processing unit is further used for obtaining a fixation view according to the fixation parameters; obtaining an astigmatism correction chart according to the astigmatism parameters of the lens; and performing smooth interpolation processing on the annotation view and the astigmatism correction map to obtain a rendering map.
6. The electronic device of claim 5, the eye tracking unit comprising: an illumination device, a reflection device, and an imaging device; wherein the content of the first and second substances,
the illuminating device illuminates an eyeball area by adopting light rays with a preset wave band, and reflects the light rays with the preset wave band to the camera device through the reflecting device;
the processing unit is further configured to determine the gaze parameter according to the image acquired by the camera.
7. The electronic device of claim 5, wherein the display unit is further configured to project light corresponding to the rendering map to an eyeball of a user.
8. The electronic device of claim 5,
when the color is lighter, the rendering depth corresponding to the color is higher;
when the color is darker, the rendering depth corresponding to the color is lower.
CN201610476251.6A 2016-06-24 2016-06-24 Information processing method and electronic equipment Active CN107544661B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610476251.6A CN107544661B (en) 2016-06-24 2016-06-24 Information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610476251.6A CN107544661B (en) 2016-06-24 2016-06-24 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN107544661A CN107544661A (en) 2018-01-05
CN107544661B true CN107544661B (en) 2020-06-23

Family

ID=60960157

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610476251.6A Active CN107544661B (en) 2016-06-24 2016-06-24 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN107544661B (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120127790A (en) * 2011-05-16 2012-11-26 경북대학교 산학협력단 Eye tracking system and method the same
CN103946732B (en) * 2011-09-26 2019-06-14 微软技术许可有限责任公司 Video based on the sensor input to perspective, near-eye display shows modification
US9380287B2 (en) * 2012-09-03 2016-06-28 Sensomotoric Instruments Gesellschaft Fur Innovative Sensorik Mbh Head mounted system and method to compute and render a stream of digital images using a head mounted display

Also Published As

Publication number Publication date
CN107544661A (en) 2018-01-05

Similar Documents

Publication Publication Date Title
Itoh et al. Towards indistinguishable augmented reality: A survey on optical see-through head-mounted displays
CN113168007B (en) System and method for augmented reality
US9967555B2 (en) Simulation device
US8576276B2 (en) Head-mounted display device which provides surround video
US9076033B1 (en) Hand-triggered head-mounted photography
US9165381B2 (en) Augmented books in a mixed reality environment
RU2672502C1 (en) Device and method for forming cornea image
WO2013166362A2 (en) Collaboration environment using see through displays
US20170053446A1 (en) Communication System
CN107238929A (en) Wearable system
US9934583B2 (en) Expectation maximization to determine position of ambient glints
WO2016101861A1 (en) Head-worn display device
US11543655B1 (en) Rendering for multi-focus display systems
WO2023082980A1 (en) Display method and electronic device
CN107544661B (en) Information processing method and electronic equipment
WO2018149266A1 (en) Information processing method and device based on augmented reality
US20220095123A1 (en) Connection assessment system
CN103837989A (en) Head-mounted imaging device and head-mounted intelligent terminal
CN209859042U (en) Wearable control device and virtual/augmented reality system
CN203745730U (en) Head-mounted video device and head-mounted intelligent terminal
US20240012246A1 (en) Methods, Apparatuses And Computer Program Products For Providing An Eye Tracking System Based On Flexible Around The Lens Or Frame Illumination Sources
WO2021057420A1 (en) Method for displaying control interface and head-mounted display
WO2024021250A1 (en) Identity information acquisition method and apparatus, and electronic device and storage medium
WO2023035911A1 (en) Display method and electronic device
CN111025644A (en) Projection screen device of double-free-form-surface reflective AR glasses

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant