CN109725417B - Method for deeply processing digital optical signal based on optical waveguide and near-eye display device - Google Patents

Method for deeply processing digital optical signal based on optical waveguide and near-eye display device Download PDF

Info

Publication number
CN109725417B
CN109725417B CN201711023535.0A CN201711023535A CN109725417B CN 109725417 B CN109725417 B CN 109725417B CN 201711023535 A CN201711023535 A CN 201711023535A CN 109725417 B CN109725417 B CN 109725417B
Authority
CN
China
Prior art keywords
image
light
difference
ideal
ambient light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711023535.0A
Other languages
Chinese (zh)
Other versions
CN109725417A (en
Inventor
杜晶
陈清甫
范懿文
张弦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Hongxu Desheng Technology Co ltd
Original Assignee
Jiangsu Hongxu Desheng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Hongxu Desheng Technology Co ltd filed Critical Jiangsu Hongxu Desheng Technology Co ltd
Priority to CN201711023535.0A priority Critical patent/CN109725417B/en
Publication of CN109725417A publication Critical patent/CN109725417A/en
Application granted granted Critical
Publication of CN109725417B publication Critical patent/CN109725417B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The technical scheme of the invention provides a method for deeply processing digital optical signals based on optical waveguides and a near-to-eye display device. The optical waveguide light receiving and transmitting device acquires current ambient light, and the light analysis component analyzes the ambient light to obtain light data. The optical coupling assembly is used for coupling the image to be projected according to the optical coupling assembly, so that a coupling image is obtained and is sent to the optical waveguide light receiving and transmitting device. The optical waveguide light transceiver receives the coupling image and projects the coupling image to a display medium for display. The invention has the advantages that the color image, the light path and the like of the virtual image to be projected are coupled with the deviation of the actual environment through the analysis of the light of the actual environment, so that the deviation of the virtual image and the actual environment is eliminated, the integration degree of the virtual image displayed on the AR/MR head-mounted display device and the actual environment is improved, the display position of the virtual image is more accurate, and the integration degree with the actual environment is good.

Description

Method for deeply processing digital optical signal based on optical waveguide and near-eye display device
Technical Field
The invention relates to the field of light processing, in particular to a method for deeply processing digital light signals based on an optical waveguide and a near-eye display device.
Background
AR/MR head-mounted display is applied to the fields of military, scientific research, education, training, trade show, medical treatment, entertainment and the like. With the rapid development of virtual reality technology and augmented virtual reality technology in recent years, near-eye display devices are increasingly demanded, and the demands for the presentation effect of images are also increasing.
The AR/MR head-mounted display mainly realizes different purposes in the fields of military training, teaching, medical treatment and the like in a form of superposition and interaction of reality and virtual. For example, in the field of military training, training effects are improved by making a trained person personally on the scene, in the field of teaching, by superimposing a virtual scene depicted by knowledge with a real scene, so that students can intuitively feel the scene depicted by knowledge to deepen the effect of learning impression, in the field of medical treatment, by acquiring differences between a focus and a normal human body structure during surgery to assist a doctor in more accurate surgical operation, and the like.
However, since light inevitably passes through different media during transmission, interference such as refraction and dispersion is generated, when the current ambient light is obtained in the current AR/MR head-mounted display device, due to the interference reasons such as refraction and dispersion, when virtual images are superimposed, it is difficult or impossible to locate the superimposed position, so that the deviation is large, and how to solve the deviation is a problem to be solved.
Disclosure of Invention
The invention provides a method and a device for deeply processing digital optical signals based on an optical waveguide, which are used for solving the problem of display deviation generated when an AR/MR head-mounted display device performs virtual image superposition display.
In order to achieve the above object, the present invention provides a method for deep processing of a digital optical signal based on an optical waveguide, the method comprising: the environment light receiver acquires current environment light, and the light analysis component analyzes the environment light to obtain light data. The optical coupling component couples the image to be projected according to the optical data to obtain a coupling image, and sends the coupling image to the imaging device. And after receiving the coupling image, the imaging device projects the coupling image to a display medium for display, and if no ambient light exists, the imaging device directly projects the image to be projected according to preset illuminance, wherein the image to be projected is a virtual image to be displayed.
As a preference of the above technical solution, the method further includes: the environment light receiver receives the environment light and sends the environment light to the light analysis component. And the light analysis component calculates the included angle between the received ambient light and the ideal light to obtain an included angle calculation result. The light analysis component analyzes the ambient light to obtain at least the following light data: contrast, color image.
As a preferred aspect of the above technical solution, the method includes: the optical coupling assembly receives the optical data and compares it to the ideal optical data to obtain an adjustment scheme. And then, adjusting the image to be projected according to an adjustment scheme and an included angle calculation result to obtain the coupling image, and sending the coupling image to the imaging device.
As a preference of the above technical solution, the method further includes: the optical coupling component respectively performs difference on the contrast, the color image, the color temperature and the ideal contrast, the color image and the color temperature to obtain a contrast difference value, a chromatic aberration and a color temperature difference value. And comparing the data with an ideal contrast difference threshold, an ideal chromatic aberration threshold and an ideal chromatic temperature difference threshold respectively to obtain an adjustment scheme. Specifically, if the contrast difference is lower than the ideal contrast difference threshold, the contrast is required to be adjusted up, otherwise, the contrast is adjusted down, and finally the contrast is enabled to fall in the preset contrast threshold, and for chromatic aberration, the color temperature difference is processed in the same way, and finally an adjustment scheme is obtained.
As a preference of the above technical solution, the method further includes: and adjusting the contrast ratio and the color image of the image to be projected according to the adjustment scheme. And according to the calculation result of the included angle, coupling the light paths of the images to be projected to obtain a coupled image, and sending the coupled image to the imaging device.
The technical scheme of the invention also provides a near-eye display device for deeply processing digital optical signals based on the optical waveguide, which comprises: and the environment light receiver is used for acquiring the current environment light and then sending the environment light to the light analysis component. And the light analysis component is used for analyzing the received ambient light to obtain light data. And the optical coupling assembly is used for receiving the optical data sent by the optical analysis assembly, coupling the image to be projected according to the optical data to obtain a coupling image, and sending the coupling image to the imaging device. And after receiving the coupling image, the imaging device projects the coupling image to a display medium in a display device for display, and if the ambient light receiver cannot receive the ambient light, the imaging device directly projects the image to be projected according to preset illuminance. The image to be projected is a virtual image to be displayed.
As a preferable aspect of the foregoing technical solution, the light ray analysis component includes: the device comprises a receiving device, a data processing device and a light analysis device. And the receiving device is used for receiving the ambient light sent by the optical waveguide light receiving and transmitting device. And the data processing device is used for calculating the included angle between the ambient light received by the receiving device and the ideal light to obtain an included angle calculation result. The light analysis device is used for analyzing the ambient light received by the receiving device to obtain at least the following light data: contrast, color image.
As a preferred aspect of the foregoing disclosure, the optical coupling assembly includes: and the optical data comparison unit is used for comparing the optical data obtained by analysis of the optical analysis device with ideal optical data to obtain an adjustment scheme. And the angle unit is used for adjusting the image to be projected according to the adjustment scheme obtained by the optical data comparison unit and the included angle calculation result obtained by the data processing device.
As a preferable aspect of the foregoing disclosure, the optical data comparing unit is specifically configured to: and respectively carrying out difference between the contrast, the color image and the ideal contrast, and the color image to obtain a contrast difference value and a color image difference value. And then, comparing the contrast difference value and the chromatic aberration value with ideal contrast difference threshold values and ideal chromatic aberration threshold values respectively to obtain an adjustment scheme. Specifically, if the contrast difference value is the ideal contrast difference threshold value, the contrast is required to be adjusted up, otherwise, the contrast is required to be adjusted down, and finally the contrast is enabled to fall in the preset contrast threshold value, and the color aberration value is processed in the same way to obtain the adjustment scheme.
As a preferred aspect of the foregoing disclosure, the optical coupling assembly includes: and the light adjustment coupling unit is used for adjusting the contrast and the color image of the image to be projected according to the adjustment scheme obtained by the light data comparison unit, and is also used for coupling the light path of the image to be projected according to the calculation result of the included angle obtained by the angle unit to obtain a coupled image. And an output unit for transmitting the coupled image to the imaging device.
The technical scheme of the invention provides a method for deeply processing digital optical signals based on optical waveguides, an optical waveguide light transceiver acquires current ambient light, and a light analysis component analyzes the ambient light to obtain optical data. The optical coupling assembly is used for coupling the image to be projected according to the optical coupling assembly, so that a coupling image is obtained and is sent to the optical waveguide light receiving and transmitting device. The optical waveguide light transceiver receives the coupling image and projects the coupling image to a display medium for display. The technical scheme of the invention also provides a device for deeply processing digital optical signals based on the optical waveguide, which comprises: and the environment light receiver is used for acquiring the current environment light and then sending the current environment light to the light analysis component. And the light analysis component is used for analyzing the received ambient light to obtain light data. And the optical coupling assembly is used for receiving the optical data, coupling the image to be projected according to the optical data to obtain a coupling image, and transmitting the coupling image to the optical imaging device. And the imaging device receives the coupling image and projects the coupling image to a display medium in the display device for display. The invention has the advantages that the color image, the light path and the like of the virtual image to be projected are coupled with the deviation of the actual environment through the analysis of the light of the actual environment, so that the deviation of the virtual image and the actual environment is eliminated, the integration degree of the virtual image displayed on the AR/MR head-mounted display device and the actual environment is improved, the display position of the virtual image is more accurate, and the integration degree with the actual environment is good.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, a brief description will be given below of the drawings required for the embodiments or the prior art descriptions, and it is obvious that the drawings in the following description are some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a first embodiment of a method for deep processing a digital optical signal based on an optical waveguide according to the present invention;
FIG. 2 is a flowchart of another embodiment of a method for optical waveguide-based deep processing of digital optical signals according to the present disclosure;
FIG. 3 is a schematic diagram of a first embodiment of the present invention;
FIG. 4 is a schematic view of an optical path according to another embodiment of the present invention;
FIG. 5 is a third optical path diagram of another embodiment of the present invention;
FIG. 6 is a schematic diagram of a near-to-eye display device based on an optical waveguide for deep processing of digital optical signals according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of the light resolving assembly 62 of FIG. 6;
fig. 8 is a schematic diagram of the optical coupling assembly 63 of fig. 6.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a schematic flow chart of a first embodiment of a method for deep processing a digital optical signal based on an optical waveguide according to the present invention, as shown in fig. 1, including:
step 101, an ambient light receiver acquires current ambient light, and a light analysis component analyzes the ambient light to obtain light data.
Specifically, the ambient light receiver receives ambient light and sends the ambient light to the light analysis component. Ambient light includes information about light such as ambient brightness around an operating AR/MR head mounted display device (hereinafter referred to as head mounted device), color images of surrounding objects, and an optical path.
The information is received by the receiver and then sent to the light analysis component, and the light analysis component calculates the included angle between the received ambient light and the ideal light to obtain an included angle calculation result.
In light transmission, since the receiver is installed in the head display device, light can pass through various mediums when being incident, and due to the difference of refractive indexes, when the receiver actually receives ambient light, the incident angle of the light can deviate from an ideal incident angle to a certain extent. The ideal light in the present invention refers to: the incident light received by the receiver is not refracted in each passing through the medium in the incident process, and no refraction angle exists. In summary, it can be derived that the actual incident light ray and the ideal light ray have angular deviation, and the calculation result of the included angle refers to the angular deviation.
The light analysis component also analyzes the ambient light to obtain at least the following light data: contrast, color image, is then sent to the light coupling assembly of step 102.
It should be noted that, the light analysis component may be installed in the head display device, and data is transmitted by a wired connection manner; the data can be transmitted in the cloud end in a network transmission mode.
And 102, coupling the image to be projected by the optical coupling component according to the optical data to obtain a coupling image, and transmitting the coupling image to the imaging device.
The optical coupling assembly receives the optical data and compares it to the ideal optical data to obtain an adjustment scheme. The ideal light data refers to the ideal contrast of the ideal light ray, the ideal color image in step 101.
Specifically, the optical coupling component respectively performs difference on the contrast ratio of the ambient light, the contrast ratio of the color image and the ideal light, and the color image to obtain a contrast ratio difference value and a color image difference value between the ambient light and the ideal light. And comparing the differences with ideal contrast difference thresholds and ideal color image thresholds respectively, and judging whether the differences are in the thresholds or not. The ideal contrast difference threshold and the ideal color image threshold are all preset adjusting ranges for judging the adjustment of the optical data.
If the contrast difference is lower than the ideal contrast difference threshold, the contrast is required to be adjusted up, otherwise, the contrast is adjusted down, finally the contrast is enabled to fall in the preset contrast threshold, and the chromatic aberration value is processed in the same way, so that an adjustment scheme is obtained.
The optical coupling component adjusts the contrast ratio and the color image of the image to be projected according to the adjustment scheme to obtain an adjustment result, and couples the image to be projected to obtain a coupling image according to the calculation result of the included angle and the adjustment result and sends the coupling image to the imaging device.
And the optical coupling component couples the image to be projected according to the adjustment scheme and the included angle calculation result to obtain a coupling image, and sends the coupling image to the optical waveguide light receiving-transmitting device.
And step 103, after the imaging device receives the coupling image, projecting the coupling image to a display medium for display.
If the current ambient light is not obtained by the ambient light receiver in step 101, the imaging device directly outputs the image to be projected according to the preset illuminance, and does not perform any processing on the image.
The display medium may be projected onto the display screen of the head display device or may be projected directly onto the retina of the user.
In the technical scheme of the invention, the light analysis component and the optical coupling component can be simultaneously arranged in the head display equipment, can process and transmit data at the cloud end, can be arranged in the head display equipment, and can ensure the stability of data processing and transmission.
The embodiment provides a method for deeply processing digital optical signals based on an optical waveguide, an environment optical receiver obtains current environment light, and a light analysis component analyzes the environment light to obtain optical data. The optical coupling component couples the image to be projected according to the optical coupling component to obtain a coupled image and sends the coupled image to the imaging device. The imaging device receives the coupling image and projects the coupling image to a display medium for display. The invention has the advantages that the color image, the light path and the like of the virtual image to be projected are coupled with the deviation of the actual environment through the analysis of the light of the actual environment, so that the deviation of the virtual image and the actual environment is eliminated, the integration degree of the virtual image displayed on the AR/MR head-mounted display device and the actual environment is improved, the display position of the virtual image is more accurate, and the integration degree with the actual environment is good.
In this embodiment, taking the light analysis component and the light coupling component as examples in the head display device, but not limiting the positions of the light analysis component and the light coupling component, the image to be projected is illustrated taking a virtual image to be superimposed on a real scene as examples, as shown in fig. 2:
step 200, the environmental light receiver senses whether the environmental light is received, if yes, step 201a is executed, otherwise step 201b is executed.
In step 201a, the ambient light receiver receives the ambient light and sends the ambient light to the light analysis component.
The specific incident light path diagram is shown in fig. 3, and fig. 3 is a light path diagram of a further embodiment of the present invention.
Ambient light includes, but is not limited to: ambient brightness, color image of surrounding objects, and light path. The ideal light is opposite to the ambient light, and is the preset ideal brightness, ideal light path, ideal color image, ideal contrast and other environment.
Step 201b, the imaging device projects the image to be projected according to the preset illuminance.
Specifically, since the ambient light receiver (including the light sensor) in step 200 does not receive the current ambient light, in order to display the image to be projected on the head display device, the image display device can enable the image to be clearly displayed on the head display device according to the preset low brightness, and correspondingly, if the ambient light received by the ambient light receiver is very strong, the image display device can also project the image to be projected according to the preset low brightness, so as to achieve the purpose of clearly displaying the image.
The preset illuminance refers to the brightness of the projected image, which is determined according to some natural light source.
And 202, calculating the incident angles of the environment light path and the ideal light path by the light analysis component to obtain an included angle calculation result.
Specifically, the light analysis component obtains the incident angle of the environmental light path, and makes a difference between the incident angle and the incident angle of the ideal light path, wherein the difference is the included angle between the incident angle and the incident angle.
Specifically, the included angle between the ambient light path and the ideal light path is shown in fig. 4, and fig. 4 is a second light path diagram of a further embodiment of the present invention.
And 203, analyzing the ambient light to obtain ambient light data.
After the light analysis component analyzes the ambient light, the obtained ambient light data at least comprises the following contents: contrast, color image.
And 204, comparing the ambient light data with ideal light data by the optical coupling component to obtain a difference result.
Specifically, the optical coupling component respectively performs numerical value difference on the ambient light contrast, the ambient light color image and the ambient light color temperature on the ideal contrast and the ideal light color image to respectively obtain a contrast difference value and a color image difference value, and the contrast difference value and the color image difference value are difference results.
Step 205, comparing the above differences with the ideal threshold, determining whether the differences are within the threshold, if yes, executing step 206, otherwise executing step 207.
The ideal threshold includes at least: an ideal contrast difference threshold, an ideal chromatic aberration threshold, and an ideal chromatic temperature difference threshold.
Step 206, adjusting the difference value so that the difference value is within the threshold value range, and obtaining an adjustment result. Step 207 is then performed.
Specifically, the contrast difference value is further compared with an ideal contrast difference threshold value by taking a natural number 0 as a limit, if the contrast difference value is lower than the ideal contrast difference threshold value, the contrast value is adjusted to be higher if the difference result is smaller than 0, otherwise, the contrast value is adjusted to be lower if the difference result is larger than 0. The chromatic aberration and the color temperature are adjusted in the same way.
And step 207, coupling the output virtual images according to the calculation result and the adjustment result of the included angle to obtain a coupled image.
Step 208, the imaging device projects the coupled image.
The imaging device sends a coupling image, the coupling image is sent to a reflecting medium (including but not limited to a lens stuck with an optical waveguide grating, a prism group semi-transparent/full-transparent display optical component, a free-form surface optical prism semi-transparent/full-transparent display component, a waveguide optical semi-transparent/full-transparent display lens component and the like) after being reflected by an optical path, and the optical path is directly projected into human eyes after being reflected. Specifically, as shown in fig. 5, fig. 5 is an optical path diagram in which the coupled image is projected. Fig. 5 is a third optical path diagram according to another embodiment of the present invention.
The head display device is further provided with an environment sensor, the environment sensor senses the current ambient light intensity during projection, and the brightness of the coupled image is adjusted in a self-adaptive mode according to the sensing result. Specifically, if the environment is bright (the judgment of brightness is obtained according to a preset threshold value), the environment sensor lightens the brightness of the coupled image when the environment sensor lightens the brightness of the coupled image, and otherwise, the environment sensor darkens the coupled image.
The embodiment provides a method for deeply processing digital optical signals based on optical waveguides, wherein an optical environment optical receiver obtains current ambient light, and an optical analysis component analyzes the ambient light to obtain optical data. The optical coupling component couples the image to be projected according to the optical coupling component to obtain a coupled image and sends the coupled image to the imaging device. The imaging device receives the coupling image and projects the coupling image to a display medium for display. The invention has the advantages that the color image, the light path and the like of the virtual image to be projected are coupled with the deviation of the actual environment through the analysis of the light of the actual environment, so that the deviation of the virtual image and the actual environment is eliminated, the integration degree of the virtual image displayed on the AR/MR head-mounted display device and the actual environment is improved, the display position of the virtual image is more accurate, and the integration degree with the actual environment is good.
The embodiment of the invention also provides a near-eye display device for deeply processing digital optical signals based on the optical waveguide, the structure schematic diagram of which is shown in fig. 6, comprising:
the ambient light receiver 61 is configured to obtain current ambient light, and then send the ambient light to the light analysis component.
The light analysis module 62 analyzes the ambient light emitted from the ambient light receiver 61 to obtain light data.
And an optical coupling component 63, configured to receive the optical data sent by the optical analysis component 62, couple the image to be projected according to the optical data, obtain a coupled image, and send the coupled image to the imaging device 64.
After receiving the coupling image, the imaging device 64 projects the coupling image to a display medium in a display device for display. The image to be projected is a virtual image to be displayed.
Specifically, if the ambient light receiver 61 does not receive the current ambient light, the imaging device 64 directly projects the image to be projected according to the preset light data.
As shown in fig. 7, the light ray parsing component 62 includes:
receiving means 71, data processing means 72 and light analyzing means 73:
and a receiving device 71 for receiving the ambient light transmitted from the ambient light receiver 61.
The data processing device 72 is configured to calculate an included angle between the ambient light received by the receiving device 71 and the ideal light, so as to obtain an included angle calculation result.
The light analysis device 73 is configured to analyze the ambient light received by the receiving device 71, and obtain at least the following light data: contrast, color image, color temperature.
As shown in fig. 8, the optical coupling assembly 63 includes:
the optical data comparing unit 81 is configured to compare the optical data analyzed by the optical analysis device 73 with ideal optical data to obtain an adjustment scheme.
And an angle unit 82 for adjusting the image to be projected according to the adjustment scheme obtained by the optical data comparing unit 81 and the calculation result of the included angle obtained by the data processing device 72.
The optical data comparing unit 81 is specifically configured to:
and respectively carrying out difference between the contrast, the color image and the ideal contrast, and the color image to obtain a contrast difference value and a color image difference value.
And comparing the contrast difference value of the actual ambient light with an ideal contrast difference threshold value and an ideal chromatic aberration threshold value respectively to obtain an adjustment scheme.
Specifically, if the contrast difference is lower than the ideal contrast difference threshold, the contrast is required to be adjusted up, otherwise, the contrast is adjusted down, and finally the contrast is enabled to fall in the preset contrast threshold, and the chromatic aberration value is processed in the same way, so that the adjustment scheme is obtained.
As shown in fig. 8, the optical coupling assembly 63 further includes:
the light adjusting coupling unit 83 is configured to adjust the contrast, color image, and color temperature of the image to be projected according to the adjustment scheme obtained by the light data comparing unit 81, and is further configured to couple the image to be projected according to the calculation result of the included angle obtained by the angle unit 82, so as to obtain a coupled image.
An output unit 84 for transmitting the coupled image to the rendering device 64.
In the above technical scheme of the invention, when a large amount of optical data is generated and to be processed, in order to perform image projection at the fastest speed, not only a processing mode of CPU+GPU but also FPGA (Field-Programmable Gate Array, field programmable gate array) is used for accelerating calculation, thereby further improving the accuracy and timeliness of correcting the light path deviation.
The technical scheme of the invention provides a deep processing digital optical signal device based on an optical waveguide, which comprises the following components: the optical waveguide light receiving and transmitting device is used for acquiring current ambient light and then transmitting the current ambient light to the light analysis component. And the light analysis component is used for analyzing the received ambient light to obtain light data. And the optical coupling assembly is used for receiving the optical data, coupling the image to be projected according to the optical data to obtain a coupling image, and transmitting the coupling image to the optical waveguide light receiving and transmitting device. And after receiving the coupling image, the waveguide light receiving and transmitting device projects the coupling image to a display medium in the display device for display. The invention has the advantages that the color image, the light path and the like of the virtual image to be projected are coupled with the deviation of the actual environment through the analysis of the light of the actual environment, so that the deviation of the virtual image and the actual environment is eliminated, the integration degree of the virtual image displayed on the AR/MR head-mounted display device and the actual environment is improved, the display position of the virtual image is more accurate, and the integration degree with the actual environment is good.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (1)

1. A method of optical waveguide based deep processing of a digital optical signal, the method comprising:
step 200, the environmental light receiver senses whether the environmental light is received, if yes, step 201a is executed, and if not, step 201b is executed;
step 201a, an ambient light receiver receives ambient light and sends the ambient light to a light analysis component;
step 201b, the imaging device projects the image to be projected according to preset illumination;
when the ambient light receiver in step 200 does not receive the current ambient light, in order to display the image to be projected on the head display device, the imaging device may enable the image to be clearly displayed on the head display device according to the preset low brightness;
if the ambient light received by the ambient light receiver is very strong, the imaging device projects the image to be projected according to the preset low brightness; the preset illuminance refers to the brightness of the projected image, and the brightness is determined according to a natural light source;
step 202, a light analysis component calculates the incidence angles of an environment light path and an ideal light path to obtain an included angle calculation result;
the light analysis component obtains the incident angle of the environment light path, and makes a difference between the incident angle and the incident angle of the ideal light path, wherein the difference is an included angle between the incident angle and the incident angle;
step 203, analyzing the ambient light to obtain ambient light data;
after the light analysis component analyzes the ambient light, the obtained ambient light data at least comprises the following contents: contrast, color image; step 204, the optical coupling component compares the ambient light data with ideal light data to obtain a difference result;
the optical coupling component respectively carries out numerical value difference on the ambient light contrast, the ambient light color image and the ambient light color temperature and the ideal contrast, the ideal light color image and the ideal color temperature to respectively obtain a contrast difference value, a color image difference value and a color temperature difference value, wherein the contrast difference value, the color image difference value and the color temperature difference value are the difference results;
step 205, comparing the difference values with ideal threshold values, judging whether the difference values are in the threshold values, if yes, executing step 206, otherwise executing step 207;
the ideal threshold includes at least: an ideal contrast difference threshold, an ideal chromatic aberration threshold, an ideal chromatic temperature difference threshold;
step 206, adjusting the difference value so that the difference value is within a threshold value range, and executing step 207 after obtaining an adjustment result;
further performing difference comparison on the contrast difference value and an ideal contrast difference threshold, taking a natural number 0 as a limit, if the contrast difference value is lower than the ideal contrast difference threshold, performing difference result less than 0, and otherwise, performing difference result greater than 0, and performing contrast value reduction;
further performing difference comparison between the chromatic aberration and an ideal chromatic aberration threshold, taking a natural number 0 as a limit, if the chromatic aberration is lower than the ideal chromatic aberration threshold, the difference result is smaller than 0, the chromatic aberration value is adjusted to be high, otherwise, the difference result is larger than 0, and the chromatic aberration value is adjusted to be low;
further performing difference comparison on the color temperature difference value and the ideal color temperature difference threshold value, taking a natural number 0 as a limit, if the color temperature difference value is lower than the ideal color temperature difference threshold value, performing difference result less than 0, and otherwise, performing difference result more than 0, and performing color temperature value reduction;
step 207, coupling the output virtual images according to the calculation result and the adjustment result of the included angle to obtain a coupled image;
step 208, the imaging device projects the coupling image;
the imaging device sends the coupling image, the coupling image is sent to the reflecting medium after being reflected by the light path in the device, and the coupling image is directly projected into human eyes after being reflected by the light path.
CN201711023535.0A 2017-10-27 2017-10-27 Method for deeply processing digital optical signal based on optical waveguide and near-eye display device Active CN109725417B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711023535.0A CN109725417B (en) 2017-10-27 2017-10-27 Method for deeply processing digital optical signal based on optical waveguide and near-eye display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711023535.0A CN109725417B (en) 2017-10-27 2017-10-27 Method for deeply processing digital optical signal based on optical waveguide and near-eye display device

Publications (2)

Publication Number Publication Date
CN109725417A CN109725417A (en) 2019-05-07
CN109725417B true CN109725417B (en) 2024-04-09

Family

ID=66291654

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711023535.0A Active CN109725417B (en) 2017-10-27 2017-10-27 Method for deeply processing digital optical signal based on optical waveguide and near-eye display device

Country Status (1)

Country Link
CN (1) CN109725417B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101455077A (en) * 2006-05-26 2009-06-10 伊斯曼柯达公司 Digital cinema projection system with increased etendue
CN106373197A (en) * 2016-09-06 2017-02-01 广州视源电子科技股份有限公司 Augmented reality method and augmented reality device
CN106501938A (en) * 2016-11-21 2017-03-15 苏州苏大维格光电科技股份有限公司 A kind of wear-type augmented reality three-dimensional display apparatus
CN106662678A (en) * 2014-08-07 2017-05-10 微软技术许可有限责任公司 Spherical lens having decoupled aspheric surface
CN106940897A (en) * 2017-03-02 2017-07-11 苏州蜗牛数字科技股份有限公司 A kind of method that real shadow is intervened in AR scenes
CN107250882A (en) * 2014-11-07 2017-10-13 奥斯特豪特集团有限公司 The power management calculated for wear-type

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008049407A1 (en) * 2008-09-29 2010-04-01 Carl Zeiss Ag Display device and display method
WO2010062481A1 (en) * 2008-11-02 2010-06-03 David Chaum Near to eye display system and appliance
JP2012252091A (en) * 2011-06-01 2012-12-20 Sony Corp Display apparatus
IL213727A (en) * 2011-06-22 2015-01-29 Elbit Systems Ltd Helmet mounted display system adjustable for bright ambient light conditions
US9448407B2 (en) * 2012-12-13 2016-09-20 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and work supporting system
US9639985B2 (en) * 2013-06-24 2017-05-02 Microsoft Technology Licensing, Llc Active binocular alignment for near eye displays
DE102014115341B4 (en) * 2014-10-21 2016-11-03 Carl Zeiss Smart Optics Gmbh Imaging optics and data glasses
US9347828B1 (en) * 2014-11-27 2016-05-24 Hui Zhao Method for detecting ambient light brightness and apparatus for achieving the method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101455077A (en) * 2006-05-26 2009-06-10 伊斯曼柯达公司 Digital cinema projection system with increased etendue
CN106662678A (en) * 2014-08-07 2017-05-10 微软技术许可有限责任公司 Spherical lens having decoupled aspheric surface
CN107250882A (en) * 2014-11-07 2017-10-13 奥斯特豪特集团有限公司 The power management calculated for wear-type
CN106373197A (en) * 2016-09-06 2017-02-01 广州视源电子科技股份有限公司 Augmented reality method and augmented reality device
CN106501938A (en) * 2016-11-21 2017-03-15 苏州苏大维格光电科技股份有限公司 A kind of wear-type augmented reality three-dimensional display apparatus
CN106940897A (en) * 2017-03-02 2017-07-11 苏州蜗牛数字科技股份有限公司 A kind of method that real shadow is intervened in AR scenes

Also Published As

Publication number Publication date
CN109725417A (en) 2019-05-07

Similar Documents

Publication Publication Date Title
US10929997B1 (en) Selective propagation of depth measurements using stereoimaging
US9898075B2 (en) Visual stabilization system for head-mounted displays
US10852817B1 (en) Eye tracking combiner having multiple perspectives
US9922464B2 (en) Occluded virtual image display
CN108225734B (en) Error calibration system based on HUD system and error calibration method thereof
US10529113B1 (en) Generating graphical representation of facial expressions of a user wearing a head mounted display accounting for previously captured images of the user's facial expressions
WO2016077508A1 (en) System for automatic eye tracking calibration of head mounted display device
US10237544B2 (en) Open head mount display device and display method thereof
US20120154277A1 (en) Optimized focal area for augmented reality displays
US20130007668A1 (en) Multi-visor: managing applications in head mounted displays
US10725302B1 (en) Stereo imaging with Fresnel facets and Fresnel reflections
KR20160123346A (en) Stereoscopic display responsive to focal-point shift
US20150293586A1 (en) Eye gaze direction indicator
US11604315B1 (en) Multiplexing optical assembly with a high resolution inset
CN111710050A (en) Image processing method and device for virtual reality equipment
US11579683B2 (en) Wearable device and control method therefor
CN104865701A (en) Head-mounted display device
US11307654B1 (en) Ambient light eye illumination for eye-tracking in near-eye display
US20160110883A1 (en) Expectation Maximization to Determine Position of Ambient Glints
US20180246332A1 (en) Optical characterization system for lenses
CN109725417B (en) Method for deeply processing digital optical signal based on optical waveguide and near-eye display device
US20130321608A1 (en) Eye direction detecting apparatus and eye direction detecting method
WO2022240707A1 (en) Adaptive backlight activation for low-persistence liquid crystal displays
US10495882B1 (en) Positioning cameras in a head mounted display to capture images of portions of a face of a user
CN108760246B (en) Method for detecting eye movement range in head-up display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240313

Address after: 212310 Xinghu Road, Danyang Development Zone, Zhenjiang City, Jiangsu Province

Applicant after: JIANGSU HONGXU DESHENG TECHNOLOGY Co.,Ltd.

Country or region after: China

Address before: No. A-79, 2nd Floor, No. 48 Haidian West Street, Haidian District, Beijing, 100085

Applicant before: MAGICAST TECHNOLOGY CO.,LTD.

Country or region before: China

GR01 Patent grant
GR01 Patent grant