CN110095870B - Optical display system, display control device and augmented reality equipment - Google Patents

Optical display system, display control device and augmented reality equipment Download PDF

Info

Publication number
CN110095870B
CN110095870B CN201910449962.8A CN201910449962A CN110095870B CN 110095870 B CN110095870 B CN 110095870B CN 201910449962 A CN201910449962 A CN 201910449962A CN 110095870 B CN110095870 B CN 110095870B
Authority
CN
China
Prior art keywords
imaging
mirror
half mirror
image
display system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910449962.8A
Other languages
Chinese (zh)
Other versions
CN110095870A (en
Inventor
张洪术
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Display Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN201910449962.8A priority Critical patent/CN110095870B/en
Publication of CN110095870A publication Critical patent/CN110095870A/en
Application granted granted Critical
Publication of CN110095870B publication Critical patent/CN110095870B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Abstract

The invention provides an optical display system, a display control device and an augmented reality device, wherein the system comprises: at least two sets of formation of image subassemblies, each group becomes including like the subassembly: the display source is used for emitting imaging light according to the image of the monocular corresponding visual angle, the first half mirror is used for reflecting the imaging light emitted by the display source and transmitting the ambient light into the human eyes, the second half mirror is used for reflecting the imaging light reflected by the first half mirror into the human eyes to display the image at the position corresponding to the depth of field and transmitting the ambient light to the first half mirror, at least two groups of imaging components are arranged and the brightness of the imaging light emitted by the display source is adjusted, so that the imaging surface obtained by brightness fitting corresponds to the image depth of the binocular stereo image, the visual convergence adjustment conflict is avoided, fatigue and discomfort of a user can not be caused, and meanwhile, the definition of the image is improved.

Description

Optical display system, display control device and augmented reality equipment
Technical Field
The invention relates to the technical field of display control, in particular to an optical display system, a display control device and augmented reality equipment.
Background
The human visual system performs convergence, i.e., convergence accommodation (both eyes usually look inward when looking at near objects; the visual axis diverges when looking at distant objects) and focus accommodation (the crystalline lens is adjusted to focus light on the retina) of both eyes when viewing different near and far objects. In real life, when the human visual system views an object, convergence adjustment and focus adjustment occur at the same time, and humans have become accustomed to this manner.
In an augmented reality system, the scene seen by a human being is displayed by a display screen. However, the light from the screen has no depth information and the focus of the eyes is fixed on the screen, so that the focusing accommodation of the eyes is not matched with the depth sense of the scenery, thereby causing a convergence accommodation conflict.
Specifically, as shown in fig. 1, when a human in the real world views a real object, the distance 1 corresponding to the radial axis adjustment and the distance 2 corresponding to the focus adjustment are equal, and the visual perception of the human visual system viewing scenes at different depths is different, that is, as shown in the left diagram in fig. 1, the dashed line represents the information module viewed, that is, the left and right edges are blurred, and the middle is clear; in the virtual reality scene, as shown in the right diagram of fig. 1, when a human uses the head-mounted device to view the scenery, the distance 3 corresponding to the radial adjustment and the distance 4 corresponding to the focusing adjustment are not consistent, i.e. the conflict of the convergence adjustment of the vision shown in the right diagram of fig. 1 is contrary to the human daily physiological law, which can cause fatigue and dizziness of the human visual system.
In the existing augmented reality system, the transmission distance of the adopted optical system is always fixed, namely the position of the focus point of the human eyes is fixed, the displayed image enables the human eyes to converge at different distances to generate 3D depth of field, at the moment, the focus adjusting distance and the spoke adjusting distance are unequal, namely, the focus adjustment and the convergence adjustment are inconsistent, the visual convergence adjustment conflict is caused, the image is not clear, and the visual fatigue and dizziness feeling are caused after the augmented reality device is taken.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, a first objective of the present invention is to provide an optical display system, in which at least two sets of imaging assemblies are arranged, and the brightness of the imaging light emitted by the display source is adjusted, so that an imaging plane obtained by fitting a virtual image corresponding to the brightness corresponds to the image depth of a binocular stereoscopic image, thereby avoiding a convergence adjustment conflict and preventing users from fatigue and discomfort.
A second object of the present invention is to provide a display control apparatus.
A third object of the present invention is to provide an augmented reality device.
To achieve the above object, an embodiment of a first aspect of the present invention provides an optical display system, including:
at least two sets of formation of image subassemblies, each group becomes including like the subassembly:
a display source for emitting imaging light according to an image of a set viewing angle;
the first half-transmitting half-reflecting mirror is used for reflecting the imaging light emitted by the display source and transmitting the ambient light into human eyes;
a second half mirror for reflecting the imaging light reflected by the first half mirror into the human eye and for transmitting ambient light to the first half mirror.
In order to achieve the above object, a second aspect of the present invention provides a display control device, electrically connected to a display source in the optical display system, for controlling a brightness of imaging light emitted from the display source of each imaging assembly.
To achieve the above object, an embodiment of a third aspect of the present invention provides an imaging method applied to the optical display system according to the first aspect, the method including:
controlling each imaging assembly in the optical display system to image at a corresponding depth of field;
and adjusting the imaging brightness of each imaging component according to the image depth of the three-dimensional image to be presented.
To achieve the above object, a fourth aspect of the present invention provides an augmented reality device, including an optical display system as described in the first aspect of the present invention.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
the optical display system comprises at least two groups of imaging components, wherein each imaging component comprises a display source, a first half-transmitting half-reflecting mirror and a second half-transmitting half-reflecting mirror, the display source is used for emitting imaging light according to an image with a set visual angle, the first half-transmitting half-reflecting mirror is used for reflecting the imaging light emitted by the display source and transmitting ambient light into human eyes, the second half-transmitting half-reflecting mirror is used for reflecting the imaging light reflected by the first half-transmitting half-reflecting mirror into the human eyes to present an image at a position corresponding to a depth of field and transmitting the ambient light to the first half-transmitting half-reflecting mirror, and the brightness of the imaging light emitted by the display source corresponds to the depth of the image of a stereoscopic image presented by two eyes. By arranging at least two groups of imaging assemblies and adjusting the brightness of the imaging light emitted by the display source, the imaging surface obtained by fitting the brightness corresponds to the image depth of the binocular stereoscopic image, the visual convergence adjustment conflict is avoided, and the user can not feel fatigue and uncomfortable.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic view of convergence adjustment and focus adjustment;
fig. 2 is a schematic structural diagram of an optical display system according to an embodiment of the present invention;
FIG. 3 is a schematic view of depth fusion;
FIG. 4 is a schematic structural diagram of another optical display system according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of another optical display system according to an embodiment of the present invention; and
fig. 6 is a schematic flowchart of an imaging method according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
An optical display system, a display control apparatus, and an augmented reality device of an embodiment of the present invention are described below with reference to the drawings.
The optical display system provided by the embodiment of the invention is used for monocular imaging in binocular stereo imaging.
An optical display system comprising at least two groups of imaging assemblies, each group comprising: the display device comprises a display source, a first half mirror and a second half mirror. The optical display system of the embodiment of the present invention may be applied to an augmented reality device, such as an augmented reality eye, a helmet, and the like, and the embodiment is not limited in this embodiment.
The display source is used for emitting imaging light according to an image with a set visual angle, and the brightness of the imaging light emitted by the display source can be adjusted. The image with the set visual angle is determined according to the visual angle corresponding to the monocular corresponding to the optical display system.
The first half mirror is used for reflecting the imaging light emitted by the display source and transmitting the ambient light into human eyes, namely the first half mirror can reflect the imaging light emitted by the display source and transmit the ambient light into human eyes.
And a second half mirror for reflecting the imaging light reflected by the first half mirror into a human eye to present an image at a corresponding depth of field, and for transmitting ambient light to the first half mirror.
The brightness of the imaging light emitted by the display source can be adjusted, so that the depth of the imaging surface obtained by fusing the imaging light emitted by the display source after the imaging light is imaged by the imaging component corresponds to the depth of the image of the binocular stereoscopic image, namely, the focusing and converging distances of human eyes can be equal by adjusting the brightness of the imaging light emitted by the display source, so that the visual convergence adjusting conflict is avoided, fatigue and discomfort of a user can not be caused, and the definition of the image is improved.
As a possible implementation manner, the imaging assemblies can share the same second half-mirror, so that the cost and the volume of the optical display system are reduced.
Based on the above embodiment, the present embodiment specifically describes an optical display system including two sets of imaging components. Fig. 2 is a schematic structural diagram of an optical display system according to an embodiment of the present invention.
As shown in fig. 2, the display system 100 includes two imaging assemblies, and the two imaging assemblies share the same second half mirror 13, so as to reduce the cost and volume of the optical display system. For ease of distinction, referred to as a first imaging assembly and a second imaging assembly, which share the same second half mirror 13, wherein the first imaging assembly further comprises a display source 11 and a first half mirror 12. The second imaging assembly further comprises a display source 21 and a first half mirror 22.
The first half mirror 12 of the first imaging assembly is a half mirror, and is located between the second half mirror 13 and the first half mirror 22 of the second imaging assembly. The second half mirror 13 is a half mirror, the first half mirror 22 of the second imaging component is a half mirror,
wherein, the first half mirror 12 of the first imaging component, the first half mirror 22 of the second imaging component are connected with the lower end of the second half mirror 13. A first included angle 31 is formed between the optical axes of the first half mirror 12 and the second half mirror 13 of the first imaging assembly, and a second included angle 32 is formed between the display plane of the display source 11 of the first imaging assembly and the optical axis of the second half mirror 13. A third included angle 33 is formed between the optical axes of the first half mirror 22 and the second half mirror 13 of the second imaging assembly, and a fourth included angle 34 is formed between the display plane of the display source 21 of the second imaging assembly and the optical axis of the second half mirror 13. As a possible implementation manner, the fourth angle 34 is determined according to a difference between two times of the third angle 33 and 90 degrees, and if the third angle 33 is θ, the fourth angle 34 is 2 × θ -90 °, wherein the first angle 31 ranges from 50 degrees to 60 degrees, and the third angle 33 ranges from 40 degrees to 50 degrees, so as to ensure that the light projected by the light source 21 and the light projected by the light source 11 can be staggered with each other, that is, the angle between the light projected by the light source 21 and the first half mirror 22 can be the same as the angle between the light projected by the light source 11 and the first half mirror 12, and since the positions of the light projected by the two imaging assemblies are different, such as the incident light of black solid line and the light of gray line in fig. 2, the light projected by the two imaging assemblies can be incident to the human eye in parallel after being reflected, that is to say, the imaging planes generated by the two imaging assemblies are parallel to each other, namely, images at different depths of field are formed.
As a possible implementation manner, as shown in fig. 2, an upper end of the first half mirror 22 of the second imaging component is higher than an upper end of the first half mirror 12 of the first imaging component, so that light emitted by the display source 21 is incident on the first half mirror 22 from a gap between the first half mirror 12 and the first half mirror 22, thereby preventing that light emitted by the display source 21 will irradiate the first half mirror 12 and then partially irradiate the first half mirror 22 again, resulting in generation of more stray light, so as to form high-order virtual image reflection at human eyes and affect visual experience of a user.
In the optical display system provided by this embodiment, the brightness of the imaging light emitted by the display source corresponds to the image depth of the binocular stereoscopic image, i.e., the convergence and convergence adjustment conflict is avoided, so that the user is not tired and uncomfortable, and the image definition is improved, which is described in detail below.
Specifically, the display source 11 in the first imaging assembly is horizontally disposed above the first half mirror 12, and emits imaging light according to an image of a monocular corresponding viewing angle, for convenience of distinction, referred to as a first imaging light, the first imaging light is reflected to the second half mirror 13 through the first half mirror 12 forming a first included angle 31 with an optical axis of the second half mirror 13, wherein the second half mirror 13 is a concave mirror of the half mirror and can amplify light, and therefore, after the first imaging light is amplified through the second half mirror 13, a part of the light passes through the first half mirror 12 and the first half mirror 22 and is reflected into human eyes. According to the distance h1 between the display source 11 and the first half mirror 12 and the distance b1 between the first half mirror 12 and the second half mirror 13, the object distance between the display source 11 and the second half mirror 13 can be calculated to be U1 ═ h1+ b1, and the focal length of the second half mirror 13 is f, and according to the lens imaging formula, the magnified virtual image generated by the first imaging component, namely the distance V1 between the position of the first imaging plane 41 and the second half mirror 13 can be calculated.
Meanwhile, the display source 21 in the second imaging assembly emits imaging light according to an image of a monocular corresponding viewing angle, and for convenience of distinction, the second imaging light is called as a second imaging light, wherein a third included angle 33 formed by the optical axes of the first half mirror 22 and the second half mirror 13 is marked as θ, the display source 21 is placed obliquely above the display source 11, a fourth included angle 34 is formed between the display plane of the display source 21 and the optical axis of the second half mirror 13, the fourth included angle 34 is 2 × θ -90 °, the second imaging light is reflected to the second half mirror 13 through the first half mirror 22, and after being amplified by the second half mirror 13, part of the light passes through the first half mirror 12 and the first half mirror 22 and then is reflected to enter human eyes. According to the distance h2 between the display source 21 and the first half mirror 22 and the distance b2 between the first half mirror 22 and the second half mirror 13, the object distance between the display source 21 and the second half mirror 13 can be calculated to be U2, which is h2+ b2, and the focal length of the second half mirror 13 is f, and according to the lens imaging formula, the magnified virtual image generated by the first imaging component, namely the distance from the position of the second imaging plane 42 to the second half mirror 13 can be calculated to be V2.
The human eye can see the external ambient light, the ambient light penetrates through the second half mirror 13 and enters the human eye after penetrating through the first half mirror 12 and the first half mirror 22, and simultaneously the human eye can see the images corresponding to the first imaging light and the second imaging light, namely, the two virtual images 41 and 42 formed at the positions V1 and V2 in fig. 2, and then the continuous depth-of-field image between the positions V1 and V2 can be fitted by using a depth-of-field fusion algorithm, that is, the position focused by the human eye is not fixed, and the depth-of-field image can be matched with the depth-of-field positions of the stereoscopic images at different viewing angles in the binocular received real scene.
As one possible implementation, with reference to fig. 3, the principle of the depth-of-field fusion algorithm according to the embodiment of the present invention is described as follows:
when the brightness of the imaging light emitted by the display source 11 is different from that of the imaging light emitted by the display source 21, the brightness of the first imaging plane 41 and the second imaging plane 42 obtained by the imaging components is different, and the depth of the fitting imaging obtained by the fusion algorithm on the first imaging plane 41 and the second imaging plane 42 is a continuous depth image between the first imaging plane 41 and the second imaging plane 42Specifically, the first imaging unit generates the first imaging plane 41 with the luminance I in the embodiment of the present inventionnIt is shown that,
Figure BDA0002074827730000051
luminance of second image plane 42 generated by second imaging component is IfIt is shown that,
Figure BDA0002074827730000052
where Dn represents the distance of the human eye from the first imaging plane 41, and DfRepresenting the distance of the human eye from the second imaging plane 42, Ds representing the position of the fitted imaging plane obtained by fitting through a depth-of-field fusion algorithm, IsThe brightness of the fitting image plane 43 obtained by the depth-of-field fusion algorithm is shown, and according to the above formula, if the brightness I of the first image plane 41 is determinednWhen the image is adjusted to be high, the fused fitted image plane 43 is close to the first image plane 41, i.e. to the position of human eyes, and the brightness I of the second image plane 42 is adjustedfAnd when the image depth is adjusted to be high, the fused fitting imaging surface 43 approaches to the second imaging surface 42, that is, is far away from the position of human eyes, so that the purpose that the fused fitting imaging surface 43 moves between the first imaging surface 41 and the second imaging surface 42 by adjusting the brightness of the first imaging surface 41 or the brightness of the second imaging surface 42 is achieved, that is, the position focused by the human eyes is not fixed any more, but can move between the two imaging surfaces, so that the image depth of a stereoscopic image obtained by converging images viewed by the actual scene through the two eyes corresponds to the image depth of the fused fitting imaging surface, that is, the visual convergence adjustment conflict of the human eyes is avoided, fatigue and discomfort of a user cannot be caused, and meanwhile, the definition of the image is improved.
It should be understood that, when depth-of-field fusion is performed on the imaging planes at multiple depths, the principle is the same, and the details are not described here.
In the optical display system provided by the embodiment of the invention, the two groups of imaging assemblies are arranged, and the brightness of the imaging light emitted by the display source is adjusted, so that the depth corresponding to the fitting imaging surface obtained according to the fusion principle corresponds to the image depth of the stereoscopic image formed by the binocular vision after the imaging light emitted by the display source with adjustable brightness is projected to human eyes through the imaging assemblies, the conflict of visual convergence adjustment is avoided, fatigue and discomfort of a user cannot be caused, and the definition of the image is improved.
Based on the above embodiments, the embodiment of the present invention further provides a possible implementation manner of a structure of another optical display system, fig. 4 is a schematic structural diagram of another optical display system provided by the embodiment of the present invention, and as shown in fig. 4, the optical display system 100 further includes an absorption plate 14.
The lower edge of the light absorbing plate 14 is flush with the lower edge of the first half mirror 22 of the second imaging assembly, and the light absorbing plate 14 is used for absorbing stray light transmitted from the first half mirror 22 of the second imaging assembly. Meanwhile, on the premise that the imaging light emitted by the display source 11 and the display source 21 is not affected by the light absorbing plate 14, the light absorbing plate 14 is as close to the first half mirror 22 as possible, wherein the included angle between the light absorbing plate 14 and the horizontal plane can be set to 15-35 degrees, so that the absorption efficiency of stray light is increased.
In order to implement the above embodiments, the optical display system 100 of the present invention further includes a display control device 110.
As shown in fig. 5, the display control device 110 is electrically connected to the display source 11 and the display source 21 in the optical display system 100, and is configured to control the brightness of the imaging light emitted by the display source of each imaging element, so that the brightness of the images projected into the human eyes by each imaging element in the optical display system at different depths is different.
It should be noted that, in this embodiment, only a schematic structural diagram that the optical display system 100 includes 2 groups of imaging components is shown, and the optical display system 100 may further include more groups of imaging components, which have the same principle and are not described herein again.
Optionally, the display control device is further configured to fuse, by using a depth-of-field fusion algorithm, image planes presented by the obtained imaging component at different depths of a scene to obtain a fused fitted imaging plane, where a depth of the fused fitted imaging plane may be between depths corresponding to the image planes of different depths of the scene generated by the imaging component, that is, depth information of the fused fitted imaging plane is variable, so as to implement correspondence with an image depth of a stereoscopic image obtained by binocular convergence according to an actual scene image. The specific fusion algorithm may refer to the previous embodiment, and the principle is the same, which is not described herein again.
In the optical display system of the embodiment of the invention, the light absorption plate is arranged to absorb the stray light transmitted by the first half-mirror, the two groups of imaging assemblies are arranged, and the brightness of the imaging light emitted by the display source is adjusted, so that the imaging light with adjustable brightness emitted by the display source is projected to human eyes through the imaging assemblies, and the depth corresponding to the fitting imaging surface obtained according to the fusion principle corresponds to the image depth of the stereoscopic image formed by the two eyes, thereby avoiding the conflict of visual convergence adjustment, avoiding fatigue and discomfort of users, improving the image definition and avoiding dispersion.
Based on the foregoing embodiments, an imaging method is further provided in an embodiment of the present invention, and fig. 6 is a schematic flow chart of the imaging method provided in the embodiment of the present invention, as shown in fig. 6, the method includes the following steps:
step 601, controlling each imaging component in the optical display system to image at a position corresponding to the depth of field.
The optical display system comprises at least two groups of imaging components, and each imaging component comprises a display source, a first half-transmitting half-reflecting mirror and a second half-transmitting half-reflecting mirror. For convenience of explanation, in the present embodiment, 2 sets of imaging assemblies are taken as an example for explanation, and are respectively referred to as a first imaging assembly and a second imaging assembly. The imaging component, for example, may be an augmented reality AR imaging component, and is used for imaging in an augmented reality device.
Specifically, the display source of the first imaging assembly is controlled to emit imaging light according to the image of the monocular corresponding viewing angle, the imaging light is reflected by the first half mirror, the reflected imaging light is further reflected by the second half mirror, and the second half mirror is a half mirror, so that the reflected imaging light can be amplified, and the reflected imaging light enters the human eyes through the first half mirror, so that the human eyes see the amplified virtual image at the corresponding depth of field, for example, the virtual image 41 with the depth of field of V1 in fig. 2. At the same time, the second imaging assembly may be controlled to image such that the human eye sees an enlarged virtual image at a corresponding depth of field, for example, virtual image 42 of depth V2 in fig. 2.
Step 602, adjusting the imaging brightness of each imaging component according to the image depth of the three-dimensional image to be presented.
Specifically, the first half mirror and the second half mirror can also allow ambient light to pass through, so that images at different visual angles in a real scene are imaged in human eyes, the brightness of imaging light emitted by the display sources of the first imaging assembly and the second imaging assembly is adjusted according to the depth of field of the three-dimensional image, the two acquired imaging surfaces with different brightness are fused according to a depth-of-field fusion algorithm to obtain a fitting imaging surface, the brightness of the imaging light emitted by the display sources of the two sets of imaging assemblies is adjusted, the depth of field of the fitting imaging surface corresponds to the depth of the image of which the three-dimensional image needs to be displayed, the conflict of accommodation and accommodation of visual convergence is avoided, fatigue and discomfort of a user cannot be caused, and the definition of the image is improved.
It should be noted that the above explanation of the embodiment of the optical display system can also be applied to the method of the embodiment, and is not repeated here.
In the imaging method of the embodiment of the invention, the brightness of the imaging light emitted by the display source can be controlled by the display control device, and the fusion of the imaging surfaces is carried out by the depth fusion algorithm, so that the depth information of the fitted imaging surface obtained by the fusion is variable, the image depth of the three-dimensional image obtained by converging with the outside scene image received by two eyes is corresponding, the convergence adjusting conflict of visual convergence is avoided, the fatigue and the discomfort of a user can not be caused, and the image definition is improved.
Based on the above embodiments, an embodiment of the present invention further provides an augmented reality device, including the optical display system 100 described in the foregoing embodiments, where the augmented reality device is, for example, augmented reality glasses, a helmet, and the like.
Optionally, a gray-scale filter may be added to the ambient light incident surface of the optical display system 100 of the augmented reality device to reduce the incident ambient light brightness and increase the contrast between the displayed virtual image and the real ambient image.
The augmented reality device according to the embodiment of the present invention may be specifically an augmented reality glasses, where the augmented reality glasses include two optical display systems 100 corresponding to the dual-purpose embodiments, that is, each eye has a corresponding optical display system 100.
It should be noted that the above explanation of the optical display system is also applicable to the augmented reality device of the embodiment, and the principle is the same, and is not repeated here.
The augmented reality device provided by the embodiment of the invention can control the brightness of the imaging light emitted by the display source through the display control device, and performs the fusion of the imaging surfaces through the depth fusion algorithm, so that the depth information of the fitted imaging surface obtained by the fusion is variable, the image depth of the stereoscopic image obtained by converging with the external scene image received by two eyes is corresponding, the visual convergence adjustment conflict is avoided, the fatigue and the discomfort of a user are avoided, and the image definition is improved.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (10)

1. An optical display system comprising a first imaging assembly and a second imaging assembly, each imaging assembly comprising:
a display source for emitting imaging light according to an image of a set viewing angle;
the first half-transmitting half-reflecting mirror is used for reflecting the imaging light emitted by the display source and transmitting the ambient light into human eyes;
a second half mirror for reflecting the imaging light reflected by the first half mirror into a human eye and for transmitting ambient light to the first half mirror;
the imaging components share the same second half-mirror, and the first half-mirror of the first imaging component is a half-mirror plane mirror and is positioned between the second half-mirror and the first half-mirror of the second imaging component; the second semi-transparent semi-reflecting mirror is a semi-transparent semi-reflecting curved mirror, and the first semi-transparent semi-reflecting mirror of the second imaging assembly is a semi-transparent semi-reflecting plane mirror;
the optical display system further comprises a display control device, wherein the display control device is electrically connected with the display source and is used for controlling the brightness of the imaging light emitted by the display source of each imaging assembly, and the brightness of the first imaging surface generated by the first imaging assembly is In
Figure FDA0003452095240000011
The second imaging surface generated by the second imaging component has brightness If
Figure FDA0003452095240000012
Wherein D isnRepresenting the distance of the human eye from said first imaging plane, DfRepresenting the distance of the human eye from said second imaging plane, DsRepresenting the position of the fitted imaging plane, I, obtained by fitting through a depth of field fusion algorithmsRepresenting the brightness of a fitted imaging plane obtained by a depth-of-field fusion algorithm;
the first half mirror of first formation of image subassembly the first half mirror of second formation of image subassembly with the second half mirror lower extreme is connected, the upper end of the first half mirror of second formation of image subassembly is higher than the upper end of the first half mirror of first formation of image subassembly.
2. The optical display system of claim 1,
a first included angle is formed between the optical axes of the first half mirror and the second half mirror of the first imaging assembly; a second included angle is formed between the display plane of the display source of the first imaging component and the optical axis of the second half mirror;
a third included angle is formed between the optical axes of the first half mirror and the second half mirror of the second imaging assembly; a fourth included angle is formed between the display plane of the display source of the second imaging assembly and the optical axis of the second half mirror;
the first included angle is larger than the third included angle, and the second included angle is larger than the fourth included angle.
3. The optical display system of claim 2, further comprising a light absorbing plate;
the lower end of the light absorption plate is connected with the lower end of the first half mirror of the second imaging component, and the light absorption plate is used for absorbing stray light transmitted by the first half mirror of the second imaging component.
4. The optical display system according to claim 2,
the value range of the first included angle is 50-60 degrees;
the third included angle ranges from 40 degrees to 50 degrees.
5. The optical display system according to claim 2,
the fourth angle is determined based on the difference between twice the third angle and 90 degrees.
6. An optical display system as claimed in claim 1, characterized in that the second half-mirror is a half-mirror concave mirror.
7. An imaging method applied to the optical display system according to any one of claims 1 to 6, the method comprising:
controlling each imaging assembly in the optical display system to image at a corresponding depth of field;
and adjusting the imaging brightness of each imaging component according to the image depth of the three-dimensional image to be presented.
8. An augmented reality device comprising an optical display system as claimed in any one of claims 1 to 6.
9. The augmented reality device of claim 8, further comprising a grayscale filter disposed opposite an ambient light entrance side of the second half mirror of the optical display system.
10. The augmented reality device of claim 8, wherein the augmented reality device is augmented reality glasses;
the augmented reality glasses comprise two optical display systems corresponding to double purposes.
CN201910449962.8A 2019-05-28 2019-05-28 Optical display system, display control device and augmented reality equipment Active CN110095870B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910449962.8A CN110095870B (en) 2019-05-28 2019-05-28 Optical display system, display control device and augmented reality equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910449962.8A CN110095870B (en) 2019-05-28 2019-05-28 Optical display system, display control device and augmented reality equipment

Publications (2)

Publication Number Publication Date
CN110095870A CN110095870A (en) 2019-08-06
CN110095870B true CN110095870B (en) 2022-04-19

Family

ID=67449419

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910449962.8A Active CN110095870B (en) 2019-05-28 2019-05-28 Optical display system, display control device and augmented reality equipment

Country Status (1)

Country Link
CN (1) CN110095870B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111338081A (en) * 2020-03-12 2020-06-26 京东方科技集团股份有限公司 AR optical system and AR display device
CN111290127B (en) * 2020-03-31 2022-11-15 优奈柯恩(北京)科技有限公司 Head-mounted display device
CN111649696B (en) * 2020-06-12 2021-07-06 珠海博明传感器技术有限公司 High-precision calibration method for structured light measurement system
CN113835135B (en) * 2020-06-23 2022-11-22 同方威视技术股份有限公司 Terahertz security inspection robot
US11320668B2 (en) * 2020-08-21 2022-05-03 Brelyon, Inc. Methods, systems, apparatuses, and devices for facilitating light field optical fusion
CN112526763B (en) * 2020-11-20 2022-09-27 亿信科技发展有限公司 Light field 3D display device and driving method thereof
CN113341567B (en) * 2021-05-12 2022-12-02 北京理工大学 Double-focal-plane optical waveguide near-to-eye display optical system
CN113124821B (en) * 2021-06-17 2021-09-10 中国空气动力研究与发展中心低速空气动力研究所 Structure measurement method based on curved mirror and plane mirror
CN115686181A (en) * 2021-07-21 2023-02-03 华为技术有限公司 Display method and electronic equipment
CN114089537B (en) * 2021-11-30 2023-09-26 京东方科技集团股份有限公司 Zoom device, AR equipment and myopia correction method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103969838A (en) * 2014-05-27 2014-08-06 清华大学 Three-dimensional stereoscopic imaging method and device
CN104570367A (en) * 2015-01-28 2015-04-29 深圳市安华光电技术有限公司 Stereo display device
CN206178246U (en) * 2016-09-05 2017-05-17 浙江舜通智能科技有限公司 Display device and wear -type display system
CN109001910A (en) * 2018-10-15 2018-12-14 浙江水晶光电科技股份有限公司 Head-up display and automobile
CN109188691A (en) * 2018-09-21 2019-01-11 歌尔科技有限公司 A kind of optical system and VR equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10088683B2 (en) * 2014-10-24 2018-10-02 Tapuyihai (Shanghai) Intelligent Technology Co., Ltd. Head worn displaying device employing mobile phone
CN106154553A (en) * 2016-08-01 2016-11-23 全球能源互联网研究院 A kind of electric inspection process intelligent helmet Binocular displays system and its implementation
CN107300777A (en) * 2017-08-18 2017-10-27 深圳惠牛科技有限公司 A kind of imaging system reflected based on double free form surfaces
CN207133516U (en) * 2017-09-19 2018-03-23 歌尔科技有限公司 A kind of AR display devices
CN108873345A (en) * 2018-07-09 2018-11-23 杭州光粒科技有限公司 The wearable light field augmented reality glasses of big field angle, more display depths
CN109683320A (en) * 2019-02-20 2019-04-26 京东方科技集团股份有限公司 Display device and display methods
CN109709676A (en) * 2019-03-07 2019-05-03 浙江水晶光电科技股份有限公司 A kind of augmented reality optics module and augmented reality device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103969838A (en) * 2014-05-27 2014-08-06 清华大学 Three-dimensional stereoscopic imaging method and device
CN104570367A (en) * 2015-01-28 2015-04-29 深圳市安华光电技术有限公司 Stereo display device
CN206178246U (en) * 2016-09-05 2017-05-17 浙江舜通智能科技有限公司 Display device and wear -type display system
CN109188691A (en) * 2018-09-21 2019-01-11 歌尔科技有限公司 A kind of optical system and VR equipment
CN109001910A (en) * 2018-10-15 2018-12-14 浙江水晶光电科技股份有限公司 Head-up display and automobile

Also Published As

Publication number Publication date
CN110095870A (en) 2019-08-06

Similar Documents

Publication Publication Date Title
CN110095870B (en) Optical display system, display control device and augmented reality equipment
JP7369507B2 (en) Wearable 3D augmented reality display with variable focus and/or object recognition
CN107003734B (en) Device, method and system for coupling visual accommodation and visual convergence to the same plane at any depth of an object of interest
CN110187506B (en) Optical display system and augmented reality device
JP3177518B2 (en) Technology for observing images in deep visual fields with improved clarity and contrast
US10382699B2 (en) Imaging system and method of producing images for display apparatus
US20200081530A1 (en) Method and system for registering between an external scene and a virtual image
CN107076984A (en) Virtual image maker
CN110376737B (en) Optical display system, display control device and augmented reality equipment
IL276021B1 (en) Virtual display system with addressable focus cues
CN109803133B (en) Image processing method and device and display device
CN102566248B (en) Stereoscopic imaging apparatus
CA2559920A1 (en) A stereoscopic display
US10852546B2 (en) Head mounted display and multiple depth imaging apparatus
CN110794582A (en) Head-mounted display and multi-depth imaging device
Fisker et al. Automatic Convergence Adjustment for Stereoscopy using Eye Tracking.
JPH08146348A (en) Single eye observation perspective sensation adjustment type display device
CN114578554B (en) Display equipment for realizing virtual-real fusion
KR102556895B1 (en) Optical arrangement for generating virtual reality stereoscopic images
TWI832308B (en) Optic system for head wearable devices
US20240118550A1 (en) System and method for multi-instances emission for retina scanning based near eye display
EP4083683A1 (en) Display apparatus including free-formed surface and operating method thereof
KR20180108314A (en) Method and apparatus for displaying a 3-dimensional image adapting user interaction information
CN113296347A (en) Naked eye display device
Gebel et al. Deep learning approach for creating the natural vergence-accommodation conditions in virtual and mixed reality systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant