CN108111837B - Image fusion method and device for binocular near-eye display - Google Patents

Image fusion method and device for binocular near-eye display Download PDF

Info

Publication number
CN108111837B
CN108111837B CN201711207847.7A CN201711207847A CN108111837B CN 108111837 B CN108111837 B CN 108111837B CN 201711207847 A CN201711207847 A CN 201711207847A CN 108111837 B CN108111837 B CN 108111837B
Authority
CN
China
Prior art keywords
content data
picture
rotation angle
display
picture content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711207847.7A
Other languages
Chinese (zh)
Other versions
CN108111837A (en
Inventor
王耀彰
郑昱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Journey Technology Ltd
Original Assignee
Journey Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Journey Technology Ltd filed Critical Journey Technology Ltd
Priority to CN201711207847.7A priority Critical patent/CN108111837B/en
Publication of CN108111837A publication Critical patent/CN108111837A/en
Application granted granted Critical
Publication of CN108111837B publication Critical patent/CN108111837B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a binocular near-eye display image fusion method and device, and relates to the technical field of optics. The method comprises the following steps: acquiring picture content data; acquiring view field control information corresponding to the picture content data; determining first picture content data and a first rotation angle of a left display unit and second picture content data and a second rotation angle of a right display unit according to the view field control information; and controlling the left display unit and the right display unit to rotate and display pictures according to the first picture content data and the first rotation angle, and the second picture content data and the second rotation angle. The invention can dynamically adjust the size of the field angle in a near-to-eye display system, and simultaneously match with a dynamic binocular fusion image, thereby realizing better image display effect.

Description

Image fusion method and device for binocular near-eye display
Technical Field
The invention relates to the technical field of optics, in particular to a binocular near-eye display image fusion method and device.
Background
In the prior art, the binocular display image splicing technology can be used for expanding the view field of binocular display. Because the binocular fusion mainly aims at the stereoscopic display effect from the aspect of application, for a common plane picture, only monocular display can be achieved. Therefore, in the prior art, a part of pictures are used for binocular fusion, a part of pictures are used for monocular display, and the content displayed by two eyes is partially overlapped, so that the visual field is expanded by means of the non-overlapped part.
The prior art can only display images with a fixed view field, and cannot adjust the size of the view field according to the requirements of screen display contents.
Disclosure of Invention
The embodiment of the invention provides a binocular near-eye display image fusion method and device. The binocular near-eye display device aims at solving the problem that the size of a view field cannot be adjusted. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
According to a first aspect of the embodiments of the present invention, there is provided a binocular near-eye display image fusion method, including:
acquiring picture content data;
acquiring view field control information corresponding to the picture content data;
determining first picture content data and a first rotation angle of a left display unit and second picture content data and a second rotation angle of a right display unit according to the view field control information;
and controlling the left display unit and the right display unit to rotate and display pictures according to the first picture content data and the first rotation angle, and the second picture content data and the second rotation angle.
Optionally, the first picture content data includes left picture content data and overlapping picture content data, and the second picture content data includes right picture content data and overlapping picture content data.
Optionally, when the left picture display width is the same as the right picture display width, the view field control information includes a picture splicing parameter P, where P is a ratio of the left picture display width to the overlapping picture display width;
determining a first rotation angle theta according to the view field control informationLAnd a second rotation angle thetaRThe method comprises the following steps:
Figure BDA0001484045480000021
wherein the content of the first and second substances,
Figure BDA0001484045480000022
Fwis the angle of view.
Optionally, the view field control information further includes a picture deflection parameter η characterizing a deflection angle of the center of the overlapped picture from the center of the line of sight;
determining a first rotation angle theta according to the view field control informationLAnd a second rotation angle thetaRThe method comprises the following steps:
Figure BDA0001484045480000023
Figure BDA0001484045480000024
note that η is a positive value when the center of the superimposed screen is deflected leftward with respect to the center of line of sight.
Optionally, the field control information further includes an adjustment time t;
controlling the left display unit and the right display unit to rotate according to the first rotation angle and the second rotation angle comprises:
and controlling the left display unit to rotate to a first rotation angle at a constant speed within the adjusting time t, and controlling the right display unit to rotate to a second rotation angle at a constant speed within the adjusting time t.
According to a second aspect of the embodiments of the present invention, there is provided an image fusion apparatus for binocular near-eye display, including:
a receiver for acquiring picture content data and field control information corresponding to the acquired picture content data;
a processor for determining first picture content data and a first rotation angle of the left display, and second picture content data and a second rotation angle of the right display according to the view field control information received by the receiver;
the left display is used for rotating to a first rotating angle determined by the processor and displaying first picture content data;
and the right display is used for rotating to the second rotation angle determined by the processor and displaying the second picture content data.
Optionally, the left display is further configured to display left picture content data and overlapping picture content data;
and a right display for displaying the right picture content data and the overlapped picture content data.
Optionally, when the left image display width is the same as the right image display width, the field control information received by the receiver includes an image splicing parameter P, where P is a ratio of the left image display width to the overlapping image display width;
the processor is further configured to determine a first rotation angle θLAnd a second rotation angle thetaR
Figure BDA0001484045480000031
Wherein the content of the first and second substances,
Figure BDA0001484045480000032
Fwis the angle of view.
Optionally, the field-of-view control information received by the receiver further includes a picture deflection parameter η characterizing a deflection angle of the center of the overlapping picture from the center of the line of sight;
the processor is further configured to determine a first rotation angle θLAnd a second rotation angle thetaR
Figure BDA0001484045480000033
Figure BDA0001484045480000034
Note that η is a positive value when the center of the superimposed screen is deflected leftward with respect to the center of line of sight.
Optionally, the field-of-view control information received by the receiver further includes an adjustment time t;
the left display is further used for rotating to a first rotating angle at a constant speed within the adjusting time t;
the right display is also used for rotating to a second rotation angle at a constant speed within the adjusting time t.
The technical scheme disclosed by the embodiment of the invention can dynamically adjust the size of the field angle in a near-eye display system, particularly an augmented reality display system, and simultaneously match a dynamic binocular fusion image, thereby realizing a better image display effect.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a flowchart of an image fusion method for binocular near-eye display according to an embodiment of the present invention;
FIG. 2 is a diagram of a display screen according to an embodiment of the present invention;
FIG. 3 is a diagram of another display screen disclosed in the embodiments of the present invention;
FIG. 4 is a diagram of another display screen disclosed in the embodiments of the present invention;
fig. 5 is a schematic diagram of another display screen disclosed in the embodiment of the present invention.
FIG. 6 is a diagram of another display screen disclosed in the embodiments of the present invention;
fig. 7 is a schematic diagram of an image fusion apparatus for binocular near-eye display according to an embodiment of the present invention.
Detailed Description
The following description and the drawings sufficiently illustrate specific embodiments of the invention to enable those skilled in the art to practice them. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of embodiments of the invention encompasses the full ambit of the claims, as well as all available equivalents of the claims. Embodiments may be referred to herein, individually or collectively, by the term "invention" merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed. The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the structures, products and the like disclosed by the embodiments, the description is relatively simple because the structures, the products and the like correspond to the parts disclosed by the embodiments, and the relevant parts can be just described by referring to the method part.
The embodiment of the invention discloses a binocular near-eye display image fusion method, as shown in figure 1, comprising the following steps:
s101, acquiring picture content data;
s102, acquiring view field control information corresponding to the picture content data;
s103, determining first picture content data and a first rotation angle of the left display unit and second picture content data and a second rotation angle of the right display unit according to the view field control information;
and S104, controlling the left display unit and the right display unit to rotate and display images according to the first image content data and the first rotation angle, and the second image content data and the second rotation angle.
In S102, the viewing field control information corresponding to the screen content data may be acquired simultaneously with the screen content data or may be acquired separately. When the data of the picture content is acquired simultaneously, the view field control information can be determined simultaneously when the data of the picture content is manufactured, so that the data of the picture content and the view field control information can be acquired simultaneously at a later stage, and when the data of the picture content and the view field control information are acquired respectively, the view field control information can be automatically determined through a preset algorithm according to the analysis of the data of the picture content, or can be freely set in the specific using process by a user, determined through the algorithm, manually set or other methods for acquiring the view field control information, and the invention is not limited.
At S1In 03, for example, when the eyes view in parallel to the front, the viewing position is defined as the center of the line of sight, and for the aspect description, the horizontal pixel of the left display unit and the right display unit is defined as W, the vertical pixel is defined as H, and the horizontal field angle is defined as FWAngle of view in the vertical direction of FHThe picture splicing parameter is P, and the picture deflection parameter is η, wherein:
the image displayed by the left display unit is first picture content data comprising left picture content data and overlapped picture content data, and the image displayed by the right display unit is second picture content data comprising right picture content data and overlapped picture content data. The overlapped picture content data is a partial picture displayed by the left display unit and the right display unit together, the left picture content data is a left-eye partial picture displayed by the left display unit in an expanded manner, and the right picture content data is a right-eye partial picture displayed by the right display unit in an expanded manner, as shown in fig. 2
Optionally, the view field control information may include a picture splicing parameter P, where P is equal to a ratio of a left picture display width to an overlapping picture display width, and when the left picture display width is the same as the right picture display width, P is also equal to a ratio of the right picture display width to the overlapping picture display width. When P is 0, the binocular pictures are completely overlapped, and when P >0, the binocular pictures are partially overlapped, and a picture splicing situation occurs. The horizontal width pixel of the overlapped part is W/(1+ P), and the width pixel of the extended monocular part at both sides is PW/(1+ P), as shown in FIG. 3.
Specifically, let the deflection angles of the left display unit and the right display unit with respect to the center of the line of sight be θLAnd thetaRAs shown in fig. 4, the field angle width of the overlapped screen content is:
Figure BDA0001484045480000061
the angular width of the field of view of the left picture content is:
Figure BDA0001484045480000062
the angular width of the field of view of the right picture content is:
Figure BDA0001484045480000063
suppose that the left sight line center divides the screen into left and right pixel widths of WLLAnd WLRThen, there are:
Figure BDA0001484045480000064
Figure BDA0001484045480000065
Figure BDA0001484045480000066
note the book
Figure BDA0001484045480000067
Then there are:
Figure BDA0001484045480000068
Figure BDA0001484045480000069
similarly, for the right case, it is assumed that the right sight line center divides the screen into left and right pixel widths of WRLAnd WRRAnd y is tan thetaRThen, there are:
Figure BDA0001484045480000071
Figure BDA0001484045480000072
Figure BDA0001484045480000073
when theta isL=θRWhen the temperature of the water is higher than the set temperature,
Figure BDA0001484045480000074
further, when P is 0,
x=0
when the P is greater than 0, the P is,
Figure BDA0001484045480000075
when P approaches 0, we can get:
Figure BDA0001484045480000076
optionally, the view field control information may further include a screen deflection parameter η, where η represents a deflection angle between the center of the overlapped screen and the center of the view line, when η is equal to 0, the center of the overlapped screen is parallel to the center of the view line, when η is equal to 0, the angle representing the center of the overlapped screen to achieve center deflection is η, and for the purpose of description, when η is a positive value when the center of the overlapped screen is deflected to the left with respect to the center of the view line, there are:
Figure BDA0001484045480000077
Figure BDA0001484045480000078
optionally, the viewing field control information may further include an adjustment time t representing a time required for the left and right display units to rotate from the current angle to the first and second rotation angles, and exemplarily, when the adjustment time t is 1s, the left and right display units need to rotate from the current angle to the first and second rotation angles within 1s, and further optionally, the left and right display units may be arranged in a position corresponding to the adjustment time tAnd (5) rotating at a constant speed within the adjusting time t. One special case is that if the current angle of the left display unit and the right display unit is thetaL=θRThe first and second rotation angles are also θ 5L=θRThe adjustment time t at this time indicates that the left and right display units should maintain the current angle for the time t.
The technical scheme disclosed by the embodiment of the invention can dynamically adjust the size of the field angle in a near-eye display system, particularly an augmented reality display system, and simultaneously match a dynamic binocular fusion image, thereby realizing a better image display effect.
Illustratively, the animation picture commonly displayed by the left display unit and the right display unit at the time T0 is as shown in fig. 5, and the picture content data at the time T1 after the time T0 is received, including the animation picture and the related information respectively displayed on the left and right sides of the animation picture, and the view field control information at the time T1, including the first rotation angle and the second rotation angle θL=θR5.768 ° and adjustment time t 2s, then:
the left display unit and the right display unit respectively rotate to the first rotation angle and the second rotation angle at a constant speed within 2s before the time T1, and a display screen at the time T1 is as shown in fig. 6. When the horizontal angle of view of the screen at time T0 is 30 °, the total horizontal angle of view at time T1 is 41.537 °.
The embodiment of the present invention further discloses a binocular near-eye display image fusion device 20, as shown in fig. 7, including:
a receiver 201 for acquiring picture content data and field control information corresponding to the acquired picture content data;
a processor 202 for determining a first picture content data and a first rotation angle of the left display 203 and a second picture content data and a second rotation angle of the right display 204 according to the field of view control information received by the receiver 201;
a left display 203, which is used for rotating to the first rotation angle determined by the processor and displaying the first picture content data;
and a right display 204 for rotating to the second rotation angle determined by the processor and displaying the second picture content data.
Optionally, the left display 203 may be further configured to display left picture content data and overlapping picture content data;
the right display 204 may also be used to display right picture content data and overlay picture content data.
Specifically, the left display 203 and the right display 204 may include a mechanical adjusting device, the mechanical adjusting device may adjust the optical axis direction of the displays through a stepping motor and other devices, and the adjusting manner may include direction adjustment of an optical module, position adjustment of the displays, and the like, which is not limited in the present invention.
Optionally, when the left frame display width is the same as the right frame display width, the view field control information received by the receiver 201 may include a frame splicing parameter P, where P is a ratio of the left frame display width to the overlapped frame display width;
processor 202 may also be configured to determine a first rotation angle θLAnd a second rotation angle thetaR
Figure BDA0001484045480000091
Wherein the content of the first and second substances,
Figure BDA0001484045480000092
Fwis the angle of view.
Optionally, the field-of-view control information received by the receiver 201 may further include a picture deflection parameter η, which represents a deflection angle between the center of the overlapped picture and the center of the line of sight;
processor 202 may also be configured to determine a first rotation angle θLAnd a second rotation angle thetaR
Figure BDA0001484045480000093
Figure BDA0001484045480000094
Note that η is a positive value when the center of the superimposed screen is deflected leftward with respect to the center of line of sight.
Optionally, the field-of-view control information received by the receiver 201 may further include an adjustment time t;
the left display 203 can also be used for rotating to a first rotation angle at a constant speed within the adjustment time t;
the right display 204 may also be configured to rotate at a constant speed to a second angle of rotation within the adjustment time t.
The image fusion apparatus 20 for binocular near-eye display disclosed in this embodiment may be used to perform the image fusion method for binocular near-eye display disclosed in the foregoing embodiments, and the related concept description and the process of reversing are already described in detail in the foregoing, and will not be described again here.
It is to be understood that the present invention is not limited to the procedures and structures described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (6)

1. An image fusion method for binocular near-eye display comprises the following steps:
acquiring picture content data;
acquiring view field control information corresponding to the picture content data;
determining first picture content data and a first rotation angle of a left display unit and second picture content data and a second rotation angle of a right display unit according to the view field control information;
controlling a left display unit and a right display unit to rotate and display images according to the first image content data and the first rotation angle, and the second image content data and the second rotation angle;
the first picture content data comprises left picture content data and overlapping picture content data, and the second picture content data comprises right picture content data and the overlapping picture content data;
when the display width of the left picture is the same as that of the right picture, the view field control information comprises a picture splicing parameter P, wherein P is the ratio of the display width of the left picture to the display width of the overlapped picture;
determining the first rotation angle theta according to the view field control informationLAnd the second rotation angle thetaRThe method comprises the following steps:
Figure FDA0002388600130000011
wherein the content of the first and second substances,
Figure FDA0002388600130000012
Fwis the angle of view.
2. The method of claim 1, wherein the field-of-view control information further comprises a picture deflection parameter η, the η characterizing a deflection angle of an overlapping picture center from a line-of-sight center;
determining the first rotation angle theta according to the view field control informationLAnd the second rotation angle thetaRThe method comprises the following steps:
Figure FDA0002388600130000013
Figure FDA0002388600130000014
note that η is a positive value when the center of the superimposed screen is deflected leftward with respect to the center of line of sight.
3. The method of claim 1, wherein the field-of-view control information further comprises an adjustment time t;
according to the first rotation angle and the second rotation angle, controlling the left display unit and the right display unit to rotate comprises the following steps:
and controlling the left display unit to rotate to the first rotation angle at a constant speed within the adjusting time t, and controlling the right display unit to rotate to the second rotation angle at a constant speed within the adjusting time t.
4. An image fusion apparatus for binocular near-eye display, comprising:
a receiver for acquiring picture content data and for acquiring field control information corresponding to said picture content data;
a processor for determining a first image content data and a first rotation angle of a left display and a second image content data and a second rotation angle of a right display according to the view field control information received by the receiver;
the left display is used for rotating to the first rotation angle determined by the processor and displaying first picture content data;
the right display is used for rotating to the second rotation angle determined by the processor and displaying second picture content data;
the left display is also used for displaying left picture content data and overlapped picture content data;
the right display is further used for displaying right picture content data and the overlapped picture content data;
when the left picture display width is the same as the right picture display width, the view field control information received by the receiver comprises a picture splicing parameter P, wherein P is the ratio of the left picture display width to the overlapped picture display width;
the processor is further configured to determine the first rotation angle θLAnd the second rotation angle thetaR
Figure FDA0002388600130000021
Wherein the content of the first and second substances,
Figure FDA0002388600130000022
Fwis the field of viewAnd (4) an angle.
5. The apparatus of claim 4, wherein the field-of-view control information received by the receiver further comprises a picture deflection parameter η, the η characterizing a deflection angle of an overlapping picture center from a line-of-sight center;
the processor is further configured to determine the first rotation angle θLAnd the second rotation angle thetaR
Figure FDA0002388600130000031
Figure FDA0002388600130000032
Note that η is a positive value when the center of the superimposed screen is deflected leftward with respect to the center of line of sight.
6. The apparatus of claim 4, wherein the field of view control information received by the receiver further comprises an adjustment time t;
the left display is further used for rotating to the first rotating angle at a constant speed within the adjusting time t;
and the right display is also used for rotating to the second rotation angle at a constant speed within the adjusting time t.
CN201711207847.7A 2017-11-27 2017-11-27 Image fusion method and device for binocular near-eye display Active CN108111837B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711207847.7A CN108111837B (en) 2017-11-27 2017-11-27 Image fusion method and device for binocular near-eye display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711207847.7A CN108111837B (en) 2017-11-27 2017-11-27 Image fusion method and device for binocular near-eye display

Publications (2)

Publication Number Publication Date
CN108111837A CN108111837A (en) 2018-06-01
CN108111837B true CN108111837B (en) 2020-08-07

Family

ID=62208584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711207847.7A Active CN108111837B (en) 2017-11-27 2017-11-27 Image fusion method and device for binocular near-eye display

Country Status (1)

Country Link
CN (1) CN108111837B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101241249A (en) * 2007-02-07 2008-08-13 鸿富锦精密工业(深圳)有限公司 Liquid crystal display panel
CN102375235A (en) * 2010-08-18 2012-03-14 索尼公司 Display apparatus
CN102928979A (en) * 2011-08-30 2013-02-13 微软公司 Adjustment of a mixed reality display for inter-pupillary distance alignment
CN108474952A (en) * 2016-01-06 2018-08-31 三星电子株式会社 Wear-type electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012172719A1 (en) * 2011-06-16 2012-12-20 パナソニック株式会社 Head-mounted display and misalignment correction method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101241249A (en) * 2007-02-07 2008-08-13 鸿富锦精密工业(深圳)有限公司 Liquid crystal display panel
CN102375235A (en) * 2010-08-18 2012-03-14 索尼公司 Display apparatus
CN102928979A (en) * 2011-08-30 2013-02-13 微软公司 Adjustment of a mixed reality display for inter-pupillary distance alignment
CN108474952A (en) * 2016-01-06 2018-08-31 三星电子株式会社 Wear-type electronic equipment

Also Published As

Publication number Publication date
CN108111837A (en) 2018-06-01

Similar Documents

Publication Publication Date Title
JP5427035B2 (en) Image observation using multiple individual settings
JP4827783B2 (en) Image display device
JP2009152798A (en) Image signal processing apparatus, image signal processing method, image projecting system, image projecting method, and program
US20080174659A1 (en) Wide field of view display device and method
US20120038635A1 (en) 3-d rendering for a rotated viewer
JP2000258723A (en) Video display device
US20120044241A1 (en) Three-dimensional on-screen display imaging system and method
EP2541948B1 (en) Stereoscopic image display method and display timing controller
US20150172644A1 (en) Display device and display method thereof
WO2018233275A1 (en) Naked-eye 3d display method, device and terminal equipment
US10495942B2 (en) Liquid crystal prism, method for driving the same, and display device
US20170230647A1 (en) Multiview image display device and control method therefor
US20170359562A1 (en) Methods and systems for producing a magnified 3d image
JP4634863B2 (en) Stereoscopic image generation apparatus and stereoscopic image generation program
US20120105746A1 (en) 3d glasses, 3d display apparatus having the same and control method thereof
KR100751290B1 (en) Image system for head mounted display
CN106559662B (en) Multi-view image display apparatus and control method thereof
CN108111837B (en) Image fusion method and device for binocular near-eye display
WO2012021129A1 (en) 3d rendering for a rotated viewer
WO2017215251A1 (en) Parallax grating panel, display device, electronic device, and display method
US8994791B2 (en) Apparatus and method for displaying three-dimensional images
US8913077B2 (en) Image processing apparatus and image processing method
Kang Wei et al. Three-dimensional scene navigation through anaglyphic panorama visualization
JP2007129494A (en) Display apparatus
KR102242923B1 (en) Alignment device for stereoscopic camera and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant