CN116703717A - Binocular image fusion method, binocular image fusion system and terminal equipment - Google Patents

Binocular image fusion method, binocular image fusion system and terminal equipment Download PDF

Info

Publication number
CN116703717A
CN116703717A CN202210189392.5A CN202210189392A CN116703717A CN 116703717 A CN116703717 A CN 116703717A CN 202210189392 A CN202210189392 A CN 202210189392A CN 116703717 A CN116703717 A CN 116703717A
Authority
CN
China
Prior art keywords
content data
picture content
optical system
move
object plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210189392.5A
Other languages
Chinese (zh)
Inventor
王耀彰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Journey Technology Ltd
Original Assignee
Journey Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Journey Technology Ltd filed Critical Journey Technology Ltd
Priority to CN202210189392.5A priority Critical patent/CN116703717A/en
Publication of CN116703717A publication Critical patent/CN116703717A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a binocular image fusion method, a binocular image fusion system and terminal equipment. According to the binocular image fusion method provided by the application, the positions of the first picture content data and the second picture content data can be dynamically adjusted according to the splicing control information, so that the size of the angle of view is dynamically adjusted, and the dynamic binocular fusion picture is matched. In addition, the design parameters of the two optical systems for projecting the first picture content data and the second picture content data are the same, the first picture content data and the second picture content data move in the same object plane of the corresponding optical systems, and the moving distances of the first picture content data and the second picture content data are the same and opposite. The design can ensure that the overlapping area of the first picture content data and the second picture content data has the same distortion, thereby solving the problem that the binocular overlapping area generated by distortion cannot be fused in the process of expanding the angle of view by using a binocular splicing picture method.

Description

Binocular image fusion method, binocular image fusion system and terminal equipment
Technical Field
The application belongs to the technical field of near-to-eye display, and particularly relates to a binocular image fusion method, a binocular image fusion system and terminal equipment.
Background
Along with the development of scientific technology, the near-eye imaging display technology is receiving more and more attention. Near-to-eye imaging display technology is widely applied to various fields of scientific research, military, industry, games, video, education and the like. In a near-eye imaging display system, the difficulty of designing an optical system increases with an increase in the angle of view, and in order to realize a larger display area, a binocular field part splicing technique is sometimes adopted.
In the prior art, some of the technical solutions are technologies for realizing the spliced display of binocular field parts by a method of calculating and adjusting the rotation angles of left and right eye display devices. However, the scheme has high requirements on image distortion parameters, and can realize better splicing effect only under the condition of small distortion. However, aiming at a visual system with a larger angle of view, reducing distortion can improve design difficulty and production cost, and can influence other parameters and finally influence display effect.
Disclosure of Invention
The embodiment of the application provides a binocular image fusion method, a binocular image fusion system and terminal equipment, which are used for solving the problem that binocular overlapping areas generated by distortion cannot be fused in the process of expanding the angle of view by using a binocular picture splicing method.
In a first aspect, an embodiment of the present application provides a binocular image fusion method, including:
acquiring splicing control information;
determining a first displacement parameter of the first picture content data and a second displacement parameter of the second picture content data according to the splicing control information;
controlling the first picture content data to move to a first position in an object plane of a first optical system according to a first displacement parameter of the first picture content data, and controlling the second picture content data to move to a second position in the object plane of a second optical system according to a second displacement parameter of the second picture content data, wherein the first displacement parameter and the second displacement parameter both comprise a moving distance and a moving direction, the moving distance of the first picture content data is the same as the moving distance of the second picture content data, and the moving direction of the first picture content data is opposite to the moving direction of the second picture content data;
and projecting the first picture content data by using a first optical system and projecting the second picture content data by using a second optical system to realize the display of a fusion picture, wherein the design parameters of the first optical system and the second optical system are the same.
In the binocular image fusion method provided by the application, the positions of the first picture content data and the second picture content data can be dynamically adjusted according to the splicing control information, so that the size of the angle of view is dynamically adjusted, and the dynamic binocular fusion picture is matched. In addition, the design parameters of the two optical systems for projecting the first picture content data and the second picture content data are the same, the first picture content data and the second picture content data move in the same object plane of the corresponding optical systems, and the moving distances of the first picture content data and the second picture content data are the same and opposite. The design can ensure that the overlapping area of the first picture content data and the second picture content data has the same distortion, thereby solving the problem that the binocular overlapping area generated by distortion cannot be fused in the process of expanding the angle of view by using a binocular splicing picture method.
In one embodiment, the splice control information includes an adjustment time t;
the controlling the first picture content data to move to the first position in the object plane of the first optical system according to the first displacement parameter of the first picture content data, and controlling the second picture content data to move to the second position in the object plane of the second optical system according to the second displacement parameter of the second picture content data includes:
controlling the first picture content data to move to a first position at a constant speed within the adjustment time t;
and controlling the second picture content data to move to a second position at a uniform speed within the adjusting time t.
In one embodiment, the method comprises the steps of:
controlling the first picture content data to move to a first position in an object plane of a first optical system by adjusting the physical position of a first image source;
and controlling the second picture content data to move to a second position in the object plane of the second optical system by adjusting the physical position of the second image source.
In one embodiment, the method comprises the steps of:
controlling the first picture content data to move to a first position in an object plane of a first optical system by adjusting the position of an effective light emitting area of a first image source;
and controlling the second picture content data to move to a second position in the object plane of the second optical system by adjusting the position of the effective light emitting area of the second image source.
In one embodiment, the method comprises the steps of:
and determining design parameters of the first optical system and the second optical system according to the preset adjustable position range of the first picture content data or the second picture content data.
In a second aspect, an embodiment of the present application provides a binocular image fusion system, including:
the acquisition module is used for acquiring splicing control information;
the calculation module is used for determining a first displacement parameter of the first picture content data and a second displacement parameter of the second picture content data according to the splicing control information;
the adjusting module is used for controlling the first picture content data to move to a first position in an object plane of a first optical system according to a first displacement parameter of the first picture content data, and controlling the second picture content data to move to a second position in the object plane of a second optical system according to a second displacement parameter of the second picture content data, wherein the first displacement parameter and the second displacement parameter both comprise a moving distance and a moving direction, the moving distance of the first picture content data is the same as the moving distance of the second picture content data, and the moving direction of the first picture content data is opposite to the moving direction of the second picture content data;
a first optical system for projecting the first picture content data;
and the second optical system is used for projecting the second picture content data so as to realize the display of a fusion picture, wherein the design parameters of the first optical system and the second optical system are the same.
In one embodiment, the calculating module is further configured to determine design parameters of the first optical system and the second optical system according to a preset adjustable position range of the first screen content data or the second screen content data.
In one embodiment, the adjustment module controls the first picture content data to move to a first position in an object plane of the first optical system by adjusting a physical position of the first image source;
the adjusting module controls the second picture content data to move to a second position in the object plane of the second optical system by adjusting the physical position of the second image source.
In one embodiment, the adjusting module controls the first picture content data to move to a first position in the object plane of the first optical system by adjusting the position of the effective light emitting area of the first image source;
the adjusting module controls the second picture content data to move to a second position in the object plane of the second optical system by adjusting the position of the effective light emitting area of the second image source.
In a third aspect, an embodiment of the present application provides a terminal device, including: a processor and a memory for storing a computer program, the processor for calling and running the computer program from the memory, causing the apparatus to perform the method of any of the first aspects.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored therein, which when executed by a processor, causes the processor to perform the method of any of the first aspects.
In a fifth aspect, embodiments of the present application provide a computer program product comprising: computer program code which, when run by a computer, causes the computer to perform the method of any of the first aspects.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a diagram of distortion of two images in the prior art image fusion process provided by the application;
FIG. 2 is a schematic flow chart of a binocular image fusion method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a multi-display screen according to an embodiment of the application;
FIG. 4 is a schematic diagram of a display area of an image source according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a binocular image fusion system according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
Referring to fig. 2, in a first aspect, an embodiment of the present application provides a binocular image fusion method, including:
s10, acquiring splicing control information;
s20, determining a first displacement parameter of the first picture content data and a second displacement parameter of the second picture content data according to the splicing control information;
s30, controlling the first picture content data to move to a first position in the object plane of a first optical system according to the first displacement parameter of the first picture content data, and controlling the second picture content data to move to a second position in the object plane of a second optical system according to the second displacement parameter of the second picture content data; wherein the first displacement parameter and the second displacement parameter both comprise a moving distance and a moving direction, the moving distance of the first picture content data is the same as the moving distance of the second picture content data, and the moving direction of the first picture content data is opposite to the moving direction of the second picture content data;
s40, projecting the first picture content data by using a first optical system, and projecting the second picture content data by using a second optical system so as to realize the display of a fusion picture; wherein design parameters of the first optical system and the second optical system are the same.
In step S10, in one embodiment, the splicing control information may be determined according to a relationship between the preset final fusion display screen and the first screen content data and the second screen content data. The splice control information may include splice parameters. The stitching parameter is equal to the ratio of the individual display width to the display width of the overlap region. For example, if the display width of the final fusion display screen is 3d and the final display screen widths of the first screen content data and the second screen content data are both 2d, the display width of the overlapping region is d. It can be confirmed that the splice parameter is 1. In another embodiment, the splice control information may also be determined according to a preset field angle.
In step S20, the first image source is used to output first screen content data. The second image source is used for outputting second picture content data. The first picture content data includes left picture content data and overlapping picture content data. The second picture content data includes right picture content data and overlapping picture content data. When the splice control information is determined, a first displacement parameter of the first picture content data and a second displacement parameter of the second picture content data may be determined using the geometric relationship.
In step S30, the first image content data moves to a first position in the object plane of the first optical system and the second image content data moves to a second position in the object plane of the second optical system, which correspond to the first image source and the second image source using different positions of the whole imaging optical system away from the optical axis, respectively. The design parameters of the first optical system and the second optical system are the same, so that the two optical systems can be ensured to be under the same distortion condition, and further, the two picture content data can be ensured to have the same distortion condition in the imaging process.
In particular, distortion can be seen as a one-to-one mapping of pixels in object space to pixels in image space
f:S Article (B) →S Image forming apparatus
In the present case, the area of the left and right eye screen (the image source corresponding to the left and right eyes) is a subset of the object plane area of the optical system
Thus, the left-right eye screen to left-right eye picture both conform to the map f
f:S Left screen →S Left image And f: S Right screen →S Right image
In the spliced picture, each pixel P E S of the overlapped area Left screen ∩S Right screen According to the attribute of one-to-one mapping, there is unique P Image forming apparatus ∈S Left image ∩S Right image So that P Image forming apparatus =f(P)
That is, the pixel P has the same distortion after passing through the optical systems of the left and right eyes, so that the problem that the first and second picture content data cannot be fused in the process of splicing the left and right picture content data is solved. Fig. 3 is a display screen finally generated by the binocular image fusion method of the present application.
The structures of the first optical system and the second optical system are not particularly limited as long as both optical systems can be ensured to have sufficiently large object planes to satisfy the movement of the picture content data in the object planes thereof. In one embodiment, the design parameters of the first optical system and the second optical system are determined according to a preset adjustable position range of the first picture content data or the second picture content data.
In an alternative embodiment, the first optical system and the second optical system may each comprise an optical lead-in element and an optical projection device. The optical lead-in element is for coupling picture content data into the optical projection device. The optical projection device is used for projecting the picture content data.
In the binocular image fusion method provided by the application, the positions of the first picture content data and the second picture content data can be dynamically adjusted according to the splicing control information, so that the size of the angle of view is dynamically adjusted, and the dynamic binocular fusion picture is matched. In addition, the design parameters of the two optical systems for projecting the first picture content data and the second picture content data are the same, the first picture content data and the second picture content data move in the same object plane of the corresponding optical systems, and the moving distances of the first picture content data and the second picture content data are the same and opposite. The design can ensure that the overlapping area of the first picture content data and the second picture content data has the same distortion, thereby solving the problem that the binocular overlapping area generated by distortion cannot be fused in the process of expanding the angle of view by using a binocular splicing picture method.
In one embodiment, the splice control information includes an adjustment time t. The adjustment time t is the time required for the first picture content data and the second picture content data to move from the current position to the first position and the second position, respectively.
At this time, step S30 may include:
controlling the first picture content data to move to a first position at a constant speed within the adjustment time t;
and controlling the second picture content data to move to a second position at a uniform speed within the adjusting time t.
In one embodiment, step S30 includes:
controlling the first picture content data to move to a first position in an object plane of a first optical system by adjusting the physical position of a first image source;
and controlling the second picture content data to move to a second position in the object plane of the second optical system by adjusting the physical position of the second image source.
Alternatively, an automated moving platform may be provided, with the image sources being placed on the respective automated moving platforms. The automatic moving platform drives the image source to move to a corresponding position in an object plane of the optical system. In one embodiment, the automatic moving platform can move in at least one of the horizontal, vertical and rotation directions at a preset frequency, so as to drive the picture content data to move in at least one of the horizontal, vertical and rotation directions.
In one embodiment, step S30 includes:
controlling the first picture content data to move to a first position in an object plane of a first optical system by adjusting the position of an effective light emitting area of a first image source;
and controlling the second picture content data to move to a second position in the object plane of the second optical system by adjusting the position of the effective light emitting area of the second image source.
In this case, the positions of the two image sources may be fixed, and only the effective light emitting areas thereof are controlled to move to corresponding positions in the object plane of the optical system. Referring to fig. 4, a display region of an image source may include an effective light emitting region and a redundant region. The pixels of the active light emitting area participate in generating the picture content data, while the pixels of the redundant area do not participate in generating the picture content data. When the position of the effective light emitting area is controlled to be changed, the purpose of changing the position of the picture content data can be achieved. In one embodiment, the position of the effective light emitting area of the image source is adjusted in at least one of the horizontal, vertical and rotation directions at a preset frequency, so as to drive the picture content data to move in at least one of the horizontal, vertical and rotation directions.
Based on the same inventive concept, please refer to fig. 5, an embodiment of the present application provides a binocular image fusion system, which includes:
the acquisition module is used for acquiring splicing control information;
the calculation module is used for determining a first displacement parameter of the first picture content data and a second displacement parameter of the second picture content data according to the splicing control information;
the adjusting module is used for controlling the first picture content data to move to a first position in an object plane of a first optical system according to a first displacement parameter of the first picture content data, and controlling the second picture content data to move to a second position in the object plane of a second optical system according to a second displacement parameter of the second picture content data, wherein the first displacement parameter and the second displacement parameter both comprise a moving distance and a moving direction, the moving distance of the first picture content data is the same as the moving distance of the second picture content data, and the moving direction of the first picture content data is opposite to the moving direction of the second picture content data;
a first optical system for projecting the first picture content data;
and the second optical system is used for projecting the second picture content data so as to realize the display of a fusion picture, wherein the design parameters of the first optical system and the second optical system are the same.
It is understood that the binocular image fusion system is used to implement the binocular image fusion method. The structures of the acquisition module, the calculation module, the adjustment module, the first optical system and the second optical system are not particularly limited, as long as the binocular image fusion method can be realized.
In one embodiment, the calculating module is further configured to determine design parameters of the first optical system and the second optical system according to a preset adjustable position range of the first screen content data or the second screen content data.
In one embodiment, the adjustment module controls the first picture content data to move to a first position in an object plane of the first optical system by adjusting a physical position of the first image source;
the adjusting module controls the second picture content data to move to a second position in the object plane of the second optical system by adjusting the physical position of the second image source.
In one embodiment, the adjusting module controls the first picture content data to move to a first position in the object plane of the first optical system by adjusting the position of the effective light emitting area of the first image source;
the adjusting module controls the second picture content data to move to a second position in the object plane of the second optical system by adjusting the position of the effective light emitting area of the second image source.
Based on the same inventive concept, the embodiment of the present application as shown in fig. 6 further provides a terminal device, where the terminal device 300 may be a projection device, an augmented reality (Augmented Reality, AR) device, or other products facing future technologies.
As shown in fig. 6, the terminal device 300 of this embodiment includes: a processor 301, a memory 302 and a computer program 303 stored in the memory 302 and executable on the processor 301. The computer program 303 may be executed by the processor 301 to generate instructions, and the processor 301 may implement the steps of the various embodiments of the multi-depth near-to-eye display method described above according to the instructions. Alternatively, the processor 301, when executing the computer program 303, performs the functions of the modules/units in the above-described apparatus embodiments.
By way of example, the computer program 303 may be divided into one or more modules/units, which are stored in the memory 302 and executed by the processor 301 to accomplish the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 303 in the terminal device 300.
It will be appreciated by those skilled in the art that fig. 6 is merely an example of terminal device 300 and is not limiting of terminal device 300, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., terminal device 300 may also include input and output devices, network access devices, buses, etc.
The processor 301 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 302 may be an internal storage unit of the terminal device 300, such as a hard disk or a memory of the terminal device 300. The memory 302 may also be an external storage device of the terminal device 300, such as a plug-in hard disk provided on the terminal device 300, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like. Further, the memory 302 may also include both an internal storage unit and an external storage device of the terminal device 300. The memory 302 is used to store computer programs and other programs and data required for the terminal device 300. The memory 302 may also be used to temporarily store data that has been output or is to be output.
The terminal device provided in this embodiment may execute the above method embodiment, and its implementation principle is similar to that of the technical effect, and will not be described herein again.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the method of the above-mentioned method embodiment.
The embodiment of the application also provides a computer program product which, when run on a terminal device, causes the terminal device to execute the method of the embodiment of the method.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include at least: any entity or device capable of carrying computer program code to a photographing device/terminal apparatus, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In the description of the present application, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature.
Furthermore, in the present application, unless explicitly specified and limited otherwise, the terms "connected," "coupled," and the like are to be construed broadly and may be, for example, mechanically or electrically connected; either directly or indirectly, through intermediaries, or both, unless otherwise specifically defined, the meaning of the terms in this disclosure is to be understood by those of ordinary skill in the art.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (10)

1. A binocular image fusion method, comprising:
acquiring splicing control information;
determining a first displacement parameter of the first picture content data and a second displacement parameter of the second picture content data according to the splicing control information;
controlling the first picture content data to move to a first position in an object plane of a first optical system according to a first displacement parameter of the first picture content data, and controlling the second picture content data to move to a second position in the object plane of a second optical system according to a second displacement parameter of the second picture content data, wherein the first displacement parameter and the second displacement parameter both comprise a moving distance and a moving direction, the moving distance of the first picture content data is the same as the moving distance of the second picture content data, and the moving direction of the first picture content data is opposite to the moving direction of the second picture content data;
and projecting the first picture content data by using the first optical system and projecting the second picture content data by using the second optical system to realize the display of a fusion picture, wherein the design parameters of the first optical system and the second optical system are the same.
2. The binocular image fusion method of claim 1, wherein the stitching control information includes an adjustment time t;
the controlling the first picture content data to move to the first position in the object plane of the first optical system according to the first displacement parameter of the first picture content data, and controlling the second picture content data to move to the second position in the object plane of the second optical system according to the second displacement parameter of the second picture content data includes:
controlling the first picture content data to move to the first position at a constant speed within the adjusting time t;
and controlling the second picture content data to move to the second position at a uniform speed within the adjusting time t.
3. The binocular image fusion method of claim 1, comprising:
controlling the first picture content data to move to the first position in the object plane of the first optical system by adjusting the physical position of a first image source;
and controlling the second picture content data to move to the second position in the object plane of the second optical system by adjusting the physical position of the second image source.
4. The binocular image fusion method of claim 1, comprising:
controlling the first picture content data to move to the first position in the object plane of the first optical system by adjusting the position of the effective light emitting area of the first image source;
and controlling the second picture content data to move to the second position in the object plane of the second optical system by adjusting the position of the effective light emitting area of the second image source.
5. The binocular image fusion method of claim 1, comprising:
and determining design parameters of the first optical system and the second optical system according to the preset adjustable position range of the first picture content data or the second picture content data.
6. A binocular image fusion system, comprising:
the acquisition module is used for acquiring splicing control information;
the calculation module is used for determining a first displacement parameter of the first picture content data and a second displacement parameter of the second picture content data according to the splicing control information;
the adjusting module is used for controlling the first picture content data to move to a first position in an object plane of a first optical system according to a first displacement parameter of the first picture content data, and controlling the second picture content data to move to a second position in the object plane of a second optical system according to a second displacement parameter of the second picture content data, wherein the first displacement parameter and the second displacement parameter both comprise a moving distance and a moving direction, the moving distance of the first picture content data is the same as the moving distance of the second picture content data, and the moving direction of the first picture content data is opposite to the moving direction of the second picture content data;
the first optical system is used for projecting the first picture content data;
and the second optical system is used for projecting the second picture content data so as to realize the display of a fusion picture, wherein the design parameters of the first optical system and the second optical system are the same.
7. The binocular image fusion system of claim 6, wherein the computing module is further configured to determine design parameters of the first optical system and the second optical system based on a preset adjustable location range of the first picture content data or the second picture content data.
8. The binocular image fusion system of claim 6, wherein the adjustment module controls the first picture content data to move to a first location within the object plane of the first optical system by adjusting a physical location of the first image source;
the adjusting module controls the second picture content data to move to a second position in the object plane of the second optical system by adjusting the physical position of the second image source.
9. The binocular image fusion system of claim 6, wherein the adjustment module controls the first picture content data to move to a first location within the object plane of the first optical system by adjusting the location of the effective light emitting area of the first image source;
the adjusting module controls the second picture content data to move to a second position in the object plane of the second optical system by adjusting the position of the effective light emitting area of the second image source.
10. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 5 when executing the computer program.
CN202210189392.5A 2022-02-28 2022-02-28 Binocular image fusion method, binocular image fusion system and terminal equipment Pending CN116703717A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210189392.5A CN116703717A (en) 2022-02-28 2022-02-28 Binocular image fusion method, binocular image fusion system and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210189392.5A CN116703717A (en) 2022-02-28 2022-02-28 Binocular image fusion method, binocular image fusion system and terminal equipment

Publications (1)

Publication Number Publication Date
CN116703717A true CN116703717A (en) 2023-09-05

Family

ID=87832690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210189392.5A Pending CN116703717A (en) 2022-02-28 2022-02-28 Binocular image fusion method, binocular image fusion system and terminal equipment

Country Status (1)

Country Link
CN (1) CN116703717A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160353078A1 (en) * 2015-05-30 2016-12-01 Beijing Zhigu Rui Tuo Tech Co., Ltd Video display control methods and apparatuses and display devices
WO2018129234A1 (en) * 2017-01-05 2018-07-12 Lang Philipp K Improved accuracy of displayed virtual data with optical head mount displays for mixed reality
CN109991746A (en) * 2019-03-08 2019-07-09 成都理想境界科技有限公司 Image source mould group and near-eye display system
US20200018968A1 (en) * 2018-07-13 2020-01-16 Magic Leap, Inc. Systems and methods for display binocular deformation compensation
US20210165222A1 (en) * 2017-12-21 2021-06-03 Nokia Technologies Oy Display Apparatus and Method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160353078A1 (en) * 2015-05-30 2016-12-01 Beijing Zhigu Rui Tuo Tech Co., Ltd Video display control methods and apparatuses and display devices
WO2018129234A1 (en) * 2017-01-05 2018-07-12 Lang Philipp K Improved accuracy of displayed virtual data with optical head mount displays for mixed reality
US20210165222A1 (en) * 2017-12-21 2021-06-03 Nokia Technologies Oy Display Apparatus and Method
US20200018968A1 (en) * 2018-07-13 2020-01-16 Magic Leap, Inc. Systems and methods for display binocular deformation compensation
CN109991746A (en) * 2019-03-08 2019-07-09 成都理想境界科技有限公司 Image source mould group and near-eye display system

Similar Documents

Publication Publication Date Title
US10872439B2 (en) Method and device for verification
CN107660337B (en) System and method for generating a combined view from a fisheye camera
US10976605B2 (en) Picture compensation method and apparatus, display device and non-transitory computer-readable storage medium
US10237539B2 (en) 3D display apparatus and control method thereof
US11457194B2 (en) Three-dimensional (3D) image rendering method and apparatus
Nagase et al. Dynamic defocus and occlusion compensation of projected imagery by model-based optimal projector selection in multi-projection environment
CN101984670A (en) Stereoscopic displaying method, tracking stereoscopic display and image processing device
US11335066B2 (en) Apparatus and operating method for displaying augmented reality object
US10931939B2 (en) Glassless three-dimensional (3D) display apparatus and control method thereof
CN113256742B (en) Interface display method and device, electronic equipment and computer readable medium
CN106920475B (en) Display panel, display device and driving method of display panel
CN109685721B (en) Panoramic picture splicing method, device, terminal and corresponding storage medium
US20220182595A1 (en) Optical flow based omnidirectional stereo video processing method
CN103018907A (en) Display method and head-mounted display
CN111540004A (en) Single-camera polar line correction method and device
US10230933B2 (en) Processing three-dimensional (3D) image through selectively processing stereoscopic images
CN113112553B (en) Parameter calibration method and device for binocular camera, electronic equipment and storage medium
US20150035953A1 (en) Electronic display numerical aberration correction
CN116703717A (en) Binocular image fusion method, binocular image fusion system and terminal equipment
CN114143524B (en) Laser projection system and projection method
KR101606539B1 (en) Method for rendering three dimensional image of circle type display
CN114513646B (en) Method and device for generating panoramic video in three-dimensional virtual scene
JP2020076969A (en) Display panel, 3d display device, and 3d hud device
EP3467637B1 (en) Method, apparatus and system for displaying image
CN105389847A (en) Drawing system and method of 3D scene, and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination