CN115457200B - Method, device, equipment and storage medium for automatic true stereo display of 2.5-dimensional image - Google Patents

Method, device, equipment and storage medium for automatic true stereo display of 2.5-dimensional image Download PDF

Info

Publication number
CN115457200B
CN115457200B CN202211063896.9A CN202211063896A CN115457200B CN 115457200 B CN115457200 B CN 115457200B CN 202211063896 A CN202211063896 A CN 202211063896A CN 115457200 B CN115457200 B CN 115457200B
Authority
CN
China
Prior art keywords
matrix
projection
offset
observation
image space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211063896.9A
Other languages
Chinese (zh)
Other versions
CN115457200A (en
Inventor
付永锋
孙建平
吴玉华
付小峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongtu Kechuang Information Technology Co ltd
Beijing Geo Vision Tech Co ltd
Original Assignee
Beijing Zhongtu Kechuang Information Technology Co ltd
Beijing Geo Vision Tech Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhongtu Kechuang Information Technology Co ltd, Beijing Geo Vision Tech Co ltd filed Critical Beijing Zhongtu Kechuang Information Technology Co ltd
Priority to CN202211063896.9A priority Critical patent/CN115457200B/en
Publication of CN115457200A publication Critical patent/CN115457200A/en
Application granted granted Critical
Publication of CN115457200B publication Critical patent/CN115457200B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Geometry (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The application relates to a method, a device, equipment and a storage medium for automatically displaying a 2.5-dimensional image in true stereo, which are applied to the technical field of image processing, wherein the method comprises the following steps: acquiring the binocular eye distance, the stereoscopic projection ratio and the visual range; generating a projection offset matrix based on the interocular distance and the stereo projection ratio; generating an observation offset matrix based on the binocular eye distance, the line of sight, and the stereo prominence ratio; acquiring a projection matrix, an observation matrix and an image space point position; and generating image space point true stereo imaging based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the image space point position. The method has the effect of realizing the automatic true three-dimensional display of the 2.5-dimensional image.

Description

Method, device, equipment and storage medium for automatic true stereo display of 2.5-dimensional image
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for displaying a 2.5-dimensional image automatically and truly.
Background
Currently, two schemes are generally used for implementing true stereoscopic display of a 2.5-dimensional image scene, the first scheme is that a 2.5-dimensional program is required to support true stereoscopic display, and the second scheme is that 2.5-dimensional image scene data is exported and then imported into another program supporting true stereoscopic display for display.
However, the existing professional programs on the market originally do not support the true stereoscopic display function, if the image scene data is exported to other programs supporting the true stereoscopic display, the problems of poor real-time performance, loss of program functions and loss and difference of attribute information and rendering effects caused by export and import of the image scene data exist, and the problem that the finished scene data of a large rule model needs to be exported to third-party software is huge in workload, and a technology for realizing automatic true stereoscopic display of 2.5-dimensional images is urgently needed.
Disclosure of Invention
In order to realize the automatic true stereo display of the 2.5-dimensional image, the application provides a method, a device, equipment and a storage medium for the automatic true stereo display of the 2.5-dimensional image.
In a first aspect, the present application provides a method for automatically displaying a true stereo 2.5-dimensional image, which adopts the following technical solution:
a method for automatically displaying a 2.5-dimensional image in true stereo comprises the following steps:
acquiring the binocular eye distance, the stereoscopic projection ratio and the visual range;
generating a projection offset matrix based on the interocular distance and the stereo projection ratio;
generating an observation offset matrix based on the binocular eye distance, the line of sight, and the stereo prominence ratio;
acquiring a projection matrix, an observation matrix and an image space point position;
and generating image space point true stereo imaging based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the image space point position.
By adopting the technical scheme, the processing of the image true stereo display is directly completed in the display card, so that the image is subjected to true stereo display, image data does not need to be imported and exported, and the display card is used for realizing the automatic true stereo display of the 2.5-dimensional image.
Optionally, the generating a projection offset matrix based on the binocular eye distance and the stereo protrusion ratio includes:
calculating a monocular eye distance, calculating a projection viewing cone offset based on the monocular eye distance and the saliency ratio;
and acquiring an original unit matrix, and generating a projection offset matrix based on the original unit matrix and the projection viewing cone offset.
Optionally, the generating a viewing offset matrix based on the binocular eye distance, the visual distance, and the stereo projection ratio includes:
acquiring offset parameters of the image space points;
calculating an offset value of the image space point on a horizontal axis based on the offset parameter, the binocular eye distance, the stereoscopic prominence ratio, and the line of sight;
and acquiring an original matrix, and generating an observation offset matrix based on the original matrix and the offset value.
Optionally, the generating of true stereo imaging of image space points based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix, and the image space point positions includes:
calculating the image space point left eye display position based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix, and the image space point position;
calculating the image space point right eye display position based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix, and the image space point position;
generating the image space point stereoscopic imagery based on the left eye display position and the right eye display position.
Optionally, the calculating the left-eye display position of the image space point based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix, and the image space point position includes:
negating the projection offset matrix and the observation offset matrix to generate a negative projection offset matrix and a negative observation offset matrix;
calculating the product of the negative projection offset matrix and the projection matrix to generate a left-eye projection matrix;
calculating the product of the negative observation offset matrix and the observation matrix to generate a left-eye observation matrix;
and calculating the product of the left-eye projection matrix, the left-eye observation matrix and the position of the image space point to generate a left-eye display position.
Optionally, the calculating the right-eye display position of the image space point based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix, and the image space point position includes:
correcting the projection offset matrix and the observation offset matrix to generate a forward projection offset matrix and a forward observation offset matrix;
calculating the product of the forward projection offset matrix and the projection matrix to generate a right eye projection matrix;
calculating the product of the forward observation offset matrix and the observation matrix to generate a right-eye observation matrix;
and calculating the product of the right eye projection matrix, the right eye observation matrix and the position of the image space point to generate a right eye display position.
In a second aspect, the present application provides a 2.5-dimensional image autostereoscopic display apparatus, which adopts the following technical solution:
a 2.5-dimensional image autostereoscopic display apparatus comprising:
the data acquisition module is used for acquiring binocular eye distance, stereoscopic projection ratio and visual range;
a projection calculation module for generating a projection offset matrix based on the binocular eye distance and the stereo projection ratio;
an observation calculation module to generate an observation offset matrix based on the binocular eye distance, the line of sight, and the stereo prominence ratio;
the position acquisition module is used for acquiring a projection matrix, an observation matrix and an image space point position;
and the display generation module is used for generating image space point true stereo imaging based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the image space point position.
By adopting the technical scheme, the processing of the image true stereo display is directly completed in the display card, so that the image is subjected to true stereo display, image data does not need to be imported and exported, and the display card is used for realizing the automatic true stereo display of the 2.5-dimensional image.
In a third aspect, the present application provides an electronic device, which adopts the following technical solutions:
an electronic device comprising a processor, the processor coupled with a memory;
the processor is configured to execute the computer program stored in the memory to cause the electronic device to execute the 2.5-dimensional image autostereoscopic display method according to any one of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, which adopts the following technical solutions:
a computer readable storage medium storing a computer program that can be loaded by a processor and executes the method for 2.5-dimensional image autostereoscopic display according to any of the first aspects.
Drawings
Fig. 1 is a schematic flowchart of a method for displaying a 2.5-dimensional image in an auto-true stereoscopic manner according to an embodiment of the present disclosure.
Fig. 2 is a block diagram of a 2.5-dimensional image autostereoscopic display apparatus according to an embodiment of the present application.
Fig. 3 is a block diagram of a structure of an electronic device according to an embodiment of the present application.
Detailed Description
The present application is described in further detail below with reference to the attached drawings.
The embodiment of the application provides a method for automatically displaying a 2.5-dimensional image in a true stereoscopic manner, the method for automatically displaying a 2.5-dimensional image in a true stereoscopic manner can be executed by an electronic device, the electronic device can be a server or a terminal device, the server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and a cloud server providing cloud service. The terminal device may be, but is not limited to, a smart phone, a tablet computer, a desktop computer, etc.
In this embodiment, the method uses a principle that is not related to the third-party program itself, the method directly operates the bottom layer of the graphics display card and directly completes the operation in the graphics card 3D rendering stream, when any program completes the drawing, the rendering stream is certainly pushed into the graphics card GPU, and the method realizes true three-dimensional drawing of the common 2.5-dimensional rendering stream in the GPU, that is, double rendering of left and right eyes.
At present, when all 2.5-dimensional programs perform image rendering, a system d3d9.dll library file or an opengl32.dll library file is called certainly, and a unique d3d9.dll library file and a unique opengl32.dll library file are generated by rewriting the system d3d9.dll library file and the system opengl32.dll library file to replace the system d3d9.dll library file or the opengl32.dll library file, so that the third-party program operation can directly call the unique d3d9.dll library file and the unique opengl32.dll library file to display, thereby obtaining any program rendering stream, and displaying GPU codes of the unique d3d9.dll library file and the unique opengl32.dll library file along with a third-party stream, so that any third-party program rendering stream is processed and controlled in real time inside a GPU, and realizing 2.5-dimensional automatic three-dimensional image display.
Fig. 1 is a schematic flow chart of a method for displaying a 2.5-dimensional image in an auto-true stereoscopic manner according to an embodiment of the present disclosure.
As shown in fig. 1, the main flow of the method is described as follows (steps S101 to S105):
step S101, obtaining the eye distance of two eyes, the stereoscopic projection ratio and the visual range;
step S102, generating a projection offset matrix based on the binocular eye distance and the stereo projection ratio;
for step S102, calculating a monocular eye distance, and calculating a projection viewing cone offset based on the monocular eye distance and the saliency ratio; and acquiring an original unit matrix, and generating a projection offset matrix based on the original unit matrix and the projection cone offset.
In this embodiment, the eye distance is expressed by eyeLen, the stereoscopic projection ratio is expressed by sd, the visual distance is expressed by distance, the original unit Matrix is expressed by Matrix, the original unit Matrix is 4*4 Matrix composed of 16 floating point numbers, and the original unit Matrix is constructed as follows:
Matrix(1.0,0.0,0.0,0.0,
0.0,1.0,0.0,0.0,
0.0,0.0,1.0,0.0,
0.0,0.0,0.0,1.0)。
when calculating the projection view cone offset, the calculated view cone offset of a single eye is the eye distance of two eyes, so the eye distance of two eyes needs to be halved to calculate the eye distance of a single eye, the projection view cone offset is calculated according to the eye distance, i.e., eyeLen/(2.0 × sd), and the projection offset matrix offsetProjMatrix is generated by combining the original element matrix, and the projection offset matrix offsetProjMatrix is constructed as follows:
offsetProjMatrix = Matrix(1.0,0.0,0.0,0.0,
0.0,1.0,0.0,0.0,
eyeLen /(2.0*sd),0.0,1.0,0.0,
0.0,0.0,0.0,1.0)。
step S103, generating an observation offset matrix based on the binocular eye distance, the visual distance and the stereoscopic projection ratio;
for step S103, obtaining an offset parameter of an image space point; calculating an offset value of the image space point on the horizontal axis based on the offset parameter, the binocular eye distance, the stereoscopic prominence ratio, and the visual distance; and acquiring an original matrix, and generating an observation offset matrix based on the original matrix and the offset value.
In this embodiment, when calculating the offset value, first, a ratio of the visual distance to the stereoscopic projection ratio is calculated, the offset parameter, the binocular distance, and the ratio of the visual distance to the stereoscopic projection ratio are multiplied, the product is an offset value of an image space point on the horizontal axis, that is, an offset value of an image space point on the X axis, which is 0.5f × eyelen [ (/ sd) ], and an observation offset matrix offsetViewMatrix is generated by combining the original cell matrices, and the observation offset matrix offsetViewMatrix is configured as follows:
offsetViewMatrix = Matrix(1.0,0.0,0.0,0.0,
0.0,1.0,0.0,0.0,
0.0,0.0,1.0,0.0,
0.5f*eyeLen*(distance/sd),0.0,0.0,1.0)。
and step S104, acquiring a projection matrix, an observation matrix and an image space point position.
In this embodiment, the projection matrix ProjMatrix, the viewing matrix ViewMatrix, and the image space point location Postimion are obtained within the GPU in real-time from a unique d3d9.Dll library file and a unique opengl32.Dll library file.
And step S105, generating image space point true stereo imaging based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the image space point position.
For step S105, calculating an image space point left eye display position based on the projection offset matrix, the viewing offset matrix, the projection matrix, the viewing matrix, and the image space point position; calculating the right eye display position of the image space point based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the position of the image space point; and generating image space point stereoscopic imaging based on the left eye display position and the right eye display position.
In this embodiment, since true stereoscopic display requires that both the left and right eyes feel stereoscopic feeling, it is necessary to calculate display positions of the left and right eyes, i.e., both eyes, and generate stereoscopic imaging of an image space point from the display positions of both eyes at the time of calculation. It should be noted that each image space point needs to calculate the display positions of both eyes, and stereoscopic imaging of the entire image is generated based on the display positions of both eyes of all the image space points.
Further, the projection offset matrix and the observation offset matrix are negatively charged to generate a negative projection offset matrix and a negative observation offset matrix; calculating the product of the negative projection offset matrix and the projection matrix to generate a left-eye projection matrix; calculating the product of the negative observation offset matrix and the observation matrix to generate a left-eye observation matrix; and calculating the product of the left-eye projection matrix, the left-eye observation matrix and the position of the image space point to generate a left-eye display position.
In this embodiment, when calculating the left-eye display position, the image space viewpoint itself is used as the origin of coordinates, and the position of the left eye is located on the negative side of the X axis, so that it is necessary to generate a negative projection offset matrix-offset proj matrix and a negative observation offset matrix-offset ViewMatrix by negating the projection offset matrix and the observation offset matrix, respectively, multiply the projection offset matrix-offset proj matrix and the projection matrix by the projection matrix, and generate a left-eye projection matrix LProjMatrix _ new, i.e., LProjMatrix _ new = proj rix (-offsetproj matrix), the negative observation offset matrix-offsetViewMatrix and the observation matrix ViewMatrix are multiplied to generate a left-eye observation matrix LViewMatrix _ new, i.e., LViewMatrix _ new = (-offsetViewMatrix), the left-eye projection matrix LProjMatrix _ new, the left-eye observation matrix LViewMatrix _ new, and the image space point position Postion, to generate a left-eye display position, i.e., lpost _ new = LProjMatrix _ new @ LViewMatrix _ new.
Further, the projection offset matrix and the observation offset matrix are corrected to generate a forward projection offset matrix and a forward observation offset matrix; calculating the product of the orthographic projection offset matrix and the projection matrix to generate a right eye projection matrix; calculating the product of the forward observation offset matrix and the observation matrix to generate a right-eye observation matrix; and calculating the product of the right eye projection matrix, the right eye observation matrix and the position of the image space point to generate a right eye display position.
In this embodiment, it can be derived from the same calculation principle of the left eye, when calculating the display position of the right eye, the image space viewpoint itself is used as the origin of coordinates, and the position of the right eye is located on the positive side of the X axis, so that the projection offset matrix and the observation offset matrix need to be respectively corrected to generate a forward projection offset matrix + offsetproj matrix and a forward observation offset matrix + offsetViewMatrix, and the forward projection offset matrix + offsetproj matrix and the projection matrix are multiplied by the proj matrix to generate a right eye projection matrix RProjMatrix _ new, that is, RProjMatrix _ new = proj matrix (+ offsetproj matrix), and the forward observation offset matrix + offsetproj matrix and the observation matrix ViewMatrix are multiplied to generate a right eye observation matrix iewox _ new, that is, rviematrix _ new = proj matrix (+ view matrix), and rpnew _ matrix is multiplied by rpnetvistie matrix.
Fig. 2 is a block diagram of a 2.5-dimensional image autostereoscopic display apparatus 200 according to an embodiment of the present application.
As shown in fig. 2, the 2.5-dimensional image autostereoscopic display apparatus 200 mainly includes:
a data acquisition module 201, configured to acquire a binocular eye distance, a stereoscopic projection ratio, and a line of sight;
a projection calculation module 202 for generating a projection offset matrix based on a binocular eye distance and a stereo saliency ratio;
an observation calculation module 203 for generating an observation offset matrix based on the binocular eye distance, the visual distance and the stereo prominence ratio;
a position obtaining module 204, configured to obtain a projection matrix, an observation matrix, and an image spatial point position;
and the display generation module 205 is used for generating the true stereo imaging of the image space point based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the image space point position.
As an optional implementation manner of this embodiment, the projection calculating module 202 is specifically configured to calculate a monocular eye distance, and calculate a projection view cone offset based on the monocular eye distance and the stereo projection ratio; and acquiring an original unit matrix, and generating a projection offset matrix based on the original unit matrix and the projection cone offset.
As an optional implementation manner of this embodiment, the observation calculation module 203 is specifically configured to obtain a shift parameter of an image space point; calculating an offset value of an image space point on a horizontal axis based on the offset parameter, the binocular eye distance, the stereoscopic prominence ratio and the visual distance; and acquiring an original matrix, and generating an observation offset matrix based on the original matrix and the offset value.
As an optional implementation manner of this embodiment, the display generating module 205 includes:
the left eye calculation module is used for calculating the left eye display position of the image space point based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the image space point position;
the right eye calculation module is used for calculating the right eye display position of the image space point based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the image space point position;
and the space generation module is used for generating image space point stereoscopic imaging based on the left eye display position and the right eye display position.
In this optional embodiment, the left-eye calculation module is specifically configured to generate a negative projection offset matrix and a negative observation offset matrix by negating the projection offset matrix and the observation offset matrix; calculating the product of the negative projection offset matrix and the projection matrix to generate a left-eye projection matrix; calculating the product of the negative observation offset matrix and the observation matrix to generate a left-eye observation matrix; and calculating the product of the left eye projection matrix, the left eye observation matrix and the position of the image space point to generate a left eye display position.
In this optional embodiment, the right-eye calculation module is specifically configured to correct the projection offset matrix and the observation offset matrix, and generate a forward projection offset matrix and a forward observation offset matrix; calculating the product of the forward projection offset matrix and the projection matrix to generate a right eye projection matrix; calculating the product of the forward observation offset matrix and the observation matrix to generate a right-eye observation matrix; and calculating the product of the right eye projection matrix, the right eye observation matrix and the position of the image space point to generate a right eye display position.
In one example, the modules in any of the above apparatus may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more Digital Signal Processors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), or a combination of at least two of these integrated circuit forms.
For another example, when a module in a device can be implemented in the form of a processing element scheduler, the processing element can be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling programs. As another example, these modules may be integrated together, implemented in the form of a system-on-a-chip (SOC).
It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Fig. 3 is a block diagram of an electronic device 300 according to an embodiment of the present disclosure.
As shown in FIG. 3, electronic device 300 includes a processor 301 and memory 302, and may further include an information input/information output (I/O) interface 303, one or more of a communications component 304, and a communications bus 305.
The processor 301 is configured to control the overall operation of the electronic device 300, so as to complete all or part of the steps of the above-mentioned 2.5-dimensional image autostereoscopic display method; the memory 302 is used to store various types of data to support operation at the electronic device 300, such data can include, for example, instructions for any application or method operating on the electronic device 300 and application-related data. The Memory 302 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as one or more of Static Random Access Memory (SRAM), electrically Erasable Programmable Read-Only Memory (EEPROM), erasable Programmable Read-Only Memory (EPROM), programmable Read-Only Memory (PROM), read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic or optical disk.
The I/O interface 303 provides an interface between the processor 301 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 304 is used for wired or wireless communication between the electronic device 300 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, near Field Communication (NFC), 2G, 3G, or 4G, or a combination of one or more of them, so that the corresponding Communication component 104 may include: wi-Fi part, bluetooth part, NFC part.
The electronic Device 300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components, and is configured to perform the method for displaying 2.5-dimensional images according to the above embodiments.
The communication bus 305 may include a path that carries information between the aforementioned components. The communication bus 305 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The communication bus 305 may be divided into an address bus, a data bus, a control bus, and the like.
The electronic device 300 may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet), a PMP (portable multimedia player), a vehicle-mounted terminal (e.g., a car navigation terminal), etc., and a stationary terminal such as a digital TV, a desktop computer, etc., and may also be a server, etc.
The present application further provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the above-mentioned method for displaying an auto-true stereoscopic image of a 2.5-dimensional image.
The computer-readable storage medium may include: a U-disk, a portable hard disk, a read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the application referred to in the present application is not limited to the embodiments with a particular combination of the above-mentioned features, but also encompasses other embodiments with any combination of the above-mentioned features or their equivalents without departing from the spirit of the application. For example, the above features and the technical features (but not limited to) having similar functions in the present application are mutually replaced to form the technical solution.

Claims (4)

1. A method for automatically displaying a true stereo 2.5-dimensional image is characterized by comprising the following steps:
obtaining the binocular eye distance, the stereoscopic projection ratio and the visual distance;
generating a projection offset matrix based on the interocular distance and the stereo projection ratio;
generating an observation offset matrix based on the binocular eye distance, the line of sight, and the stereo prominence ratio;
acquiring a projection matrix, an observation matrix and an image space point position;
generating an image space point true stereo imaging based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix, and the image space point position;
the generating a projection offset matrix based on the binocular eye distance and the stereo protrusion ratio comprises:
calculating a monocular eye distance, calculating a projection viewing cone offset based on the monocular eye distance and the stereo prominence ratio;
acquiring an original unit matrix, and generating a projection offset matrix based on the original unit matrix and the projection viewing cone offset;
the generating a viewing offset matrix based on the binocular eye distance, the line of sight, and the stereo prominence ratio comprises:
acquiring offset parameters of the image space points;
calculating an offset value of the image space point on a horizontal axis based on the offset parameter, the binocular eye distance, the stereoscopic projection ratio, and the line of sight;
acquiring an original matrix, and generating an observation offset matrix based on the original matrix and the offset value;
the generating image space point true stereo imaging based on the projection offset matrix, the viewing offset matrix, the projection matrix, the viewing matrix, and the image space point position comprises:
calculating the image space point left eye display position based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix, and the image space point position;
calculating the image space point right eye display position based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix, and the image space point position;
generating the image space point stereoscopic imagery based on the left eye display position and the right eye display position;
said calculating said image space point left eye display position based on said projection offset matrix, said viewing offset matrix, said projection matrix, said viewing matrix, and said image space point position comprises:
negating the projection offset matrix and the observation offset matrix to generate a negative projection offset matrix and a negative observation offset matrix;
calculating the product of the negative projection offset matrix and the projection matrix to generate a left-eye projection matrix;
calculating the product of the negative observation offset matrix and the observation matrix to generate a left-eye observation matrix;
calculating the product of the left eye projection matrix, the left eye observation matrix and the image space point position to generate a left eye display position;
said calculating said image space point right eye display position based on said projection offset matrix, said viewing offset matrix, said projection matrix, said viewing matrix, and said image space point position comprises:
correcting the projection offset matrix and the observation offset matrix to generate a forward projection offset matrix and a forward observation offset matrix;
calculating the product of the orthographic projection offset matrix and the projection matrix to generate a right eye projection matrix;
calculating the product of the forward observation offset matrix and the observation matrix to generate a right-eye observation matrix;
and calculating the product of the right eye projection matrix, the right eye observation matrix and the position of the image space point to generate a right eye display position.
2. An autostereoscopic display apparatus for 2.5 dimensional images, comprising:
the data acquisition module is used for acquiring binocular eye distance, stereoscopic projection ratio and visual range;
a projection calculation module for generating a projection offset matrix based on the binocular eye distance and the saliency ratio;
an observation calculation module to generate an observation offset matrix based on the binocular eye distance, the line of sight, and the stereo prominence ratio;
the position acquisition module is used for acquiring a projection matrix, an observation matrix and an image space point position;
a display generation module for generating an image space point true stereo imaging based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix, and the image space point position;
the projection calculation module is specifically used for calculating monocular eye distance and calculating projection visual cone offset based on the monocular eye distance and the stereoscopic projection ratio; acquiring an original unit matrix, and generating a projection offset matrix based on the original unit matrix and the projection viewing cone offset;
the observation calculation module is specifically used for acquiring offset parameters of image space points; calculating an offset value of the image space point on the horizontal axis based on the offset parameter, the binocular eye distance, the stereoscopic prominence ratio, and the visual distance; acquiring an original matrix, and generating an observation offset matrix based on the original matrix and the offset value;
the display generation module includes:
the left eye calculation module is used for calculating the left eye display position of the image space point based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the image space point position;
the right eye calculation module is used for calculating the right eye display position of the image space point based on the projection offset matrix, the observation offset matrix, the projection matrix, the observation matrix and the image space point position;
the space generation module is used for generating image space point three-dimensional imaging based on the left eye display position and the right eye display position;
the left-eye calculation module is specifically used for negating the projection offset matrix and the observation offset matrix to generate a negative projection offset matrix and a negative observation offset matrix; calculating the product of the negative projection offset matrix and the projection matrix to generate a left-eye projection matrix; calculating the product of the negative observation offset matrix and the observation matrix to generate a left-eye observation matrix; calculating the product of the left-eye projection matrix, the left-eye observation matrix and the position of the image space point to generate a left-eye display position;
the right eye calculation module is specifically used for correcting the projection offset matrix and the observation offset matrix to generate a forward projection offset matrix and a forward observation offset matrix; calculating the product of the forward projection offset matrix and the projection matrix to generate a right eye projection matrix; calculating the product of the forward observation offset matrix and the observation matrix to generate a right-eye observation matrix; and calculating the product of the right eye projection matrix, the right eye observation matrix and the position of the image space point to generate a right eye display position.
3. An electronic device comprising a processor, the processor coupled with a memory;
the processor is configured to execute the computer program stored in the memory to cause the electronic device to perform the method of claim 1.
4. A computer-readable storage medium comprising a computer program or instructions which, when run on a computer, cause the computer to carry out the method of claim 1.
CN202211063896.9A 2022-08-31 2022-08-31 Method, device, equipment and storage medium for automatic true stereo display of 2.5-dimensional image Active CN115457200B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211063896.9A CN115457200B (en) 2022-08-31 2022-08-31 Method, device, equipment and storage medium for automatic true stereo display of 2.5-dimensional image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211063896.9A CN115457200B (en) 2022-08-31 2022-08-31 Method, device, equipment and storage medium for automatic true stereo display of 2.5-dimensional image

Publications (2)

Publication Number Publication Date
CN115457200A CN115457200A (en) 2022-12-09
CN115457200B true CN115457200B (en) 2023-04-14

Family

ID=84300591

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211063896.9A Active CN115457200B (en) 2022-08-31 2022-08-31 Method, device, equipment and storage medium for automatic true stereo display of 2.5-dimensional image

Country Status (1)

Country Link
CN (1) CN115457200B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101635061A (en) * 2009-09-08 2010-01-27 南京师范大学 Adaptive three-dimensional rendering method based on mechanism of human-eye stereoscopic vision
CN103444190A (en) * 2011-03-14 2013-12-11 高通股份有限公司 Run-time conversion of native monoscopic 3D into stereoscopic 3D
CN109640070A (en) * 2018-12-29 2019-04-16 上海曼恒数字技术股份有限公司 A kind of stereo display method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11240487B2 (en) * 2016-12-05 2022-02-01 Sung-Yang Wu Method of stereo image display and related device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101635061A (en) * 2009-09-08 2010-01-27 南京师范大学 Adaptive three-dimensional rendering method based on mechanism of human-eye stereoscopic vision
CN103444190A (en) * 2011-03-14 2013-12-11 高通股份有限公司 Run-time conversion of native monoscopic 3D into stereoscopic 3D
CN109640070A (en) * 2018-12-29 2019-04-16 上海曼恒数字技术股份有限公司 A kind of stereo display method, device, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种基于OPENGL的立体图像生成技术研究;程磊;刘海艳;;科技信息(第15期);第449页 *

Also Published As

Publication number Publication date
CN115457200A (en) 2022-12-09

Similar Documents

Publication Publication Date Title
CN109961406B (en) Image processing method and device and terminal equipment
CN103444190B (en) Conversion when primary list is as the operation of 3D to three-dimensional 3D
CN107223270B (en) Display data processing method and device
CN113574863A (en) Method and system for rendering 3D image using depth information
US20170186243A1 (en) Video Image Processing Method and Electronic Device Based on the Virtual Reality
CN108833877B (en) Image processing method and device, computer device and readable storage medium
WO2013085513A1 (en) Graphics rendering technique for autostereoscopic three dimensional display
CN109920043B (en) Stereoscopic rendering of virtual 3D objects
CN106598250A (en) VR display method and apparatus, and electronic device
Yang et al. Dynamic 3D scene depth reconstruction via optical flow field rectification
CN107635132B (en) Display control method and device of naked eye 3D display terminal and display terminal
CN114782648A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114531553B (en) Method, device, electronic equipment and storage medium for generating special effect video
CN114742703A (en) Method, device and equipment for generating binocular stereoscopic panoramic image and storage medium
CN115170740A (en) Special effect processing method and device, electronic equipment and storage medium
WO2022000266A1 (en) Method for creating depth map for stereo moving image and electronic device
CN115457200B (en) Method, device, equipment and storage medium for automatic true stereo display of 2.5-dimensional image
CN109816791B (en) Method and apparatus for generating information
CN115131507B (en) Image processing method, image processing device and meta space three-dimensional reconstruction method
US20150215602A1 (en) Method for ajdusting stereo image and image processing device using the same
CN115131471A (en) Animation generation method, device and equipment based on image and storage medium
US20120281067A1 (en) Image processing method, image processing apparatus, and display apparatus
CN113963103A (en) Rendering method of three-dimensional model and related device
CN113592990A (en) Three-dimensional effect generation method, device, equipment and medium for two-dimensional image
CN104270627A (en) Information processing method and first electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant