CN114566132A - Parameter processing method and device, electronic equipment and computer readable storage medium - Google Patents
Parameter processing method and device, electronic equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN114566132A CN114566132A CN202210187986.2A CN202210187986A CN114566132A CN 114566132 A CN114566132 A CN 114566132A CN 202210187986 A CN202210187986 A CN 202210187986A CN 114566132 A CN114566132 A CN 114566132A
- Authority
- CN
- China
- Prior art keywords
- angle
- display
- display device
- parameter
- parameters
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/42—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of patterns using a display memory without fixed position correspondence between the display memory contents and the display position on the screen
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/22—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
- G09G5/32—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory with means for controlling the display position
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides a parameter processing method and device, electronic equipment and a computer readable storage medium, and relates to the technical field of computers. In the embodiment of the application, a first angle of a first display device and a second angle corresponding to the first display device are obtained; according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; the angle of the first display equipment can be adjusted at any time, and after adjustment, under the condition that the watching position and the watching angle of a user are unchanged, the first display equipment can present a better watching effect of a three-dimensional view to the user when being positioned at any first angle.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a parameter processing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
With the continuous development of display technology, naked eye three-dimensional (3D) displays are becoming more mature and provide viewers with a good 3D experience.
However, in the related art, the naked-eye 3D display device is generally unable to adjust the angle of the display screen. This is because, for the viewer, the inclination angle of the display screen is adjusted, and only the viewing comfort is to be adjusted, and after the adjustment, the position and the sight angle of the viewer are not changed. However, for a display system with a fixed algorithm, it cannot be distinguished whether the angle of the display screen is adjusted by a viewer or the position of the viewer changes; therefore, after the inclination angle of the display screen is changed, the display system adjusts the 3D picture angle, so that the user cannot obtain a better 3D picture effect. Therefore, regarding the problem, in the related art, a display screen with a fixed angle is generally manufactured to avoid the problem that the angle of the entire 3D picture changes after the inclination angle of the display screen is adjusted. The display screen with a fixed angle cannot meet the use requirements of the viewers.
Disclosure of Invention
The purpose of the present application is to solve at least one of the above technical defects, especially the technical defect that the angle of the naked-eye 3D display screen cannot be adjusted.
According to an aspect of the present application, there is provided a parameter processing method, including:
acquiring a first angle of first display equipment and a second angle corresponding to the first display equipment;
wherein the first angle comprises a tilt angle of the first display device; the second angle comprises a preset reference angle corresponding to the first display device;
according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; wherein the display parameters include a position parameter of a reference with respect to the first display device.
Optionally, the obtaining the first angle of the first display device includes:
acquiring angle parameters measured by a preset angle sensor, and determining the first angle according to the angle parameters; the preset angle sensor is arranged at a preset position of the first display device.
Optionally, the method further includes:
acquiring first position information of a reference object in a coordinate system of a plane where the first display device is located;
and determining the display parameters according to the first position information.
Optionally, the correcting the display parameter of the first display device according to the first angle and the second angle to obtain a corrected parameter includes:
determining an offset angle between the first angle and the second angle;
and correcting the display parameters according to the offset angle to obtain the correction parameters.
Optionally, the display parameter includes first position information of the reference object in a coordinate system of a plane where the first display device is located,
the correcting the display parameter according to the offset angle to obtain the correction parameter includes:
determining the correction parameter according to the offset angle, the display parameter and a first data relation;
the first data relationship includes:
wherein (x1, y1, z1) represents first position information of the reference object in a coordinate system of a plane in which the first display device is located,
(x2, y2, z2) represents the correction parameter, θ1Representing the component of said offset angle in the direction of the X-axis, theta2Representing the component of said offset angle in the direction of the Y-axis, theta3A component of the offset angle in the Z-axis direction.
Optionally, the obtaining first position information of the reference object in a coordinate system of a plane where the first display device is located includes:
acquiring a user image through preset image acquisition equipment;
and determining the first position information according to the user image.
Optionally, the method further includes:
determining a target position of a virtual camera device based on the correction parameters so that the virtual camera device acquires the three-dimensional view based on the target position;
and displaying the three-dimensional view.
According to another aspect of the present application, there is provided a parameter processing apparatus, the apparatus including:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a first angle of first display equipment and a second angle corresponding to the first display equipment;
wherein the first angle comprises a tilt angle of the first display device; the second angle comprises a preset reference angle corresponding to the first display device;
the correction module is used for correcting the display parameters of the first display device according to the first angle and the second angle to obtain correction parameters so as to display a three-dimensional view based on the correction parameters; wherein the display parameters include a position parameter of a reference with respect to the first display device.
According to another aspect of the present application, there is provided an electronic device including:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: performing the parameter processing method of any of the first aspects of the present application.
For example, in a third aspect of the present application, there is provided a computing device comprising: the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the corresponding operation of the parameter processing method as shown in the first aspect of the application.
According to yet another aspect of the present application, there is provided a computer readable storage medium, which when executed by a processor implements the parameter processing method of any one of the first aspects of the present application.
For example, in a fourth aspect of the embodiments of the present application, a computer-readable storage medium is provided, on which a computer program is stored, and the computer program, when executed by a processor, implements the parameter processing method shown in the first aspect of the present application.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method provided in the various alternative implementations of the first aspect described above.
The beneficial effect that technical scheme that this application provided brought is:
in the embodiment of the application, a first angle of a first display device and a second angle corresponding to the first display device are obtained; according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; the angle of the first display equipment can be adjusted at any time, and after adjustment, under the condition that the watching position and the watching angle of a user are unchanged, the first display equipment can present a better watching effect of a three-dimensional view to the user when being positioned at any first angle.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a system architecture diagram of a parameter processing method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a parameter processing method according to an embodiment of the present application;
fig. 3 is a schematic view of an application scenario of a parameter processing method according to an embodiment of the present application;
fig. 4 is a schematic view of an application scenario of a parameter processing method according to an embodiment of the present application;
fig. 5 is a schematic view of an application scenario of a parameter processing method according to an embodiment of the present application;
fig. 6 is a schematic view of an application scenario of a parameter processing method according to an embodiment of the present application;
fig. 7 is a system framework diagram of a parameter processing method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a parameter processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device for parameter processing according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below in conjunction with the drawings in the present application. It should be understood that the embodiments set forth below in connection with the drawings are exemplary descriptions for explaining technical solutions of the embodiments of the present application, and do not limit the technical solutions of the embodiments of the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the terms "comprises" and/or "comprising," when used in this specification in connection with embodiments of the present application, specify the presence of stated features, information, data, steps, operations, elements, and/or components, but do not preclude the presence or addition of other features, information, data, steps, operations, elements, components, and/or groups thereof, as embodied in the art. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein indicates at least one of the items defined by the term, e.g., "a and/or B" may be implemented as "a", or as "B", or as "a and B".
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The terms referred to in this application will first be introduced and explained:
three-dimensional stereoscopic display, which is 3D display, is a display technique for reproducing depth information such as the distance of an object by generating parallax between the left and right eyes of a human by using a series of optical methods.
Naked eye 3D is a generic term for a technology that realizes a stereoscopic effect without using external tools such as polarized glasses.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
The system architecture of the embodiment of the present application is described below with reference to fig. 1, where fig. 1 is a schematic diagram of a system architecture provided in the embodiment of the present application. As shown in fig. 1, the system may include a server 10a and a user terminal cluster, and the user terminal cluster may include: user terminal 10b, user terminals 10c, … …, and user terminal 10d, where there may be a communication connection between the user terminal clusters, for example, there may be a communication connection between user terminal 10b and user terminal 10c, and there may be a communication connection between user terminal 10b and user terminal 10 d. Meanwhile, any user terminal in the user terminal cluster may have a communication connection with the server 10a, for example, a communication connection exists between the user terminal 10b and the server 10a, and a communication connection exists between the user terminal 10c and the server 10a, where the communication connection is not limited to a connection manner, and may be directly or indirectly connected through a wired communication manner, may also be directly or indirectly connected through a wireless communication manner, and may also be through other manners, which is not limited herein.
The server 10a provides services for the user terminal cluster through the communication connection function, the server 10a may be a background server for parameter processing, and the user terminal 10b, the user terminals 10c and …, and the user terminal 10d may all be connected to the server through the communication connection function. The network connected to the above communication may be a wide area network or a local area network, or a combination of the two.
In the embodiment of the present application, the parameter processing method may be implemented by the server 10a, or may be implemented by the terminal device.
In the embodiment of the application, a first angle of a first display device and a second angle corresponding to the first display device are obtained; according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; the angle of the first display equipment can be adjusted at any time, and after adjustment, under the condition that the watching position and the watching angle of a user are unchanged, the first display equipment can present a better watching effect of a three-dimensional view to the user when being positioned at any first angle.
It is understood that the method provided by the embodiments of the present application can be executed by a computer device, including but not limited to a terminal (also including the user terminal described above) or a server (also including the server 10a described above). The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, middleware service, a domain name service, a security service, a CDN, a big data and artificial intelligence platform, and the like. The terminal may be, but is not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal and the server may be directly or indirectly connected through wired or wireless communication, and the application is not limited herein.
The server 10a, the user terminal 10b, the user terminal 10c, and the user terminal 10d in fig. 1 may include a mobile phone, a tablet computer, a notebook computer, and a palm computer.
The embodiment of the application provides a parameter processing method, and an execution main body of the method can be various terminal devices or server devices with parameter processing capacity, and can also be devices or chips integrated on the devices. As shown in fig. 2, which is a schematic flow chart of a parameter processing method provided in an embodiment of the present application, the method includes the following steps:
s201: the method comprises the steps of obtaining a first angle of first display equipment and a second angle corresponding to the first display equipment.
Wherein the first angle comprises a tilt angle of the first display device; the second angle comprises a preset reference angle corresponding to the first display device.
Optionally, the embodiment of the present application may be applied to the field of three-dimensional (3-dimensional, 3D) stereoscopic view display, for example, the present application may be specifically applied to a naked-eye 3D display scene.
Wherein the first display device may comprise a three-dimensional view display device. Alternatively, the first display device may be a display screen displaying a three-dimensional view. For example, in a naked eye 3D display scene, the first display device may be a naked eye 3D display screen.
In the embodiment of the application, the angle between the first display device and the reference surface can be adjusted. For example, the first display device may be disposed at an angle of 90 degrees with respect to the horizontal plane, i.e., the first display device is disposed perpendicular to the horizontal plane; alternatively, the first display device may be arranged at an angle of 120 degrees to the horizontal, i.e. the first display device is arranged inclined to the horizontal, etc.
In the embodiment of the application, the first angle is an angle between the first display device and the reference plane when the three-dimensional view is displayed. Wherein, the reference plane may include a horizontal plane, a vertical plane, and the like; alternatively, in the embodiment of the present application, for convenience of description, the reference plane may be a horizontal plane. It will be appreciated that the first angle is variable with adjustment of the first display device.
In order to obtain a first angle between a first display device and a reference plane (horizontal plane) in real time, a preset angle sensor may be provided at a predetermined position of the first display device, and the first angle may be determined by an angle parameter measured by the preset angle sensor. Alternatively, the preset angle sensor may include an angle sensor such as a gyroscope or a rotary slide rheostat.
In addition, the second angle is a preset reference angle corresponding to the first display device. Specifically, when the first display device is disposed at the second angle, the position coordinates of the reference object in the coordinate system of the plane where the first display device is located are the same as the position coordinates of the reference object in the coordinate system of the plane where the vertical plane is located. That is, when the first display device is set at the second angle, the three-dimensional view viewed by the user is the best. In this embodiment of the application, the reference object may be an eye of a user who views a three-dimensional view (hereinafter referred to as a human eye for short).
S202: according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; wherein the display parameters include a position parameter of a reference with respect to the first display device.
In particular, the display parameters may include related parameters for displaying the three-dimensional view. Optionally, in this embodiment of the application, the display parameter may be a position parameter of the reference object with respect to the first display device. Wherein, the reference object can be human eyes. Alternatively, the position parameter of the human eye relative to the first display device (hereinafter, may be referred to as a first position parameter) may be a coordinate of the human eye in a coordinate system of a plane in which the first display device is located.
Further, the correction parameter includes a position parameter of the virtual image pickup apparatus when the first display apparatus is set at the first angle. The virtual camera shooting device is used for acquiring a three-dimensional view.
In a naked eye 3D display scene, the position of the virtual camera equipment can be determined according to the position of human eyes. When the first display device is arranged at the second angle, the position coordinates of the human eyes in the real space are the position coordinates of the virtual camera shooting device in the virtual space. In this way, when the virtual camera device acquires the three-dimensional view at the position, a better three-dimensional view effect can be presented to the user.
In the embodiment of the present application, after the first angle is obtained, the display parameter of the first display device may be corrected according to the first angle and the second angle to obtain a correction parameter, so as to display the three-dimensional view based on the correction parameter. Therefore, under the condition that the watching position of the user is not changed, the first display device can present a better watching effect to the user at the first angle.
Specifically, the display parameter may be corrected according to an offset angle between a first angle and a second angle, so as to obtain the correction parameter, and a three-dimensional view may be displayed based on the correction parameter.
For example, in an actual scene, the preset reference angle (i.e., the second angle) corresponding to the first display device is 90 degrees; when the user actually views the three-dimensional view, the inclination angle (i.e., the first angle) of the first display device is 120 degrees; in this case, the coordinates (i.e. the display parameters) of the human eye in the coordinate system of the plane where the first display device is located may be corrected according to the offset angle between the first angle and the second angle, i.e. the offset angle is 120-90 degrees to 30 degrees, so as to obtain corrected coordinates (i.e. the correction parameters); then, the virtual camera apparatus may be set at a coordinate position corresponding to the correction parameter to acquire a three-dimensional view and display the three-dimensional view to the user.
It should be noted that, because the first angle of the first display device can be adjusted at any time, in the embodiment of the present application, the obtaining of the first angle in the above step, and the correction processing performed on the display parameter of the first display device according to the first angle and the second angle can be performed in real time, so as to present a better viewing effect to the user.
In the embodiment of the application, a first angle of a first display device and a second angle corresponding to the first display device are obtained; according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; the angle of the first display equipment can be adjusted at any time, and after adjustment, under the condition that the watching position and the watching angle of a user are unchanged, the first display equipment can present a better watching effect of a three-dimensional view to the user when being positioned at any first angle.
In an embodiment of the application, the obtaining the first angle of the first display device includes:
acquiring an angle parameter measured by a preset angle sensor, and determining the first angle according to the angle parameter; the preset angle sensor is arranged at a preset position of the first display device.
In the embodiment of the application, the inclination angle (i.e. the first angle) of the first display device can be freely adjusted, so that the user can obtain a comfortable viewing experience by adjusting the first angle of the first display device.
Specifically, in the embodiment of the present application, the first angle may be determined by an angle parameter measured by a preset angle sensor. Wherein the preset angle sensor may be disposed at a predetermined position of the first display device. Optionally, the preset angle sensor may include an angle sensor such as a gyroscope or a rotary rheostat.
By way of example, embodiments of the present application are described with reference to fig. 3:
the naked-eye 3D device shown in fig. 3 includes a base, a display screen (i.e., the first display device of the embodiment of the present application), and a preset angle sensor.
The base is used for supporting the display screen; the display screen can rotate around the base, so that the angle between the display screen and the horizontal plane can be freely adjusted.
The display screen comprises a display module, a lens grating or a barrier grating and other components; the display screen is used to present a three-dimensional view.
The preset angle sensor is used for acquiring an angle between the display screen and a horizontal plane. The preset angle sensor may include a gyroscope or a rotary rheostat or the like.
The setting of the preset angle sensor can be divided into two cases. Wherein, as one case: the preset angle sensor may be a gyroscope, and the gyroscope may be disposed inside the display screen. The included angle between the display screen and the horizontal plane can be obtained in real time through the gyroscope. Or, as another case: the preset angle sensor may be a rotary slide rheostat. As shown in fig. 3, the rotary type slide rheostat can be arranged at the connection position of the display screen and the base; it can be understood that the rotary slide rheostat is movably connected with the base. When the display screen rotates relative to the base, the resistance value (or the current value) change value of the rotary slide rheostat is converted into an angle change value, so that a first angle between the display screen and a horizontal plane can be measured.
In one embodiment of the present application, the method further comprises:
acquiring first position information of a reference object in a coordinate system of a plane where the first display device is located;
and determining the display parameters according to the first position information.
Specifically, the display parameter may be a position parameter of the reference object with respect to the first display device. Wherein, the reference object can be human eyes. Optionally, the position parameter of the human eye relative to the first display device may be a coordinate of the human eye in a coordinate system of a plane in which the first display device is located.
As an example, a coordinate system of a plane on which the first display device is located (hereinafter, simply referred to as an eye tracking coordinate system) will be described with reference to fig. 4:
as shown in fig. 4, fig. 4 is a schematic diagram of setting an eye tracking coordinate system, where the eye tracking coordinate system uses the center of the display screen (i.e., the first display device) as an origin, the display screen plane as an XY-axis plane, the positive X-axis direction as the horizontal right of the display screen, the positive Y-axis direction as the upward along the display screen plane, and the positive Z-axis direction as the right front of the display screen.
In the embodiment of the present application, the display parameter is a coordinate (i.e., first position information) of the human eye in the human eye tracking coordinate system.
In an embodiment of the application, the acquiring first position information of the reference object in a coordinate system of a plane where the first display device is located includes:
acquiring a user image through preset image acquisition equipment;
and determining the first position information according to the user image.
In the embodiment of the application, the reference object may be a human eye, and when first position information of the human eye in a coordinate system of a plane where the first display device is located is determined, a user image may be acquired through a preset image acquisition device; and determining the first position information according to the user image.
The preset image acquisition equipment can be eye tracking equipment; the eye tracking device comprises a camera and an associated chip. The camera may be a camera with depth recognition capability, such as a Time of Flight (TOF) camera. The camera and the display screen (namely the first display device) are relatively fixed, and when the display screen rotates, the human eye tracking device also rotates relative to the rotation center.
Specifically, the user image can be acquired through the human eye tracking camera, and the human eye coordinate of the user is acquired by identifying and analyzing the user image.
In an embodiment of the application, the correcting the display parameter of the first display device according to the first angle and the second angle to obtain a corrected parameter includes:
determining an offset angle between the first angle and the second angle;
and correcting the display parameters according to the offset angle to obtain the correction parameters.
In particular, the offset angle is used to characterize the angular difference between the first angle and the second angle. According to the embodiment of the application, the display parameters can be corrected according to the angle difference to obtain the correction parameters. The correction parameter is a position parameter of the virtual camera device when the first display device is at the first angle.
It can be understood that the display parameters are corrected based on the offset angle to obtain the position parameters of the virtual camera device when the first display device is at the first angle, so that the virtual camera device acquires the three-dimensional view based on the position parameters, and thus, a better three-dimensional view playing effect can be presented to a user.
In one embodiment of the application, the display parameters comprise first position information of the reference object in a coordinate system of a plane in which the first display device is located,
the correcting the display parameter according to the offset angle to obtain the correction parameter includes:
determining the correction parameter according to the offset angle, the display parameter and a first data relation;
the first data relationship includes:
wherein (x1, y1, z1) represents first position information of the reference object in a coordinate system of a plane in which the first display device is located,
(x2, y2, z2) represents the correction parameter, θ1Representing the component of said offset angle in the direction of the X-axis, theta2Representing the component of said offset angle in the direction of the Y-axis, theta3A component of the offset angle in the Z-axis direction.
Specifically, in the embodiment of the present application, the reference object is a human eye, and by way of example, with reference to fig. 5 and 6, a preset reference angle (second angle) is taken as a 90-degree angle from the horizontal plane, and only an offset in the X-axis direction exists between the first angle and the second angle, that is, θ2=0、θ3The description is given by way of example with 0:
as shown in fig. 5, fig. 5 is a schematic view illustrating a coordinate system of a plane (hereinafter referred to as a real space coordinate system) where a predetermined reference angle is located. The real space coordinate system takes the center of the display screen (namely, the first display device) as an original point, when a viewer looks at the display screen, the right of the viewer is the positive direction of an X axis, the negative direction of gravity is the positive direction of a Y axis, and the direction from the display screen to the viewer is the positive direction of a Z axis.
As shown in fig. 6, in the example of the embodiment of the present application, the first display device is obliquely disposed at an angle ψ from a horizontal plane, that is, the first angle of the first display device is ψ. It is to be understood that the first display device may be understood as being reclined on the basis of the second angle, corresponding to the first angleThe display device is rotated around the X-axis in the real space coordinate system by ψ -90 °, i.e. the offset angle θ between the first angle and the second angle is θ1=ψ-90°(θ2=0、θ30). That is, the first angle corresponds to a rotation by an angle θ along the X-axis based on the second angle1。
Since the first angle is rotated by an angle theta along the X-axis based on the second angle1When the display parameters are corrected, the display parameters, that is, the coordinates of the human eye in the coordinate system of the plane where the first display device is located, may be corrected to the coordinates of the human eye in the coordinate system of the real space.
And because the rotation of the first angle on the basis of the second angle is linear transformation, the coordinates of the human eyes relative to the real space coordinate system when the first display equipment is arranged at the first angle can be determined through the transformation of the rotation matrix. In the embodiment of the present application, the coordinates of the human eye in the real space coordinate system are the correction parameters in the embodiment of the present application.
Wherein, the correction parameters are specifically:
wherein, theta2=0、θ30; therefore, in this example, x2 ═ x 1; y2 ═ y1 ═ cos θ1-z1*sinθ;z2=y1*sinθ1+z1*cosθ1。
Wherein (x1, y1, z1) represents first position information of the human eye in a coordinate system of a plane in which the first display device is located,
(x2, y2, z2) represents the correction parameter, θ1Representing the component of said offset angle in the direction of the X-axis, theta2Representing the component of said offset angle in the direction of the Y-axis, theta3A component of the offset angle in the Z-axis direction.
In one embodiment of the present application, the method further comprises:
determining a target position of the virtual camera device based on the correction parameters so that the virtual camera device acquires the three-dimensional view based on the target position;
displaying the three-dimensional view.
Specifically, after the correction parameter is determined, the virtual camera device may be set at a target position corresponding to the correction parameter, so that the virtual camera device acquires the three-dimensional view based on the target position, and then displays the three-dimensional view.
The following describes an overall architecture of a naked eye 3D display system according to an embodiment of the present application with reference to fig. 7:
the naked eye 3D display system comprises display equipment, human eye tracking equipment and a host control center.
The display device may include a base, a display screen (i.e., the first display device in the embodiment of the present application), and a preset angle sensor. The base is used for supporting the display screen to the display screen can rotate around the base, thereby realizes the free adjustment of screen and desktop contained angle. The preset angle sensor may measure the tilt angle (i.e. the first angle in the embodiment of the present application) of the display screen, wherein the preset sensor may be a gyroscope or a rotary type slide rheostat, etc.
The eye tracking device comprises a camera and a relevant chip thereof and is used for acquiring images of a user in real time.
The host control center comprises 3D software and an eye tracking algorithm.
The human eye tracking algorithm is used for identifying and analyzing the image acquired by the human eye tracking camera and acquiring the human eye coordinates of the user in real time; and correcting the coordinates of the human eyes to obtain the correction parameters of the embodiment of the application.
The 3D software is used to synthesize 3D images and transmit the 3D images to a display device, so that a user can see a good 3D picture.
In the embodiment of the application, a first angle of a first display device and a second angle corresponding to the first display device are obtained; according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; the angle of the first display equipment can be adjusted at any time, and after adjustment, under the condition that the watching position and the watching angle of a user are unchanged, the first display equipment can present a better watching effect of a three-dimensional view to the user when being positioned at any first angle.
An embodiment of the present application provides a parameter processing apparatus, as shown in fig. 8, the parameter processing apparatus 80 may include: an acquisition module 801, a correction module 802, wherein,
an obtaining module 801, configured to obtain a first angle of a first display device and a second angle corresponding to the first display device;
wherein the first angle comprises a tilt angle of the first display device; the second angle comprises a preset reference angle corresponding to the first display device;
a correction module 802, configured to perform correction processing on the display parameter of the first display device according to the first angle and the second angle to obtain a correction parameter, so as to display a three-dimensional view based on the correction parameter; wherein the display parameters include a positional parameter of a reference with respect to the first display device.
In an embodiment of the present application, the obtaining module is specifically configured to:
acquiring an angle parameter measured by a preset angle sensor, and determining the first angle according to the angle parameter; the preset angle sensor is arranged at a preset position of the first display device.
In one embodiment of the present application, the apparatus further comprises:
the determining module is used for acquiring first position information of a reference object in a coordinate system of a plane where the first display device is located;
and determining the display parameters according to the first position information.
In an embodiment of the present application, the correction module is specifically configured to:
determining an offset angle between the first angle and the second angle;
and correcting the display parameters according to the offset angle to obtain the correction parameters.
In one embodiment of the application, the display parameters comprise first position information of the reference object in a coordinate system of a plane in which the first display device is located,
the correction module is specifically configured to: determining the correction parameter according to the offset angle, the display parameter and a first data relation;
the first data relationship includes:
wherein (x1, y1, z1) represents first position information of the reference object in a coordinate system of a plane in which the first display device is located,
(x2, y2, z2) represents the correction parameter, θ1Representing the component of said offset angle in the direction of the X-axis, theta2Representing the component of said offset angle in the direction of the Y-axis, theta3A component of the offset angle in the Z-axis direction.
In an embodiment of the present application, the determining module is specifically configured to:
acquiring a user image through preset image acquisition equipment;
and determining the first position information according to the user image.
In one embodiment of the present application, the apparatus further comprises:
a display module, configured to determine a target position of a virtual camera device based on the correction parameter, so that the virtual camera device acquires the three-dimensional view based on the target position;
displaying the three-dimensional view.
The apparatus of the embodiment of the present application may execute the method provided by the embodiment of the present application, and the implementation principle is similar, the actions executed by the modules in the apparatus of the embodiments of the present application correspond to the steps in the method of the embodiments of the present application, and for the detailed functional description of the modules of the apparatus, reference may be specifically made to the description in the corresponding method shown in the foregoing, and details are not repeated here.
In the embodiment of the application, a first angle of a first display device and a second angle corresponding to the first display device are obtained; according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; the angle of the first display equipment can be adjusted at any time, and after adjustment, under the condition that the watching position and the watching angle of a user are unchanged, the first display equipment can present a better watching effect of a three-dimensional view to the user when being positioned at any first angle.
An embodiment of the present application provides an electronic device, including: a memory and a processor; at least one program stored in the memory for execution by the processor, which when executed by the processor, implements: in the embodiment of the application, a first angle of a first display device and a second angle corresponding to the first display device are obtained; according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; the angle of the first display equipment can be adjusted at any time, and after adjustment, under the condition that the watching position and the watching angle of a user are unchanged, the first display equipment can present a better watching effect of a three-dimensional view to the user when being positioned at any first angle.
In an alternative embodiment, an electronic device is provided, as shown in fig. 9, the electronic device 4000 shown in fig. 9 comprising: a processor 4001 and a memory 4003. Processor 4001 is coupled to memory 4003, such as via bus 4002. Optionally, the electronic device 4000 may further include a transceiver 4004, and the transceiver 4004 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data. In addition, the transceiver 4004 is not limited to one in practical applications, and the structure of the electronic device 4000 is not limited to the embodiment of the present application.
The Processor 4001 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or other Programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 4001 may also be a combination that performs a computing function, e.g., comprising one or more microprocessors, a combination of DSPs and microprocessors, etc.
The Memory 4003 may be a ROM (Read Only Memory) or other types of static storage devices that can store static information and instructions, a RAM (Random Access Memory) or other types of dynamic storage devices that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
The memory 4003 is used for storing application program codes (computer programs) for executing the present scheme, and execution is controlled by the processor 4001. Processor 4001 is configured to execute application code stored in memory 4003 to implement what is shown in the foregoing method embodiments.
Among them, electronic devices include but are not limited to: mobile phones, notebook computers, multimedia players, desktop computers, and the like.
The present application provides a computer-readable storage medium, on which a computer program is stored, which, when running on a computer, enables the computer to execute the corresponding content in the foregoing method embodiments.
In the embodiment of the application, a first angle of a first display device and a second angle corresponding to the first display device are obtained; according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; the angle of the first display equipment can be adjusted at any time, and after adjustment, under the condition that the watching position and the watching angle of a user are unchanged, the first display equipment can present a better watching effect of a three-dimensional view to the user when being positioned at any first angle.
The terms "first," "second," "third," "fourth," "1," "2," and the like in the description and in the claims of the present application and in the above-described drawings (if any) are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used are interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in other sequences than illustrated or otherwise described herein.
It should be understood that, although each operation step is indicated by an arrow in the flowchart of the embodiment of the present application, the implementation order of the steps is not limited to the order indicated by the arrow. In some implementation scenarios of the embodiments of the present application, the implementation steps in the flowcharts may be performed in other sequences as desired, unless explicitly stated otherwise herein. In addition, some or all of the steps in each flowchart may include multiple sub-steps or multiple stages based on an actual implementation scenario. Some or all of these sub-steps or stages may be performed at the same time, or each of these sub-steps or stages may be performed at different times, respectively. In a scenario where execution times are different, an execution sequence of the sub-steps or the phases may be flexibly configured according to requirements, which is not limited in the embodiment of the present application.
The foregoing is only an optional implementation manner of a part of implementation scenarios in this application, and it should be noted that, for those skilled in the art, other similar implementation means based on the technical idea of this application are also within the protection scope of the embodiments of this application without departing from the technical idea of this application.
Claims (10)
1. A method for processing parameters, comprising:
acquiring a first angle of first display equipment and a second angle corresponding to the first display equipment;
wherein the first angle comprises a tilt angle of the first display device; the second angle comprises a preset reference angle corresponding to the first display device;
according to the first angle and the second angle, correcting the display parameters of the first display device to obtain correction parameters, and displaying a three-dimensional view based on the correction parameters; wherein the display parameters include a position parameter of a reference with respect to the first display device.
2. The parameter processing method of claim 1, wherein the obtaining the first angle of the first display device comprises:
acquiring an angle parameter measured by a preset angle sensor, and determining the first angle according to the angle parameter; the preset angle sensor is arranged at a preset position of the first display device.
3. The parameter processing method according to claim 1, further comprising:
acquiring first position information of a reference object in a coordinate system of a plane where the first display device is located;
and determining the display parameters according to the first position information.
4. The method according to claim 1, wherein the correcting the display parameter of the first display device according to the first angle and the second angle to obtain a corrected parameter comprises:
determining an offset angle between the first angle and the second angle;
and correcting the display parameters according to the offset angle to obtain the correction parameters.
5. The method according to claim 4, wherein the display parameters include first position information of the reference object in a coordinate system of a plane in which the first display device is located,
the correcting the display parameter according to the offset angle to obtain the correction parameter includes:
determining the correction parameter according to the offset angle, the display parameter and a first data relation;
the first data relationship includes:
wherein (x1, y1, z1) represents first position information of the reference object in a coordinate system of a plane in which the first display device is located,
(x2, y2, z2) represents the correction parameter, θ1Representing the component of said offset angle in the direction of the X-axis, theta2Representing the component of said offset angle in the direction of the Y-axis, theta3A component of the offset angle in the Z-axis direction.
6. The parameter processing method according to claim 3, wherein the acquiring first position information of the reference object in the coordinate system of the plane in which the first display device is located comprises:
acquiring a user image through preset image acquisition equipment;
and determining the first position information according to the user image.
7. The parameter processing method according to claim 1, further comprising:
determining a target position of a virtual camera device based on the correction parameters so that the virtual camera device acquires the three-dimensional view based on the target position;
displaying the three-dimensional view.
8. A parameter processing apparatus, comprising:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a first angle of first display equipment and a second angle corresponding to the first display equipment;
wherein the first angle comprises a tilt angle of the first display device; the second angle comprises a preset reference angle corresponding to the first display device;
the correction module is used for correcting the display parameters of the first display device according to the first angle and the second angle to obtain correction parameters so as to display a three-dimensional view based on the correction parameters; wherein the display parameters include a position parameter of a reference with respect to the first display device.
9. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: performing the parameter processing method according to any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the parameter processing method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210187986.2A CN114566132A (en) | 2022-02-28 | 2022-02-28 | Parameter processing method and device, electronic equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210187986.2A CN114566132A (en) | 2022-02-28 | 2022-02-28 | Parameter processing method and device, electronic equipment and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114566132A true CN114566132A (en) | 2022-05-31 |
Family
ID=81715443
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210187986.2A Pending CN114566132A (en) | 2022-02-28 | 2022-02-28 | Parameter processing method and device, electronic equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114566132A (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008089984A (en) * | 2006-10-02 | 2008-04-17 | Pioneer Electronic Corp | Image display device |
US20120069002A1 (en) * | 2010-09-22 | 2012-03-22 | Nikon Corporation | Image display apparatus and imaging apparatus |
WO2013039491A1 (en) * | 2011-09-14 | 2013-03-21 | Hewlett-Packard Development Company, L.P. | Image-viewing systems with an integrated light-steering panel |
CN106569611A (en) * | 2016-11-11 | 2017-04-19 | 努比亚技术有限公司 | Apparatus and method for adjusting display interface, and terminal |
CN108476316A (en) * | 2016-09-30 | 2018-08-31 | 华为技术有限公司 | A kind of 3D display method and user terminal |
JP2018190196A (en) * | 2017-05-08 | 2018-11-29 | 株式会社コロプラ | Information processing method, information processing device, program causing computer to execute information processing method |
CN111710047A (en) * | 2020-06-05 | 2020-09-25 | 北京有竹居网络技术有限公司 | Information display method and device and electronic equipment |
CN113411574A (en) * | 2021-06-17 | 2021-09-17 | 纵深视觉科技(南京)有限责任公司 | Method, device, medium and system for evaluating naked eye 3D display effect |
CN113438465A (en) * | 2021-06-22 | 2021-09-24 | 纵深视觉科技(南京)有限责任公司 | Display adjusting method, device, equipment and medium |
CN113870213A (en) * | 2021-09-24 | 2021-12-31 | 深圳市火乐科技发展有限公司 | Image display method, image display device, storage medium, and electronic apparatus |
-
2022
- 2022-02-28 CN CN202210187986.2A patent/CN114566132A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008089984A (en) * | 2006-10-02 | 2008-04-17 | Pioneer Electronic Corp | Image display device |
US20120069002A1 (en) * | 2010-09-22 | 2012-03-22 | Nikon Corporation | Image display apparatus and imaging apparatus |
WO2013039491A1 (en) * | 2011-09-14 | 2013-03-21 | Hewlett-Packard Development Company, L.P. | Image-viewing systems with an integrated light-steering panel |
CN108476316A (en) * | 2016-09-30 | 2018-08-31 | 华为技术有限公司 | A kind of 3D display method and user terminal |
CN106569611A (en) * | 2016-11-11 | 2017-04-19 | 努比亚技术有限公司 | Apparatus and method for adjusting display interface, and terminal |
JP2018190196A (en) * | 2017-05-08 | 2018-11-29 | 株式会社コロプラ | Information processing method, information processing device, program causing computer to execute information processing method |
CN111710047A (en) * | 2020-06-05 | 2020-09-25 | 北京有竹居网络技术有限公司 | Information display method and device and electronic equipment |
CN113411574A (en) * | 2021-06-17 | 2021-09-17 | 纵深视觉科技(南京)有限责任公司 | Method, device, medium and system for evaluating naked eye 3D display effect |
CN113438465A (en) * | 2021-06-22 | 2021-09-24 | 纵深视觉科技(南京)有限责任公司 | Display adjusting method, device, equipment and medium |
CN113870213A (en) * | 2021-09-24 | 2021-12-31 | 深圳市火乐科技发展有限公司 | Image display method, image display device, storage medium, and electronic apparatus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9286718B2 (en) | Method using 3D geometry data for virtual reality image presentation and control in 3D space | |
EP3058451B1 (en) | Techniques for navigation among multiple images | |
CN108989785B (en) | Naked eye 3D display method, device, terminal and medium based on human eye tracking | |
WO2019076348A1 (en) | Virtual reality (vr) interface generation method and apparatus | |
US11922568B2 (en) | Finite aperture omni-directional stereo light transport | |
EP4186033A2 (en) | Map for augmented reality | |
CN107635132B (en) | Display control method and device of naked eye 3D display terminal and display terminal | |
CN114742703A (en) | Method, device and equipment for generating binocular stereoscopic panoramic image and storage medium | |
WO2022267694A1 (en) | Display adjustment method and apparatus, device, and medium | |
US20170069133A1 (en) | Methods and Systems for Light Field Augmented Reality/Virtual Reality on Mobile Devices | |
CN109978945B (en) | Augmented reality information processing method and device | |
KR20150058733A (en) | A method using 3d geometry data for virtual reality image presentation and control in 3d space | |
CN110548289B (en) | Method and device for displaying three-dimensional control | |
CN109816791B (en) | Method and apparatus for generating information | |
CN115131507B (en) | Image processing method, image processing device and meta space three-dimensional reconstruction method | |
CN114566132A (en) | Parameter processing method and device, electronic equipment and computer readable storage medium | |
KR102534449B1 (en) | Image processing method, device, electronic device and computer readable storage medium | |
US8755819B1 (en) | Device location determination using images | |
CN108399638B (en) | Augmented reality interaction method and device based on mark and electronic equipment | |
CN106990838B (en) | Method and system for locking display content in virtual reality mode | |
CN112132909A (en) | Parameter acquisition method and device, media data processing method and storage medium | |
US20120162199A1 (en) | Apparatus and method for displaying three-dimensional augmented reality | |
CN112837424B (en) | Image processing method, apparatus, device and computer readable storage medium | |
CN115457200B (en) | Method, device, equipment and storage medium for automatic true stereo display of 2.5-dimensional image | |
CN117409175B (en) | Video recording method, system, electronic equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |