WO2021238564A1 - 显示设备及其畸变参数确定方法、装置、系统及存储介质 - Google Patents

显示设备及其畸变参数确定方法、装置、系统及存储介质 Download PDF

Info

Publication number
WO2021238564A1
WO2021238564A1 PCT/CN2021/090760 CN2021090760W WO2021238564A1 WO 2021238564 A1 WO2021238564 A1 WO 2021238564A1 CN 2021090760 W CN2021090760 W CN 2021090760W WO 2021238564 A1 WO2021238564 A1 WO 2021238564A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target
point
display
distortion parameter
Prior art date
Application number
PCT/CN2021/090760
Other languages
English (en)
French (fr)
Inventor
白家荣
董瑞君
王晨如
栗可
武玉龙
韩娜
张�浩
陈丽莉
Original Assignee
京东方科技集团股份有限公司
北京京东方光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 北京京东方光电科技有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2021238564A1 publication Critical patent/WO2021238564A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis

Definitions

  • the present disclosure relates to the field of display technology, and in particular to a display device and a method, device, system and storage medium for determining distortion parameters thereof.
  • a virtual reality (VR) device is a display device that can form a visual effect of virtual reality.
  • VR virtual reality
  • the VR image presented to the user by the VR device will inevitably be distorted, that is, the VR image is prone to distortion and deformation.
  • the present disclosure provides a display device and its distortion parameter determination method, device, system, and storage medium.
  • the technical solutions are as follows:
  • a method for determining distortion parameters of a display device is provided, the display device is a virtual reality device, the virtual reality device includes a display screen, and the method includes:
  • the distortion parameter of the virtual reality device is determined.
  • the determining the actual image height between the first target point and the second target point in the VR image according to the reference image includes:
  • the target distance as the actual image height between the first target point and the second target point in the VR image
  • the position of the first reference point in the reference image is the same as the position of the first target point in the VR image
  • the position of the second reference point in the reference image is the same as the position in the reference image.
  • the position of the second target point in the VR image is the same.
  • the determining the actual image height between the first target point and the second target point in the VR image according to the reference image includes:
  • the determining the distortion parameter of the virtual reality device according to the actual image height includes: acquiring the actual object height between the first display point and the second display point in the display screen;
  • the position of the first display point in the display screen is the same as the position of the first target point in the VR image
  • the position of the second display point in the display screen is the same as the position in the display screen.
  • the position of the second target point in the VR image is the same.
  • the acquiring the actual object height between the first display point and the second display point in the display screen includes:
  • the actual object height between the first display point and the second display point in the display screen is calculated according to the size parameter.
  • the determining the distortion parameter of the virtual reality device according to the actual object height and the actual image height includes:
  • a distortion parameter of the virtual reality device is determined, and the distortion parameter is positively correlated with the actual image height.
  • the distortion parameter satisfies:
  • the d1 is the actual image height
  • the d2 is the ideal image height
  • the target image and the reference image are both regular grid images or regular dot images.
  • the size of the reference image is the same as the size of the target image.
  • the acquisition of the VR image obtained by the image acquisition device using the target shooting parameters to shoot the target image displayed on the display screen includes: acquiring the image acquisition device at a first distance from the imaging surface of the virtual reality device Where, the VR image obtained by shooting the target image displayed on the display screen using the target shooting parameter;
  • the acquiring a reference image obtained by the image acquisition device using the target shooting parameter to shoot a reference image includes: acquiring the image acquisition device at a second distance from the reference image, and using the target shooting parameter to capture the A reference image obtained by shooting a reference image;
  • the first distance and the second distance are equal.
  • the target image is the reference image displayed on the display screen of the virtual reality device.
  • a display device is provided, the display device is a virtual reality device, and the distortion parameter of the virtual display device is determined by the method for determining the distortion parameter of the display device described in the foregoing aspect.
  • a device for determining distortion parameters of a display device is provided, the display device is a virtual reality device, the virtual reality device includes a display screen, and the device includes:
  • the first acquisition module is configured to acquire the VR image obtained by the image acquisition device using the target shooting parameters to shoot the target image displayed on the display screen;
  • a second acquisition module configured to acquire a reference image obtained by the image acquisition device using the target shooting parameter to shoot a reference image
  • the first determining module is configured to determine the actual image height between the first target point and the second target point in the VR image based on the reference image;
  • the second determining module is configured to determine the distortion parameter of the virtual reality device according to the actual image height.
  • a device for determining distortion parameters of a display device includes: a processor; a memory configured to store executable instructions of the processor; wherein the processor is configured to: execute The method for determining the distortion parameter of the display device as described in the above aspect.
  • a system for determining a distortion parameter of a display device includes: a virtual reality device, an image acquisition device, and a distortion parameter determining device.
  • the distortion parameter determining device includes the A device for determining a distortion parameter, the image acquisition device establishes a communication connection with the device for determining a distortion parameter;
  • the image acquisition device is configured to: use target shooting parameters to shoot a target image displayed on the display screen of the virtual reality device to obtain a VR image, use the target shooting parameters to shoot a reference image to obtain a reference image, and Sending the VR image and the reference image to the distortion parameter determination device;
  • the distortion parameter determining device is configured to determine the distortion parameter of the virtual reality device based on the VR image and the reference image.
  • the distance from the imaging surface of the virtual reality device when the image capture device captures the target image is equal to the distance from the reference image when the image capture device captures the reference image.
  • the image coverage of the reference image in the image capture device is not less than when the image capture device captures the target image, the target image The image coverage in the image acquisition device.
  • the target shooting parameters include: at least one of the focal length, aperture size, shutter duration, contrast, and color saturation of the image acquisition device.
  • the image acquisition device is integrated in the distortion parameter determination device.
  • a non-volatile computer-readable storage medium is provided, and instructions are stored in the computer-readable storage medium.
  • the computer-readable storage medium runs on a computer, the computer can execute the above-mentioned aspects.
  • the method for determining the distortion parameter of the display device can be executed.
  • FIG. 1 is a schematic diagram of an implementation environment involved in a method for determining distortion parameters of a display device according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a scene of shooting a target image provided by an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a scene for shooting a reference image provided by an embodiment of the present disclosure
  • Fig. 4 is a schematic diagram including a reference image and a VR image provided by an embodiment of the present disclosure
  • FIG. 5 is a flowchart of a method for determining distortion parameters of a display device according to an embodiment of the present disclosure
  • FIG. 6 is a flowchart of another method for determining distortion parameters of a display device according to an embodiment of the present disclosure
  • FIG. 7 is a flowchart of a method for determining actual image height according to an embodiment of the present disclosure
  • FIG. 8 is a flowchart of another method for determining actual image height provided by an embodiment of the present disclosure.
  • FIG. 9 is another schematic diagram including a reference image and a VR image provided by an embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of a display device provided by an embodiment of the present disclosure.
  • FIG. 11 is a block diagram of a device for determining distortion parameters of a display device according to an embodiment of the present disclosure
  • FIG. 12 is a block diagram of a second determining module provided by an embodiment of the present disclosure.
  • FIG. 13 is a schematic structural diagram of another device for determining distortion parameters provided by an embodiment of the present disclosure.
  • FIG. 14 is a hardware structure diagram of another device for determining distortion parameters provided by an embodiment of the present disclosure.
  • FIG. 1 is a schematic diagram of an implementation environment involved in a method for determining a distortion parameter of a display device provided by an embodiment of the present disclosure, that is, a schematic diagram of a system for determining a distortion parameter.
  • the system may include: a VR device 01, an image acquisition device 02 (also called a photographing device), and a distortion parameter determination device 03 (also called a processing device).
  • the image acquisition device 02 may establish a communication connection with the distortion parameter determination device 03 through a wired or wireless network.
  • the VR device 01 may include a display screen.
  • a target image can be displayed on the display screen of the VR device 01 first, and then the image acquisition device 02 can first use the target shooting parameters to shoot the target image to obtain a pair VR images with better clarity.
  • the image acquisition device 02 can send the captured VR image and reference image to the distortion parameter determination device 03, so that the distortion parameter determination device 03 determines the distortion parameter of the VR device 01 based on the VR image and the reference image.
  • the reference image may be a physical image or a virtual image (also referred to as an electronic image) displayed on the display screen of an electronic device.
  • the electronic device may be a device with a display screen, such as a tablet computer or a smart phone.
  • the reference image is a real image
  • the definition of the reference image obtained by shooting may be affected by the external ambient light. Therefore, in order to ensure that the definition of the VR image obtained by shooting is consistent with the reference image as much as possible, it is generally possible to Select the electronic image as the reference image.
  • the target shooting parameters may include: at least one of the focal length, aperture size, shutter duration, contrast, and color saturation of the image acquisition device 02.
  • the target shooting parameters may include: at least one of the focal length, aperture size, shutter duration, contrast, and color saturation of the image acquisition device 02.
  • any parameter that may affect the shooting effect can be used as the target shooting parameter.
  • the distance L1 between the image acquisition device 02 and the imaging surface of the VR device ie, the VR virtual image surface shown in Figure 2
  • the distance L2 from the reference image when shooting the reference image can be equal. That is, the distance L2 may be the sum of the VR virtual image distance and the distance from the VR device 01 to the image capture device 02.
  • the VR virtual image distance refers to the distance from the display screen of the VR device 01 to the VR virtual image surface, which can be provided or measured by the manufacturer of the VR device 01.
  • the target image For example, take the target image first and then the reference image as an example.
  • the VR device 01 is not moved, but the image capture device 02 is moved from a preset position to another location, and the image capture device 02 is guaranteed to capture the reference image at another location.
  • the distance L2 of the image may be equal to the distance L1 from the imaging surface of the VR device when the image acquisition device 02 captures the target image.
  • the content of the reference image and the target image displayed on the display screen of the VR device 01 can be selected according to actual needs.
  • the target image displayed on the display screen of the VR device 01 may be a dot image.
  • the reference image may be a grid image.
  • the image coverage range of the reference image in the image acquisition device 02 is It may not be less than the image coverage area of the target image in the image capture device 02 when the image capture device 02 captures the target image displayed on the display screen of the VR device 01.
  • the reference image is located at the position of the imaging surface of the VR device 01 as an example.
  • the VR device is generally a symmetrical optical system
  • the size of the reference image can only cover the half field of view of the display of the VR device 01, that is, when the reference image is taken, only one half of the reference image can enter the shooting of the image acquisition device 02 Inside the lens.
  • the size of the reference image may only cover half a diagonal of the display screen of the VR device 01, that is, when the reference image is taken, only a quarter of the reference image may enter the shooting lens of the image acquisition device 02.
  • the VR device 01 may be any form of VR device, such as head-mounted VR glasses.
  • the image acquisition device 02 can be any device capable of capturing images, such as a single-lens reflex camera, a smart phone or a tablet computer.
  • the distortion parameter determination device 03 may be a mobile terminal, such as an external computer device.
  • the image acquisition device 02 may be integrated into the distortion parameter determination device 03, that is, the distortion parameter determination device 03 may be equipped with a photographing component (ie, the image acquisition device 02).
  • the distortion parameter determination device 03 may be used to capture the target image to obtain a VR image through the included shooting components, capture the reference image to obtain a reference image, and determine the distortion parameter of the VR device based on the VR image and the reference image. That is, the distortion parameter determining device 03 can directly acquire the image itself, without receiving the VR image and the reference image sent by another independent external image acquisition device 02.
  • the distortion parameter determination device 03 after the distortion parameter determination device 03 obtains the VR image and the reference image, it can reliably determine the distortion parameter of the VR device 01 by stacking the two images. Since two images taken under the same target shooting parameters are compared to determine the distortion parameters of the VR device 01, there is no need to eliminate the distortion of the image acquisition device 02, that is, there is no need to calibrate or correct the image acquisition device 02. distortion. Determining the distortion parameters of the VR device 01 through the system shown in Figure 1 not only has high reliability, but also has a simple principle and convenient operation.
  • the system can determine the distortion parameters based on the VR image before distortion correction, that is, before anti-distortion, or it can determine the distortion parameters based on the VR image after distortion correction, that is, after anti-distortion, which is not done in the embodiment of the present disclosure. limited.
  • FIG. 5 is a flowchart of a method for determining distortion parameters of a display device according to an embodiment of the present disclosure, where the display device may be the VR device 01 including a display screen shown in FIG. 1. This method can be applied to the distortion parameter determination device 03 shown in FIG. 1. As shown in Figure 5, the method may include:
  • Step 501 Obtain a VR image obtained by the image capture device using the target shooting parameters to shoot the target image displayed on the display screen.
  • the VR image may be an image obtained by the image capture device using the target shooting parameters to shoot the target image displayed on the display screen of the VR device. That is, the distortion parameter determination device can receive the VR image sent by the image acquisition device.
  • Step 502 Obtain a reference image obtained by the image capture device using the target shooting parameter to shoot the reference image.
  • the reference image may be an image obtained by an image capture device using target shooting parameters to shoot a reference image. That is, the distortion parameter determination device can receive the reference image sent by the image acquisition device.
  • Step 503 Determine the actual image height between the first target point and the second target point in the VR image according to the reference image.
  • the distortion parameter determination device may automatically calculate the actual image height between the first target point and the second target point in the VR image based on the size of the reference image.
  • the distortion parameter determination device may display the VR image and the reference image to the user, and receive the actual image height between the first target point and the second target point in the VR image input by the user based on the displayed two images.
  • Step 504 Determine the distortion parameter of the VR device according to the actual image height.
  • the distortion parameter determination device may automatically calculate based on the determined actual image height to determine the distortion parameter of the VR device.
  • the distortion parameter determination device may also receive the distortion parameter of the VR device input by the user based on the actual image height.
  • the embodiments of the present disclosure provide a method for determining distortion parameters of a display device. Since this method can obtain the VR image and the reference image displayed by the VR device captured with the same target shooting parameters, the actual image height between any two target points in the VR image can be flexibly determined based on the reference image, and the actual image height between any two target points in the VR image can be flexibly determined based on the reference image.
  • the actual image is highly reliable to determine the distortion parameters of the VR device. Therefore, compared with the related technology to obtain the distortion parameter from the manufacturer, the accuracy of the distortion parameter determined by the method for determining the distortion parameter is higher, that is, the reliability of the determination of the distortion parameter is better.
  • FIG. 6 is a flowchart of another method for determining a distortion parameter of a display device provided by an embodiment of the present disclosure, which can be applied to the distortion parameter determining device 03 shown in FIG. 1. As shown in Figure 6, the method may include:
  • Step 601 Obtain a VR image obtained by the image capture device using the target shooting parameters to shoot the target image displayed on the display screen.
  • the distortion parameter determination device may receive the VR image sent by the image acquisition device. That is, the VR image may be an image obtained by using the target shooting parameter to shoot the target image displayed on the display screen by the image acquisition device at the first distance from the imaging surface of the virtual reality device. It should be noted that the VR image acquired by the embodiment of the present disclosure only refers to the image displayed on the display screen of the VR device, and does not refer to the stereoscopic image viewed by the user through the VR device.
  • Step 602 Obtain a reference image obtained by the image capture device using the target shooting parameter to shoot the reference image.
  • the distortion parameter determination device may also receive the reference image sent by the image acquisition device. That is, the reference image may be an image obtained by using the target shooting parameters to shoot the reference image by the image acquisition device at the second distance from the reference image.
  • the reference image may be an image obtained by using the target shooting parameters to shoot the reference image by the image acquisition device at the second distance from the reference image.
  • the distance is displayed. The first distance and the second distance of the imaging surface of the VR device of the target image may be equal.
  • the selectable types of the target shooting parameters and the optional implementation manners of the VR images and reference images captured by the image acquisition device can be referred to the records in the foregoing embodiments, and details are not described herein again.
  • the target image captured in step 601 may be a reference image displayed on the display screen, that is, only one reference image may be provided.
  • the reference image can be displayed on the display screen of the VR device as the target image, and the target image (that is, the reference image displayed on the display screen of the VR device) can be photographed by the image acquisition device to obtain the VR image.
  • an image acquisition device can be used to directly shoot the reference image to obtain the reference image.
  • both the VR image and the reference image captured by the image acquisition device may be distorted images.
  • the distortion parameters of the VR device can be determined directly based on the reference image, without the need to perform distortion correction on the image acquisition device based on the reference image that produces the distortion, and the method is relatively simple.
  • Step 603 Determine the actual image height between the first target point and the second target point in the VR image according to the reference image.
  • the first target point and the second target point may be any two points in the VR image.
  • the VR image may be composed of a plurality of pixels arranged in a matrix, and each target point in the VR image may include only one pixel, or each target point in the VR image may include two or more pixel.
  • the actual image height between the first target point and the second target point may refer to the shortest distance between the first target point and the second target point (e.g., straight-line distance).
  • the distortion parameter determination device may directly determine the positions of the two target points based on the size of the reference image by using an image recognition algorithm, and calculate the actual image height between the two target points.
  • the distortion parameter determination device may also display the two acquired images to a user (for example, an operator), and receive the actual image height input by the user based on the displayed image.
  • FIG. 7 shows a flow chart of a method for determining the actual image height by taking the direct calculation of the actual image height by the distortion parameter determining device as an example. As shown in Figure 7, the method may include:
  • Step 6031A Determine the first reference point and the second reference point from the multiple reference points included in the reference image.
  • the first reference point and the second reference point may be any two points in the reference image.
  • the reference image may also be composed of a plurality of pixels arranged in a matrix. Each reference point in the reference image may include only one pixel, or each reference point may include two or more pixels. .
  • the position of the first reference point in the reference image is the same as the position of the first target point in the VR image
  • the position of the second reference point in the reference image is the same as the position of the second target point in the VR image. That is, if the VR image and the reference image are stacked, the first reference point and the first target point overlap, and the second reference point and the second target point overlap. Furthermore, the number of pixels included in the first reference point and the number of pixels included in the first target point may be the same, and the number of pixels included in the second reference point and the number of pixel points included in the second target point may be the same.
  • the coordinates of the reference point or target point in the image to which it belongs can be used to refer to the position of the reference point or target point in the image to which it belongs. That is, the position of the first reference point in the reference image is the same as the position of the first target point in the VR image may refer to: the coordinates of the first reference point in the reference image and the position coordinates of the first target point in the VR image same.
  • the second reference point is the same as the second target point.
  • the distortion parameter determination device after the distortion parameter determination device acquires the reference image and the VR image, it may first compare the two images to determine the first reference point and the second reference point among the multiple reference points included in the reference image. Two reference points. That is, find two points corresponding to the same position of the VR image and the reference image.
  • Step 6032A Calculate the target distance between the first reference point and the second reference point according to the size of the reference image.
  • the distortion parameter determination device may calculate the size of the reference image after acquiring the reference image.
  • the distortion parameter determining device can directly receive the size of the reference image input by the user.
  • the distortion parameter determination device needs to obtain the arrangement rule of the reference image (for example, the ratio of length to width) in advance. That is, the distortion parameter determination device has either stored the size of the reference image or the arrangement rule of the reference image.
  • the reference image may also be referred to as an image with a known internal size or an image with a known internal arrangement rule.
  • the distortion parameter determination device After the distortion parameter determination device determines the first reference point and the second reference point, it can calculate the target distance between the first reference point and the second reference point based on the determined size of the reference image.
  • the target distance may be the shortest distance between the first reference point and the second reference point.
  • the reference image may be a regular image.
  • the reference image may be a regular grid image as shown in FIG. 2.
  • Step 6033A Determine the target distance as the actual image height between the first target point and the second target point in the VR image.
  • the distortion parameter determination device can directly The shortest distance between a reference point and the second reference point (ie, the target distance) is determined as the shortest distance between the first target point and the second target point. Furthermore, since the actual image height between the first target point and the second target point refers to the shortest distance between the first target point and the second target point, the distance between the first reference point and the second reference point can be The target distance is determined as the actual image height between the first target point and the second target point. That is, the reference image can be used as a reference coordinate, and the coordinates of each target point of the VR image can be found based on the reference coordinate, and the distance between each target point (that is, the actual image height) can be calculated.
  • the distortion parameter determination device may directly execute the above steps 6031A to 6033A.
  • the distortion parameter determination device may then execute the foregoing steps 6031A to 6033A in sequence.
  • the distortion parameter determination instruction may be generated by the user by triggering a certain control displayed on the distortion parameter determination device.
  • the distortion parameter determination device since the above steps 6031A to 6033A are automatically executed by the distortion parameter determination device, the distortion parameter determination device does not need to be shown in FIG. 4 to layer and display the VR image and the reference image.
  • FIG. 8 shows a flow chart of a method for determining the actual image height by taking the actual image height input by the user by the distortion parameter determining device as an example. As shown in Figure 8, the method may include:
  • Step 6031B overlay and display the VR image and the reference image, and set the transparency of the VR image and the image near the display side in the reference image to be greater than the transparency threshold.
  • the distortion parameter determination device can superimpose the display of the VR image and the reference image, and set the transparency of the image near the display side in the VR image and the reference image to be greater than the transparency threshold, that is to make the upper layer image more transparent than the lower layer image Transparency.
  • the transparency threshold may be pre-configured in the distortion parameter determination device, or received real-time input from the user when the distortion parameter is determined.
  • the pre-configuration can be configured by the developer at the factory, or configured by the operator immediately before the distortion parameters are determined.
  • the distortion parameter determination device may display the VR image and the reference image based on the editing requirements carried by the image editing instruction when the image editing instruction is received.
  • the editing requirement can be a superimposed display of the VR image and the reference image, and the transparency of the image near the display side in the VR image and the reference image is set to be greater than the transparency threshold.
  • the distortion parameter determination device can automatically superimpose and display the VR image and the reference image after receiving the distortion parameter determination instruction or acquiring the VR image and the reference image, and set the transparency of the VR image and the image near the display side of the reference image to be greater than Transparency threshold.
  • the image editing instruction and the distortion parameter determination instruction may also be generated by the user by triggering a certain control displayed on the distortion parameter determination device.
  • Step 6032B Receive the actual image height between the first target point and the second target point in the VR image input by the user based on the superimposed image.
  • the distortion parameter determination device may receive the actual image height between the first target point and the second target point in the VR image input by the user based on the superimposed displayed image.
  • the user can know the size of the reference image or the known arrangement rule, and calculate the size of the reference image based on the arrangement rule. Or, the user can use the tool to measure the size of the reference image.
  • the VR image and the reference image displayed to the user by the distortion parameter determination device can be of the same size and different colors, and/or, combined with Figure 2 and Figure 3, the VR image and the reference image In the image, one image can be a regular grid image, and the other image can be a regular dot image.
  • the target image and the reference image initially displayed on the display screen of the VR device have the same size and different colors, and/or, in the target image and the reference image, one image is a regular grid image, and the other image is a regular image Dot image.
  • the target image is similar to the reference image, and can also be an image with a known internal size or an image with a known internal arrangement rule.
  • Figure 9 takes the target image as a regular dot image with a known internal size
  • the reference image is a regular grid image with a known internal size, that is, the VR image obtained by shooting is a regular dot image
  • the reference image is a known internal size.
  • the first reference point in the reference image that is located at the same position as the first target point in the VR image is point p1
  • the second reference point in the reference image that is located at the same position as the second target point in the VR image The reference point is point p2, and each small grid in the reference image is a square with side length a.
  • the target distance ie, the shortest distance
  • the actual image height between the first target point and the second target point in the finally determined VR image is
  • the Pythagorean theorem can be directly used for calculation. If the target point is not located at the intersection of a small grid, the actual image height can be determined by estimation or dichotomy.
  • the method for determining the actual image height between any two target points at other positions may refer to the method for determining the actual image height between the first target point and the second target point, which will not be repeated here.
  • Step 604 Obtain the actual object height between the first display point and the second display point in the display screen.
  • the shortest distance between the first display point and the second display point may be the actual object height between the first display point and the second display point.
  • the actual object height can be obtained by the user from the manufacturer or measured by a tool, and input to the distortion parameter determination device. That is, the distortion parameter determination device can receive the actual height of the object between the first display point and the second display point in the display screen input by the user. Alternatively, the user can only input the size-related parameters of the display screen to the distortion parameter determination device, and the distortion parameter determination device calculates the actual object height between any two display points.
  • the VR device may pre-store the size-related parameters of its display screen, the VR device and the distortion parameter determination device may also establish a communication connection, and the VR device may send the size-related parameters of its display screen to the distortion parameter determination device, And the actual object height between any two display points is calculated by the device to determine the distortion parameter.
  • the first display point and the second display point can be any two points on the display screen.
  • the display screen may also include a plurality of pixel points arranged in a matrix, and each display point in the display screen may include only one pixel point, or each display point may include two or more pixel points.
  • the coordinates of the first display point in the display screen and the position of the first target point in the VR image may be the same, and the coordinates of the second display point in the display screen and the position of the second target point in the VR image may be the same.
  • the number of pixels included in the first display point and the number of pixels included in the first target point may be the same, and the number of pixels included in the second display point and the number of pixels included in the second target point may be the same.
  • the same position here can also be understood as the same coordinate.
  • the actual image height between any two target points corresponds to the actual object height between two display points at the same position. Therefore, the corresponding relationship between the actual image height and the actual object height finally obtained can be called the distortion relationship.
  • Step 605 Determine the ideal image height between the first target point and the second target point according to the actual object height.
  • the distortion parameter determination device may further automatically calculate the ideal image height between the first target point and the second target point based on the actual object height.
  • the distortion parameter determination device can display the actual object height to the user, and the user calculates the ideal image height based on the actual object height and inputs it to the distortion parameter determination device. That is, the distortion parameter determination device can receive the first target point and the second target point input by the user. Ideal image height between target points.
  • Step 606 Determine the distortion parameter of the VR device according to the ideal image height and the actual image height.
  • the distortion parameter determination device can automatically calculate the distortion parameter of the VR device based on the determined ideal image height and actual image height.
  • the distortion parameter determination device can display the ideal image height and the actual image height to the user, and the user calculates the distortion parameters based on the ideal image height and the actual image height and inputs them to the distortion parameter determination device, that is, the distortion parameter determination device can receive the user input Distortion parameters of the VR device.
  • the distortion parameter and the actual image height can be positively correlated, that is, the larger the actual image height, the larger the distortion parameter, and the greater the degree of distortion; the smaller the actual image height, the smaller the distortion parameter, and the smaller the degree of distortion.
  • the distortion parameter determination device may also determine the field angle of the distorted VR device based on the actual image height and the virtual image distance of the VR device, and replace the ideal image height with the actual image height near the central field of view to determine the VR device distortion parameters.
  • the embodiment of the present disclosure does not limit the implementation of determining the distortion parameter based on the actual image height, that is, any required distortion parameter can be determined based on the actual image height.
  • the distortion parameter recorded in the above embodiment refers to any parameter that affects the distortion of the VR image finally presented to the user.
  • the distortion parameter determination device determines the distortion parameters of the VR device based on the actual image height between target points at different distances in the VR image, that is, the VR device may include multiple distortion parameters.
  • the foregoing embodiment only takes a distortion parameter of the VR device determined based on the actual image height between the first target point and the second target point in the VR image as an example for description.
  • the method of determining other distortion parameters based on the actual image height between any two other target points reference may be made to the above-mentioned method, which will not be repeated here.
  • the first target point is the center point
  • the actual image height of the target point determines 3 different distortion parameters of the VR device. Then you can first obtain the actual image height between the first target point and the second target point, the actual image height between the first target point and the third target point, and the actual image height between the first target point and the fourth target point.
  • the actual image height of the target point determines 3 different distortion parameters of the VR device.
  • the distortion parameter determination device may send the distortion parameter to the VR device, so that the VR device can obtain its own distortion parameter. Afterwards, the VR device can perform distortion correction on the VR image displayed to the user based on its own distortion parameters, such as anti-distortion processing. To avoid distortion or deformation of the displayed VR image, thereby effectively improving the user's viewing experience.
  • step 601 and step 602 can be executed simultaneously. Any person skilled in the art within the technical scope disclosed in the present disclosure can easily think of various methods, which should be covered by the protection scope of the present disclosure, and will not be repeated.
  • the embodiments of the present disclosure provide a method for determining distortion parameters of a display device. Since this method can obtain the VR image and the reference image displayed by the VR device captured with the same target shooting parameters, the actual image height between any two target points in the VR image can be flexibly determined based on the reference image, and the actual image height between any two target points in the VR image can be flexibly determined based on the reference image.
  • the actual image is highly reliable to determine the distortion parameters of the VR device. Therefore, compared with the related technology to obtain the distortion parameter from the manufacturer, the accuracy of the distortion parameter determined by the method for determining the distortion parameter is higher, that is, the reliability of the determination of the distortion parameter is better.
  • FIG. 10 is a schematic structural diagram of a display device provided by an embodiment of the present disclosure.
  • the display device may be a VR device 01, and the distortion parameter of the VR device 01 may be determined by the distortion parameter determination method of the display device shown in 5 or FIG. 6.
  • FIG. 11 is a block diagram of a device for determining distortion parameters of a display device provided by an embodiment of the present disclosure.
  • the display device is a VR device, and the VR device includes a display screen.
  • the device may include:
  • the first acquisition module 111 is configured to acquire a VR image obtained by the image acquisition device using the target shooting parameters to shoot the target image displayed on the display screen.
  • the second acquisition module 112 is configured to acquire a reference image obtained by the image acquisition device using target shooting parameters to shoot a reference image.
  • the first determining module 113 is configured to determine the actual image height between the first target point and the second target point in the VR image based on the reference image.
  • the second determining module 114 is configured to determine the distortion parameter of the VR device according to the actual image height.
  • the first determining module 113 may be used to:
  • the first reference point and the second reference point are determined from a plurality of reference points included in the reference image.
  • the target distance between the first reference point and the second reference point is calculated.
  • the target distance is determined as the actual image height between the first target point and the second target point in the VR image.
  • the position of the first reference point in the reference image is the same as the position of the first target point in the VR image, and the position of the second reference point in the reference image is the same as the position of the second target point in the VR image.
  • FIG. 12 is a block diagram of a second determining module 114 according to an embodiment of the present disclosure.
  • the second determining module 114 may include:
  • the obtaining sub-module 1141 is used to obtain the actual object height between the first display point and the second display point in the display screen.
  • the third determining sub-module 1142 is used to determine the distortion parameters of the VR device according to the actual object height and the actual image height.
  • the position of the first display point in the display screen is the same as the position of the first target point in the VR image, and the position of the second display point in the display screen is the same as the position of the second target point in the VR image.
  • the third determining submodule 1142 may be used for:
  • the ideal image height between the first target point and the second target point is determined according to the actual object height, and the distortion parameters of the VR device are determined according to the ideal image height and the actual image height. Among them, the distortion parameter is positively correlated with the actual image height.
  • the distortion parameter may satisfy: (d1-d2)/d2.
  • d1 is the actual image height
  • d2 is the ideal image height.
  • the target image and the reference image may both be regular grid images or regular dot images.
  • the size of the reference image and the size of the target image may be the same.
  • the first obtaining module 111 may be used to obtain a VR image obtained by shooting the target image displayed on the display screen by the target shooting parameter by the image capturing device at a first distance from the imaging surface of the virtual reality device.
  • the second acquisition module 112 may be configured to acquire a reference image obtained by shooting the reference image by using target shooting parameters by the image acquisition device at a second distance from the reference image. The first distance and the second distance may be equal.
  • the target image may be a reference image displayed on the display screen of the VR device.
  • the embodiments of the present disclosure provide a device for determining distortion parameters of a display device. Since the device can obtain the VR image and the reference image displayed by the VR equipment captured by using the same target shooting parameters, the actual image height between any two target points in the VR image can be flexibly determined based on the reference image, and the actual image height between any two target points in the VR image can be flexibly determined based on the reference image.
  • the actual image is highly reliable to determine the distortion parameters of the VR device. Therefore, compared with the related technology to obtain the distortion parameter from the manufacturer, the accuracy of the distortion parameter determined by the distortion parameter determination device is higher, that is, the reliability of the determination of the distortion parameter is better.
  • FIG. 13 is a schematic structural diagram of another device for determining distortion parameters of a display device according to an embodiment of the present disclosure.
  • the apparatus may include: a processor 1301; and a memory 1302 configured to store executable instructions of the processor 1301.
  • the processor 1301 may be configured to execute the method for determining the distortion parameter of the display device as shown in FIG. 5 or FIG. 6.
  • FIG. 14 shows a structural block diagram of an apparatus 1400 for determining a distortion parameter of a display device according to an exemplary embodiment of the present disclosure.
  • the device 1400 may be a terminal such as a smart phone, a tablet computer, a notebook computer, or a desktop computer.
  • the device 1400 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the device 1400 may also be a server.
  • the device 1400 includes a processor 1401 and a memory 1402.
  • the processor 1401 may include one or more processing cores, such as a 4-core processor, a 14-core processor, and so on.
  • the processor 1401 can adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). accomplish.
  • the processor 1401 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the awake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1401 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
  • the processor 1401 may further include an AI (Artificial Intelligence) processor, which is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1402 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1402 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1402 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1401 to implement the distortion parameter determination method of the embodiment of the present disclosure.
  • the device 1400 may further include: a peripheral device interface 1403 and at least one peripheral device.
  • the processor 1401, the memory 1402, and the peripheral device interface 1403 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1403 through a bus, a signal line, or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 1404, a touch display screen 1405, a camera 1406, an audio circuit 1407, a positioning component 1408, and a power supply 1409.
  • the peripheral device interface 1403 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1401 and the memory 1402.
  • the processor 1401, the memory 1402, and the peripheral device interface 1403 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1401, the memory 1402, and the peripheral device interface 1403 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the radio frequency circuit 1404 is used to receive and transmit RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1404 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1404 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 1404 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks and/or WiFi (Wireless Fidelity, wireless fidelity) networks.
  • the radio frequency circuit 1404 may also include a circuit related to NFC (Near Field Communication), which is not limited in the present disclosure.
  • the display screen 1405 is used to display UI (User Interface).
  • the UI can include graphics, text, icons, videos, and any combination thereof.
  • the display screen 1405 also has the ability to collect touch signals on or above the surface of the display screen 1405.
  • the touch signal may be input to the processor 1401 as a control signal for processing.
  • the display screen 1405 can also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1405 may be one display screen 1405, which is provided with the front panel of the device 1400; in other embodiments, there may be at least two display screens 1405, which are respectively provided on different surfaces of the device 1400 or in a folded design; In still other embodiments, the display screen 1405 may be a flexible display screen, which is arranged on the curved surface or the folding surface of the device 1400. Furthermore, the display screen 1405 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 1405 may be made of materials such as LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, organic light-emitting diode).
  • the camera assembly 1406 is used to capture images or videos.
  • the camera assembly 1406 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • the camera assembly 1406 may also include a flash.
  • the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • the audio circuit 1407 may include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1401 for processing, or input to the radio frequency circuit 1404 to implement voice communication.
  • the microphone can also be an array microphone or an omnidirectional collection microphone.
  • the speaker is used to convert the electrical signal from the processor 1401 or the radio frequency circuit 1404 into sound waves.
  • the speaker can be a traditional thin-film speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for purposes such as distance measurement.
  • the audio circuit 1407 may also include a headphone jack.
  • the positioning component 1408 is used to locate the current geographic location of the device 1400 to implement navigation or LBS (Location Based Service, location-based service).
  • the positioning component 1408 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, the Granus system of Russia, or the Galileo system of the European Union.
  • the power supply 1409 is used to supply power to various components in the device 1400.
  • the power source 1409 may be alternating current, direct current, disposable batteries, or rechargeable batteries.
  • the rechargeable battery may support wired charging or wireless charging. Rechargeable batteries can also be used to support fast charging technology.
  • the device 1400 further includes one or more sensors 1410.
  • the one or more sensors 1410 include, but are not limited to: an acceleration sensor 1411, a gyroscope sensor 1412, a pressure sensor 1413, a fingerprint sensor 1414, an optical sensor 1415, and a proximity sensor 1416.
  • the acceleration sensor 1411 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the device 1400.
  • the acceleration sensor 1411 can be used to detect the components of gravitational acceleration on three coordinate axes.
  • the processor 1401 may control the touch screen 1405 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1411.
  • the acceleration sensor 1411 may also be used for the collection of game or user motion data.
  • the gyroscope sensor 1412 can detect the body direction and rotation angle of the device 1400, and the gyroscope sensor 1412 can cooperate with the acceleration sensor 1411 to collect the user's 3D actions on the device 1400. Based on the data collected by the gyroscope sensor 1412, the processor 1401 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1413 may be arranged on the side frame of the device 1400 and/or the lower layer of the touch screen 1405.
  • the processor 1401 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1413.
  • the processor 1401 operates according to the user's pressure on the touch display screen 1405 to control the operability controls on the UI interface.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1414 is used to collect the user's fingerprint.
  • the processor 1401 identifies the user's identity according to the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1414 identifies the user's identity according to the collected fingerprint. When it is recognized that the user's identity is a trusted identity, the processor 1401 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 1414 may be provided on the front, back, or side of the device 1400. When a physical button or a manufacturer logo is provided on the device 1400, the fingerprint sensor 1414 may be integrated with the physical button or the manufacturer logo.
  • the optical sensor 1415 is used to collect the ambient light intensity.
  • the processor 1401 may control the display brightness of the touch screen 1405 according to the intensity of the ambient light collected by the optical sensor 1415. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1405 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1405 is decreased.
  • the processor 1401 may also dynamically adjust the shooting parameters of the camera assembly 1406 according to the ambient light intensity collected by the optical sensor 1415.
  • the proximity sensor 1416 also called a distance sensor, is usually arranged on the front panel of the device 1400.
  • the proximity sensor 1416 is used to collect the distance between the user and the front of the device 1400.
  • the processor 1401 controls the touch screen 1405 to switch from the on-screen state to the off-screen state; when the proximity sensor 1416 detects When the distance between the user and the front of the device 1400 gradually increases, the processor 1401 controls the touch display screen 1405 to switch from the rest screen state to the bright screen state.
  • FIG. 14 does not constitute a limitation on the device 1400, and may include more or fewer components than shown in the figure, or combine certain components, or adopt different component arrangements.
  • a non-volatile computer-readable storage medium storing instructions, such as a memory including instructions.
  • the computer-readable storage medium runs on a computer, the computer can be caused to execute the method for determining the distortion parameter of the display device shown in FIG. 5 or FIG. 6.
  • the non-transitory computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本公开提供了一种显示设备及其畸变参数确定方法、装置、系统及存储介质,属于显示技术领域。由于该方法可以获取采用相同的目标拍摄参数拍摄得到的虚拟现实设备显示的VR图像和参考图像,可以基于该参考图像灵活确定VR图像中任意两个目标点之间的实际像高,且可以基于该实际像高可靠确定虚拟现实设备的畸变参数。因此相对于相关技术从制造厂商处获取畸变参数,采用该方法确定的畸变参数准确率较高,即确定畸变参数的可靠性较好。

Description

显示设备及其畸变参数确定方法、装置、系统及存储介质
本公开要求于2020年5月28日提交的申请号为202010466124.4、发明名称为“显示设备及其畸变参数确定方法、装置、系统及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本公开涉及显示技术领域,特别涉及一种显示设备及其畸变参数确定方法、装置、系统及存储介质。
背景技术
虚拟现实(virtual reality,VR)设备是一种能够形成虚拟现实视觉效果的显示设备。但受VR设备中光学镜片本身特性的影响,VR设备呈现给用户的VR图像会不可避免的发生畸变,即VR图像容易失真变形。
发明内容
本公开提供了一种显示设备及其畸变参数确定方法、装置、系统及存储介质,所述技术方案如下:
一方面,提供了一种显示设备的畸变参数的确定方法,所述显示设备为虚拟现实设备,所述虚拟现实设备包括显示屏,所述方法包括:
获取图像采集设备采用目标拍摄参数对所述显示屏显示的目标图像进行拍摄得到的VR图像;
获取所述图像采集设备采用所述目标拍摄参数对基准图像进行拍摄得到的参考图像;
根据所述参考图像,确定所述VR图像中第一目标点和第二目标点之间的实际像高;
根据所述实际像高,确定所述虚拟现实设备的畸变参数。
可选的,所述根据所述参考图像,确定所述VR图像中第一目标点和第二目标点之间的实际像高,包括:
从所述参考图像包括的多个参考点中确定第一参考点和第二参考点;
根据所述基准图像的尺寸,计算所述第一参考点和所述第二参考点之间的目标距离;
将所述目标距离确定为所述VR图像中第一目标点和第二目标点之间的实际像高;
其中,所述第一参考点在所述参考图像中的位置与所述第一目标点在所述VR图像中的位置相同,所述第二参考点在所述参考图像中的位置与所述第二目标点在所述VR图像中的位置相同。
可选的,所述根据所述参考图像,确定所述VR图像中第一目标点和第二目标点之间的实际像高,包括:
响应于图像编辑指令,叠加显示所述VR图像和所述参考图像,且所述VR图像和所述参考图像中位于靠近显示侧的图像的透明度大于透明度阈值;
接收基于叠加显示的图像输入的所述VR图像中第一目标点和第二目标点之间的实际像高。
可选的,所述根据所述实际像高,确定所述虚拟现实设备的畸变参数,包括:获取所述显示屏中第一显示点和第二显示点之间的实际物高;
根据所述实际物高和所述实际像高,确定所述虚拟现实设备的畸变参数;
其中,所述第一显示点在所述显示屏中的位置与所述第一目标点在所述VR图像中的位置相同,所述第二显示点在所述显示屏中的位置与所述第二目标点在所述VR图像中的位置相同。
可选的,所述获取所述显示屏中第一显示点和第二显示点之间的实际物高,包括:
接收所述显示屏的尺寸参数;
根据所述尺寸参数计算得到所述显示屏中第一显示点和第二显示点之间的实际物高。
可选的,所述根据所述实际物高和所述实际像高,确定所述虚拟现实设备的畸变参数,包括:
根据所述实际物高确定所述第一目标点和所述第二目标点间的理想像高;
根据所述理想像高和所述实际像高,确定所述虚拟现实设备的畸变参数,所述畸变参数与所述实际像高正相关。
可选的,所述畸变参数满足:
(d1-d2)/d2;
其中,所述d1为所述实际像高,所述d2为所述理想像高。
可选的,所述目标图像和所述基准图像均为规则网格图像或规则网点图像。
可选的,所述基准图像的尺寸与所述目标图像的尺寸相同。
可选的,所述获取图像采集设备采用目标拍摄参数对所述显示屏显示的目标图像进行拍摄得到的VR图像,包括:获取图像采集设备在距所述虚拟现实设备的成像面的第一距离处,采用目标拍摄参数对所述显示屏显示的目标图像进行拍摄得到的VR图像;
所述获取所述图像采集设备采用所述目标拍摄参数对基准图像进行拍摄得到的参考图像,包括:获取所述图像采集设备在距基准图像第二距离处,采用所述目标拍摄参数对所述基准图像进行拍摄得到的的参考图像;
其中,所述第一距离和所述第二距离相等。
可选的,所述目标图像为在所述虚拟现实设备的显示屏中显示的所述基准图像。
另一方面,提供了一种显示设备,所述显示设备为虚拟现实设备,所述虚拟显示设备的畸变参数由上述方面所述的显示设备的畸变参数的确定方法确定。
又一方面,提供了一种显示设备的畸变参数确定装置,所述显示设备为虚拟现实设备,所述虚拟现实设备包括显示屏,所述装置包括:
第一获取模块,用于获取图像采集设备采用目标拍摄参数对所述显示屏显示的目标图像进行拍摄得到的VR图像;
第二获取模块,用于获取所述图像采集设备采用所述目标拍摄参数对基准图像进行拍摄得到的参考图像;
第一确定模块,用于根根据所述参考图像,确定所述VR图像中第一目标点和第二目标点之间的实际像高;
第二确定模块,用于根据所述实际像高,确定所述虚拟现实设备的畸变参数。
再一方面,提供了一种显示设备的畸变参数确定装置,所述装置包括:处理器;被配置为存储所述处理器的可执行指令的存储器;其中,所述处理器被配置为:执行如上述方面所述的显示设备的畸变参数确定方法。
再一方面,提供了一种显示设备的畸变参数确定系统,所述系统包括:虚拟现实设备、图像采集设备和畸变参数确定设备,所述畸变参数确定设备包括如上述方面所述的显示设备的畸变参数确定装置,所述图像采集设备与所述畸变参数确定设备建立有通信连接;
其中,所述图像采集设备用于:采用目标拍摄参数对所述虚拟现实设备的显示屏中显示的目标图像进行拍摄得到VR图像,采用所述目标拍摄参数对基准图像进行拍摄得到参考图像,以及将所述VR图像和所述参考图像发送至所述畸变参数确定设备;
所述畸变参数确定设备用于:基于所述VR图像和所述参考图像确定所述虚拟现实设备的畸变参数。
可选的,所述图像采集设备拍摄所述目标图像时距所述虚拟现实设备的成像面的距离,与所述图像采集设备拍摄所述基准图像时距所述基准图像的距离相等。
可选的,所述图像采集设备拍摄所述基准图像时,所述基准图像在所述图像采集设备中的图像覆盖范围,不小于所述图像采集设备拍摄所述目标图像时,所述目标图像在所述图像采集设备中的图像覆盖范围。
可选的,所述目标拍摄参数包括:所述图像采集设备的焦距长度、光圈大小、快门时长、对比度和色彩饱和度中的至少一种。
可选的,所述图像采集设备集成于所述畸变参数确定设备中。
再一方面,提供了一种非易失性计算机可读存储介质,所述计算机可读存储介质中存储有指令,当所述计算机可读存储介质在计算机上运行时,使得计算机执行如上述方面所述的显示设备的畸变参数确定方法。
附图说明
为了更清楚地说明本公开实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本公开实施例提供的一种显示设备的畸变参数的确定方法所涉及的实施环境示意图;
图2是本公开实施例提供的一种拍摄目标图像的场景示意图;
图3是本公开实施例提供的一种拍摄基准图像的场景示意图;
图4是本公开实施例提供的一种包括参考图像和VR图像的示意图;
图5是本公开实施例提供的一种显示设备的畸变参数的确定方法流程图;
图6是本公开实施例提供的另一种显示设备的畸变参数的确定方法流程图;
图7是本公开实施例提供的一种确定实际像高的方法流程图;
图8是本公开实施例提供的另一种确定实际像高的方法流程图;
图9是本公开实施例提供的另一种包括参考图像和VR图像的示意图;
图10是本公开实施例提供的一种显示设备的结构示意图;
图11是本公开实施例提供的一种显示设备的畸变参数的确定装置的框图;
图12是本公开实施例提供的一种第二确定模块的框图;
图13是本公开实施例提供的另一种畸变参数的确定装置的结构示意图;
图14是本公开实施例提供的又一种畸变参数的确定装置的硬件结构图。
具体实施方式
为使本公开的目的、技术方案和优点更加清楚,下面将结合附图对本公开实施方式作进一步地详细描述。
图1是本公开实施例提供的一种显示设备的畸变参数的确定方法所涉及的实施环境示意图,即畸变参数的确定系统示意图。如图1所示,该系统可以包括:VR设备01、图像采集设备02(也可以称为拍摄装置)和畸变参数确定设备03(也可以称为处理装置)。该图像采集设备02可以通过有线或无线网络与畸变参数确定设备03建立有通信连接。该VR设备01可以包括显示屏。
可选的,当需要确定VR设备01的畸变参数时,VR设备01的显示屏中可以先显示一幅目标图像,然后图像采集设备02可以先采用目标拍摄参数对该目标图像进行拍摄得到一副清晰度较好的VR图像。然后,继续参考图1,可以再提供一副基准图像,并固定图像采集设备02的拍摄参数不变,即使得图像采集设备02继续采用目标拍摄参数对该基准图像进行拍摄得到一副清晰度较好的参考图像。当然,也可以先拍摄基准图像得到参考图像,再拍摄目标图像得到VR图像,本公开实施例对拍摄顺序不做限定。最后,图像采集设备02可以将拍摄得到的VR图像和参考图像发送至畸变参数确定设备03,以供畸变参数确定设 备03基于该VR图像和参考图像确定VR设备01的畸变参数。
可选的,基准图像可以为一实物的图像或一电子设备显示屏中显示的虚拟图像(也可以称为电子图像)。如,该电子设备可以为平板电脑或智能手机等具有显示屏的设备。需要说明的是,若基准图像为实物的图像,则拍摄得到的参考图像的清晰度可能会受外界环境光的影响,故为了尽可能确保拍摄得到的VR图像和参考图像清晰度一致,一般可以选择电子图像作为基准图像。
可选的,目标拍摄参数可以包括:图像采集设备02的焦距长度、光圈大小、快门时长、对比度和色彩饱和度中的至少一种。当然也不限于该几种拍摄参数,任何可能影响拍摄效果的参数均可以作为目标拍摄参数。
可选的,为确保拍摄得到的VR图像和参考图像尺寸的一致性,便于后续确定畸变参数。结合图2和图3,图像采集设备02在拍摄VR设备01显示屏中显示的目标图像时距VR设备的成像面(即,图2示出的VR虚像面)的距离L1,与图像采集设备02在拍摄基准图像时距基准图像的距离L2可以相等。即,距离L2可以为VR虚像距与VR设备01至图像采集设备02的距离之和。VR虚像距是指VR设备01显示屏到VR虚像面的距离,可由VR设备01的制造厂商提供或测量得到。
示例的,以先拍摄目标图像,后拍摄基准图像为例,为了使距离L1和L2相等,结合图2,可以先控制图像采集设备02在预设位置处拍摄VR设备01显示屏中显示的目标图像。拍摄完成后,可以固定图像采集设备02的位置不变,仅移走VR设备01,然后将基准图像移动至拍摄目标图像时,VR设备01的成像面位置处,并对基准图像进行拍摄。或者,也可以在拍摄完目标图像后,不移动VR设备01,而将图像采集设备02从预设位置移动至另一位置,并保证图像采集设备02在另一位置处拍摄基准图像时距基准图像的距离L2,与图像采集设备02拍摄目标图像时距VR设备的成像面的距离L1相等即可。
可选的,基准图像和VR设备01显示屏中显示的目标图像的内容可以根据实际需求选择,例如,参考图2,该VR设备01显示屏中显示的目标图像可以为网点图像。例如,参考图3,该基准图像可以为网格图像。
可选的,为确保拍摄得到的参考图像的尺寸不小于VR图像的尺寸,进一步便于后续确定畸变参数,图像采集设备02在拍摄基准图像时,基准图像在图像采集设备02中的图像覆盖范围,可以不小于图像采集设备02在拍摄VR设备 01显示屏中显示的目标图像时,目标图像在图像采集设备02中的图像覆盖范围。
示例的,以基准图像在图像采集设备02中的图像覆盖范围,等于目标图像在图像采集设备02中的图像覆盖范围,且基准图像位于VR设备01的成像面位置处为例。由于VR设备一般为对称的光学系统,因此基准图像的尺寸可以仅覆盖VR设备01显示屏的半视场,即在拍摄基准图像时,基准图像可以仅二分之一进入图像采集设备02的拍摄镜头内。或者,基准图像的尺寸可以仅覆盖VR设备01的显示屏半条对角线,即在拍摄基准图像时,基准图像可以仅四分之一进入图像采集设备02的拍摄镜头内。
可选的,VR设备01可以为任何形式的VR设备,如头戴式VR眼镜。图像采集设备02可以为任何能够拍摄图像的设备,如单反相机、智能手机或平板电脑。畸变参数确定设备03可以为移动终端,如外接计算机设备。
可选的,该图像采集设备02可以集成于畸变参数确定设备03中,即畸变参数确定设备03可以具备拍摄组件(即,图像采集设备02)。相应的,畸变参数确定设备03可以用于通过其包括的拍摄组件对目标图像拍摄得到VR图像,对基准图像拍摄得到参考图像,并基于VR图像和参考图像确定VR设备的畸变参数。也即是,畸变参数确定设备03自身可以直接获取图像,而无需接收另一独立的外接图像采集设备02发送的VR图像和参考图像。
可选的,在本公开实施例中,结合图2至图4,畸变参数确定设备03在获取到VR图像和参考图像后,可以通过层叠该两张图像可靠确定VR设备01的畸变参数。由于是通过对比两张在同一目标拍摄参数下拍摄得到的两幅图像,来确定VR设备01整机的畸变参数,因此无需消除图像采集设备02的畸变,即无需标定或校正图像采集设备02的畸变。通过图1所示系统确定VR设备01的畸变参数,不仅可靠性较高,且原理简单,操作方便。另,该系统可以基于畸变校正前,即反畸变前的VR图像进行畸变参数确定,或者,可以基于畸变校正后,即反畸变后的VR图像进行畸变参数确定,本公开实施例对此不做限定。
图5是本公开实施例提供的一种显示设备的畸变参数的确定方法流程图,其中,该显示设备可以为图1所示包括显示屏的VR设备01。该方法可以应用于图1所示的畸变参数确定设备03中。如图5所示,该方法可以包括:
步骤501、获取图像采集设备采用目标拍摄参数对显示屏显示的目标图像进 行拍摄得到的VR图像。
可选的,结合图1,该VR图像可以为图像采集设备采用目标拍摄参数拍摄VR设备的显示屏显示的目标图像得到的图像。即畸变参数确定设备可以接收图像采集设备发送的VR图像。
步骤502、获取图像采集设备采用目标拍摄参数对基准图像进行拍摄得到的参考图像。
可选的,结合图1,该参考图像可以为图像采集设备采用目标拍摄参数拍摄基准图像得到的图像。即畸变参数确定设备可以接收图像采集设备发送的参考图像。
步骤503、根据参考图像,确定VR图像中第一目标点和第二目标点之间的实际像高。
可选的,在获取到参考图像和VR图像后,畸变参数确定设备可以基于基准图像的尺寸自动计算得到VR图像中第一目标点和第二目标点之间的实际像高。或,畸变参数确定设备可以将VR图像和参考图像显示给用户,并接收用户基于该显示的两幅图像输入的VR图像中第一目标点和第二目标点之间的实际像高。
步骤504、根据实际像高,确定VR设备的畸变参数。
可选的,畸变参数确定设备可以基于确定的实际像高自动计算,以确定出VR设备的畸变参数。或,畸变参数确定设备还可以接收用户基于该实际像高输入的VR设备的畸变参数。
综上所述,本公开实施例提供了一种显示设备的畸变参数的确定方法。由于该方法可以获取采用相同的目标拍摄参数拍摄得到的VR设备显示的VR图像和参考图像,可以基于该参考图像灵活确定VR图像中任意两个目标点之间的实际像高,且可以基于该实际像高可靠确定VR设备的畸变参数。因此相对于相关技术从制造厂商处获取畸变参数,采用该畸变参数确定方法确定的畸变参数准确率较高,即确定畸变参数的可靠性较好。
图6是本公开实施例提供的另一种显示设备的畸变参数的确定方法流程图,可以应用于图1所示的畸变参数确定设备03中。如图6所示,该方法可以包括:
步骤601、获取图像采集设备采用目标拍摄参数对显示屏显示的目标图像进行拍摄得到的VR图像。
可选的,如上述结合图1的实施例记载,畸变参数确定设备可以接收图像采集设备发送的VR图像。即该VR图像可以为图像采集设备在距虚拟现实设备的成像面第一距离处,采用目标拍摄参数对显示屏显示的目标图像进行拍摄得到的图像。需要说明的是,本公开实施例获取到的VR图像仅是指显示于VR设备的显示屏的图像,而并不是指用户通过VR设备观看到的立体图像。
步骤602、获取图像采集设备采用目标拍摄参数对基准图像进行拍摄得到的参考图像。
可选的,如上述结合图1的实施例记载,畸变参数确定设备还可以接收图像采集设备发送的参考图像。即该参考图像可以为图像采集设备在距基准图像第二距离处,采用目标拍摄参数对基准图像进行拍摄得到的图像。且如上述结合图2和图3实施例记载,为了确保拍摄得到的VR图像和参考图像尺寸的一致性,便于后续确定畸变参数,步骤601图像采集设备拍摄目标图像得到VR图像时,距显示该目标图像的VR设备的成像面的第一距离与该第二距离可以相等。
其中,该目标拍摄参数的可选类型,以及图像采集设备拍摄得到VR图像和参考图像的可选实现方式均可以参考上述实施例记载,在此不再赘述。
需要说明的是,步骤601中拍摄的目标图像可以为显示屏中显示的基准图像,即可以仅提供一副基准图像。在需要获取VR图像时,可以将该基准图像显示于VR设备的显示屏中作为目标图像,并采用图像采集设备对该目标图像(即显示于VR设备显示屏中的基准图像)进行拍摄得到VR图像。在需要获取参考图像时,可以采用图像采集设备直接对该基准图像进行拍摄得到参考图像。
还需要说明的是,参考上述图4,因图像采集设备中拍摄镜头自身特性的影响,图像采集设备拍摄得到的VR图像和参考图像均可能为发生畸变的图像。但在本公开实施例中,可以直接基于该参考图像确定VR设备的畸变参数,而无需基于该产生畸变的参考图像对图像采集设备进行畸变校正,方法较为简单。
步骤603、根据参考图像,确定VR图像中第一目标点和第二目标点之间的实际像高。
可选的,第一目标点和第二目标点可以为VR图像中的任意两点。该VR图像可以由呈矩阵状排布的多个像素点组成,VR图像中的每个目标点可以仅包括一个像素点,或,VR图像中的每个目标点可以包括两个或两个以上像素点。且第一目标点和第二目标点之间的实际像高可以是指第一目标点和第二目标点之 间的最短距离(如,直线距离)。
可选的,畸变参数确定设备可以基于基准图像的尺寸,采用图像识别算法直接确定出两个目标点的位置,以及计算出该两个目标点之间的实际像高。或者,畸变参数确定设备也可以将获取到的两幅图像显示给用户(如,操作人员),并接收用户基于显示的图像输入的实际像高。
作为一种可选的实现方式:图7以畸变参数确定设备直接计算实际像高为例,示出了一种确定实际像高的方法流程图。如图7所示,该方法可以包括:
步骤6031A、从参考图像包括的多个参考点中确定第一参考点和第二参考点。
可选的,第一参考点和第二参考点可以为参考图像中的任意两点。该参考图像也可以由呈矩阵状排布的多个像素点组成,该参考图像中的每个参考点可以仅包括一个像素点,或,每个参考点可以包括两个或两个以上像素点。
其中,第一参考点在参考图像中的位置与第一目标点在VR图像中的位置相同,第二参考点在参考图像中的位置与第二目标点在VR图像中的位置相同。即,若层叠VR图像和参考图像,第一参考点和第一目标点重叠,第二参考点和第二目标点重叠。且,第一参考点包括的像素点的数量和第一目标点包括的像素点的数量可以相同,第二参考点包括的像素点的数量和第二目标点包括的像素点的数量可以相同。
需要说明的是,可以用参考点或目标点在其所属图像中的坐标指代参考点或目标点在其所属图像中的位置。即,第一参考点在参考图像中的位置与第一目标点在VR图像中的位置相同可以是指:第一参考点在参考图像中的坐标与第一目标点在VR图像中的位置坐标相同。第二参考点和第二目标点同理。
在本公开实施例中,畸变参数确定设备在获取到参考图像和VR图像后,可以通过先比对该两幅图像,以在参考图像包括的多个参考点中确定出第一参考点和第二参考点。即,找出VR图像和参考图像相同位置处对应的两点。
步骤6032A、根据基准图像的尺寸,计算第一参考点和第二参考点之间的目标距离。
可选的,畸变参数确定设备可以在获取基准图像后计算基准图像的尺寸。或,畸变参数确定设备可以直接接收用户输入的基准图像的尺寸。且,对于畸变参数确定设备主动计算基准图像尺寸,畸变参数确定设备需要预先获取到基 准图像的排布规律(如,长宽比例)。即,该畸变参数确定设备要么已存储基准图像的尺寸,要么已存储基准图像的排布规律。相应的,该基准图像也可以称为内部尺寸已知的图像或内部排布规律已知的图像。
畸变参数确定设备在确定出第一参考点和第二参考点后,即可以基于确定的基准图像的尺寸,计算出第一参考点和第二参考点之间的目标距离。该目标距离可以为第一参考点和第二参考点之间的最短距离。
可选的,为了便于畸变参数确定设备计算目标距离,基准图像可以为规则图像。例如,基准图像可以为图2所示的规则网格图像。
步骤6033A、将目标距离确定为VR图像中第一目标点和第二目标点之间的实际像高。
由于第一参考点和第一目标点为两幅图像同一位置处的两点,第二参考点和第二目标点为两幅图像同一位置处的两点,因此畸变参数确定设备可以直接将第一参考点和第二参考点之间的最短距离(即目标距离),确定为第一目标点和第二目标点之间的最短距离。又由于第一目标点和第二目标点之间的实际像高是指第一目标点和第二目标点之间的最短距离,因此即可以将第一参考点和第二参考点之间的目标距离,确定为第一目标点和第二目标点之间的实际像高。也即是,可以将参考图像作为一基准坐标,并基于该基准坐标找到VR图像各目标点的坐标,且计算出各目标点之间的距离(即实际像高)。
需要说明的是,畸变参数确定设备在获取到参考图像和VR图像后,可以直接执行上述步骤6031A至6033A。或者,畸变参数确定设备在接收到畸变参数确定指令时,可以再依次执行上述步骤6031A至6033A。可选的,该畸变参数确定指令可以由用户通过触发畸变参数确定设备上显示的某控件生成。另,由于上述步骤6031A至6033A是由畸变参数确定设备自动执行,故,畸变参数确定设备无需如图4所示,将VR图像和参考图像层叠并显示出来。
作为另一种可选的实现方式:图8以畸变参数确定设备获取用户输入的实际像高为例,示出了一种确定实际像高的方法流程图。如图8所示,该方法可以包括:
步骤6031B、叠加显示VR图像和参考图像,且设置VR图像和参考图像中位于靠近显示侧的图像的透明度大于透明度阈值。
为便于用户观看以计算距离,畸变参数确定设备可以叠加显示VR图像和参 考图像,且设置VR图像和参考图像中位于靠近显示侧的图像的透明度大于透明度阈值,即使得上层图像的透明度大于下层图像的透明度。其中,该透明度阈值可以为畸变参数确定设备中预先配置的,或,在确定畸变参数时接收用户实时输入的。预先配置可以为出厂时由开发人员配置,或,在确定畸变参数之前,由操作人员即时配置的。
可选的,畸变参数确定设备可以是在接收到图像编辑指令时,基于该图像编辑指令携带的编辑需求显示VR图像和参考图像。如该编辑需求可以为叠加显示VR图像和参考图像,且设置VR图像和参考图像中位于靠近显示侧的图像的透明度大于透明度阈值。或者,畸变参数确定设备可以在接收到畸变参数确定指令或获取到VR图像和参考图像后,自动叠加显示VR图像和参考图像,并设置VR图像和参考图像中位于靠近显示侧的图像的透明度大于透明度阈值。另,该图像编辑指令和该畸变参数确定指令也可以均由用户通过触发畸变参数确定设备上显示的某控件生成。
步骤6032B、接收用户基于叠加显示的图像输入的VR图像中第一目标点和第二目标点之间的实际像高。
对于用户而言,结合图4,其可以先通过比较层叠显示的VR图像和参考图像,确定出参考图像中的第一参考点和第二参考点,然后再基于基准图像的尺寸计算出第一参考点和第二参考点之间的目标距离,最后将该目标距离作为实际像高输入至畸变参数确定设备。即,畸变参数确定设备可以接收用户基于叠加显示的图像输入的VR图像中第一目标点和第二目标点之间的实际像高。
可选的,用户可以已知基准图像的尺寸或已知排布规律,并基于排布规律计算出基准图像的尺寸。或,用户可以通过工具测量得到基准图像的尺寸。
可选的,为了进一步便于用户比较两幅图像,畸变参数确定设备显示给用户的VR图像和参考图像的尺寸可以相同,颜色可以不同,和/或,结合图2和图3,VR图像和参考图像中,一副图像可以为规则网格图像,另一幅图像可以为规则网点图像。相应的,也即需要VR设备显示屏最初显示的目标图像和基准图像尺寸相同且颜色不同,和/或,目标图像和基准图像中,一副图像为规则网格图像,另一幅图像为规则网点图像。另,目标图像和基准图像类似,也可以为内部尺寸已知的图像或内部排布规律已知的图像。
示例的,图9以目标图像为内部尺寸已知的规则网点图像,基准图像为内 部尺寸已知的规则网格图像,即,拍摄得到的VR图像为规则网点图像,参考图像为内部尺寸已知的规则网格图像为例,示出了一种包括VR图像和参考图像的示意图。结合图9,假设确定出的与VR图像中第一目标点位于相同位置的参考图像中的第一参考点为p1点,与VR图像中第二目标点位于相同位置的参考图像中的第二参考点为p2点,且基准图像中每个小网格为边长为a的正方形。则基于勾股定理可以计算得出:第一参考点p1和第二参考点p2的目标距离(即最短距离)则为
Figure PCTCN2021090760-appb-000001
进而,最终确定的VR图像中的第一目标点和第二目标点之间的实际像高即为
Figure PCTCN2021090760-appb-000002
需要说明的是,由于图9示出的VR图像中两个目标点,正好位于参考图像中某个小网格的交点处,所以可以直接采用勾股定理计算。而若目标点不位于某小网格的交点处时,则可以通过估读或二分法确定实际像高。且,对于其他位置处任意两个目标点之间的实际像高确定方式可以参考第一目标点和第二目标点之间实际像高的确定方式,在此不再赘述。
步骤604、获取显示屏中第一显示点和第二显示点之间的实际物高。
可选的,该第一显示点和第二显示点之间的最短距离可以为该第一显示点和第二显示点之间的实际物高。该实际物高可以由用户从制造厂商处获取或通过工具测量得到,并输入至畸变参数确定设备。即,畸变参数确定设备可以接收用户输入的显示屏中第一显示点和第二显示点之间的实际物高。或者,用户可以仅输入显示屏的尺寸相关参数至畸变参数确定设备,由畸变参数确定设备计算得到任意两个显示点之间的实际物高。再或者,VR设备中可以预先存储有其显示屏的尺寸相关参数,VR设备和畸变参数确定设备也可以建立有通信连接,VR设备可以将其显示屏的尺寸相关参数发送至畸变参数确定设备,并由畸变参数确定设备计算得到任意两个显示点之间的实际物高。
其中,第一显示点和第二显示点可以为显示屏的任意两点。该显示屏也可以包括呈矩阵状排布的多个像素点组成,显示屏中的每个显示点可以仅包括一个像素点,或,每个显示点可以包括两个或两个以上像素点。
其中,第一显示点在显示屏中的坐标与第一目标点在VR图像中的位置可以相同,第二显示点在显示屏中的坐标与第二目标点在VR图像中的位置可以相同。第一显示点包括的像素点数量和第一目标点包括的像素点数量可以相同,第二显示点包括的像素点数量和第二目标点包括的像素点数量可以相同。同上述对 参考点和目标点位置相同的记载,此处的位置相同也可以理解为坐标相同。
需要说明的是,任意两个目标点之间的实际像高,均对应有相同位置处两个显示点之间的实际物高。故最终得到的实际像高和实际物高之间的对应关系即可以称为畸变关系。
步骤605、根据实际物高确定第一目标点和第二目标点间的理想像高。
可选的,畸变参数确定设备可以基于实际物高进一步自动计算出第一目标点和第二目标点间的理想像高。或者,畸变参数确定设备可以将实际物高显示给用户,由用户基于实际物高计算理想像高并输入至畸变参数确定设备,即畸变参数确定设备可以接收用户输入的第一目标点和第二目标点间的理想像高。
步骤606、根据理想像高和实际像高,确定VR设备的畸变参数。
可选的,畸变参数确定设备可以基于确定的理想像高和实际像高,自动计算VR设备的畸变参数。或者,畸变参数确定设备可以将理想像高和实际像高显示给用户,由用户基于理想像高和实际像高计算畸变参数并输入至畸变参数确定设备,即畸变参数确定设备可以接收用户输入的VR设备的畸变参数。
其中,畸变参数与实际像高可以呈正相关,即,实际像高越大,畸变参数越大,进而畸变程度越大;实际像高越小,畸变参数越小,进而畸变程度越小。如该畸变参数J1可以满足:J1=(d1-d2)/d2,d1为实际像高,d2为理想像高。
或者,畸变参数确定设备还可以基于实际像高和VR设备的虚像距确定畸变后VR设备的视场角,并由中心视场附近的实际像高代替理想像高,确定出VR设备畸变参数。本公开实施例对基于实际像高确定畸变参数的实现方式不做限定,即可以基于实际像高确定任何所需的畸变参数。另,上述实施例记载的畸变参数指影响最终呈现给用户的VR图像出现畸变的任何参数。
需要说明的是,畸变参数确定设备基于VR图像中不同距离的目标点之间的实际像高确定的VR设备的畸变参数不同,即VR设备可以包括多个畸变参数。上述实施例仅以基于VR图像中第一目标点和第二目标点之间的实际像高,确定的VR设备的一个畸变参数为例进行说明。对于基于其他任意两个目标点之间的实际像高确定其他畸变参数的方法可以参考上述方法,在此不再赘述。
例如,假设以第一目标点为中心点,需要基于与该第一目标点距离不同的其他三个目标点(如,第二目标点、第三目标点、第四目标点)与该第一目标点的实际像高确定VR设备3个不同的畸变参数。则可以先分别获取第一目标点 和第二目标点之间的实际像高、第一目标点和第三目标点之间的实际像高、第一目标点和第四目标点之间的实际像高。再分别获取VR设备显示屏中与第一目标点位置相同的第一参考点以及与第二目标点位置相同的第二参考点之间的实际物高,第一参考点以及与第三目标点位置相同的第三参考点之间的实际物高,第一参考点以及与第四目标点位置相同的第四参考点之间的实际物高。最后再分别基于对应点的实际像高和实际物高计算出3个不同大小的畸变参数。
还需要说明的是,在确定出VR设备的畸变参数后,畸变参数确定设备可以将该畸变参数发送至VR设备,以使VR设备可以获取到其自身的畸变参数。之后,VR设备可以基于其自身的畸变参数对显示给用户的VR图像进行畸变校正,如反畸变处理。以避免显示的VR图像失真或变形,从而有效改善用户观看体验。
还需要说明的是,本公开实施例提供的畸变参数的确定方法步骤的先后顺序可以进行适当调整,步骤也可根据情况进行相应增减,例如步骤601和步骤602可以同步执行。任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,可轻易想到变化的方法,都应涵盖在本公开的保护范围之内,不再赘述。
综上所述,本公开实施例提供了一种显示设备的畸变参数的确定方法。由于该方法可以获取采用相同的目标拍摄参数拍摄得到的VR设备显示的VR图像和参考图像,可以基于该参考图像灵活确定VR图像中任意两个目标点之间的实际像高,且可以基于该实际像高可靠确定VR设备的畸变参数。因此相对于相关技术从制造厂商处获取畸变参数,采用该畸变参数确定方法确定的畸变参数准确率较高,即确定畸变参数的可靠性较好。
图10是本公开实施例提供的一种显示设备的结构示意图。如图10所示,该显示设备可以为VR设备01,且该VR设备01的畸变参数可以由5或图6所示的显示设备的畸变参数确定方法确定。
图11是本公开实施例提供的一种显示设备的畸变参数确定装置框图,该显示设备为VR设备,且该VR设备包括显示屏。如图11所示,该装置可以包括:
第一获取模块111,用于获取图像采集设备采用目标拍摄参数对显示屏显示的目标图像进行拍摄得到的VR图像。
第二获取模块112,用于获取图像采集设备采用目标拍摄参数对基准图像进 行拍摄得到的参考图像。
第一确定模块113,用于根根据参考图像,确定VR图像中第一目标点和第二目标点之间的实际像高。
第二确定模块114,用于根据实际像高,确定VR设备的畸变参数。
可选的,该第一确定模块113可以用于:
从参考图像包括的多个参考点中确定第一参考点和第二参考点。
根据基准图像的尺寸,计算第一参考点和第二参考点之间的目标距离。
将目标距离确定为VR图像中第一目标点和第二目标点之间的实际像高。
其中,第一参考点在参考图像中的位置与第一目标点在VR图像中的位置相同,第二参考点在参考图像中的位置与第二目标点在VR图像中的位置相同。
可选的,图12是本公开实施例提供的一种第二确定模块114的框图。如图12所示,该第二确定模块114可以包括:
获取子模块1141,用于获取显示屏中第一显示点和第二显示点之间的实际物高。
第三确定子模块1142,用于根据实际物高和实际像高,确定VR设备的畸变参数。
其中,第一显示点在显示屏中的位置与第一目标点在VR图像中的位置相同,第二显示点在显示屏中的位置与第二目标点在VR图像中的位置相同。
可选的,第三确定子模块1142,可以用于:
根据实际物高确定第一目标点和第二目标点间的理想像高,并根据理想像高和实际像高,确定VR设备的畸变参数。其中,畸变参数与实际像高正相关。
可选的,畸变参数可以满足:(d1-d2)/d2。其中,d1为实际像高,d2为理想像高。
可选的,目标图像和基准图像可以均为规则网格图像或规则网点图像。
可选的,基准图像的尺寸与目标图像的尺寸可以相同。
可选的,第一获取模块111可以用于:获取图像采集设备在距虚拟现实设备的成像面的第一距离处,采用目标拍摄参数对显示屏显示的目标图像进行拍摄得到的VR图像。第二获取模块112可以用于:获取图像采集设备在距基准图像第二距离处,采用目标拍摄参数对基准图像进行拍摄得到的的参考图像。第一距离和第二距离可以相等。
可选的,目标图像可以为在VR设备的显示屏中显示的基准图像。
综上所述,本公开实施例提供了一种显示设备的畸变参数的确定装置。由于该装置可以获取采用相同的目标拍摄参数拍摄得到的VR设备显示的VR图像和参考图像,可以基于该参考图像灵活确定VR图像中任意两个目标点之间的实际像高,且可以基于该实际像高可靠确定VR设备的畸变参数。因此相对于相关技术从制造厂商处获取畸变参数,采用该畸变参数确定装置确定的畸变参数准确率较高,即确定畸变参数的可靠性较好。
图13是本公开实施例提供的另一种显示设备的畸变参数确定装置的结构示意图。如图13所示,该装置可以包括:处理器1301;被配置为存储该处理器1301的可执行指令的存储器1302。其中,处理器1301可以被配置为:执行如图5或图6所示的显示设备的畸变参数确定方法。
图14示出了本公开一个示例性实施例提供的显示设备的畸变参数确定装置1400的结构框图。该设备1400可以是:智能手机、平板电脑、笔记本电脑或台式电脑等终端。设备1400还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。或,设备1400也可以是服务器。通常,设备1400包括有处理器1401和存储器1402。
其中,处理器1401可以包括一个或多个处理核心,比如4核心处理器、14核心处理器等。处理器1401可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1401也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1401可以集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1401还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1402可以包括一个或多个计算机可读存储介质,该计算机可读存储 介质可以是非暂态的。存储器1402还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1402中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1401所执行以实现本公开实施例的畸变参数确定方法。
在一些实施例中,设备1400还可以包括有:外围设备接口1403和至少一个外围设备。处理器1401、存储器1402和外围设备接口1403之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1403相连。具体地,外围设备包括:射频电路1404、触摸显示屏1405、摄像头1406、音频电路1407、定位组件1408和电源1409中的至少一种。
外围设备接口1403可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1401和存储器1402。在一些实施例中,处理器1401、存储器1402和外围设备接口1403被集成在同一芯片或电路板上;在一些其他实施例中,处理器1401、存储器1402和外围设备接口1403中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1404用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1404通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1404将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路1404包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1404可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:城域网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1404还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本公开对此不加以限定。
显示屏1405用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏1405是触摸显示屏时,显示屏1405还具有采集在显示屏1405的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1401进行处理。此时,显示屏1405还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1405可以为一个,设置设备1400的前面板;在另一些实施例中, 显示屏1405可以为至少两个,分别设置在设备1400的不同表面或呈折叠设计;在再一些实施例中,显示屏1405可以是柔性显示屏,设置在设备1400的弯曲表面上或折叠面上。甚至,显示屏1405还可以设置成非矩形的不规则图形,也即异形屏。显示屏1405可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1406用于采集图像或视频。可选地,摄像头组件1406包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1406还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路1407可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1401进行处理,或者输入至射频电路1404以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在设备1400的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1401或射频电路1404的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1407还可以包括耳机插孔。
定位组件1408用于定位设备1400的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1408可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统、俄罗斯的格雷纳斯系统或欧盟的伽利略系统的定位组件。
电源1409用于为设备1400中的各个组件进行供电。电源1409可以是交流电、直流电、一次性电池或可充电电池。当电源1409包括可充电电池时,该可充电电池可以支持有线充电或无线充电。可充电电池还可以用于支持快充技术。
在一些实施例中,设备1400还包括有一个或多个传感器1410。该一个或多个传感器1410包括但不限于:加速度传感器1411、陀螺仪传感器1412、压力传感器1413、指纹传感器1414、光学传感器1415以及接近传感器1416。
加速度传感器1411可以检测以设备1400建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1411可以用于检测重力加速度在三个坐标轴上的分量。处理器1401可以根据加速度传感器1411采集的重力加速度信号,控制触摸显示屏1405以横向视图或纵向视图进行用户界面的显示。加速度传感器1411还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1412可以检测设备1400的机体方向及转动角度,陀螺仪传感器1412可以与加速度传感器1411协同采集用户对设备1400的3D动作。处理器1401根据陀螺仪传感器1412采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1413可以设置在设备1400的侧边框和/或触摸显示屏1405的下层。当压力传感器1413设置在设备1400的侧边框时,可以检测用户对设备1400的握持信号,由处理器1401根据压力传感器1413采集的握持信号进行左右手识别或快捷操作。当压力传感器1413设置在触摸显示屏1405的下层时,由处理器1401根据用户对触摸显示屏1405的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1414用于采集用户的指纹,由处理器1401根据指纹传感器1414采集到的指纹识别用户的身份,或者,由指纹传感器1414根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1401授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1414可以被设置设备1400的正面、背面或侧面。当设备1400上设置有物理按键或厂商Logo时,指纹传感器1414可以与物理按键或厂商Logo集成在一起。
光学传感器1415用于采集环境光强度。在一个实施例中,处理器1401可以根据光学传感器1415采集的环境光强度,控制触摸显示屏1405的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏1405的显示亮度;当环境光强 度较低时,调低触摸显示屏1405的显示亮度。在另一个实施例中,处理器1401还可以根据光学传感器1415采集的环境光强度,动态调整摄像头组件1406的拍摄参数。
接近传感器1416,也称距离传感器,通常设置在设备1400的前面板。接近传感器1416用于采集用户与设备1400的正面之间的距离。在一个实施例中,当接近传感器1416检测到用户与设备1400的正面之间的距离逐渐变小时,由处理器1401控制触摸显示屏1405从亮屏状态切换为息屏状态;当接近传感器1416检测到用户与设备1400的正面之间的距离逐渐变大时,由处理器1401控制触摸显示屏1405从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图14中示出的结构并不构成对设备1400的限定,可以包括比图示更多或更少的组件,或组合某些组件,或采用不同的组件布置。在示例性实施例中,还提供了一种存储有指令的非易失性计算机可读存储介质,例如包括指令的存储器。当计算机可读存储介质在计算机上运行时,可以使得计算机执行上述图5或图6所示的显示设备的畸变参数确定方法。
例如,非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
以上所述仅为本公开的可选实施例,并不用以限制本公开,凡在本公开的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本公开的保护范围之内。

Claims (20)

  1. 一种显示设备的畸变参数的确定方法,所述显示设备为虚拟现实设备,所述虚拟现实设备包括显示屏,所述方法包括:
    获取图像采集设备采用目标拍摄参数对所述显示屏显示的目标图像进行拍摄得到的VR图像;
    获取所述图像采集设备采用所述目标拍摄参数对基准图像进行拍摄得到的参考图像;
    根据所述参考图像,确定所述VR图像中第一目标点和第二目标点之间的实际像高;
    根据所述实际像高,确定所述虚拟现实设备的畸变参数。
  2. 根据权利要求1所述的方法,其中,所述根据所述参考图像,确定所述VR图像中第一目标点和第二目标点之间的实际像高,包括:
    从所述参考图像包括的多个参考点中确定第一参考点和第二参考点;
    根据所述基准图像的尺寸,计算所述第一参考点和所述第二参考点之间的目标距离;
    将所述目标距离确定为所述VR图像中第一目标点和第二目标点之间的实际像高;
    其中,所述第一参考点在所述参考图像中的位置与所述第一目标点在所述VR图像中的位置相同,所述第二参考点在所述参考图像中的位置与所述第二目标点在所述VR图像中的位置相同。
  3. 根据权利要求1所述的方法,其中,所述根据所述参考图像,确定所述VR图像中第一目标点和第二目标点之间的实际像高,包括:
    响应于图像编辑指令,叠加显示所述VR图像和所述参考图像,且所述VR图像和所述参考图像中位于靠近显示侧的图像的透明度大于透明度阈值;
    接收基于叠加显示的图像输入的所述VR图像中第一目标点和第二目标点之间的实际像高。
  4. 根据权利要求1至3任一所述的方法,其中,所述根据所述实际像高,确定所述虚拟现实设备的畸变参数,包括:
    获取所述显示屏中第一显示点和第二显示点之间的实际物高;
    根据所述实际物高和所述实际像高,确定所述虚拟现实设备的畸变参数;
    其中,所述第一显示点在所述显示屏中的位置与所述第一目标点在所述VR图像中的位置相同,所述第二显示点在所述显示屏中的位置与所述第二目标点在所述VR图像中的位置相同。
  5. 根据权利要求4所述的方法,其中,所述获取所述显示屏中第一显示点和第二显示点之间的实际物高,包括:
    接收所述显示屏的尺寸参数;
    根据所述尺寸参数计算得到所述显示屏中第一显示点和第二显示点之间的实际物高。
  6. 根据权利要求4所述的方法,其中,所述根据所述实际物高和所述实际像高,确定所述虚拟现实设备的畸变参数,包括:
    根据所述实际物高确定所述第一目标点和所述第二目标点间的理想像高;
    根据所述理想像高和所述实际像高,确定所述虚拟现实设备的畸变参数,所述畸变参数与所述实际像高正相关。
  7. 根据权利要求6所述的方法,其中,所述畸变参数满足:
    (d1-d2)/d2;
    其中,所述d1为所述实际像高,所述d2为所述理想像高。
  8. 根据权利要求1至7任一所述的方法,其中,所述目标图像和所述基准图像均为规则网格图像或规则网点图像。
  9. 根据权利要求1至8任一所述的方法,其中,所述基准图像的尺寸与所述目标图像的尺寸相同。
  10. 根据权利要求1至9任一所述的方法,其中,所述获取图像采集设备采用目标拍摄参数对所述显示屏显示的目标图像进行拍摄得到的VR图像,包括:获取图像采集设备在距所述虚拟现实设备的成像面的第一距离处,采用目标拍摄参数对所述显示屏显示的目标图像进行拍摄得到的VR图像;
    所述获取所述图像采集设备采用所述目标拍摄参数对基准图像进行拍摄得到的参考图像,包括:获取所述图像采集设备在距基准图像第二距离处,采用所述目标拍摄参数对所述基准图像进行拍摄得到的的参考图像;
    其中,所述第一距离和所述第二距离相等。
  11. 根据权利要求1至10任一所述的方法,其中,所述目标图像为在所述虚拟现实设备的显示屏中显示的所述基准图像。
  12. 一种显示设备,其中,所述显示设备为虚拟现实设备,所述虚拟显示设备的畸变参数由权利要求1至11任一所述的显示设备的畸变参数的确定方法确定。
  13. 一种显示设备的畸变参数确定装置,其中,所述显示设备为虚拟现实设备,所述虚拟现实设备包括显示屏,所述装置包括:
    第一获取模块,用于获取图像采集设备采用目标拍摄参数对所述显示屏显示的目标图像进行拍摄得到的VR图像;
    第二获取模块,用于获取所述图像采集设备采用所述目标拍摄参数对基准图像进行拍摄得到的参考图像;
    第一确定模块,用于根根据所述参考图像,确定所述VR图像中第一目标点和第二目标点之间的实际像高;
    第二确定模块,用于根据所述实际像高,确定所述虚拟现实设备的畸变参数。
  14. 一种显示设备的畸变参数确定装置,其中,所述装置包括:处理器;被配置为存储所述处理器的可执行指令的存储器;其中,所述处理器被配置为:执行如权利要求1至11任一所述的显示设备的畸变参数确定方法。
  15. 一种显示设备的畸变参数确定系统,其中,所述系统包括:虚拟现实设备、图像采集设备和畸变参数确定设备,所述畸变参数确定设备包括如权利要求13或14所述的显示设备的畸变参数确定装置,所述图像采集设备与所述畸变参数确定设备建立有通信连接;
    其中,所述图像采集设备用于:采用目标拍摄参数对所述虚拟现实设备的显示屏中显示的目标图像进行拍摄得到VR图像,采用所述目标拍摄参数对基准图像进行拍摄得到参考图像,以及将所述VR图像和所述参考图像发送至所述畸变参数确定设备;
    所述畸变参数确定设备用于:基于所述VR图像和所述参考图像确定所述虚拟现实设备的畸变参数。
  16. 根据权利要求15所述的系统,其中,所述图像采集设备拍摄所述目标图像时距所述虚拟现实设备的成像面的距离,与所述图像采集设备拍摄所述基准图像时距所述基准图像的距离相等。
  17. 根据权利要求15或16所述的系统,其中,所述图像采集设备拍摄所述基准图像时,所述基准图像在所述图像采集设备中的图像覆盖范围,不小于所述图像采集设备拍摄所述目标图像时,所述目标图像在所述图像采集设备中的图像覆盖范围。
  18. 根据权利要求15至17任一所述的系统,其中,所述目标拍摄参数包括:所述图像采集设备的焦距长度、光圈大小、快门时长、对比度和色彩饱和度中的至少一种。
  19. 根据权利要求15至18任一所述的系统,其中,所述图像采集设备集成于所述畸变参数确定设备中。
  20. 一种非易失性计算机可读存储介质,其中,所述计算机可读存储介质中存储有指令,当所述计算机可读存储介质在计算机上运行时,使得计算机执行 如权利要求1至11任一所述的显示设备的畸变参数确定方法。
PCT/CN2021/090760 2020-05-28 2021-04-28 显示设备及其畸变参数确定方法、装置、系统及存储介质 WO2021238564A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010466124.4 2020-05-28
CN202010466124.4A CN111565309B (zh) 2020-05-28 2020-05-28 显示设备及其畸变参数确定方法、装置、系统及存储介质

Publications (1)

Publication Number Publication Date
WO2021238564A1 true WO2021238564A1 (zh) 2021-12-02

Family

ID=72072415

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/090760 WO2021238564A1 (zh) 2020-05-28 2021-04-28 显示设备及其畸变参数确定方法、装置、系统及存储介质

Country Status (2)

Country Link
CN (1) CN111565309B (zh)
WO (1) WO2021238564A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111565309B (zh) * 2020-05-28 2022-08-19 京东方科技集团股份有限公司 显示设备及其畸变参数确定方法、装置、系统及存储介质
CN112333385B (zh) * 2020-10-28 2022-02-22 维沃移动通信有限公司 电子防抖控制方法及装置
CN113625895A (zh) * 2021-07-02 2021-11-09 北京极豪科技有限公司 电子设备、图案校正方法及遮挡装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106652857A (zh) * 2016-09-18 2017-05-10 北京和欣之光信息技术有限公司 一种虚拟现实系统的测试装置及方法
CN108876725A (zh) * 2017-05-12 2018-11-23 深圳市魔眼科技有限公司 一种虚拟图像畸变矫正方法及系统
US10277893B1 (en) * 2017-06-22 2019-04-30 Facebook Technologies, Llc Characterization of optical distortion in a head mounted display
KR102070997B1 (ko) * 2018-11-23 2020-01-29 재단법인 대구경북첨단의료산업진흥재단 Hmd 성능 평가 시스템 및 이를 이용한 hmd 성능 평가 방법
CN111565309A (zh) * 2020-05-28 2020-08-21 京东方科技集团股份有限公司 显示设备及其畸变参数确定方法、装置、系统及存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127714B (zh) * 2016-07-01 2019-08-20 南京睿悦信息技术有限公司 一种虚拟现实头戴显示器设备畸变参数的测量方法
CN107527324B (zh) * 2017-07-13 2019-07-12 江苏泽景汽车电子股份有限公司 一种hud的图像畸变矫正方法
CN108680343B (zh) * 2018-05-22 2020-11-20 歌尔股份有限公司 一种柔性屏检测方法以及检测装置
US11009941B2 (en) * 2018-07-25 2021-05-18 Finch Technologies Ltd. Calibration of measurement units in alignment with a skeleton model to control a computer system
CN109451302A (zh) * 2018-12-07 2019-03-08 昆山丘钛微电子科技有限公司 一种摄像模组测试方法、装置、电子设备及介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106652857A (zh) * 2016-09-18 2017-05-10 北京和欣之光信息技术有限公司 一种虚拟现实系统的测试装置及方法
CN108876725A (zh) * 2017-05-12 2018-11-23 深圳市魔眼科技有限公司 一种虚拟图像畸变矫正方法及系统
US10277893B1 (en) * 2017-06-22 2019-04-30 Facebook Technologies, Llc Characterization of optical distortion in a head mounted display
KR102070997B1 (ko) * 2018-11-23 2020-01-29 재단법인 대구경북첨단의료산업진흥재단 Hmd 성능 평가 시스템 및 이를 이용한 hmd 성능 평가 방법
CN111565309A (zh) * 2020-05-28 2020-08-21 京东方科技集团股份有限公司 显示设备及其畸变参数确定方法、装置、系统及存储介质

Also Published As

Publication number Publication date
CN111565309A (zh) 2020-08-21
CN111565309B (zh) 2022-08-19

Similar Documents

Publication Publication Date Title
WO2021008456A1 (zh) 图像处理方法、装置、电子设备及存储介质
US11517099B2 (en) Method for processing images, electronic device, and storage medium
WO2021238564A1 (zh) 显示设备及其畸变参数确定方法、装置、系统及存储介质
CN109859102B (zh) 特效显示方法、装置、终端及存储介质
CN108616691B (zh) 基于自动白平衡的拍照方法、装置、服务器及存储介质
CN111028144B (zh) 视频换脸方法及装置、存储介质
CN109302632B (zh) 获取直播视频画面的方法、装置、终端及存储介质
CN109166150B (zh) 获取位姿的方法、装置存储介质
CN109886208B (zh) 物体检测的方法、装置、计算机设备及存储介质
WO2022134632A1 (zh) 作品处理方法及装置
CN111385525B (zh) 视频监控方法、装置、终端及系统
CN108848405B (zh) 图像处理方法和装置
WO2022227893A1 (zh) 图像拍摄方法、装置、终端及存储介质
WO2022199102A1 (zh) 图像处理方法及装置
CN112184802B (zh) 标定框的调整方法、装置及存储介质
CN113824902B (zh) 红外摄像机系统时延确定方法、装置、系统、设备及介质
WO2021218926A1 (zh) 图像显示方法、装置和计算机设备
CN112967261B (zh) 图像融合方法、装置、设备及存储介质
CN110660031B (zh) 图像锐化方法及装置、存储介质
CN112243083B (zh) 抓拍方法、装置及计算机存储介质
CN111757146B (zh) 视频拼接的方法、系统及存储介质
CN110443841B (zh) 地面深度的测量方法、装置及系统
CN108881739B (zh) 图像生成方法、装置、终端及存储介质
CN108881715B (zh) 拍摄模式的启用方法、装置、终端及存储介质
CN114093020A (zh) 动作捕捉方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21814008

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21814008

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23/06/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21814008

Country of ref document: EP

Kind code of ref document: A1