CN112689135A - Projection correction method, projection correction device, storage medium and electronic equipment - Google Patents

Projection correction method, projection correction device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112689135A
CN112689135A CN202110297235.1A CN202110297235A CN112689135A CN 112689135 A CN112689135 A CN 112689135A CN 202110297235 A CN202110297235 A CN 202110297235A CN 112689135 A CN112689135 A CN 112689135A
Authority
CN
China
Prior art keywords
image
projector
target feature
feature point
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110297235.1A
Other languages
Chinese (zh)
Other versions
CN112689135B (en
Inventor
孙世攀
张聪
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN202110297235.1A priority Critical patent/CN112689135B/en
Publication of CN112689135A publication Critical patent/CN112689135A/en
Application granted granted Critical
Publication of CN112689135B publication Critical patent/CN112689135B/en
Priority to PCT/CN2021/115160 priority patent/WO2022193559A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Abstract

The disclosure relates to a projection correction method, a projection correction device, a storage medium and an electronic device, and relates to the technical field of projection, wherein the method comprises the following steps: acquiring a preset image projected by a projector through a camera to obtain a shot image; determining target feature points in the shot image, and determining depth information of the target feature points in a shooting space of the camera aiming at each target feature point to obtain three-dimensional coordinates of the target feature points; and determining normal vectors of a fitting plane constructed by all the target feature points according to the three-dimensional coordinates of all the target feature points, and correcting the original image of the projector according to the normal vectors and the current pose information of the projector. The beneficial effects of this disclosure are: projection trapezoidal correction can be realized through one camera, the number of devices is reduced, meanwhile, the depth information of the target characteristic points can be rapidly calculated, and the complexity of calculating the three-dimensional coordinates of the target characteristic points is reduced.

Description

Projection correction method, projection correction device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of projection technologies, and in particular, to a projection correction method and apparatus, a storage medium, and an electronic device.
Background
A projector is a device that projects an image to a wall or a projection screen by optical projection. In a traditional projector, the projector needs to be over against a wall surface, so that a projected picture is a normal rectangle. Once the projector is improperly placed, the projected picture is deformed.
At present, in the trapezoidal correction technology of the projector, binocular correction is mainly used, however, two cameras are needed for binocular correction or a distance sensor is needed to match the cameras, and the extra cameras or the distance sensor can increase the hardware cost for the projector, and the calibrated parameters are more.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a projection correction method, an apparatus, a storage medium, and an electronic device.
In order to achieve the above object, in a first aspect, the present disclosure provides a projection correction method, the method including:
controlling a projector to project a preset image to a projection plane in response to the received correction instruction;
shooting the preset image projected by the projector through a camera of the projector to obtain a shot image;
identifying target feature points of the preset image in the shot image;
for each target feature point, determining depth information of the target feature point in a shooting space of the camera according to a mapping relation calibrated in advance for the target feature point and a camera coordinate of the target feature point on the shot image so as to obtain a three-dimensional coordinate of the target feature point in a projection space of the projector, wherein the mapping relation is an association relation between the depth information of the target feature point calibrated at different depths and an offset of the camera coordinate;
determining a normal vector of the projection plane relative to the projector according to the three-dimensional coordinates of each target feature point;
correcting the two-dimensional vertex coordinates of the original image of the projector according to the normal vector of the projection plane and the current pose information of the projector to obtain the two-dimensional vertex coordinates of the corrected original image;
and controlling the projector to project according to the two-dimensional vertex coordinates of the corrected original image.
Optionally, the preset image is a checkerboard image; the identifying of the target feature point of the preset image comprises the following steps:
determining initial characteristic points in the checkerboard image by adopting an angular point detection algorithm;
performing linear fitting on the initial characteristic points in the same preset area range respectively according to the vertical direction and the horizontal direction;
and taking the intersection point between any straight line in the vertical direction and any straight line in the horizontal direction obtained by fitting as the target characteristic point.
Optionally, the determining, for each target feature point, depth information of the target feature point in the shooting space of the camera according to a mapping relationship pre-calibrated for the target feature point and the camera coordinate of the target feature point on the shot image includes:
for each target feature point, calculating depth information of the target feature point in a shooting space of the camera according to a mapping relation pre-calibrated for the target feature point and camera coordinates of the target feature point on the shot image, wherein the mapping relation is as follows:
Figure 487967DEST_PATH_IMAGE001
wherein h is the depth information of the target feature point,
Figure 907447DEST_PATH_IMAGE002
a first preset calibration parameter for the target feature point,
Figure DEST_PATH_IMAGE003
a second preset calibration parameter of the target feature point, wherein X is the camera coordinate of the target feature point, and the first preset calibration parameter and the second preset calibration parameterThe fixed parameters are all constants.
Optionally, for each target feature point, the first preset calibration parameter and the second preset calibration parameter of the target feature point are obtained as follows:
under the condition that the projector is a first distance away from a projection plane and the projection light of the projector is perpendicular to the projection plane, projecting a preset image to the projection plane, and shooting the preset image through a camera of the projector to obtain a first image;
determining first camera coordinates and first depth information of a target feature point on the first image based on the first image and the first distance;
under the condition that the projector is at a second distance from the projection plane and the projection light of the projector is perpendicular to the projection plane, projecting a preset image to the projection plane, and shooting the preset image through a camera of the projector to obtain a second image;
determining second camera coordinates and second depth information of the target feature points on the preset image according to the second image and the second distance;
according to the first camera coordinate and the first depth information, and the second camera coordinate and the second depth information, combining a mapping relation between the pre-established depth information of the target feature point and the camera coordinate to obtain the first preset calibration parameter and the second preset calibration parameter, wherein the mapping relation is as follows:
Figure 79366DEST_PATH_IMAGE004
wherein h is the depth coordinate of the target feature point, X is the camera coordinate,
Figure DEST_PATH_IMAGE005
for the first preset calibration parameter, the calibration parameter is set,
Figure 25325DEST_PATH_IMAGE003
is a secondAnd presetting calibration parameters.
Optionally, the correcting, according to the normal vector of the projection plane and the current pose information of the projector, the two-dimensional vertex coordinates of the original image of the projector to obtain the two-dimensional vertex coordinates of the corrected original image includes:
calculating offset information of the projector normal vector according to the normal vector of the projection plane and the current pose information of the projector, wherein the offset information comprises a yaw angle, a pitch angle and a roll angle;
calculating to obtain two-dimensional vertex coordinates of a projection image projected to the projection plane by the projector based on the offset information;
establishing a homography matrix relation between the projected image and the original image based on the two-dimensional vertex coordinates of the projected image and the two-dimensional vertex coordinates of the original image of the projector;
selecting a target rectangle from the projection image of the projector, and determining the two-dimensional vertex coordinates of the target rectangle;
and obtaining the two-dimensional vertex coordinates of the corrected original image by combining the homography matrix relation based on the two-dimensional vertex coordinates of the target rectangle.
Optionally, the selecting a target rectangle from the projection image of the projector includes:
randomly selecting a point from any side of the projected image, and generating a rectangle in the area of the projected image by taking the point as the vertex of the rectangle to be constructed and the aspect ratio of the original image as the aspect ratio of the rectangle to be constructed;
and selecting the rectangle with the largest area from the generated rectangles as the target rectangle.
In a second aspect, the present disclosure provides a projection correction apparatus, the apparatus comprising:
the response module is configured to respond to the received correction instruction and control the projector to project a preset image to the projection plane;
the shooting module is configured to shoot the preset image projected by the projector through a camera of the projector to obtain a shot image;
a target feature point determination module configured to identify a target feature point of the preset image in the captured image;
the three-dimensional coordinate determination module is configured to determine, for each target feature point, depth information of the target feature point in a shooting space of the camera according to a mapping relationship calibrated in advance for the target feature point and a camera coordinate of the target feature point on the shot image, so as to obtain a three-dimensional coordinate of the target feature point in a projection space of the projector, wherein the mapping relationship is an association relationship between the depth information of the target feature point calibrated at different depths and an offset of the camera coordinate;
a normal vector module configured to determine a normal vector of the projection plane relative to the projector according to the three-dimensional coordinates of each of the target feature points;
the correction module is configured to correct the two-dimensional vertex coordinates of the original image of the projector according to the normal vector of the projection plane and the current pose information of the projector to obtain the two-dimensional vertex coordinates of the corrected original image;
and the projection module is used for controlling the projector to project according to the two-dimensional vertex coordinates of the corrected original image.
Optionally, the preset image is a checkerboard image, and the response module includes:
the instruction response submodule is configured to determine initial feature points in the checkerboard image by adopting an angular point detection algorithm;
the straight line fitting submodule is configured to perform straight line fitting on the initial characteristic points in the same preset area range respectively in the vertical direction and the horizontal direction;
and the characteristic point determining submodule is configured to use an intersection point between any straight line in the vertical direction and any straight line in the horizontal direction, which are obtained through fitting, as the target characteristic point.
In a third aspect, the present disclosure provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processing apparatus, implements the steps of the method provided in the first aspect above.
In a fourth aspect, the present disclosure provides an electronic device comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method provided by the first aspect above.
According to the technical scheme, in response to the received correction instruction, the camera acquires the preset image projected by the projector to obtain the shot image; determining target feature points in the shot image, and determining depth information of the target feature points in the shooting space of the camera according to a mapping relation preset for the target feature points and camera coordinates of the target feature points on the shot image aiming at each target feature point so as to obtain three-dimensional coordinates of the target feature points; and determining a normal vector of a projection plane according to the three-dimensional coordinates of all the target feature points, and correcting the two-dimensional vertex coordinates of the original image of the projector according to the normal vector and the current pose information of the projector. Therefore, projection trapezoidal correction can be realized through one camera, the number of devices is reduced, meanwhile, the depth information of the target characteristic point can be rapidly calculated through the pre-calibrated mapping relation, the complexity of calculating the three-dimensional coordinate of the target characteristic point is reduced, and further, the correction efficiency is improved.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale. In the drawings:
FIG. 1 is a flow chart illustrating a method of projection correction according to an exemplary embodiment;
FIG. 2 is a schematic flow chart illustrating the identification of target feature points in accordance with an exemplary embodiment;
FIG. 3 is a diagram illustrating mathematical modeling of a mapping relationship in accordance with an exemplary embodiment;
FIG. 4 is another flow chart illustrating a method of projection correction according to an exemplary embodiment;
FIG. 5 is a schematic diagram illustrating the calculation of three-dimensional imaging vertex coordinates for a standard image in accordance with an exemplary embodiment;
FIG. 6 is a schematic diagram illustrating a vector decomposition in accordance with an exemplary embodiment;
FIG. 7 is a block diagram illustrating a projection correction device in accordance with an exemplary embodiment;
FIG. 8 is a block diagram illustrating an electronic device in accordance with an example embodiment.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
FIG. 1 is a flow chart illustrating a method of projection correction according to an exemplary embodiment. The projection correction method can be applied to electronic devices such as a projector, and as shown in fig. 1, the projection correction method includes the steps of:
and S101, in response to the received correction instruction, controlling the projector to project a preset image to the projection plane.
Here, the preset image refers to an image projected to a projection plane, and in general, the preset image of the present embodiment may be a checkerboard image. The projection plane refers to an area for displaying an output image of the projector, such as a wall surface or a curtain. It should be understood that when the projector is perpendicular to the wall or curtain, the predetermined image is a standard rectangular image, where the checkerboard pattern is not distorted, and when the projector is not perpendicular to the wall or curtain, the projected image is a non-rectangular image, where the checkerboard pattern is distorted.
In some embodiments, the correction instructions may be triggered automatically or non-automatically. For example, if the projector is automatically triggered, the projector may automatically trigger a correction instruction when the projector detects that the projected image is a non-rectangular image; if the button is not automatically triggered, the user can press a button of a controller which is in communication connection with the projector, so that the controller is triggered to send a correction instruction to the projector, and the button can be a virtual button or a physical button. This embodiment is not limited to this.
S102, shooting the preset image projected by the projector through a camera of the projector to obtain a shot image.
Here, after the projector projects a preset image onto the projection plane, the preset image is photographed by the camera to obtain a photographed image, and the wall surface or the curtain projected by the projector is modeled according to the photographed image to obtain three-dimensional information of the wall surface or the curtain.
S103, identifying the target characteristic point of the preset image in the shot image.
Here, the target feature point is a feature point set on a preset image for modeling a wall surface or a curtain, and the form or number of the feature point may be set according to actual conditions. For example, when the preset image is a checkerboard image, the target feature point in the preset image refers to an intersection point between black and white squares in the preset image.
And S104, for each target feature point, determining the depth information of the target feature point in the shooting space of the camera according to a mapping relation calibrated in advance for the target feature point and the camera coordinate of the target feature point on the shot image so as to obtain the three-dimensional coordinate of the target feature point in the projection space of the projector, wherein the mapping relation is the association relation between the depth information of the target feature point calibrated at different depths and the offset of the camera coordinate.
Here, since the correlation between the depth information of the target feature point calibrated at different depths and the offset of the camera coordinate is performed in advance, after the camera coordinate of the target feature point is determined in the captured image, the depth of the target feature point can be calculated based on the camera coordinate and the mapping relationship. Wherein, the depth information refers to the depth of the target characteristic point of the preset image projected on the projection plane relative to the camera. For example, preset images projected by a projector are photographed at depths of 1.2m and 1.9m, respectively, camera coordinates of the target feature point at 1.2m and 1.9m are determined, and thus the association relationship between the depth information and the camera coordinates of the same target feature point is calculated.
It should be noted that, after the camera coordinates and the depth information of the target feature coordinate point are calculated, the depth information of the target feature point in the space can be determined according to the camera coordinates and the depth information. Wherein the depth information is a Z-axis coordinate in the three-dimensional coordinates of the target feature point. And the X-axis coordinate and the Y-axis coordinate in the three-dimensional coordinates of the target feature point on the shot image are obtained by carrying out scaling according to the camera coordinate of the target feature point on the preset image and combining the depth information of the target feature point. The principle is as follows: and projecting the preset image on a projection plane by using a pinhole imaging principle, wherein the X-axis coordinate and the Y-axis coordinate of the target feature point displayed on the projection plane are obtained by converting the X-axis coordinate and the Y-axis coordinate in the camera coordinate of the target feature on the preset image through the depth information.
And S105, determining a normal vector of the projection plane relative to the projector according to the three-dimensional coordinates of the target feature points.
Here, the obtained three-dimensional coordinates of all the target feature points may be fitted, so that a projection plane is modeled according to the three-dimensional coordinates of the target feature points to obtain a fitting plane, the fitting plane reflects three-dimensional information of the projection plane, and a normal vector of the projection plane with respect to the projector is obtained. Wherein, the fitting may be a least squares fitting.
And S106, correcting the two-dimensional vertex coordinates of the original image of the projector according to the normal vector of the projection plane and the current pose information of the projector to obtain the two-dimensional vertex coordinates of the corrected original image.
Here, the normal vector of the projection plane refers to a line segment perpendicular to the projection plane, and the current pose information of the projector refers to the current placement position of the projector, which may be obtained by an attitude sensor (IMU). When the projector is used for projection, if the placing position is inclined, the projection light of the projector can be relatively deviated relative to the wall surface or the curtain, so that the projection plane displayed on the wall surface or the curtain by the projector presents trapezoidal distortion. According to the normal vector of the projection plane and the current pose information of the projector, the position deviation of the projector relative to the projection plane (a wall surface or a curtain) can be calculated, and then the two-dimensional vertex coordinates of the original image of the projector are adjusted according to the position deviation.
The original image refers to an original output image of the projector, and in general, the original image is a rectangular image, such as an image with a width w and a height h. When the projector is disposed obliquely with respect to the projection plane, the rectangular original image projected on the projection plane may appear as a trapezoid, such as a convex quadrangle, and in order to make the image projected on the projection plane by the projector appear as a rectangle, it is necessary to correct the two-dimensional vertex coordinates of the original image so that the corrected original image projected on the projection plane appears as a rectangle.
It should be understood that the two-dimensional vertex coordinates refer to the four vertex coordinates of the original image. Correcting the two-dimensional vertex coordinates of the original image may be in a digitally adjusted manner, changing only the vertex coordinates of the original image output by the projector without changing the lens structure of the projector.
And S107, controlling the projector to project according to the two-dimensional vertex coordinates of the corrected original image.
Here, the projector is controlled to project according to the two-dimensional vertex coordinates of the corrected original image so that the original image of the projector appears as a rectangle on the projection plane.
By adopting the technical scheme, the projection trapezoidal correction can be realized through one camera, the setting number of devices is reduced, meanwhile, the depth information of the target characteristic point can be rapidly calculated through the pre-calibrated mapping relation, the complexity of calculating the three-dimensional coordinate of the target characteristic point is reduced, and further, the correction efficiency is improved.
In this disclosure, when the preset image is a checkered image, it is considered that a camera of the camera is affected by environmental factors, such as exposure, in actual shooting, so that a difference exists between black and white regions of a shot image with checkered patterns, and a feature point depends on image quality, and under the difference, a feature point determined by conventionally adopting an angular point detection algorithm is not accurate, so that a parameter obtained according to the feature point subsequently is also inaccurate, and accuracy of correction is affected. Therefore, the initial feature points detected by the corner detection algorithm are optimized, and the following description is made in detail with respect to the target feature points identified in S103 for the preset image.
Fig. 2 is a flowchart illustrating the process of identifying target feature points according to an exemplary embodiment. As shown in fig. 2, in an implementable embodiment, in S103, identifying the target feature point of the preset image includes:
s1031, determining initial feature points in the checkerboard image by adopting an angular point detection algorithm;
s1032, performing linear fitting on the initial feature points in the same preset region range respectively according to the vertical direction and the horizontal direction;
and S1033, taking an intersection point between any straight line in the vertical direction and any straight line in the horizontal direction, which is obtained through fitting, as the target feature point.
Firstly, determining initial characteristic points in a preset image by adopting an angular point detection algorithm, and then performing linear fitting on the initial characteristic points in the same preset area range according to the vertical direction and the horizontal direction respectively; and finally, taking the intersection point between any straight line in the vertical direction and any straight line in the horizontal direction, which is obtained by fitting, as a target characteristic point.
In the present disclosure, since the offset of the initial feature point may also be a small range of offset, the same preset region range is a range that can be determined in advance. Wherein, the straight line fitting can be realized by a least square method.
It should be understood that in the case of a projector perpendicular to a wall surface, the intersection point of black and white of the projected image is formed by the intersection of two straight lines. In this ideal case, the number of horizontal and vertical lines is known. Accordingly, the number of the straight lines in the horizontal direction and the straight lines in the vertical direction obtained by fitting is equal to that in an ideal case, and further, the number of the feature points is also equal.
The corner detection algorithm may be, for example, the findchessboardcorrers algorithm of OpenCV.
By adopting the technical scheme, the initial characteristic points detected by the angular point detection algorithm are optimized, the accuracy of the target characteristic points of the preset image is ensured, and the correction precision is improved.
In an implementation manner, in S104, for each target feature point, determining depth information of the target feature point in a shooting space of the camera according to a mapping relationship pre-calibrated for the target feature point and camera coordinates of the target feature point on the shot image, includes:
for each target feature point, calculating depth information of the target feature point in a shooting space of the camera according to a mapping relation pre-calibrated for the target feature point and camera coordinates of the target feature point on the shot image, wherein the mapping relation is as follows:
Figure 436715DEST_PATH_IMAGE006
wherein h is the depth information of the target feature point,
Figure 140229DEST_PATH_IMAGE005
a first preset calibration parameter for the target feature point,
Figure DEST_PATH_IMAGE007
and a second preset calibration parameter of the target feature point, wherein X is a camera coordinate of the target feature point, and the first preset calibration parameter and the second preset calibration parameter are constants.
Here, after the camera coordinates of the target feature point are calculated, the depth information of the target feature point can be calculated by substituting the camera coordinates into the above calculation formula. Wherein, in the above-mentioned calculation formula,
Figure 154321DEST_PATH_IMAGE005
and
Figure 830153DEST_PATH_IMAGE008
is a constant.
It should be noted that X may be an abscissa or an ordinate of the camera coordinate, and the abscissa and the ordinate are symmetric during the calculation, so the above calculation formula is applicable to both the abscissa and the ordinate of the camera coordinate.
Therefore, in the present disclosure, through the mapping relationship calibrated in advance, the depth information of the target feature point can be rapidly calculated, so as to calculate the three-dimensional coordinates of the target feature point in the space.
Next, a description is given of a calibration process of the first preset calibration parameter and the second preset calibration parameter.
For each target feature point, the first preset calibration parameter and the second preset calibration parameter of the target feature point are obtained by the following method:
under the condition that the projector is a first distance away from a projection plane and the projection light of the projector is perpendicular to the projection plane, projecting a preset image to the projection plane, and shooting the preset image through a camera of the projector to obtain a first image;
determining first camera coordinates and first depth information of a target feature point on the first image based on the first image and the first distance;
under the condition that the projector is at a second distance from the projection plane and the projection light of the projector is perpendicular to the projection plane, projecting a preset image to the projection plane, and shooting the preset image through a camera of the projector to obtain a second image;
determining second camera coordinates and second depth information of the target feature points on the preset image according to the second image and the second distance;
according to the first camera coordinate and the first depth information, and the second camera coordinate and the second depth information, combining a mapping relation between the pre-established depth information of the target feature point and the camera coordinate to obtain the first preset calibration parameter and the second preset calibration parameter, wherein the mapping relation is as follows:
Figure 537078DEST_PATH_IMAGE004
wherein h is the depth coordinate of the target feature point, X is the camera coordinate,
Figure 931150DEST_PATH_IMAGE009
for the first preset calibration parameter, the calibration parameter is set,
Figure 421038DEST_PATH_IMAGE010
and the second preset calibration parameter is obtained.
Here, the calibration process of the mapping relationship is actually to project the preset images at two different depths and to shoot the preset images at the different depths by using the camera, for example, the projection light of the projector is set to be perpendicular to the projection plane (wall or curtain), and then the projector projects the preset images to the projection plane at distances of 1.2m and 1.9m respectively and shoots by using the camera to obtain the first image and the second image. Then, camera coordinates and depth information of the target feature point in the first image are calculated, wherein the first distance corresponds to the depth information of the target feature point of the first image. The depth information of the target feature point of the second image and the calculation process of the camera coordinate are consistent with those of the first image, and are not described herein again. It is worth mentioning that the camera coordinates are two-dimensional coordinates.
After the camera coordinates and the depth information of the target feature points at two depths are obtained through calculation, the camera coordinates and the depth information are substituted into the following calculation formula:
Figure 311896DEST_PATH_IMAGE011
the values of the first preset calibration parameter and the second preset calibration parameter can be calculated.
The above embodiment will be described with reference to fig. 3.
Fig. 3 is a diagram illustrating mathematical modeling of a mapping relationship according to an exemplary embodiment, in which point H is a position of a target feature point at a first distance, point I is a position of the target feature point at a second distance, point K is a point of the target feature point at the second distance projected onto a camera normalization plane, point J is a point of the target feature point at the first distance projected onto the camera normalization plane, point L is an auxiliary point for describing a depth of the point H from the camera normalization plane, point Q is a point of the target feature point projected onto the camera normalization plane, point D is a camera origin, point a is a projector origin, point M is an auxiliary point for describing a depth of the point H from the camera origin plane, and point N is a position of the target feature point on an original.
From the mathematical geometry it can be found that:
JQ/DN=HL/HM;
the abscissa of the Q point is expressed as
Figure 861826DEST_PATH_IMAGE012
When the abscissa of the H point (target feature point) with respect to the projection point J of the camera is denoted by X, the length of LM is denoted by f, the length of HM is denoted by H, and the length of DN is denoted by b, then transforming JQ/DN = HL/HM yields:
Figure 743194DEST_PATH_IMAGE013
further, the above formula is simplified as:
Figure 99089DEST_PATH_IMAGE014
further, the above formula can be seen as:
Figure 749513DEST_PATH_IMAGE015
the relational expression of h and X can be obtained by the above expression, wherein h is the depth information of the target characteristic point,
Figure 204765DEST_PATH_IMAGE002
a first preset calibration parameter for the target feature point,
Figure 432484DEST_PATH_IMAGE016
and setting a second preset calibration parameter of the target characteristic point, wherein X is the abscissa of the target characteristic point. Therefore, the first camera coordinate and the first depth information, and the second camera coordinate and the second depth information are substituted into the above calculation formula to obtain the values of the first preset calibration parameter and the second preset calibration parameter.
FIG. 4 is another flow chart illustrating a method of projection correction according to an exemplary embodiment. In an implementation manner, as shown in fig. 4, in S106, correcting the two-dimensional vertex coordinates of the original image of the projector according to the normal vector of the projection plane and the current pose information of the projector to obtain the two-dimensional vertex coordinates of the corrected original image, includes:
s1061, calculating offset information of the projector normal vector according to the normal vector of the projection plane and the current pose information of the projector, wherein the offset information comprises a yaw angle, a pitch angle and a roll angle;
s1062, calculating to obtain two-dimensional vertex coordinates of a projection image projected to the projection plane by the projector based on the offset information;
s1063, establishing a homography matrix relation between the projection image and the original image based on the two-dimensional vertex coordinates of the projection image and the two-dimensional vertex coordinates of the original image of the projector;
s1064, selecting a target rectangle from the projection image of the projector, and determining the two-dimensional vertex coordinates of the target rectangle;
s1065, based on the two-dimensional vertex coordinates of the target rectangle, combining the homography matrix relationship to obtain the two-dimensional vertex coordinates of the corrected original image.
Here, in step S1061, offset information of the projector is calculated from the normal vector of the projection plane and the current pose information of the projector. The roll angle may be obtained by an Inertial sensor (IMU for short) disposed in the projector, that is, the IMU obtains current pose information of the projector, and then calculates the roll angle according to the current pose information. The yaw angle and the pitch angle can be calculated according to the normal vector of the projection plane, and the specific method is to project the normal vector of the projection plane on the plane where the yaw angle is located, and then calculate the included angle of the projection, namely the yaw angle. For example, if the XOY plane is the plane of the yaw angle, the angle between the projection and the X-axis is calculated. And projecting the normal vector of the projection plane on the plane where the pitch angle is located, and then calculating the included angle of the projection, namely the pitch angle. For example, if the XOZ plane is the plane of the yaw angle, the angle between the projection and the X-axis is calculated.
In step S1062, the vertex coordinates of the projection image, which are two-dimensional coordinates, are calculated based on the offset information, and the vertex coordinates of the original image of the projector projected onto the wall surface or the curtain are calculated.
Wherein the two-dimensional vertex coordinates of the projection image may be calculated by:
firstly, according to a yaw angle and a pitch angle, calculating a measurement normal vector of the projection image relative to the projector by using a first preset calculation formula, wherein the first preset calculation formula is as follows:
Figure 467436DEST_PATH_IMAGE017
Figure 34684DEST_PATH_IMAGE018
Figure 723154DEST_PATH_IMAGE019
wherein the content of the first and second substances,
Figure 579115DEST_PATH_IMAGE020
is the X-axis coordinate of the measured normal vector,
Figure 214495DEST_PATH_IMAGE021
is the Y-axis coordinate of the measured normal vector,
Figure 197102DEST_PATH_IMAGE022
and H is the yaw angle and V is the pitch angle.
And then determining the position information of the plane where the projected image is located based on the measurement normal vector and the coordinate information of the preset target point, wherein the target point is a preset central point where the projected image rotates.
Here, since the target point is a preset central point at which the preset projection image is rotated for yaw, pitch, roll, and the like, the coordinate information of the target point is constant. After the measurement vector and the target point have been determined, position information of the plane in which the projection image is located can be determined.
And then, obtaining the three-dimensional vertex coordinates of the projection image based on the position information and by combining a pre-established ray vector, wherein the ray vector is a unit vector of a connecting line between the vertex of the projection image projected by the projector and the optical center of the projector.
Here, the ray vector is a unit vector of a connecting line between the vertex of the projection image projected by the projector and the optical center of the projector, that is, the projector projects the projection image outward, and the connecting line between the four vertices of the projected pattern projected by the projector and the optical center is determinable. After the position information of the plane where the projection image is located is determined, the intersection point of the ray vector and the plane where the projection image is located can be determined through the ray vector, and the intersection point is 4 vertex coordinates of the projection image projected on the projection plane by the original image.
The ray vector can be calculated according to the roll angle, and the specific calculation method is as follows:
acquiring optical-mechanical parameters of the projector, wherein the optical-mechanical parameters comprise a raising angle, a projection ratio and an aspect ratio of projection light;
obtaining a three-dimensional imaging vertex coordinate of a standard image projected on a projection plane by the projector under a preset condition according to optical machine parameters of the projector, wherein the preset condition is that the projector is horizontally placed, projection light of the projector is perpendicular to the projection plane, and the projector is away from the projection plane by a preset distance threshold;
and calculating a unit vector of a connecting line between the vertex of the standard image and the optical center of the projector according to the three-dimensional imaging vertex coordinates of the standard image, and taking the unit vector as the ray vector.
Here, the projector generates a change in similarity due to a far and near projection image of depth, for example, the projection image projected onto the projection plane is rectangular, and the projection image is always rectangular regardless of the far and near depth. Therefore, the projector projects the standard image to the projection plane under the preset condition, and the three-dimensional imaging vertex coordinates of the standard image projected under the preset condition can be calculated according to the optical machine parameters of the projector. The uplift angle refers to an uplift angle of projection light of the projector, and in general, the uplift angle is related to the model of the projector.
The specific process of calculating the three-dimensional imaging vertex coordinates of the standard image is as follows:
FIG. 5 is a schematic diagram illustrating the principle of calculating three-dimensional imaging vertex coordinates for a standard image according to an exemplary embodiment. As shown in fig. 5, the standard image has four vertices, namely a first vertex 0, a second vertex 1, a third vertex 2, and a fourth vertex 3, where the first vertex is a vertex located at the upper right corner of the projected image, the second vertex is a vertex located at the upper left corner of the projected image, the third vertex is a vertex located at the lower right corner of the projected image, and the fourth vertex is a vertex located at the lower left corner of the projected image.
According to the optical machine parameters, a preset distance threshold is defined as f, a projection ratio is defined as throwRatio, w is the width of a projection image, h is the height of the projection image, and throwRatio = f/w exists according to a triangular relation. Then
Figure 462998DEST_PATH_IMAGE023
Figure 399730DEST_PATH_IMAGE024
Since throwRatio = f/w, aspect ratio = w/h, and h = f/throwRatio, therefore,
Figure 42064DEST_PATH_IMAGE025
the three-dimensional imaging vertex coordinates of the first vertex 0 are:
Figure 318324DEST_PATH_IMAGE026
the three-dimensional imaging vertex coordinates of the second vertex 1 are as follows:
Figure 879756DEST_PATH_IMAGE027
the three-dimensional imaging vertex coordinates of the third vertex 2 are as follows:
Figure 179150DEST_PATH_IMAGE028
the three-dimensional imaging vertex coordinates of the fourth vertex 3 are:
Figure 484229DEST_PATH_IMAGE029
wherein the content of the first and second substances,
Figure 614996DEST_PATH_IMAGE030
is the X-axis coordinate of the first vertex, f is the preset distance threshold, doffsetAngle is the raising angle,
Figure 957116DEST_PATH_IMAGE031
is the Y-axis coordinate of the first vertex,
Figure 635484DEST_PATH_IMAGE032
is the X-axis coordinate of the second vertex,
Figure 150779DEST_PATH_IMAGE033
is the Y-axis coordinate of the second vertex,
Figure 339315DEST_PATH_IMAGE034
is the Z-axis coordinate of the first vertex,
Figure 914653DEST_PATH_IMAGE035
is the Z-axis coordinate of the second vertex,
Figure 578852DEST_PATH_IMAGE036
is the X-axis coordinate of the third vertex,
Figure 835521DEST_PATH_IMAGE037
is the Y-axis coordinate of the third vertex,
Figure 268777DEST_PATH_IMAGE038
is the Z-axis coordinate of the third vertex,
Figure 15016DEST_PATH_IMAGE039
is the X-axis coordinate of the fourth vertex,
Figure 510719DEST_PATH_IMAGE040
is the Y-axis coordinate of the fourth vertex,
Figure 961292DEST_PATH_IMAGE041
is the Z-axis coordinate of the fourth vertex.
After the three-dimensional imaging vertex coordinates of the standard image are obtained through calculation, the optical center of the projector and four ray vectors of four vertexes can be calculated by utilizing the vectors, and the unit vector is a mode of dividing the ray vector of the vertex by the ray vector.
It should be understood that the ray vector is related to the optical-mechanical parameters of the projector, and is invariant without changes to the optical-mechanical parameters of the projector.
In step 1544, vector decomposition is performed on the three-dimensional imaging vertex coordinates of the projection image to obtain two-dimensional imaging vertex coordinates of the projection image.
Here, after the three-dimensional imaging vertex coordinates of the projection image are calculated, it is necessary to convert the three-dimensional imaging vertex coordinates of the four vertices into two-dimensional imaging vertex coordinates based on vector decomposition. This is done by decomposing the vector into basis vectors on the horizontal plane, e.g.,
Figure 921158DEST_PATH_IMAGE042
is a pair of basis vectors, and the basis vectors,
Figure 775982DEST_PATH_IMAGE043
a basis vector is found for the intersection of the projection image with the horizontal plane as the X-axis of the coordinate system,
Figure 919168DEST_PATH_IMAGE044
and
Figure 783219DEST_PATH_IMAGE045
and is vertical. Wherein the content of the first and second substances,
Figure 597591DEST_PATH_IMAGE046
can be calculated by the following calculation:
Figure 13529DEST_PATH_IMAGE047
wherein the content of the first and second substances,
Figure 749404DEST_PATH_IMAGE048
is a normal vector of the horizontal plane,
Figure 213883DEST_PATH_IMAGE049
is a cross-product of the vector and,
Figure 945079DEST_PATH_IMAGE050
is a normal vector of the projected image,
Figure 672863DEST_PATH_IMAGE051
is a vector
Figure 692772DEST_PATH_IMAGE052
The die of (1).
FIG. 6 is a schematic diagram illustrating the principle of vector decomposition according to an exemplary embodiment. As shown in fig. 6, the projected image has G, I, J vertices and H vertices. After the three-dimensional imaging vertex coordinates of the projection image are found, a coordinate system is established with any one of the points G, I, J and H as the origin of coordinates to convert the three-dimensional imaging vertex coordinates into two-dimensional imaging vertex coordinates. The process of calculating the coordinates of the two-dimensional imaging vertex by vector decomposition is explained in detail in the present disclosure by establishing a coordinate system with the point H as the origin of coordinates. The three-dimensional imaging vertex coordinates of point G, I, J may be converted to two-dimensional imaging vertex coordinates using the following calculation.
Figure 23259DEST_PATH_IMAGE053
Wherein, X is the X-axis coordinate of the vertex coordinate of the two-dimensional imaging, vectorP (0) is the X-axis coordinate of the vector vectorP,
Figure 749906DEST_PATH_IMAGE054
is composed of
Figure 445330DEST_PATH_IMAGE055
The Y-axis coordinate of (a) is,
Figure 250737DEST_PATH_IMAGE056
is composed of
Figure 57019DEST_PATH_IMAGE057
Vector P (1) is the Y-axis coordinate of vector P,
Figure 169332DEST_PATH_IMAGE058
is composed of
Figure 832394DEST_PATH_IMAGE059
The X-axis coordinate of (a) is,
Figure 826895DEST_PATH_IMAGE060
is composed of
Figure 108972DEST_PATH_IMAGE061
For example, when solving the two-dimensional imaging vertex coordinate of the point G, the point3D is the three-dimensional imaging vertex coordinate of the point G, and then the vectorP is the HG vector, the HJ vector, and the HI vector.
Thus, the three-dimensional imaging vertex coordinates of the projection image can be converted into two-dimensional imaging vertex coordinates of the projection image by the above calculation formula.
In some realizable embodiments, after obtaining the three-dimensional imaging vertex coordinates of the standard image projected on the projection plane by the projector according to the optical-mechanical parameters of the projector, the method further comprises:
acquiring a current roll angle of the projector;
when the current roll angle does not meet a preset threshold value, correcting the X-axis coordinate and the Y-axis coordinate in the three-dimensional imaging vertex coordinate of the standard image according to the current roll angle by combining a second preset calculation formula, wherein the second preset calculation formula is as follows:
Figure 200425DEST_PATH_IMAGE062
Figure 972072DEST_PATH_IMAGE063
wherein the content of the first and second substances,
Figure 391552DEST_PATH_IMAGE064
is the corrected X-axis coordinate of the ith vertex of the standard image,
Figure 539636DEST_PATH_IMAGE065
is the corrected Y-axis coordinate of the ith vertex of the standard image,
Figure 220016DEST_PATH_IMAGE066
is the X-axis coordinate before correction of the ith vertex of the standard image,
Figure 631406DEST_PATH_IMAGE067
is the Y-axis coordinate before correction of the ith vertex of the standard image,
Figure 334920DEST_PATH_IMAGE068
the X-axis coordinate of the center of rotation at which the projector rolls,
Figure 847547DEST_PATH_IMAGE069
is the Y-axis coordinate of the rotation center, and r is the current roll angle;
and taking the corrected X-axis coordinate and the corrected Y-axis coordinate as a new X-axis coordinate and a new Y-axis coordinate of the vertex of the standard image.
Here, the current roll angle of the projector may be obtained by an Inertial sensor (IMU) provided in the projector, and when the current roll angle does not satisfy a preset threshold, it indicates that the projector rotates by rolling. For example, if the current roll angle is not 0, this indicates that the projector is rotating while rolling. When the projector rolls, the standard image of the projector rolls by taking the optical center ray as a rotating shaft, and the X-axis coordinate and the Y-axis coordinate of the three-dimensional imaging vertex coordinate of the standard image change, so that the X-axis coordinate and the Y-axis coordinate of the three-dimensional imaging vertex coordinate of the standard image which rolls need to be calculated based on the second preset calculation formula, and the X-axis coordinate and the Y-axis coordinate after correction of each vertex are obtained, so that a new three-dimensional imaging vertex coordinate of the standard image is obtained. And then recalculating the ray vector based on the new three-dimensional imaging vertex coordinates, and solving the three-dimensional imaging vertex coordinates of the projected image.
It should be understood that the coordinate of the rotation center rotap may be (0, 0), the rotation center rotap refers to the rotation center of the projector for rolling, and the preset center point is the offset of the projected image after the rotation of yaw and pitch of the imaginary projector.
Thus, the roll angle can take into account the change of the rotating projection image of the projector after sending the roll, thereby realizing accurate trapezoidal correction.
In step S1063, homography is a concept in projective geometry, also called projective transformation. It maps points (three-dimensional homogeneous vectors) on one projective plane onto another projective plane. Assuming that the homography between the two images is known, it is possible to convert from an image of one plane to another. The conversion through the planes is to perform projection correction on the same plane. Therefore, after the two-dimensional vertex coordinates of the original image of the projector and the two-dimensional vertex coordinates of the projected image are known, a corresponding homography matrix relation can be constructed, wherein the homography matrix relation refers to the incidence relation of the original image of the projector mapped on the projected image of the wall surface or the curtain.
In step S1064, the target rectangle is a rectangle selected in the area of the projected image, and the rectangle may be the rectangle having the largest area in the area of the projected image of the projector. The objective rectangle is set to be the rectangle with the largest area so as to maximize the projection area, and the user experience is improved.
It should be understood that, in the above-described embodiment, an embodiment of calculating two-dimensional vertex coordinates of a projection image is proposed, and in a specific application, not only the two-dimensional vertex coordinates of the projection image may be calculated using the method disclosed in the above-described embodiment, but also the two-dimensional vertex coordinates of the projection image may be calculated using other methods. For example, the vertex coordinates of the rotated original image are calculated based on the offset information and the vertex coordinates of the original image. The vertex coordinates of the rotated original image refer to the vertex coordinates of the original image after the vertex coordinates are rotated by a yaw angle, a pitch angle and a roll angle, and then two-dimensional vertex coordinates of a projection image of the rotated original image, which are mapped to a projection plane, are calculated based on the projection depth of the projector obtained by calculation. The projection depth is a distance between the projector and the projection plane.
In one practical implementation, the step S1063 of selecting a target rectangle from the projection image may include:
randomly selecting a point from any side of the projected image, and generating a rectangle in the area of the projected image by taking the point as the vertex of the rectangle to be constructed and the aspect ratio of the original image as the aspect ratio of the rectangle to be constructed;
and selecting the rectangle with the largest area from the generated rectangles as the target rectangle.
Here, the specific way of selecting the target rectangle may be to arbitrarily select a point on any side of the projection image, generate a rectangle in the region of the projection image with the point as a vertex of the rectangle to be constructed and the aspect ratio of the original image as the aspect ratio of the rectangle to be constructed, and select a rectangle with the largest area from the generated rectangles as the target rectangle.
For example, traversing the longest side of the projected image and the side adjacent to the longest side, selecting any point as the vertex of the rectangle to be constructed, generating the rectangle with the aspect ratio consistent with that of the original image to the periphery of the projected image, and finding out the rectangle with the largest area from all the generated rectangles as the target rectangle after the traversal is completed.
Therefore, the rectangle with the largest area is selected as the target rectangle, the area of the projected image watched by the user is ensured to be the largest, and the watching experience of the user is improved.
In step S1065, after obtaining the two-dimensional vertex coordinates of the target rectangle, the two-dimensional vertex coordinates of the target rectangle may be used as an input value of the homography matrix relationship, and the two-dimensional vertex coordinates of the corrected original image are calculated, so that the projector projects the projection image presented in the field of view of the user in accordance with the two-dimensional vertex coordinates of the corrected original image, so as to be a rectangle. That is, before correction, the projected image in the user field appears as a trapezoid, and the corrected projected image appears as a rectangle.
It should be noted that the vertex coordinates mentioned in the above embodiments refer to 4 vertex coordinates of the projection plane.
Fig. 7 is a block diagram illustrating a projection correction apparatus according to an exemplary embodiment. As shown in fig. 7, the apparatus 400 includes:
a response module 401 configured to control the projector to project a preset image to the projection plane in response to the received correction instruction;
a shooting module 402 configured to shoot the preset image projected by the projector through a camera of the projector to obtain a shot image;
a target feature point determination module 403 configured to identify a target feature point of the preset image in the captured image;
a three-dimensional coordinate determination module 404, configured to determine, for each target feature point, depth information of the target feature point in a shooting space of the camera according to a mapping relationship calibrated in advance for the target feature point and a camera coordinate of the target feature point on the shot image, so as to obtain a three-dimensional coordinate of the target feature point in a projection space of the projector, where the mapping relationship is an association relationship between the depth information of the target feature point calibrated at different depths and an offset of the camera coordinate;
a normal vector module 405 configured to determine a normal vector of the projection plane relative to the projector according to the three-dimensional coordinates of each of the target feature points;
a correcting module 406, configured to correct the two-dimensional vertex coordinates of the original image of the projector according to the normal vector of the projection plane and the current pose information of the projector, to obtain two-dimensional vertex coordinates of the corrected original image;
and the projection module 407 controls the projector to project according to the two-dimensional vertex coordinates of the corrected original image.
Optionally, the preset image is a checkerboard image, and the response module 401 includes:
the instruction response submodule is configured to determine initial feature points in the checkerboard image by adopting an angular point detection algorithm;
the straight line fitting submodule is configured to perform straight line fitting on the initial characteristic points in the same preset area range respectively in the vertical direction and the horizontal direction;
and the characteristic point determining submodule is configured to use an intersection point between any straight line in the vertical direction and any straight line in the horizontal direction, which are obtained through fitting, as the target characteristic point.
Optionally, the three-dimensional coordinate determination module 404 is specifically configured to:
for each target feature point, calculating depth information of the target feature point in a shooting space of the camera according to a mapping relation pre-calibrated for the target feature point and camera coordinates of the target feature point on the shot image, wherein the mapping relation is as follows:
Figure 523379DEST_PATH_IMAGE001
wherein h is the depth information of the target feature point,
Figure 964725DEST_PATH_IMAGE070
a first preset calibration parameter for the target feature point,
Figure 155535DEST_PATH_IMAGE007
and a second preset calibration parameter of the target feature point, wherein X is a camera coordinate of the target feature point, and the first preset calibration parameter and the second preset calibration parameter are constants.
Optionally, the three-dimensional coordinate determination module 404 is specifically configured to:
under the condition that the projector is a first distance away from a projection plane and the projection light of the projector is perpendicular to the projection plane, projecting a preset image to the projection plane, and shooting the preset image through a camera of the projector to obtain a first image;
determining first camera coordinates and first depth information of a target feature point on the first image based on the first image and the first distance;
under the condition that the projector is at a second distance from the projection plane and the projection light of the projector is perpendicular to the projection plane, projecting a preset image to the projection plane, and shooting the preset image through a camera of the projector to obtain a second image;
determining second camera coordinates and second depth information of the target feature points on the preset image according to the second image and the second distance;
according to the first camera coordinate and the first depth information, and the second camera coordinate and the second depth information, combining a mapping relation between the pre-established depth information of the target feature point and the camera coordinate to obtain the first preset calibration parameter and the second preset calibration parameter, wherein the mapping relation is as follows:
Figure 583105DEST_PATH_IMAGE071
wherein h is the depth coordinate of the target feature point, X is the camera coordinate,
Figure 175760DEST_PATH_IMAGE072
for the first preset calibration parameter, the calibration parameter is set,
Figure 53587DEST_PATH_IMAGE073
and the second preset calibration parameter is obtained.
Optionally, the correction module 406 includes:
the offset information calculation module is configured to calculate offset information of the projector normal vector according to the normal vector of the projection plane and current pose information of the projector, wherein the offset information comprises a yaw angle, a pitch angle and a roll angle;
the vertex calculation module is configured to calculate and obtain two-dimensional vertex coordinates of a projection image projected to the projection plane by the projector based on the offset information;
the homography matrix module is configured to establish a homography matrix relation between the projection image and the original image based on the two-dimensional vertex coordinates of the projection image and the two-dimensional vertex coordinates of the original image of the projector;
the target rectangle selection module is configured to select a target rectangle from the projection image of the projector and determine two-dimensional vertex coordinates of the target rectangle;
and the vertex correction module is configured to obtain the two-dimensional vertex coordinates of the corrected original image by combining the homography matrix relation based on the two-dimensional vertex coordinates of the target rectangle.
Optionally, the target rectangle selecting module is specifically configured to:
randomly selecting a point from any side of the projected image, and generating a rectangle in the area of the projected image by taking the point as the vertex of the rectangle to be constructed and the aspect ratio of the original image as the aspect ratio of the rectangle to be constructed;
and selecting the rectangle with the largest area from the generated rectangles as the target rectangle.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The present disclosure also provides a computer-readable storage medium having stored thereon a computer program, the computer program
The program when executed by a processor implements the steps of the projection correction method provided by the present disclosure.
FIG. 8 is a block diagram illustrating an electronic device in accordance with an example embodiment. As shown in fig. 8, the electronic device 500 may include: a processor 501 and a memory 502. The electronic device 500 may also include one or more of a multimedia component 503, an input/output (I/O) interface 504, and a communication component 505.
The processor 501 is configured to control the overall operation of the electronic device 500, so as to complete all or part of the steps in the projection correction method.
The memory 502 is used to store various types of data to support operation at the electronic device 500, such as instructions for any application or method operating on the electronic device 500 and application-related data, such as contact data, messaging, pictures, audio, video, and so forth. The Memory 502 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk.
The multimedia component 503 may include a screen and an audio component. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 502 or transmitted through the communication component 505. The audio assembly also includes at least one speaker for outputting audio signals.
The I/O interface 504 provides an interface between the processor 501 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons.
The communication component 505 is used for wired or wireless communication between the electronic device 500 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, or 4G, or a combination of one or more of them, so that the corresponding Communication component 505 may include: Wi-Fi module, bluetooth module, NFC module.
In an exemplary embodiment, the electronic Device 500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the above-described projection correction method.
In another exemplary embodiment, a computer-readable storage medium is also provided, which comprises program instructions, which when executed by a processor, implement the steps of the projection correction method described above. For example, the computer readable storage medium may be the memory 502 described above that includes program instructions that are executable by the processor 501 of the electronic device 500 to perform the projection correction method described above.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, various possible combinations will not be separately described in this disclosure.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (10)

1. A method of projection correction, the method comprising:
controlling a projector to project a preset image to a projection plane in response to the received correction instruction;
shooting the preset image projected by the projector through a camera of the projector to obtain a shot image;
identifying target feature points of the preset image in the shot image;
for each target feature point, determining depth information of the target feature point in a shooting space of the camera according to a mapping relation calibrated in advance for the target feature point and a camera coordinate of the target feature point on the shot image so as to obtain a three-dimensional coordinate of the target feature point in a projection space of the projector, wherein the mapping relation is an association relation between the depth information of the target feature point calibrated at different depths and an offset of the camera coordinate;
determining a normal vector of the projection plane relative to the projector according to the three-dimensional coordinates of each target feature point;
correcting the two-dimensional vertex coordinates of the original image of the projector according to the normal vector of the projection plane and the current pose information of the projector to obtain the two-dimensional vertex coordinates of the corrected original image;
and controlling the projector to project according to the two-dimensional vertex coordinates of the corrected original image.
2. The method according to claim 1, wherein the preset image is a checkerboard image; the identifying of the target feature point of the preset image comprises the following steps:
determining initial characteristic points in the checkerboard image by adopting an angular point detection algorithm;
performing linear fitting on the initial characteristic points in the same preset area range respectively according to the vertical direction and the horizontal direction;
and taking the intersection point between any straight line in the vertical direction and any straight line in the horizontal direction obtained by fitting as the target characteristic point.
3. The method according to claim 1, wherein the determining, for each target feature point, depth information of the target feature point in the shooting space of the camera according to a mapping relationship pre-calibrated for the target feature point and camera coordinates of the target feature point on the shot image includes:
for each target feature point, calculating depth information of the target feature point in a shooting space of the camera according to a mapping relation pre-calibrated for the target feature point and camera coordinates of the target feature point on the shot image, wherein the mapping relation is as follows:
Figure 872140DEST_PATH_IMAGE001
wherein h is the depth information of the target feature point,
Figure 984452DEST_PATH_IMAGE002
a first preset calibration parameter for the target feature point,
Figure 585198DEST_PATH_IMAGE003
and a second preset calibration parameter of the target feature point, wherein X is a camera coordinate of the target feature point, and the first preset calibration parameter and the second preset calibration parameter are constants.
4. The method according to claim 3, wherein for each target feature point, the first preset calibration parameter and the second preset calibration parameter of the target feature point are obtained by:
under the condition that the projector is a first distance away from a projection plane and the projection light of the projector is perpendicular to the projection plane, projecting a preset image to the projection plane, and shooting the preset image through a camera of the projector to obtain a first image;
determining first camera coordinates and first depth information of a target feature point on the first image based on the first image and the first distance;
under the condition that the projector is at a second distance from the projection plane and the projection light of the projector is perpendicular to the projection plane, projecting a preset image to the projection plane, and shooting the preset image through a camera of the projector to obtain a second image;
determining second camera coordinates and second depth information of the target feature points on the preset image according to the second image and the second distance;
according to the first camera coordinate and the first depth information, and the second camera coordinate and the second depth information, combining a mapping relation between the pre-established depth information of the target feature point and the camera coordinate to obtain the first preset calibration parameter and the second preset calibration parameter, wherein the mapping relation is as follows:
Figure 642016DEST_PATH_IMAGE004
wherein h is the depth coordinate of the target feature point, X is the camera coordinate,
Figure 986409DEST_PATH_IMAGE005
for the first preset calibration parameter, the calibration parameter is set,
Figure 953228DEST_PATH_IMAGE006
and the second preset calibration parameter is obtained.
5. The method according to claim 1, wherein the correcting the two-dimensional vertex coordinates of the original image of the projector according to the normal vector of the projection plane and the current pose information of the projector to obtain the two-dimensional vertex coordinates of the corrected original image includes:
calculating offset information of the projector normal vector according to the normal vector of the projection plane and the current pose information of the projector, wherein the offset information comprises a yaw angle, a pitch angle and a roll angle;
calculating to obtain two-dimensional vertex coordinates of a projection image projected to the projection plane by the projector based on the offset information;
establishing a homography matrix relation between the projected image and the original image based on the two-dimensional vertex coordinates of the projected image and the two-dimensional vertex coordinates of the original image of the projector;
selecting a target rectangle from the projection image of the projector, and determining the two-dimensional vertex coordinates of the target rectangle;
and obtaining the two-dimensional vertex coordinates of the corrected original image by combining the homography matrix relation based on the two-dimensional vertex coordinates of the target rectangle.
6. The method of claim 5, wherein the selecting a target rectangle from the projection image of the projector comprises:
randomly selecting a point from any side of the projected image, and generating a rectangle in the area of the projected image by taking the point as the vertex of the rectangle to be constructed and the aspect ratio of the original image as the aspect ratio of the rectangle to be constructed;
and selecting the rectangle with the largest area from the generated rectangles as the target rectangle.
7. A projection correction apparatus, characterized in that the apparatus comprises:
the response module is configured to respond to the received correction instruction and control the projector to project a preset image to the projection plane;
the shooting module is configured to shoot the preset image projected by the projector through a camera of the projector to obtain a shot image;
a target feature point determination module configured to identify a target feature point of the preset image in the captured image;
the three-dimensional coordinate determination module is configured to determine, for each target feature point, depth information of the target feature point in a shooting space of the camera according to a mapping relationship calibrated in advance for the target feature point and a camera coordinate of the target feature point on the shot image, so as to obtain a three-dimensional coordinate of the target feature point in a projection space of the projector, wherein the mapping relationship is an association relationship between the depth information of the target feature point calibrated at different depths and an offset of the camera coordinate;
a normal vector module configured to determine a normal vector of the projection plane relative to the projector according to the three-dimensional coordinates of each of the target feature points;
the correction module is configured to correct the two-dimensional vertex coordinates of the original image of the projector according to the normal vector of the projection plane and the current pose information of the projector to obtain the two-dimensional vertex coordinates of the corrected original image;
and the projection module is used for controlling the projector to project according to the two-dimensional vertex coordinates of the corrected original image.
8. The apparatus of claim 7, wherein the preset image is a checkerboard image, and the response module comprises:
the instruction response submodule is configured to determine initial feature points in the checkerboard image by adopting an angular point detection algorithm;
the straight line fitting submodule is configured to perform straight line fitting on the initial characteristic points in the same preset area range respectively in the vertical direction and the horizontal direction;
and the characteristic point determining submodule is configured to use an intersection point between any straight line in the vertical direction and any straight line in the horizontal direction, which are obtained through fitting, as the target characteristic point.
9. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by processing means, carries out the steps of the method of any one of claims 1 to 6.
10. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 6.
CN202110297235.1A 2021-03-19 2021-03-19 Projection correction method, projection correction device, storage medium and electronic equipment Active CN112689135B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110297235.1A CN112689135B (en) 2021-03-19 2021-03-19 Projection correction method, projection correction device, storage medium and electronic equipment
PCT/CN2021/115160 WO2022193559A1 (en) 2021-03-19 2021-08-27 Projection correction method and apparatus, storage medium, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110297235.1A CN112689135B (en) 2021-03-19 2021-03-19 Projection correction method, projection correction device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112689135A true CN112689135A (en) 2021-04-20
CN112689135B CN112689135B (en) 2021-07-02

Family

ID=75455702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110297235.1A Active CN112689135B (en) 2021-03-19 2021-03-19 Projection correction method, projection correction device, storage medium and electronic equipment

Country Status (2)

Country Link
CN (1) CN112689135B (en)
WO (1) WO2022193559A1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112804508A (en) * 2021-03-19 2021-05-14 深圳市火乐科技发展有限公司 Projector correction method, projector correction system, storage medium, and electronic device
CN113645456A (en) * 2021-09-22 2021-11-12 业成科技(成都)有限公司 Projection image correction method, projection system, and readable storage medium
CN113824939A (en) * 2021-09-29 2021-12-21 深圳市火乐科技发展有限公司 Projection image adjusting method and device, projection equipment and storage medium
CN113983951A (en) * 2021-09-10 2022-01-28 深圳市辰卓科技有限公司 Three-dimensional target measuring method and device, imager and storage medium
CN114383812A (en) * 2022-01-17 2022-04-22 深圳市火乐科技发展有限公司 Method and device for detecting stability of sensor, electronic equipment and medium
CN114449249A (en) * 2022-01-29 2022-05-06 深圳市火乐科技发展有限公司 Image projection method, image projection device, storage medium and projection equipment
CN114615478A (en) * 2022-02-28 2022-06-10 青岛信芯微电子科技股份有限公司 Projection picture correction method, projection picture correction system, projection device, and storage medium
CN115086625A (en) * 2022-05-12 2022-09-20 峰米(重庆)创新科技有限公司 Correction method, device and system of projection picture, correction equipment and projection equipment
WO2022193558A1 (en) * 2021-03-19 2022-09-22 深圳市火乐科技发展有限公司 Projector correction method and system, and storage medium and electronic device
WO2022193559A1 (en) * 2021-03-19 2022-09-22 深圳市火乐科技发展有限公司 Projection correction method and apparatus, storage medium, and electronic device
CN115103169A (en) * 2022-06-10 2022-09-23 深圳市火乐科技发展有限公司 Projection picture correction method, projection picture correction device, storage medium and projection equipment
CN115314689A (en) * 2022-08-05 2022-11-08 深圳海翼智新科技有限公司 Projection correction method, projection correction device, projector and computer program product
CN115442584A (en) * 2022-08-30 2022-12-06 中国传媒大学 Multi-sensor fusion irregular surface dynamic projection method
WO2022267027A1 (en) * 2021-06-25 2022-12-29 闻泰科技(深圳)有限公司 Image correction method and apparatus, and electronic device and storage medium
CN116033131A (en) * 2022-12-29 2023-04-28 深圳创维数字技术有限公司 Image correction method, device, electronic equipment and readable storage medium
CN116433476A (en) * 2023-06-09 2023-07-14 有方(合肥)医疗科技有限公司 CT image processing method and device
WO2023216822A1 (en) * 2022-05-13 2023-11-16 北京字节跳动网络技术有限公司 Image correction method and apparatus, electronic device, and storage medium
CN117066702A (en) * 2023-08-25 2023-11-17 上海频准激光科技有限公司 Laser marking control system based on laser
CN116033131B (en) * 2022-12-29 2024-05-17 深圳创维数字技术有限公司 Image correction method, device, electronic equipment and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107147888A (en) * 2017-05-16 2017-09-08 深圳市火乐科技发展有限公司 One kind corrects distortion methods and device automatically using graph processing chips
EP3287986A1 (en) * 2016-08-23 2018-02-28 National Taiwan University of Science and Technology Image correction method of projector and image correction system
CN110099267A (en) * 2019-05-27 2019-08-06 广州市讯码通讯科技有限公司 Trapezoidal correcting system, method and projector
CN110769217A (en) * 2018-10-10 2020-02-07 成都极米科技股份有限公司 Image processing method, projection apparatus, and photographing apparatus
CN112422939A (en) * 2021-01-25 2021-02-26 深圳市橙子数字科技有限公司 Trapezoidal correction method and device for projection equipment, projection equipment and medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6176114B2 (en) * 2011-09-15 2017-08-09 日本電気株式会社 Projected image automatic correction system, projected image automatic correction method and program
CN102611822B (en) * 2012-03-14 2015-07-01 海信集团有限公司 Projector and projection image rectifying method thereof
JP2015007866A (en) * 2013-06-25 2015-01-15 ローランドディー.ジー.株式会社 Projection image correction system, projection image correction method, projection image correction program, and computer-readable recording medium
CN107749979B (en) * 2017-09-20 2021-08-31 神画科技(深圳)有限公司 Left-right trapezoidal correction method for projector
CN110336987B (en) * 2019-04-03 2021-10-08 北京小鸟听听科技有限公司 Projector distortion correction method and device and projector
CN111093067B (en) * 2019-12-31 2023-03-24 歌尔光学科技有限公司 Projection apparatus, lens distortion correction method, distortion correction device, and storage medium
CN112689135B (en) * 2021-03-19 2021-07-02 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and electronic equipment
CN112804507B (en) * 2021-03-19 2021-08-31 深圳市火乐科技发展有限公司 Projector correction method, projector correction system, storage medium, and electronic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3287986A1 (en) * 2016-08-23 2018-02-28 National Taiwan University of Science and Technology Image correction method of projector and image correction system
CN107147888A (en) * 2017-05-16 2017-09-08 深圳市火乐科技发展有限公司 One kind corrects distortion methods and device automatically using graph processing chips
CN110769217A (en) * 2018-10-10 2020-02-07 成都极米科技股份有限公司 Image processing method, projection apparatus, and photographing apparatus
CN110099267A (en) * 2019-05-27 2019-08-06 广州市讯码通讯科技有限公司 Trapezoidal correcting system, method and projector
CN112422939A (en) * 2021-01-25 2021-02-26 深圳市橙子数字科技有限公司 Trapezoidal correction method and device for projection equipment, projection equipment and medium

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022193560A1 (en) * 2021-03-19 2022-09-22 深圳市火乐科技发展有限公司 Projector correction method and system, and storage medium and electronic device
CN112804508B (en) * 2021-03-19 2021-08-31 深圳市火乐科技发展有限公司 Projector correction method, projector correction system, storage medium, and electronic device
CN112804508A (en) * 2021-03-19 2021-05-14 深圳市火乐科技发展有限公司 Projector correction method, projector correction system, storage medium, and electronic device
WO2022193559A1 (en) * 2021-03-19 2022-09-22 深圳市火乐科技发展有限公司 Projection correction method and apparatus, storage medium, and electronic device
WO2022193558A1 (en) * 2021-03-19 2022-09-22 深圳市火乐科技发展有限公司 Projector correction method and system, and storage medium and electronic device
WO2022267027A1 (en) * 2021-06-25 2022-12-29 闻泰科技(深圳)有限公司 Image correction method and apparatus, and electronic device and storage medium
CN113983951B (en) * 2021-09-10 2024-03-29 深圳市辰卓科技有限公司 Three-dimensional target measuring method, device, imager and storage medium
CN113983951A (en) * 2021-09-10 2022-01-28 深圳市辰卓科技有限公司 Three-dimensional target measuring method and device, imager and storage medium
CN113645456B (en) * 2021-09-22 2023-11-07 业成科技(成都)有限公司 Projection image correction method, projection system and readable storage medium
CN113645456A (en) * 2021-09-22 2021-11-12 业成科技(成都)有限公司 Projection image correction method, projection system, and readable storage medium
CN113824939A (en) * 2021-09-29 2021-12-21 深圳市火乐科技发展有限公司 Projection image adjusting method and device, projection equipment and storage medium
CN114383812A (en) * 2022-01-17 2022-04-22 深圳市火乐科技发展有限公司 Method and device for detecting stability of sensor, electronic equipment and medium
CN114449249A (en) * 2022-01-29 2022-05-06 深圳市火乐科技发展有限公司 Image projection method, image projection device, storage medium and projection equipment
CN114449249B (en) * 2022-01-29 2024-02-09 深圳市火乐科技发展有限公司 Image projection method, image projection device, storage medium and projection apparatus
CN114615478A (en) * 2022-02-28 2022-06-10 青岛信芯微电子科技股份有限公司 Projection picture correction method, projection picture correction system, projection device, and storage medium
CN114615478B (en) * 2022-02-28 2023-12-01 青岛信芯微电子科技股份有限公司 Projection screen correction method, projection screen correction system, projection apparatus, and storage medium
CN115086625A (en) * 2022-05-12 2022-09-20 峰米(重庆)创新科技有限公司 Correction method, device and system of projection picture, correction equipment and projection equipment
CN115086625B (en) * 2022-05-12 2024-03-15 峰米(重庆)创新科技有限公司 Correction method, device and system for projection picture, correction equipment and projection equipment
WO2023216822A1 (en) * 2022-05-13 2023-11-16 北京字节跳动网络技术有限公司 Image correction method and apparatus, electronic device, and storage medium
CN115103169B (en) * 2022-06-10 2024-02-09 深圳市火乐科技发展有限公司 Projection picture correction method, projection picture correction device, storage medium and projection device
CN115103169A (en) * 2022-06-10 2022-09-23 深圳市火乐科技发展有限公司 Projection picture correction method, projection picture correction device, storage medium and projection equipment
CN115314689A (en) * 2022-08-05 2022-11-08 深圳海翼智新科技有限公司 Projection correction method, projection correction device, projector and computer program product
CN115442584B (en) * 2022-08-30 2023-08-18 中国传媒大学 Multi-sensor fusion type special-shaped surface dynamic projection method
CN115442584A (en) * 2022-08-30 2022-12-06 中国传媒大学 Multi-sensor fusion irregular surface dynamic projection method
CN116033131A (en) * 2022-12-29 2023-04-28 深圳创维数字技术有限公司 Image correction method, device, electronic equipment and readable storage medium
CN116033131B (en) * 2022-12-29 2024-05-17 深圳创维数字技术有限公司 Image correction method, device, electronic equipment and readable storage medium
CN116433476B (en) * 2023-06-09 2023-09-08 有方(合肥)医疗科技有限公司 CT image processing method and device
CN116433476A (en) * 2023-06-09 2023-07-14 有方(合肥)医疗科技有限公司 CT image processing method and device
CN117066702A (en) * 2023-08-25 2023-11-17 上海频准激光科技有限公司 Laser marking control system based on laser
CN117066702B (en) * 2023-08-25 2024-04-19 上海频准激光科技有限公司 Laser marking control system based on laser

Also Published As

Publication number Publication date
WO2022193559A1 (en) 2022-09-22
CN112689135B (en) 2021-07-02

Similar Documents

Publication Publication Date Title
CN112689135B (en) Projection correction method, projection correction device, storage medium and electronic equipment
WO2021103347A1 (en) Projector keystone correction method, apparatus, and system, and readable storage medium
CN112804508B (en) Projector correction method, projector correction system, storage medium, and electronic device
CN112804507B (en) Projector correction method, projector correction system, storage medium, and electronic device
US20240153143A1 (en) Multi view camera registration
JP5961945B2 (en) Image processing apparatus, projector and projector system having the image processing apparatus, image processing method, program thereof, and recording medium recording the program
CN106846409B (en) Calibration method and device of fisheye camera
US20200177866A1 (en) Calibration apparatus, chart for calibration, chart pattern generation apparatus, and calibration method
CN114727081B (en) Projector projection correction method and device and projector
WO2021031781A1 (en) Method and device for calibrating projection image and projection device
CN112272292B (en) Projection correction method, apparatus and storage medium
KR101694651B1 (en) Distortion compensation apparatus and method for wide-angle imaging lens using three-dimensional position estimate
KR101649753B1 (en) Calibrating method for images from multiview cameras and controlling system for multiview cameras
US20130314533A1 (en) Data deriving apparatus
JP6990694B2 (en) Projector, data creation method for mapping, program and projection mapping system
CN114286066A (en) Projection correction method, projection correction device, storage medium and projection equipment
CN111131801A (en) Projector correction system and method and projector
CN114286068A (en) Focusing method, focusing device, storage medium and projection equipment
CN115174878B (en) Projection picture correction method, apparatus and storage medium
TW202038185A (en) Method for correcting distortion image and apparatus thereof
CN114071103A (en) Adaptive left-right trapezoidal correction method for projector
CN114339179A (en) Projection correction method, projection correction device, storage medium and projection equipment
CN115174879B (en) Projection screen correction method, apparatus, computer device and storage medium
CN114740681B (en) Intelligent ranging adjustment system of monolithic liquid crystal projector with rotary lens
WO2024080234A1 (en) Projection device, correction device, projection system, correction method, and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant