CN110336987A - A kind of projector distortion correction method, device and projector - Google Patents
A kind of projector distortion correction method, device and projector Download PDFInfo
- Publication number
- CN110336987A CN110336987A CN201910266198.0A CN201910266198A CN110336987A CN 110336987 A CN110336987 A CN 110336987A CN 201910266198 A CN201910266198 A CN 201910266198A CN 110336987 A CN110336987 A CN 110336987A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- point
- camera lens
- ray machine
- plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
Abstract
The present invention discloses a kind of projector distortion correction method, device and projector.The method comprise the steps that the standard grid image in grating pixel faces is projected on perspective plane using ray machine camera lens, and the point cloud projection image of mesh point on perspective plane is obtained using camera shooting perspective plane;According to the three-dimensional projection model constructed in advance, three-dimensional coordinate of the mesh point based on ray machine camera lens coordinate system on perspective plane is obtained;Reference projection plane is obtained using the three-dimensional coordinate of mesh point on perspective plane, and world coordinate system is constructed using the posture information of reference projection plane and the projector obtained by Inertial Measurement Unit, and obtain the transformational relation between world coordinate system and ray machine camera lens coordinate system;The target area of undistorted projection in grating pixel faces is obtained according to transformational relation;The mapping relations for obtaining grating pixel faces and target area treat project content using mapping relations and carry out texture mapping.The present invention can be eliminated by the undesirable caused trapezoidal distortion of projection angle.
Description
Technical field
The present invention relates to a kind of projector distortion correction method, device and projectors.
Background technique
With the maturation of short focus ray machine technology and declining to a great extent for cost, smart projector is in application placed in the middle increasingly
Increase.It is not to meet requirement of the household user to high quality viewing experience, the resolution ratio of smart projector, brightness and color side
The picture quality in face has obtained rapid promotion.In general, if between the projecting direction and projection screen of projector not
At right angle, trapezoidal distortion will occur for projected picture.And in practical applications, user can be adjusted in projector vertical direction
Inclination angle is projected to generate the projected picture of different height and size, thus will generate vertical trapezium distortion, projector is in addition to needing
It wants except vertical keystone correction, also usually encounters and generate trapezoidal distortion because of the offset of projector horizontal position.Even put
The influence of the factors such as horizontalization face out-of-flatness, projector also have an axial-rotation angle component, cause projected horizontal face and absolutely
Level is not able to maintain to horizontal plane, such case will also result in final projected picture and generate distortion.
Currently, most projectors all have the function of vertical keystone correction, by adjusting projector in vertical direction
The intensity of keystone, so that the picture finally projected is rectangular.Some projector manufacturers have developed " horizontal trapezoidal school
Orthofunction ", horizontal keystone correction are solved due to the horizontal direction image that projector lens and screen can not be vertical and generate
Trapezoidal distortion, to allow projector that standard rectangular projected image also may be implemented in the side of projection screen.
But existing solution can not also ideally solve distortion caused by axial-rotation deviates, so that projector is to throwing
The bad adaptability of shadow environment, the viewing experience for influencing user.In addition, current keystone technology needs user to pass through mostly
Range estimation correction, manual setting correction parameter complete correcting process, and the degree of automation is generally high.
Summary of the invention
The present invention provides a kind of projector distortion correction method, device and projectors, at least partly to solve above-mentioned ask
Topic.
In a first aspect, projector has ray machine camera lens, camera shooting the present invention provides a kind of projector distortion correction method
Head and Inertial Measurement Unit, this method comprises: being projected the standard grid image in grating pixel faces using the ray machine camera lens
On the projection surface, and using the camera point cloud projection image that the perspective plane obtains mesh point on perspective plane is shot;Root
According to the pixel point correspondence between the standard grid image and the point cloud projection image and according to three constructed in advance
Projection model is tieed up, three-dimensional coordinate of the mesh point based on the ray machine camera lens coordinate system on the perspective plane is obtained, wherein described three
It ties up ray machine camera lens coordinate system and camera coordinate system of the projection model based on the projector and constructs;Utilize the perspective plane
The three-dimensional coordinate of upper mesh point obtains reference projection plane, and using the reference projection plane and by the inertia measurement
The posture information for the projector that unit obtains constructs world coordinate system, and obtains the world coordinate system and the ray machine
Transformational relation between camera lens coordinate system;It is closed according to the conversion between the world coordinate system and the ray machine camera lens coordinate system
System, obtains the target area of undistorted projection in the grating pixel faces;Obtain the grating pixel faces and the target area
Mapping relations, using the mapping relations treat project content carry out texture mapping, realize undistorted projection.
Second aspect, the present invention provides a kind of projector distortion correction device, projector has ray machine camera lens, camera shooting
Head and Inertial Measurement Unit, the device include: point cloud projection acquiring unit, will be in grating pixel faces using the ray machine camera lens
Standard grid image be projected on perspective plane, and shoot the perspective plane using the camera and obtain grid on perspective plane
The point cloud projection image of point;Point cloud projection coordinate calculating unit, according to the standard grid image and the point cloud projection image
Between pixel point correspondence and according to the three-dimensional projection model constructed in advance, obtain mesh point base on the perspective plane
In the three-dimensional coordinate of the ray machine camera lens coordinate system, wherein ray machine camera lens of the three-dimensional projection model based on the projector
Coordinate system and camera coordinate system and construct;Transformational relation computing unit is sat using the three-dimensional of mesh point on the perspective plane
Mark obtains reference projection plane, and utilizes the reference projection plane and the projection obtained by the Inertial Measurement Unit
The posture information of instrument constructs world coordinate system, and obtains and turn between the world coordinate system and the ray machine camera lens coordinate system
Change relationship;Target area determination unit is closed according to the conversion between the world coordinate system and the ray machine camera lens coordinate system
System, obtains the target area of undistorted projection in the grating pixel faces;Texture map unit obtains the grating pixel faces
With the mapping relations of the target area, project content is treated using the mapping relations and carries out texture mapping, is realized undistorted
Projection.
The third aspect, the present invention provides a kind of projectors, comprising: ray machine camera lens, by the standard network in grating pixel faces
Table images project on the projection surface;Camera, shooting perspective plane obtain point cloud projection image and are sent to graphics processor;It is used
Property measuring unit, the posture information for measuring the projector are sent to the graphics processor;Memory, storage computer can
It executes instruction;Graphics processor, it is above-mentioned that the computer executable instructions when executed execute the graphics processor
Projector distortion correction method.
Fourth aspect stores on computer readable storage medium the present invention provides a kind of computer readable storage medium
There are one or more computer programs, one or more computer programs, which are performed, realizes projector distortion correction above-mentioned
Method.
The present invention passes through built-in camera and ray machine in projector built-in camera, ray machine camera lens and inertia direction finding unit
Camera lens makes projector have stereoscopic vision ability, may be implemented using the ray machine camera lens and camera of projector to projection
The active three-dimensional modeling of environment, establishes three-dimensional projection model, based on three-dimensional projection model projector i.e. using computer vision
Method automatically obtains the distribution situation on perspective plane without passing through built-in inertia measurement list by external equipment, instrument
Member can obtain the posture information of projector, calculate grating image in conjunction with the distribution situation on perspective plane and the posture information of projector
The target area of undistorted projection on vegetarian noodles passes through the target area of undistorted projection and the vertex of original grating image
It can use texture at live fluoroscopic with the mapping relations for obtaining original grating image and undistorted precorrection image and reflect
It penetrates technology to realize to carry out distortion correction image to be projected, to eliminate since projection angle is not between projector and perspective plane
Trapezoidal distortion caused by ideal.
Detailed description of the invention
Fig. 1 is the flow chart of the projector distortion correction method shown in the embodiment of the present invention;
Fig. 2 is the schematic diagram of optical system of the projector shown in the embodiment of the present invention;
Fig. 3 is the pin-hole imaging schematic diagram shown in the embodiment of the present invention;
Fig. 4 is the point cloud projection image generation process schematic diagram shown in the embodiment of the present invention;
Fig. 5 is the three-dimensional projection model schematic diagram shown in the embodiment of the present invention;
Fig. 6 is the reference projection plane fitting process schematic shown in the embodiment of the present invention;
Fig. 7 is that world coordinate system, ray machine camera lens coordinate system, grating pixel faces and the benchmark shown in the embodiment of the present invention are thrown
Relation schematic diagram between shadow plane;
Fig. 8 is quadrilateral area in the reference projection plane shown in the embodiment of the present invention and maximum inscribed rectangle schematic diagram;
Fig. 9 is the target area schematic diagram in the grating pixel faces shown in the embodiment of the present invention;
Figure 10 is the triangle mesh schematic diagram shown in the embodiment of the present invention;
Figure 11 is the grid schematic diagram of the precorrection target picture shown in the embodiment of the present invention;
Figure 12 is the distortion correction drop shadow effect schematic diagram shown in the embodiment of the present invention;
Figure 13 is the structural block diagram of the projector distortion correction device shown in the embodiment of the present invention;
Figure 14 is the structural schematic diagram of the projector shown in the embodiment of the present invention.
Specific embodiment
The embodiment of the present invention provides a kind of projector distortion correction method, utilizes the various methods of computer vision and throwing
Itself posture information of shadow instrument calculates the surface distribution parameter on perspective plane, is according to calculating precorrection target figure with this distribution parameter
Piece grid texture layout is realized by GPU and is corrected to the texture mapping of input picture, to obtain predistortion projected image, made
The undesirable situation of projection angle between projector and perspective plane, realize the effect of undistorted projection.The present invention is implemented
Example additionally provides corresponding device, projector and computer readable storage medium, is described in detail separately below.
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with attached drawing to embodiment party of the present invention
Formula is described in further detail.However, it should be understood that these descriptions are merely illustrative, and it is not intended to limit of the invention
Range.In addition, in the following description, descriptions of well-known structures and technologies are omitted, to avoid the present invention is unnecessarily obscured
Concept.
Term as used herein is not intended to limit the present invention just for the sake of description specific embodiment.Used here as
Word " one ", " one (kind) " and "the" etc. also should include " multiple ", " a variety of " the meaning, unless context is in addition clear
It points out.In addition, the terms "include", "comprise" as used herein etc. show depositing for the feature, step, operation and/or component
, but it is not excluded that in the presence of or add other one or more features, step, operation or component.
There are all terms (including technical and scientific term) as used herein those skilled in the art to be generally understood
Meaning, unless otherwise defined.It should be noted that term used herein should be interpreted that with consistent with the context of this specification
Meaning, without that should be explained with idealization or excessively mechanical mode.
Shown in the drawings of some block diagrams and/or flow chart.It should be understood that some sides in block diagram and/or flow chart
Frame or combinations thereof can be realized by computer program instructions.These computer program instructions can be supplied to general purpose computer,
The processor of special purpose computer or other programmable data processing units, so that these instructions when executed by this processor may be used
To create for realizing function/operation device illustrated in these block diagrams and/or flow chart.
Therefore, technology of the invention can be realized in the form of hardware and/or software (including firmware, microcode etc.).Separately
Outside, technology of the invention can take the shape of the computer program product on the computer readable storage medium for being stored with instruction
Formula, the computer program product use for instruction execution system or instruction execution system are combined to use.Of the invention upper
Hereinafter, computer readable storage medium, which can be, can include, store, transmitting, propagating or transmitting the arbitrary medium of instruction.
For example, computer readable storage medium can include but is not limited to electricity, magnetic, optical, electromagnetic, infrared or semiconductor system, device,
Device or propagation medium.The specific example of computer readable storage medium includes: magnetic memory apparatus, such as tape or hard disk
(HDD);Light storage device, such as CD (CD-ROM);Memory, such as random access memory (RAM) or flash memory;And/or have
Line/wireless communication link.
The present invention provides a kind of projector distortion correction method, and the projector in the present embodiment has ray machine camera lens, camera shooting
Head and Inertial Measurement Unit.
Fig. 1 is the flow chart of the projector distortion correction method shown in the embodiment of the present invention, as shown in Figure 1, the present embodiment
Method include:
Standard grid image in grating pixel faces is projected on perspective plane by S110 using ray machine camera lens, and is utilized and taken the photograph
As head shooting perspective plane obtains the point cloud projection image of mesh point on perspective plane.
Ray machine in the present embodiment can be understood as the projection module in projection device, it is generally the case that ray machine is exactly handle
Digital Micromirror Device shows that core (also referred to as grating pixel faces), light source, camera lens optical path, heat dissipation all concentrate on a mechanism
In, it is integrally formed component, with dust-proof, shockproof.
S120, according to the pixel point correspondence between standard grid image and point cloud projection image and according to preparatory
The three-dimensional projection model of building obtains three-dimensional coordinate of the mesh point based on ray machine camera lens coordinate system on perspective plane, wherein three-dimensional throw
Ray machine camera lens coordinate system and camera coordinate system of the shadow model based on the projector and construct, for calculate perspective plane online
Three-dimensional coordinate of the lattice point based on ray machine camera lens coordinate system.
S130 obtains reference projection plane using the three-dimensional coordinate of mesh point on perspective plane, and flat using reference projection
The posture information of face and the projector obtained by Inertial Measurement Unit constructs world coordinate system, and obtains world coordinate system
With the transformational relation between ray machine camera lens coordinate system.
In some embodiments, horizontal plane where the axial-rotation angle of gyroscope measurement identifies projector is relatively exhausted
To the departure degree of horizontal plane.Wherein, absolute horizon can and be interpreted as the ray machine camera lens coordinate system of Ideal Projector
The projector deviated on vertical, horizontal and direction of rotation is not present in XOZ plane, Ideal Projector;It is horizontal where projector
Face can be understood as the XOZ plane of ray machine camera lens coordinate system.
In some embodiments, Inertial Measurement Unit includes gyroscope, using reference projection plane and by inertia measurement list
The posture information building world coordinate system for the projector that member obtains is specifically to utilize reference projection plane and by gyroscope measurement
Axial-rotation angle constructs world coordinate system.
S140 obtains nothing in grating pixel faces according to the transformational relation between world coordinate system and ray machine camera lens coordinate system
The target area of distorted projection.
S150 is obtained the mapping relations of the grating pixel faces and the target area, is treated using the mapping relations
Project content carries out texture mapping, realizes undistorted projection.
The present embodiment passes through built-in camera and light in projector built-in camera, ray machine camera lens and inertia direction finding unit
Machine camera lens makes projector have stereoscopic vision ability, may be implemented using the ray machine camera lens and camera of projector to throwing
The active three-dimensional modeling of shadow environment, establishes three-dimensional projection model, is regarded using computer based on three-dimensional projection model projector
Feel method automatically obtains the distribution situation on perspective plane without passing through built-in inertia measurement by external equipment, instrument
Unit can obtain the posture information of projector, calculate grating in conjunction with the distribution situation on perspective plane and the posture information of projector
The target area of undistorted projection in pixel faces passes through the target area of undistorted projection and the vertex of original grating image
Matching obtains original grating image and the mapping relations of undistorted precorrection image can use texture in live fluoroscopic
Mapping techniques are realized to carry out distortion correction image to be projected, to eliminate due to projection angle between projector and perspective plane
Undesirable caused trapezoidal distortion.
For the precision for promoting projector distortion correction, the present embodiment in advance demarcates projector, obtains projector
Distortion parameter, the distortion parameter include ray machine camera lens and camera intrinsic parameter and outer parameter.Such as to the calibration process of projector
Under:
As shown in Fig. 2, there are certain distance, world coordinates same in this way between the ray machine camera lens and camera of projector
Point will have the point A in parallax, such as Tu2Zhong view field in light in ray machine grating pixel faces and sensor pixel face
Correspond to location of pixels a1 in machine grating pixel faces, location of pixels a2 is corresponded on sensor pixel face, thus meets double
The formation condition of item stereo vision, since three can be constructed based on the ray machine camera lens coordinate system and camera coordinate system of projector
Tie up projection model.
Ray machine camera lens can be regarded as a reverse camera, can establish the pinhole imaging system mould similar with camera
Type, thus the correction principle of ray machine camera lens and the correction principle of camera are similar, and the present embodiment is joined with the distortion for obtaining camera
It is described for number.
As shown in figure 3, the formula of national forest park in Xiaokeng are as follows: and sm '=A [R | t] M ', wherein s is the Normalized Scale factor;
A is the Intrinsic Matrix of camera;[R | t] it is outer parameter matrix, for the coordinate of picture point P to be converted from world coordinate system
Into camera coordinate system, R is spin matrix, and t is translation vector;M ' is coordinate position in the camera coordinate system, and M ' is
Coordinate position in world coordinate system;In pinhole imaging system optical path shown in Fig. 3, the coordinate c of object point FcxWith cyIn uv plane
Corresponding plane coordinates is (u, v), u=fx·x′+cx, v=fy·y′+cy, x '=x/z, y '=y/z, fx,fyRespectively take the photograph
As head focal length coordinate, cx,cyThe respectively X-axis and Y axis coordinate of object point Fc, x, y are respectively the coordinate of picture point P, and x ', y ' are respectively
For the normalized coordinate of opposite P point ordinate of orthogonal axes.The coordinate system of object point Fc shown in Fig. 3 is equivalent to the present embodiment camera seat
Mark system, uv plane coordinate system are equivalent to the present embodiment camera sensing device pixel faces coordinate system.Therefore, world coordinates is being obtained
It, can be according to u=f after transformational relation between system and camera coordinate systemx·x′+cx, v=fy·y′+cyThe world is obtained to sit
Corresponding relationship between mark system and sensor pixel areal coordinate system.
For camera internal reference matrix number, can be obtained in projector camera by scaling board and structured light projection
Parameter, intrinsic parameter include focal length, radial distortion parameter, tangential distortion parameter and principal point coordinate (the i.e. center of sensor image
Point).At this point, the corresponding relationship between camera coordinate system and sensor pixel areal coordinate system are as follows: u=fx*x″+cx, v=fy*
y″+cy, wherein k1,k2,k3,k4,k5,k6Respectively take the photograph
As the radial distortion parameter of head, p1,p2The respectively tangential distortion parameter of camera, s1,s2,s3,s4Respectively camera is thin
Prismatic distortion parameter.
In the present embodiment, it is also necessary to obtain the translation vector and spin matrix between ray machine camera lens and camera.Specifically
, in interior parameter calibration process, available world coordinate system to ray machine camera lens coordinate system spin matrix RpWith translation vector tp,
Spin matrix R of the world coordinate system to camera coordinate systemcWith translation vector tc, it can be obtained according to national forest park in Xiaokeng:(X, Y, Z) is the three-dimensional point coordinate of world coordinate system, (Xp,Yp,
Zp) and (Xc,Yc,Zc) three-dimensional point coordinate of ray machine camera lens coordinate system and camera coordinate system is respectively corresponded, simultaneous is above-mentioned two
The available ray machine camera lens of formula and camera relative positional relationship:It can both obtain in this way
The outer parameter of projector binocular vision is obtained, the coordinate for realizing camera coordinate system to ray machine camera lens coordinate system is converted.
Before projector factory, by above-mentioned projector calibrating method, obtain ray machine camera lens and camera intrinsic parameter and
Outer parameter.In some embodiments, the projector light bar resolution used is demarcated as 1920x1080, and net is obtained by re-projection
Lattice point calculates calibration residual error in the virtual image in grating face.Intrinsic parameter calibration residual error is 0.2137 pixel in some embodiments,
External parameters calibration residual error is 0.4324 pixel.Inside and outside parameter obtained by calibrating downloaded in projector, for below
It is used in distortion correction process.
In projector applications, the user interface comprising perspective plane distortion correction process is selecting and is entering the process
User interface after, projector can prompt enter distortion correction process, require the perspective plane that need to correct projector alignment at this time
And keep projector position constant, into automated correction process.Below with reference to Fig. 4-12 couples of above-mentioned steps S110-S150 into
Row is described in detail.
Firstly, executing step S110, i.e., the standard grid image in grating pixel faces is projected in using ray machine camera lens
On shadow face, and utilize the point cloud projection image of mesh point on the acquisition perspective plane of camera shooting perspective plane.
Under normal conditions project millet cake cloud density it is higher, to projection planar survey precision will be higher, but due to by
The influence of the factors such as the scattering of round spot caused by lens distortion, acquisition noise and bias light, if mesh-density is excessively high, Jiu Huiying
The speed of grid search is rung, or even mesh point can not be searched.The present embodiment introduces the method that distorted grid is generated and projected,
Grid dot density can be promoted in the case where not influencing mesh point search efficiency, to promote projection millet cake cloud measurement essence
Degree.
In some embodiments, perspective plane point cloud chart picture is obtained by following methods: utilizing ray machine camera lens by grating pixel
The first standard grid image in face is projected on perspective plane, and obtains the first subpoint cloud atlas using camera shooting perspective plane
Picture;The second standard grid image of grating pixel faces is projected on perspective plane using ray machine camera lens, and is shot using camera
The perspective plane obtains the second point cloud projection image, the first standard grid image and the round spot phase in the second standard grid image
Mutually it is staggered;The round spot grid in the first point cloud projection image and the second point cloud projection image is identified respectively, obtains the first subpoint
Location of pixels of each round spot in grating pixel faces in cloud atlas picture and the second point cloud projection image;By being superimposed the first subpoint
Cloud atlas picture and the second point cloud projection image obtain point cloud projection image, the first point cloud projection image and the second subpoint cloud atlas
Each round spot is the location of pixels of corresponding round spot on point cloud projection image in the location of pixels of grating pixel faces as in.
As shown in figure 4, the application program of projector generates two asymmetric round spot grid pictures, the round spot of two pictures
Mutual dislocation is spaced each other a grid distance.The two pictures successively project to perspective plane by ray machine camera lens, then pass through
The camera acquired projections picture of projector finds grid round spot in collected point cloud chart picture and records its pixel coordinate,
Then the coordinate of distorted grid round spot on two amplitude point cloud images is merged, obtaining mesh-density is two point cloud chart picture circles
The round spot central coordinate of circle of the point cloud projection image of the sum of spot density.
On obtaining perspective plane after the point cloud projection image of mesh point, step S120 is continued to execute, i.e., according to standard network
Pixel point correspondence between table images and point cloud projection image and the three-dimensional projection model constructed in advance, obtain perspective plane
Three-dimensional coordinate of the upper mesh point based on ray machine camera lens coordinate system, wherein ray machine mirror of the three-dimensional projection model based on the projector
Head coordinate system and camera coordinate system and construct, for calculating three-dimensional of the mesh point based on ray machine camera lens coordinate system on perspective plane
Coordinate.
Before the distribution situation for obtaining perspective plane based on perspective plane mesh point, acquired distortion parameter (ray machine is utilized
The intrinsic parameter of camera lens and camera) image in grating pixel faces and sensor pixel face is corrected, after being corrected
Mesh point coordinate (the u of camera sensing device imagec,vc) and ray machine image in mesh point coordinate (up,vp);Wherein distortion ginseng
Several acquisition processes see above description.
In some embodiments, three-dimensional projection model is constructed by following methods: first according to the camera optical path light
The heart and its first check point in the camera sensing device pixel faces, establish the first linear relationship;Then according to the light
Machine camera lens optical path optical center and its second check point in the ray machine camera lens grating pixel faces, establish the second linear relationship;So
Afterwards according to the external parameter matrix between the camera and the ray machine camera lens, first check point and described the are established
The third linear relationship of two check points;Finally according to the first linear relationship, the second linear relationship and third linear relationship, obtain
Three-dimensional coordinate of the mesh point based on ray machine camera lens coordinate system on perspective plane.
The three-dimensional projection model of the present embodiment calculates the three-dimensional coordinate of mesh point on perspective plane using deltic method.Such as
Shown in Fig. 5, mapping point of any mesh point on sensor pixel face and in grating pixel faces on perspective plane can be obtained, and
Pixel point correspondence between standard grid image and point cloud projection image is determining, and then can reconstruct perspective plane
The three-dimensional coordinate of mesh point.I.e. due to perspective plane point cloud chart seem standard grid image is projected by ray machine camera lens, and by
Camera is acquired and obtains, and therefore, can reconstruct each net in perspective plane based on three-dimensional projection model shown in fig. 5
The three-dimensional coordinate of lattice point.
One example, the u in Fig. 5cvcPlane coordinate system corresponds to coordinate system where sensor pixel face, upvpPlane is sat
Mark system corresponds to coordinate system where grating pixel faces, then according to previously described formula u=fx*x″+cxWith v=fy·y′+
cyCamera optical path optical center O can be obtainedcIt is qc, ray machine camera lens optical path optical center in the coordinate of coordinate system where sensor pixel face
OpIt is q in the coordinate of coordinate system where grating pixel facesp, thus according to the three-dimensional projection model of the present embodiment building, Ke Yiji
Calculate perspective plane respective grid points QwThree-dimensional coordinate in ray machine camera lens coordinate system.
Assuming that the coordinate of some mesh point of perspective plane is (Xp,Yp,Zp), then availableWithspWith scThe respectively scale factor of camera and ray machine camera lens, (uc,vc) with
(up,vp) it is respectively two-dimensional coordinate of the space three-dimensional point in sensor pixel face and grating pixel faces upslide shadow point, AcWith ApPoint
The Intrinsic Matrix of other camera and ray machine camera lens is the outer parameter matrix of projector for [R | t].
Further according to camera optical path optical center OcWith the first pixel (u in camera sensing device pixel facesp,vp) establish the
One linear relationship and ray machine camera lens optical path optical center OpWith the second pixel (u in ray machine camera lens grating pixel facesp,vp) establish
(X can be calculated in two wires sexual intercoursep,Yp,Zp)。
On obtaining perspective plane after three-dimensional coordinate of the mesh point based on ray machine camera lens coordinate system, step is continued to execute
S130 obtains reference projection plane that is, using the three-dimensional coordinate of mesh point on the perspective plane, and utilizes the reference projection
The posture information of plane and the projector obtained by the Inertial Measurement Unit constructs world coordinate system, and described in acquisition
Transformational relation between world coordinate system and the ray machine camera lens coordinate system.
In some embodiments, space is carried out to mesh point on perspective plane according to the three-dimensional coordinate of mesh point on perspective plane
It filters out, filters out the invalid mesh point in mesh point, obtain effective grid point;Effective grid point is carried out using least square method
Plane fitting determines that being fitted obtained plane is benchmark projection plane;Wherein, the effective grid point is the perspective plane
The mesh point being located substantially on same plane in upper mesh point, the invalid mesh point are big in mesh point on the perspective plane
Far from the mesh point in the plane.
As shown in fig. 6, the three-dimensional point cloud of all mesh point compositions may be not on same plane on perspective plane, this
It may include spatial noise in a little mesh points to select, discontinuous select and need these before rebuilding projection reference surface from millet cake
Invalid mesh point forms smooth projection millet cake cloud, carries out plane fitting to filtered smooth projection millet cake cloud and obtains institute
State reference projection plane.
In some embodiments, can be filtered out first with low-pass filtering mode spatial noise point in invalid mesh point and
It is discontinuous, then again by following methods filter out in invalid grid from millet cake, obtain effective grid point:
Step A: randomly selecting three not conllinear mesh points in the mesh point, and it is true to obtain three not conllinear mesh points
Fixed sub- plane a0′x+a1′y+a2' z=d, wherein a0′,a1′,a2', d is constant;
Step B: each mesh point and i-th of sub- plane distance d on perspective plane are calculatedi=a0′xi+a1′yi+a2′zi,
The pixilated grid point for being greater than pre-determined distance value with sub- plan range is rejected, the number of reference net lattice point is obtained, reference net lattice point is
With the mesh point for being not more than pre-determined distance value at a distance from sub- plane;Illustratively, pre-determined distance value t=2 σ, σ are all grids
Point works as d to the standard deviation of current sub- plan rangeiDetermine that the mesh point is removed for abnormal point when 2 σ of >, on the contrary di≤ 2 σ,
Determine that the mesh point is retained for reference net lattice point.
Repeat step A and step B, after iteration n times, determining after n times iteration has in obtained N sub- planes
The sub- plane of reference of most reference net lattice point numbers, the reference net lattice point with reference to sub- plane is effective grid point.
With reference to Fig. 6, the left figure in Fig. 6 is the perspective plane point cloud chart before filtering, it can be seen that some points are in from face position
It sets, the right figure in Fig. 6 is the point cloud obtained by cloud plane fitting, it can be seen that those points from face have been filtered out, and are remained
Remaining point is generally in fit Plane.
After obtaining effective grid point, the three-dimensional coordinate that can use effective grid point determines the flat of reference projection plane
Face equation is a0x+a1y+a2=z, the first unit normal vector of the planeWherein a0,a1,
a2It is constant, NbpFor the first unit normal vector of benchmark projection plane, norm () is vector norm operator.
After obtaining reference projection plane and its posture information, it can use Inertial Measurement Unit and obtain projector
Posture information;In some embodiments, Inertial Measurement Unit includes gyroscope.Axial-rotation angle is measured by gyroscope, is utilized
The posture information of axial-rotation angle and reference projection plane constructs world coordinate system, to eliminate trapezoidal distortion.
In some embodiments, world coordinate system is constructed by following methods: obtaining in the ray machine camera lens coordinate system
First unit normal vector of the reference projection plane;According to the Y-axis unit vector of the ray machine camera lens coordinate system and described
The posture information of projector is obtained using Inertial Measurement Unit, obtains absolute horizon in the ray machine camera lens coordinate system
Second unit normal vector;The vector product of first unit normal vector and second unit normal vector is as the world
The unit vector of coordinate system X-axis will using second unit normal vector as the unit vector of the world coordinate system Y-axis
Unit vector of first unit normal vector as the world coordinate system Z axis.It, can after constructing world coordinate system
With according to coordinate position of the world coordinate system origin in the ray machine camera lens coordinate system obtain the world coordinate system with
Translation matrix between the ray machine camera lens coordinate system, and according to the world coordinate system X-axis, the unit vector of Y-axis Z axis
Obtain the spin matrix between the world coordinate system and the ray machine camera lens coordinate system.
As shown in fig. 7, the XOY plane of the world coordinate system of building described in the present embodiment should be overlapped with reference projection plane,
The X-axis of world coordinate system is parallel with absolute horizon, and the Z-direction of world coordinate system is perpendicular to reference projection plane.It is based on
This, vector of three reference axis of world coordinate system in ray machine camera lens coordinate system can be calculated by following methods to be indicated:
The unit normal vector of absolute horizon is calculated first.It can use gyroscope included by Inertial Measurement Unit
The axial-rotation angle of measurement determines departure degree of the XOZ plane with respect to absolute horizon of ray machine camera lens coordinate system;Wherein top
The axial-rotation angle of spiral shell instrument measurement is the angle that is axially displaced from shown in Fig. 7, also the as axial-rotation angle of gyroscope measurement
θ, axial-rotation angle, θ are rotation angle of the projector in Z-direction relative to absolute horizon.
The XOZ plane of ray machine camera lens coordinate system is with respect to the departure degree of absolute horizonThe second unit normal line vector of absolute horizonRz(θ) is indicated
Departure degree of the XOZ plane with respect to absolute horizon of ray machine camera lens coordinate system, NhorIndicate second unit normal line vector, θ table
Show axial-rotation angle.
Then, the unit vector of the single reference axis of world coordinate system is calculated.Since the X-axis of world coordinate system is respectively with
One unit normal vector NbpWith second unit normal line vector NhorVertically, therefore the first unit normal vector N can be passed throughbpWith
Second unit normal line vector NhorVector product obtain the unit vector V of world coordinate system X-axisxw, respectively by the first unit normal
Vector NbpWith second unit normal line vector NhorAs world coordinate system Z axis unit vector VzwWith Y-axis unit vector Vyw.That is Vxw
=Nbp×Nhor, Vyw=Nhor, Vzw=Nbp。
In some embodiments, the origin of world coordinate system can be obtained according to the center of gravity of reference projection plane.One is shown
Example, the origin of world coordinate system are subpoint of the average focus point of N number of sub- plane in reference projection plane.World coordinates
The origin O of systemw=[X0-a0t,Y0-a1t,Z0+ t],
(Xck,Yck,Zck) be k-th of sub- plane barycentric coodinates, [X0,Y0,Z0] be N number of sub- plane average focus point, t is constant.
As shown in fig. 7, after constructing world coordinate system, so that it may determine ray machine camera lens coordinate system and world coordinate system
Between transformational relation be translation vector T=OwWith spin matrix R=(Vxw,Vyw,Vzw)T。
After obtaining the transformational relation between world coordinate system and the ray machine camera lens coordinate system, step is continued to execute
S140 obtains undistorted throwing in grating pixel faces that is, according to the transformational relation between world coordinate system and ray machine camera lens coordinate system
The target area of shadow, so that the picture finally projected is rendered as no trapezoidal distortion.
Firstly the need of acquisition under the relative attitude on current projector and perspective plane, the reality of projector on the projection surface
View field.In conjunction with the pixel coordinate on four vertex of projector ray machine internal reference and grating pixel faces, on reconstruction perspective plane above
The mode of the three-dimensional coordinate of mesh point is similar, calculate four vertex of grating pixel faces using ray machine optical center as the directions of rays of starting point,
Then the intersection point for obtaining this four rays and reference projection plane, calculates this four intersection points in the coordinate of benchmark world coordinate system,
This two-dimensional coordinate of four intersection points on world coordinate system XOY plane i.e. reference projection plane is obtained, to calculate grating image
The quadrilateral area that vegetarian noodles actually projects on the projection surface.
In some embodiments, the method for obtaining the target area of the undistorted projection in grating pixel faces includes: basis
The three-dimensional projection model obtains the ginseng that four corner points are formed in the reference projection plane in the grating pixel faces
Examine subpoint;According to the ratio of width to height of the grating pixel faces in by the quadrilateral area surrounded with reference to subpoint, obtain
Rectangle is inscribed in the maximum of the quadrilateral area, and the ratio of width to height of the maximum inscribed rectangle and the width of the grating pixel faces are high
Than consistent, the X-axis and Y-axis of the broadside world coordinate system parallel with flash difference of the maximum inscribed rectangle;According to described
The Intrinsic Matrix of transformational relation and the ray machine camera lens between world coordinate system and the ray machine camera lens coordinate system obtains
The maximum inscribed back projection point of the rectangle in the grating pixel faces, by the back projection in the grating pixel faces
The rectangular area that point surrounds is target area.
With reference to Fig. 8, when obtaining quadrilateral area, the most imperial palace for stating quadrilateral area can be obtained according to following methods
Connect rectangle:
Obtain four vertex and the most left pixel of the quadrilateral area and the most right pixel of the quadrilateral area
Point is made of wherein the side being made of lower-left angular vertex and bottom right angular vertex is bottom edge top left corner apex and upper right angular vertex
While being bottom edge, the side being made of bottom right angular vertex and upper right angular vertex is the right;
The initial position of left side vertical line, the left side vertical line and the world coordinates are obtained according to the most left pixel
The Y-axis of system is parallel, executes following steps a~h to the left side vertical line iteration, until left side vertical line reaches the most right pixel
Point:
Step a executes step b after moving right pixel-by-pixel to the left side vertical line;
Step b forms described the left side vertical line of every location of pixels with the intersection point of the left side vertical line and the bottom edge
The candidate left upper apex of the inscribed rectangle of maximum moves up the candidate left upper apex pixel-by-pixel and executes step c;
Step c forms described the left side vertical line of every location of pixels with the intersection point of the left side vertical line and the top margin
The candidate bottom left vertex of the inscribed rectangle of maximum moves down the candidate bottom left vertex pixel-by-pixel and executes step d;
Step d is obtained and the world coordinate system respectively according to the candidate left upper apex and the candidate bottom left vertex
The parallel first straight line and second straight line of X-axis, formed according to the first straight line and the second straight line and described the right
Intersection point in two intersection points close to the left side determines candidate bottom right vertex and candidate right vertices, surrounds in four candidate vertices
Candidate rectangle is determined in rectangular area, is calculated the area of the candidate rectangle and is executed step e, wherein the candidate rectangle is
With the consistent maximum rectangle of described grating pixel faces the ratio of width to height;
Step e, when the area of the candidate rectangle is greater than maximum inscribed rectangular area, by the maximum inscribed rectangle
Area is updated to the area of the candidate rectangle, and the coordinate on maximum inscribed four vertex of rectangle is updated to the time
The coordinate on four vertex of rectangle is selected, step f is executed;When the area of the candidate rectangle is no more than the maximum inscribed rectangular surfaces
When product, the coordinate of the maximum inscribed rectangular area and maximum inscribed four vertex of rectangle is kept, step f is executed;Its
In, the initial value of the maximum inscribed rectangular area can be set to 0, and maximum inscribed initial four vertex of rectangle can be with
It is arranged vacant;
Step f does not reach the intersection point location of pixels of the left side vertical line Yu the top margin in the candidate left upper apex
When, execute step b;When the candidate left upper apex reaches the intersection point location of pixels of the left side vertical line and the top margin, hold
Row step g;
Step g does not reach the intersection point location of pixels of the left side vertical line Yu the bottom edge in the candidate bottom left vertex
When, execute step c;When the candidate bottom left vertex reaches the intersection point location of pixels of the left side vertical line and the bottom edge, hold
Row step h;
Step h executes step a when the left side vertical line does not reach the most right pixel;In the left side vertical line
When arrived the most right pixel, obtain corresponding four positioning of the maximum inscribed rectangular area coordinate be it is described most
Four apex coordinates of big inscribed rectangle.
Maximum inscribed rectangle can be obtained by iteration above-mentioned steps a~h, as shown in figure 8, quadrangle external in Fig. 8
Quadrilateral area as in reference projection plane is rendered as an irregular quadrangle due to trapezoidal distortion;In Fig. 8
Inner rectangular is maximum undistorted rectangle in the view field obtained, it can be seen that the X-axis and Y of the rectangle and world coordinate system
Axis is parallel.
After the target area for obtaining undistorted projection, step S150 is continued to execute, that is, obtains the grating pixel faces
With the mapping relations of the target area, project content is treated using the mapping relations and carries out texture mapping, is realized undistorted
Projection.
Behind the target area for obtaining undistorted projection, according to projector intrinsic parameter and world coordinate system and ray machine camera lens
The transformational relation of coordinate system, according to the three-dimensional projection model shown in the national forest park in Xiaokeng and Fig. 5 shown in Fig. 3, by target area
Four vertex re-projections to grating pixel faces, to obtain four of four vertex correspondences in target area of undistorted projection
Grating pixel coordinate points, this four grating pixel coordinate points form a quadrangle.If original grating picture is deformed to this
A quadrilateral area, the then picture effect finally projected correspond to that rectangle is inscribed in the maximum of reference projection plane, therefore
The quadrilateral area in grating pixel faces that this four grating pixel coordinate points are formed is the target of grating picture area distortion correction
Region.Obtained undistorted target area is mapped as shown in figure 9, entire picture is the original mesh in entire grating pixel faces
Image, the white quadrangle of the inside are the target area being calculated.
In some embodiments, it is closed according to the four of target area vertex are corresponding with four vertex of grating pixel faces
System, obtains the homography matrix between the target area and the grating pixel faces, to utilize the homography matrix
It treats projected picture and carries out texture mapping.
The homography between two-dimensional coordinate is formed by four vertex of target area and four vertex of grating pixel faces
Relationship.This mapping relations can by GPU accelerate complete, by target area be divided by two triangle sets at Mesh, often
A triangular apex sequence is arranged clockwise, and vertex attribute includes to set apex coordinate and source picture texture mapping coordinate.
As shown in Figure 10, Figure 10 illustrates A, B, C, and D is respectively the 4 of the grating pixel faces of 1920,1080 resolution ratio
A vertex, dashed box quadrangle are the target areas being calculated, and two triangles are indicated by different the fill styles, each triangle
The vertex of shape includes apex coordinate and texture coordinate attribute.
In the process of running, GPU renders pipeline and is sat by the texture that bilinear interpolation calculates vertex triangle interior pixels
Mark, the sampler that GPU renders pipeline is responsible for the corresponding coordinate position from the picture of source and extracts pixel value, as the vertex position
Output pixel value.Image by texture mapping is output in the grating pixel faces of projector, is completed by projection entire abnormal
Become precorrection process, effect is as shown in figure 11, it can be seen that by after keystone correction, former raster image is deformed to
Within target area.Final drop shadow effect is as shown in figure 12, and white quadrangle profile is uncorrected view field, quadrangle
Internal image is the projected picture obtained by trapezoidal distortion precorrection, it can be seen that projection obtains after keystone correction
Picture be rendered as undistorted rectangle, and its horizontal axis is parallel to absolute horizon.
Thus the present embodiment is by external instrument, equipment, by projector the application automatically complete due to projector with
The undesirable caused trapezoidal distortion of projection angle between perspective plane.
The present invention also provides a kind of projector distortion correction devices.
Figure 13 is the structural block diagram of the projector distortion correction device shown in the embodiment of the present invention, as shown in Figure 13, this reality
The device applied includes:
Standard grid image in grating pixel faces is projected in by point cloud projection acquiring unit using the ray machine camera lens
On perspective plane, and the point cloud projection image that the perspective plane obtains mesh point on perspective plane is shot using the camera;
Point cloud projection coordinate calculating unit, according to the picture between the standard grid image and the point cloud projection image
Vegetarian refreshments corresponding relationship and according to the three-dimensional projection model constructed in advance obtains mesh point on the perspective plane and is based on the light
The three-dimensional coordinate of machine camera lens coordinate system, wherein ray machine camera lens coordinate system of the three-dimensional projection model based on the projector and
Camera coordinate system and construct;
Transformational relation computing unit obtains reference projection plane using the three-dimensional coordinate of mesh point on the perspective plane,
And the world is constructed using the posture information of the reference projection plane and the projector obtained by the Inertial Measurement Unit
Coordinate system, and obtain the transformational relation between the world coordinate system and the ray machine camera lens coordinate system;
Target area determination unit is closed according to the conversion between the world coordinate system and the ray machine camera lens coordinate system
System, obtains the target area of undistorted projection in the grating pixel faces;
Texture map unit obtains the mapping relations of the grating pixel faces and the target area, utilizes the mapping
Relationship treats project content and carries out texture mapping, realizes undistorted projection.
In some embodiments, the device in Figure 13 further includes pretreatment unit, according to the camera optical path optical center with
Its first check point in the camera sensing device pixel faces, establishes the first linear relationship;According to the ray machine camera lens light
Road optical center and its second check point in the grating pixel faces, establish the second linear relationship;According to the camera and institute
The external parameter matrix between ray machine camera lens is stated, the third linear for establishing first check point and second check point is closed
System;According to first linear relationship, second linear relationship and the third linear relationship, obtain on the perspective plane
Three-dimensional coordinate of the mesh point based on the ray machine camera lens coordinate system.
Transformational relation computing unit includes reference projection plane building module, world coordinate system building module and matrix meter
Calculate module;
Reference projection plane constructs module, according to the three-dimensional coordinate of mesh point on the perspective plane on the perspective plane
Mesh point carries out spatial filter, filters out the invalid mesh point in the mesh point, obtains effective grid point;Utilize least square
Method carries out plane fitting to the effective grid point, determines that being fitted obtained plane is the reference projection plane.Benchmark is thrown
Shadow plane constructs module, also by repeating step A and step B, after iteration n times, determines obtained N number of after n times iteration
With the sub- plane of reference of most reference net lattice point numbers in sub- plane, the reference net lattice point with reference to sub- plane is institute
State effective grid point;Wherein step A: randomly selecting three not conllinear mesh points in the mesh point, obtains described three
The sub- plane that not conllinear mesh point determines;Step B: calculating on the perspective plane each mesh point at a distance from the sub- plane,
The pixilated grid point for being greater than pre-determined distance value with the sub- plan range is rejected, the number of reference net lattice point, the reference are obtained
Mesh point is the mesh point for being not more than the pre-determined distance value at a distance from the sub- plane.
World coordinate system constructs module, and the first of the reference projection plane is obtained in the ray machine camera lens coordinate system
Unit normal vector;According to the Y-axis unit vector of the ray machine camera lens coordinate system and the throwing obtained by Inertial Measurement Unit
The posture information of shadow instrument obtains second unit normal vector of the absolute horizon in the ray machine camera lens coordinate system;First is single
Unit vector of the vector product of position normal line vector and second unit normal vector as the world coordinate system X-axis, by institute
State unit vector of second unit normal vector as the world coordinate system Y-axis, using first unit normal vector as
The unit vector of the world coordinate system Z axis.
Matrix computing module, according to coordinate position of the world coordinate system origin in the ray machine camera lens coordinate system
The translation matrix between the world coordinate system and the ray machine camera lens coordinate system is obtained, and according to the world coordinate system X
Axis, Y-axis Z axis unit vector obtain the spin matrix between the world coordinate system and the ray machine camera lens coordinate system.
In some embodiments, target area determination unit obtains the grating image according to the three-dimensional projection model
The reference subpoint that four corner points are formed in the reference projection plane on vegetarian noodles;According to the width of the grating pixel faces
Rectangle, institute is inscribed in the high maximum than in by the quadrilateral area surrounded with reference to subpoint, obtaining the quadrilateral area
The ratio of width to height for stating maximum inscribed rectangle is consistent with the ratio of width to height of the grating pixel faces, the broadside of the maximum inscribed rectangle with
The X-axis and Y-axis of the parallel world coordinate system of flash difference;According to the world coordinate system and the ray machine camera lens coordinate system
Between transformational relation and the ray machine camera lens Intrinsic Matrix, obtain the maximum inscribed rectangle in the grating pixel
Back projection's point on face, the rectangular area surrounded in the grating pixel faces by back projection's point are the target area
Domain.
In conjunction with the present embodiment, target area determination unit includes computing module, obtains four tops of the quadrilateral area
Point and the most left pixel of the quadrilateral area and most right pixel, wherein being made of lower-left angular vertex and bottom right angular vertex
Side be bottom edge, the side being made of top left corner apex and upper right angular vertex be bottom edge, by bottom right angular vertex and upper right angular vertex structure
At side be the right;The initial position of left side vertical line, the left side vertical line and the world are obtained according to the most left pixel
The Y-axis of coordinate system is parallel, executes following steps a~h to the left side vertical line iteration, until left side vertical line reaches described most right
Pixel: step a executes step b after moving right pixel-by-pixel to the left side vertical line;Step b, to a left side for every location of pixels
Side vertical line forms the candidate left upper apex of the maximum inscribed rectangle with the intersection point of the left side vertical line and the bottom edge, right
Candidate's left upper apex moves up pixel-by-pixel and executes step c;Step c, to the left side vertical line of every location of pixels, with institute
The intersection point for stating left side vertical line and the top margin forms the maximum candidate bottom left vertex that rectangle is inscribed, to the candidate lower-left
Vertex moves down pixel-by-pixel and executes step d;Step d, according to the candidate left upper apex and the candidate bottom left vertex point
First straight line and second straight line that Huo Qu be not parallel with the X axis of the world coordinate system, according to the first straight line and described the
Intersection point in two intersection points that two straight lines and described the right are formed close to the left side determines candidate bottom right vertex and candidate upper right top
Point determines candidate rectangle in the rectangular area that four candidate vertices surround, calculates the area of candidate rectangle and execute step
E, wherein the candidate rectangle is and the consistent maximum rectangle of described grating pixel faces the ratio of width to height;Step e, when the candidate square
When the area of shape is greater than maximum inscribed rectangular area, the maximum inscribed rectangular area is updated to the face of the candidate rectangle
It accumulates, and the coordinate on maximum inscribed four vertex of rectangle is updated to the coordinate on four vertex of candidate rectangle, execute step
Rapid f;When the area of the candidate rectangle is no more than the maximum inscribed rectangular area, the maximum inscribed rectangular surfaces are kept
The long-pending coordinate with maximum inscribed four vertex of rectangle, executes step f;Step f is not reached in the candidate left upper apex
When the intersection point location of pixels of the left side vertical line and the top margin, step b is executed;Described in being reached in the candidate left upper apex
When the intersection point location of pixels of left side vertical line and the top margin, step g is executed;Step g is not reached in the candidate bottom left vertex
When the left side vertical line and the intersection point location of pixels on the bottom edge, step c is executed;Described in being reached in the candidate bottom left vertex
When left side vertical line and the intersection point location of pixels on the bottom edge, step h is executed;Step h is not reached described in the left side vertical line
When most right pixel, step a is executed;When the left side vertical line arrived the most right pixel, obtain described maximum inscribed
The coordinate of corresponding four positioning of rectangular area is four apex coordinates of the maximum inscribed rectangle.
In some embodiments, texture map unit, according to the four of the target area vertex and the grating pixel
The corresponding relationship on four vertex in face obtains the homography matrix between the target area and the grating pixel faces.
In some embodiments, point cloud projection acquiring unit is marked using the ray machine camera lens by the first of grating pixel faces
Quasi- grid image is projected on the perspective plane, and is shot the perspective plane using the camera and obtained the first point cloud projection
Image;The second standard grid image of grating pixel faces is projected on the perspective plane using the ray machine camera lens, and is utilized
The camera shoots the perspective plane and obtains the second point cloud projection image, the first standard grid image and described second
Round spot in standard grid image mutually staggers;The first point cloud projection image and second point cloud projection are identified respectively
Round spot grid in image obtains each round spot in the first point cloud projection image and the second point cloud projection image and exists
The location of pixels of the grating pixel faces;By merging the first point cloud projection image and the second point cloud projection image
In each round spot in the location of pixels of the grating pixel faces, obtain the pixel position of each round spot of the point cloud projection image
It sets.
For device embodiment, since it corresponds essentially to embodiment of the method, so related place is referring to method reality
Apply the part explanation of example.The apparatus embodiments described above are merely exemplary, wherein described be used as separation unit
The unit of explanation may or may not be physically separated, and component shown as a unit can be or can also
Not to be physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to reality
Border needs to select some or all of the modules therein to achieve the purpose of the solution of this embodiment.Ordinary skill people
Member can understand and implement without creative efforts.
The present invention also provides a kind of projectors.
Figure 14 is the structural schematic diagram of the projector shown in the embodiment of the present invention, as shown in figure 14, should in hardware view
Projector includes graphics processor, optionally further comprising internal bus, network interface, memory.Wherein, memory may wrap
Containing memory, such as high-speed random access memory (Random-Access Memory, RAM), it is also possible to further include non-volatile
Memory (non-volatile memory), for example, at least magnetic disk storage etc..Certainly, which is also possible that
Hardware required for other business, such as ray machine camera lens, camera and Inertial Measurement Unit, ray machine camera lens will be in grating pixel faces
Standard grid image be projected on perspective plane, camera shoot perspective plane obtain point cloud projection image, Inertial Measurement Unit
The posture information of measuring projector, Inertial Measurement Unit include gyroscope.
Graphics processor, network interface and memory can be connected with each other by internal bus, which can be
ISA (Industry Standard Architecture, industry standard architecture) bus, PCI (Peripheral
Component Interconnect, Peripheral Component Interconnect standard) bus or EISA (Extended Industry
Standard Architecture, expanding the industrial standard structure) bus etc..The bus can be divided into address bus, data
Bus, control bus etc..Only to be indicated with a four-headed arrow in Figure 14, it is not intended that an only bus convenient for indicating
Or a type of bus.
Memory, for storing program.Specifically, program may include program code, and said program code includes calculating
Machine executable instruction.Memory may include memory and nonvolatile memory, and provide instruction sum number to graphics processor
According to.
Graphics processor from read in nonvolatile memory corresponding computer program into memory then run,
Projector distortion correction device is formed on logic level.Graphics processor executes the program that memory is stored and realizes as above
The projector distortion correction method of description.
The method that projector distortion correction device disclosed in the above-mentioned embodiment illustrated in fig. 14 such as this specification executes can answer
It is realized in graphics processor, or by graphics processor.Graphics processor may be a kind of IC chip, have
The processing capacity of signal.During realization, each step of above-described projector distortion correction method can pass through figure
The integrated logic circuit of hardware in processor or the instruction of software form are completed.Above-mentioned graphics processor can be logical
With processor, including central processing unit (Central Processing Unit, CPU), network processing unit (Network
Processor, NP) etc.;It can also be digital signal processor (Digital Signal Processor, DSP), dedicated collection
At circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array
(Field- Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or crystal
Pipe logical device, discrete hardware components.It may be implemented or execute disclosed each method, the step in this specification embodiment
And logic diagram.General processor can be microprocessor or the processor is also possible to any conventional processor etc..Knot
The step of closing method disclosed in this specification embodiment can be embodied directly in hardware decoding processor and execute completion, or
With in decoding processor hardware and software module combination execute completion.Software module can be located at random access memory, flash memory,
The storage of this fields such as read-only memory, programmable read only memory or electrically erasable programmable memory, register maturation
In medium.The storage medium is located at memory, and processor reads the information in memory, completes the above method in conjunction with its hardware
The step of.
The present invention also provides a kind of computer readable storage mediums.
The computer-readable recording medium storage one or more computer program, the one or more computer program
Including instruction, which can be realized above-described projector distortion correction when the graphics processor for being projected instrument executes
Method.
For the ease of clearly describing the technical solution of the embodiment of the present invention, in the embodiment of invention, use " the
One ", the printed words such as " second " distinguish function and the essentially identical identical entry of effect or similar item, those skilled in the art
It is understood that the printed words such as " first ", " second " are not defined quantity and execution order.
The above description is merely a specific embodiment, under above-mentioned introduction of the invention, those skilled in the art
Member can carry out other improvement or deformation on the basis of the above embodiments.It will be understood by those skilled in the art that above-mentioned
It specifically describes and only preferably explains that the purpose of the present invention, protection scope of the present invention should be with scope of protection of the claims
It is quasi-.
Claims (12)
1. a kind of projector distortion correction method, wherein the projector has ray machine camera lens, camera and inertia measurement list
Member, which comprises
The standard grid image in grating pixel faces is projected on perspective plane using the ray machine camera lens, and utilizes the camera shooting
Head shoots the point cloud projection image that the perspective plane obtains mesh point on perspective plane;
According to the pixel point correspondence between the standard grid image and the point cloud projection image and according to preparatory structure
The three-dimensional projection model built obtains three-dimensional coordinate of the mesh point based on the ray machine camera lens coordinate system on the perspective plane, wherein
Ray machine camera lens coordinate system and camera coordinate system of the three-dimensional projection model based on the projector and construct;
Using the three-dimensional coordinate of mesh point on the perspective plane, reference projection plane is obtained, and utilizes the reference projection plane
World coordinate system is constructed with the posture information of the projector obtained by the Inertial Measurement Unit, and obtains the world and sits
Transformational relation between mark system and the ray machine camera lens coordinate system;
According to the transformational relation between the world coordinate system and the ray machine camera lens coordinate system, obtain in the grating pixel faces
The target area of undistorted projection;
The mapping relations for obtaining the grating pixel faces and the target area, using the mapping relations treat project content into
Row texture mapping realizes undistorted projection.
2. according to the method described in claim 1, wherein, constructing the three-dimensional projection model by following methods:
According to the camera optical path optical center and its first check point in the camera sensing device pixel faces, first is established
Linear relationship;
According to the ray machine camera lens optical path optical center and its second check point in the grating pixel faces, the second linear pass is established
System;
According to the external parameter matrix between the camera and the ray machine camera lens, first check point and described the are established
The third linear relationship of two check points;
According to first linear relationship, second linear relationship and the third linear relationship, obtain on the perspective plane
Three-dimensional coordinate of the mesh point based on the ray machine camera lens coordinate system.
3. according to the method described in claim 1, wherein, the three-dimensional coordinate using mesh point on the perspective plane obtains
Reference projection plane includes:
Spatial filter is carried out to mesh point on the perspective plane according to the three-dimensional coordinate of mesh point on the perspective plane, is filtered out described
Invalid mesh point in mesh point obtains effective grid point;
Plane fitting is carried out to the effective grid point using least square method, determines that being fitted obtained plane is the benchmark
Projection plane.
4. according to the method described in claim 3, wherein, the three-dimensional coordinate according to mesh point on the perspective plane is to described
Mesh point carries out spatial filter on perspective plane, filters out the invalid mesh point in the mesh point, obtains effective grid point, comprising:
Step A: randomly selecting three not conllinear mesh points in the mesh point, obtains described three not conllinear mesh points and determines
Sub- plane;
Step B: it calculates each mesh point on the perspective plane and is rejected big with the sub- plan range at a distance from the sub- plane
In the pixilated grid point of pre-determined distance value, the number of reference net lattice point is obtained, the reference net lattice point is and the sub- plane
Distance is not more than the mesh point of the pre-determined distance value;
It repeats step A and step B, after iteration n times, determines after n times iteration that there is at most ginseng in obtained N number of sub- plane
The sub- plane of reference of mesh point number is examined, the reference net lattice point with reference to sub- plane is the effective grid point.
5. described using the reference projection plane and by the inertia measurement list according to the method described in claim 1, wherein
The posture information for the projector that member obtains constructs world coordinate system, and obtains the world coordinate system and the ray machine camera lens
Transformational relation between coordinate system, comprising:
The first unit normal vector of the reference projection plane is obtained in the ray machine camera lens coordinate system;
According to the Y-axis unit vector and the posture information of the ray machine camera lens coordinate system, absolute horizon is obtained in the light
The second unit normal vector in machine camera lens coordinate system;
The vector product of first unit normal vector and second unit normal vector is as the world coordinate system X-axis
Unit vector, using second unit normal vector as the unit vector of the world coordinate system Y-axis, by first unit
Unit vector of the normal line vector as the world coordinate system Z axis;
The world coordinate system is obtained according to coordinate position of the world coordinate system origin in the ray machine camera lens coordinate system
With the translation matrix between the ray machine camera lens coordinate system, and according to the world coordinate system X-axis, the unit vector of Y-axis Z axis
Obtain the spin matrix between the world coordinate system and the ray machine camera lens coordinate system.
6. described according to the world coordinate system and the ray machine camera lens coordinate system according to the method described in claim 1, wherein
Between transformational relation, obtain the target area of the undistorted projection in the grating pixel faces, comprising:
According to the three-dimensional projection model, four corner points shape in the reference projection plane in the grating pixel faces is obtained
At reference subpoint;
According to the ratio of width to height of the grating pixel faces in by the quadrilateral area surrounded with reference to subpoint, described four are obtained
Rectangle is inscribed in the maximum in side shape region, and the ratio of width to height of the maximum inscribed rectangle is consistent with the ratio of width to height of the grating pixel faces,
The X-axis and Y-axis of the broadside world coordinate system parallel with flash difference of the maximum inscribed rectangle;
According to the internal reference of transformational relation and the ray machine camera lens between the world coordinate system and the ray machine camera lens coordinate system
Matrix number obtains the maximum inscribed back projection point of the rectangle in the grating pixel faces, in the grating pixel faces by
The rectangular area that back projection's point surrounds is the target area.
7. according to the method described in claim 6, wherein, described the ratio of width to height according to the grating pixel faces is by the reference
In the quadrilateral area that subpoint surrounds, rectangle is inscribed in the maximum for obtaining the quadrilateral area, comprising:
Four vertex and the most left pixel of the quadrilateral area and the most right pixel of the quadrilateral area are obtained, wherein
The side being made of lower-left angular vertex and bottom right angular vertex is bottom edge, and the side being made of top left corner apex and upper right angular vertex is bottom
Side, the side being made of bottom right angular vertex and upper right angular vertex are the right;
The initial position of left side vertical line, the Y of the left side vertical line and the world coordinate system are obtained according to the most left pixel
Axis is parallel, executes following steps a~h to the left side vertical line iteration, until left side vertical line reaches the most right pixel:
Step a executes step b after moving right pixel-by-pixel to the left side vertical line;
Step b forms the most imperial palace to the left side vertical line of every location of pixels with the intersection point of the left side vertical line and the bottom edge
The candidate left upper apex for connecing rectangle moves up the candidate left upper apex pixel-by-pixel and executes step c;
Step c forms the most imperial palace to the left side vertical line of every location of pixels with the intersection point of the left side vertical line and the top margin
The candidate bottom left vertex for connecing rectangle moves down the candidate bottom left vertex pixel-by-pixel and executes step d;
Step d is obtained and the X-axis of the world coordinate system respectively according to the candidate left upper apex and the candidate bottom left vertex
Parallel first straight line and second straight line, two friendships formed according to the first straight line and the second straight line and described the right
Intersection point in point close to the left side determines candidate bottom right vertex and candidate right vertices, in the rectangular area that four candidate vertices surround
Interior determining candidate rectangle calculates its area and executes step e, wherein the candidate rectangle is and described grating pixel faces the ratio of width to height
Consistent maximum rectangle;
Step e, when the area of the candidate rectangle is greater than maximum inscribed rectangular area, more by the maximum inscribed rectangular area
Newly it is the area of the candidate rectangle, and the coordinate on maximum inscribed four vertex of rectangle is updated to the candidate rectangle four
The coordinate on a vertex executes step f;When the area of the candidate rectangle is no more than the maximum inscribed rectangular area, keep
The coordinate of the maximum inscribed rectangular area and maximum inscribed four vertex of rectangle, executes step f;
Step f is executed when the candidate left upper apex does not reach the intersection point location of pixels of the left side vertical line and the top margin
Step b;When the candidate left upper apex reaches the intersection point location of pixels of the left side vertical line and the top margin, step g is executed;
Step g is executed when the candidate bottom left vertex does not reach the intersection point location of pixels of the left side vertical line and the bottom edge
Step c;When the candidate bottom left vertex reaches the intersection point location of pixels of the left side vertical line and the bottom edge, step h is executed;
Step h executes step a when the left side vertical line does not reach the most right pixel;It arrived in the left side vertical line
When the most right pixel, obtaining the maximum coordinate that corresponding four positioning of rectangular area are inscribed is the maximum inscribed square
Four apex coordinates of shape.
8. according to the method described in claim 1, wherein, the mapping for obtaining the grating pixel faces and the target area
Relationship, comprising:
According to the corresponding relationship on the four of the target area vertex and four vertex of the grating pixel faces, the mesh is obtained
Mark the homography matrix between region and the grating pixel faces.
9. according to the method described in claim 1, wherein, it is described using the ray machine camera lens by the standard grid of grating pixel faces
Image projection shoots the perspective plane using the camera and obtains point cloud projection image on the perspective plane, comprising:
The first standard grid image of grating pixel faces is projected on the perspective plane using the ray machine camera lens, and utilizes institute
It states camera and shoots the first point cloud projection image of the perspective plane acquisition;
The second standard grid image of grating pixel faces is projected on the perspective plane using the ray machine camera lens, and utilizes institute
It states camera and shoots the second point cloud projection image of the perspective plane acquisition, the first standard grid image and second standard
Round spot in grid image mutually staggers;
It identifies the round spot grid in the first point cloud projection image and the second point cloud projection image respectively, obtains described the
Location of pixels of each round spot in the grating pixel faces in one point cloud projection image and the second point cloud projection image;
By each round spot in merging the first point cloud projection image and the second point cloud projection image in the grating image
The location of pixels of vegetarian noodles obtains the location of pixels of each round spot of the point cloud projection image.
10. a kind of projector distortion correction device, wherein the projector has ray machine camera lens, camera and inertia measurement list
Member, described device include:
Standard grid image in grating pixel faces is projected in perspective plane using the ray machine camera lens by point cloud projection acquiring unit
On, and the point cloud projection image that the perspective plane obtains mesh point on perspective plane is shot using the camera;
Point cloud projection coordinate calculating unit, according to the pixel pair between the standard grid image and the point cloud projection image
It should be related to and according to the three-dimensional projection model constructed in advance, obtain mesh point on the perspective plane and sat based on the ray machine camera lens
The three-dimensional coordinate for marking system, wherein ray machine camera lens coordinate system and camera coordinate of the three-dimensional projection model based on the projector
It is and constructs;
Transformational relation computing unit obtains reference projection plane using the three-dimensional coordinate of mesh point on the perspective plane, and utilizes
The posture information of the reference projection plane and the projector obtained by the Inertial Measurement Unit constructs world coordinate system,
And obtain the transformational relation between the world coordinate system and the ray machine camera lens coordinate system;
Target area determination unit is obtained according to the transformational relation between the world coordinate system and the ray machine camera lens coordinate system
Take the target area of undistorted projection in the grating pixel faces;
Texture map unit obtains the mapping relations of the grating pixel faces and the target area, utilizes the mapping relations
It treats project content and carries out texture mapping, realize undistorted projection.
11. a kind of projector, wherein include:
Standard grid image in grating pixel faces is projected on perspective plane by ray machine camera lens;
Camera, shooting perspective plane obtain point cloud projection image and are sent to graphics processor;
Inertial Measurement Unit, the posture information for measuring the projector are sent to the graphics processor;
Memory stores computer executable instructions;
Graphics processor, the computer executable instructions make the graphics processor execute such as claim 1- when executed
9 described in any item methods.
12. a kind of computer readable storage medium, wherein be stored with one or more meters on the computer readable storage medium
Calculation machine program, one or more of computer programs are performed realization such as the described in any item methods of claim 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910266198.0A CN110336987B (en) | 2019-04-03 | 2019-04-03 | Projector distortion correction method and device and projector |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910266198.0A CN110336987B (en) | 2019-04-03 | 2019-04-03 | Projector distortion correction method and device and projector |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110336987A true CN110336987A (en) | 2019-10-15 |
CN110336987B CN110336987B (en) | 2021-10-08 |
Family
ID=68139264
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910266198.0A Active CN110336987B (en) | 2019-04-03 | 2019-04-03 | Projector distortion correction method and device and projector |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110336987B (en) |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110677634A (en) * | 2019-11-27 | 2020-01-10 | 成都极米科技股份有限公司 | Trapezoidal correction method, device and system for projector and readable storage medium |
CN110996081A (en) * | 2019-12-06 | 2020-04-10 | 北京一数科技有限公司 | Projection picture correction method and device, electronic equipment and readable storage medium |
CN111093067A (en) * | 2019-12-31 | 2020-05-01 | 歌尔股份有限公司 | Projection apparatus, lens distortion correction method, distortion correction device, and storage medium |
CN111443557A (en) * | 2020-05-13 | 2020-07-24 | 浙江宝妈创客品牌管理有限公司 | Multimedia projection automatic correction device based on light sensation control |
CN111800620A (en) * | 2020-08-18 | 2020-10-20 | 深圳市慧视智图科技有限公司 | Noninductive trapezoidal correction module of projector and calculation method thereof |
CN112261396A (en) * | 2020-10-26 | 2021-01-22 | 成都极米科技股份有限公司 | Projection method, projection device, projection equipment and computer readable storage medium |
CN112330794A (en) * | 2020-10-09 | 2021-02-05 | 同济大学 | Single-camera image acquisition system based on rotary bipartite prism and three-dimensional reconstruction method |
CN112819939A (en) * | 2021-03-08 | 2021-05-18 | 网易(杭州)网络有限公司 | Method, apparatus, device and computer readable storage medium for correcting UV coordinates |
CN112967348A (en) * | 2021-04-01 | 2021-06-15 | 深圳大学 | Three-dimensional reconstruction method based on one-dimensional scanning structured light system and related components thereof |
CN112995624A (en) * | 2021-02-23 | 2021-06-18 | 峰米(北京)科技有限公司 | Trapezoidal error correction method and device for projector |
CN113141492A (en) * | 2020-11-27 | 2021-07-20 | 深圳市当智科技有限公司 | Wide-screen projection method and system based on camera and readable storage medium |
CN113163186A (en) * | 2020-12-03 | 2021-07-23 | 深圳市当智科技有限公司 | Projection method, system and storage medium based on horizontal correction of distance sensor |
CN113155053A (en) * | 2020-01-22 | 2021-07-23 | 株式会社三丰 | Three-dimensional geometry measuring device and three-dimensional geometry measuring method |
CN113191963A (en) * | 2021-04-02 | 2021-07-30 | 华中科技大学 | Projector residual distortion full-field calibration method and device without additional operation |
CN113259642A (en) * | 2021-05-12 | 2021-08-13 | 华强方特(深圳)科技有限公司 | Film visual angle adjusting method and system |
CN113286132A (en) * | 2020-01-31 | 2021-08-20 | 精工爱普生株式会社 | Control method of image projection system and image projection system |
CN113470131A (en) * | 2021-06-08 | 2021-10-01 | 中国人民解放军93114部队 | Sea surface simulation image generation method and device, electronic equipment and storage medium |
CN113781550A (en) * | 2021-08-10 | 2021-12-10 | 国网河北省电力有限公司保定供电分公司 | Four-foot robot positioning method and system |
CN114018173A (en) * | 2021-11-01 | 2022-02-08 | 江苏鑫晨光热技术有限公司 | Heliostat surface shape initial normal outdoor measurement system and method |
CN114111639A (en) * | 2021-11-26 | 2022-03-01 | 凌云光技术股份有限公司 | Correction method and device for surface structured light three-dimensional measurement system |
CN114205570A (en) * | 2021-11-16 | 2022-03-18 | 海信视像科技股份有限公司 | Projection equipment and display control method for automatically correcting projected image |
WO2022062604A1 (en) * | 2020-09-28 | 2022-03-31 | 成都极米科技股份有限公司 | Projection screen adjusting method and apparatus, and projector, and storage medium |
CN114567762A (en) * | 2021-12-28 | 2022-05-31 | 上海顺久电子科技有限公司 | Method for correcting projected image in projection plane and projection equipment |
CN114760454A (en) * | 2022-05-24 | 2022-07-15 | 海信视像科技股份有限公司 | Projection equipment and trigger correction method |
CN114866751A (en) * | 2022-04-15 | 2022-08-05 | 海信视像科技股份有限公司 | Projection equipment and trigger correction method |
WO2022193559A1 (en) * | 2021-03-19 | 2022-09-22 | 深圳市火乐科技发展有限公司 | Projection correction method and apparatus, storage medium, and electronic device |
WO2022193739A1 (en) * | 2021-03-19 | 2022-09-22 | 深圳市火乐科技发展有限公司 | Projection image adjustment method and apparatus, storage medium, and electronic device |
CN115174879A (en) * | 2022-07-18 | 2022-10-11 | 峰米(重庆)创新科技有限公司 | Projection picture correction method, projection picture correction device, computer equipment and storage medium |
CN115190281A (en) * | 2022-06-30 | 2022-10-14 | 海宁奕斯伟集成电路设计有限公司 | Device and method for adjusting projection position of projector |
CN115760620A (en) * | 2022-11-18 | 2023-03-07 | 荣耀终端有限公司 | Document correction method and device and electronic equipment |
WO2023087951A1 (en) * | 2021-11-16 | 2023-05-25 | 海信视像科技股份有限公司 | Projection device, and display control method for projected image |
WO2023087947A1 (en) * | 2021-11-16 | 2023-05-25 | 海信视像科技股份有限公司 | Projection device and correction method |
CN117058342A (en) * | 2023-10-12 | 2023-11-14 | 天津科汇新创科技有限公司 | Spine 3D voxel model construction method based on projection image |
WO2024066776A1 (en) * | 2022-09-29 | 2024-04-04 | 海信视像科技股份有限公司 | Projection device and projection-picture processing method |
CN113470131B (en) * | 2021-06-08 | 2024-04-26 | 中国人民解放军93114部队 | Sea surface simulation image generation method and device, electronic equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1845002A (en) * | 2005-04-06 | 2006-10-11 | 精工爱普生株式会社 | Distortion correction for projector |
CN103517016A (en) * | 2012-06-22 | 2014-01-15 | 精工爱普生株式会社 | Projector, image display system, and method of controlling projector |
CN107454373A (en) * | 2016-05-31 | 2017-12-08 | 财团法人工业技术研究院 | Projection system and non-planar automatic correction method and automatic correction processing device thereof |
US20180176520A1 (en) * | 2016-12-16 | 2018-06-21 | Cj Cgv Co., Ltd. | Method of automatically correcting projection area based on image photographed by photographing device and system therefor |
CN108377371A (en) * | 2018-02-09 | 2018-08-07 | 深圳市火乐科技发展有限公司 | A kind of method and device of projection image correction |
-
2019
- 2019-04-03 CN CN201910266198.0A patent/CN110336987B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1845002A (en) * | 2005-04-06 | 2006-10-11 | 精工爱普生株式会社 | Distortion correction for projector |
CN103517016A (en) * | 2012-06-22 | 2014-01-15 | 精工爱普生株式会社 | Projector, image display system, and method of controlling projector |
CN107454373A (en) * | 2016-05-31 | 2017-12-08 | 财团法人工业技术研究院 | Projection system and non-planar automatic correction method and automatic correction processing device thereof |
US20180176520A1 (en) * | 2016-12-16 | 2018-06-21 | Cj Cgv Co., Ltd. | Method of automatically correcting projection area based on image photographed by photographing device and system therefor |
CN108377371A (en) * | 2018-02-09 | 2018-08-07 | 深圳市火乐科技发展有限公司 | A kind of method and device of projection image correction |
Cited By (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022528659A (en) * | 2019-11-27 | 2022-06-15 | チョントゥー ジミー テクノロジー カンパニー リミテッド | Projector keystone correction methods, devices, systems and readable storage media |
WO2021103347A1 (en) * | 2019-11-27 | 2021-06-03 | 成都极米科技股份有限公司 | Projector keystone correction method, apparatus, and system, and readable storage medium |
CN110677634A (en) * | 2019-11-27 | 2020-01-10 | 成都极米科技股份有限公司 | Trapezoidal correction method, device and system for projector and readable storage medium |
JP7291244B2 (en) | 2019-11-27 | 2023-06-14 | チョントゥー ジミー テクノロジー カンパニー リミテッド | Projector Keystone Correction Method, Apparatus, System and Readable Storage Medium |
CN110996081A (en) * | 2019-12-06 | 2020-04-10 | 北京一数科技有限公司 | Projection picture correction method and device, electronic equipment and readable storage medium |
CN111093067A (en) * | 2019-12-31 | 2020-05-01 | 歌尔股份有限公司 | Projection apparatus, lens distortion correction method, distortion correction device, and storage medium |
CN113155053A (en) * | 2020-01-22 | 2021-07-23 | 株式会社三丰 | Three-dimensional geometry measuring device and three-dimensional geometry measuring method |
CN113286132A (en) * | 2020-01-31 | 2021-08-20 | 精工爱普生株式会社 | Control method of image projection system and image projection system |
CN111443557A (en) * | 2020-05-13 | 2020-07-24 | 浙江宝妈创客品牌管理有限公司 | Multimedia projection automatic correction device based on light sensation control |
CN111800620A (en) * | 2020-08-18 | 2020-10-20 | 深圳市慧视智图科技有限公司 | Noninductive trapezoidal correction module of projector and calculation method thereof |
WO2022062604A1 (en) * | 2020-09-28 | 2022-03-31 | 成都极米科技股份有限公司 | Projection screen adjusting method and apparatus, and projector, and storage medium |
CN112330794A (en) * | 2020-10-09 | 2021-02-05 | 同济大学 | Single-camera image acquisition system based on rotary bipartite prism and three-dimensional reconstruction method |
CN112261396A (en) * | 2020-10-26 | 2021-01-22 | 成都极米科技股份有限公司 | Projection method, projection device, projection equipment and computer readable storage medium |
CN112261396B (en) * | 2020-10-26 | 2022-02-25 | 成都极米科技股份有限公司 | Projection method, projection device, projection equipment and computer readable storage medium |
WO2022088419A1 (en) * | 2020-10-26 | 2022-05-05 | 成都极米科技股份有限公司 | Projection method, apparatus, and device, and computer-readable storage medium |
CN113141492B (en) * | 2020-11-27 | 2022-06-10 | 深圳市当智科技有限公司 | Wide-screen projection method and system based on camera and readable storage medium |
CN113141492A (en) * | 2020-11-27 | 2021-07-20 | 深圳市当智科技有限公司 | Wide-screen projection method and system based on camera and readable storage medium |
CN113163186A (en) * | 2020-12-03 | 2021-07-23 | 深圳市当智科技有限公司 | Projection method, system and storage medium based on horizontal correction of distance sensor |
CN113163186B (en) * | 2020-12-03 | 2022-06-10 | 深圳市当智科技有限公司 | Projection method, system and storage medium based on horizontal correction of distance sensor |
CN112995624B (en) * | 2021-02-23 | 2022-11-08 | 峰米(北京)科技有限公司 | Trapezoidal error correction method and device for projector |
CN112995624A (en) * | 2021-02-23 | 2021-06-18 | 峰米(北京)科技有限公司 | Trapezoidal error correction method and device for projector |
CN112819939B (en) * | 2021-03-08 | 2023-07-07 | 网易(杭州)网络有限公司 | Method, apparatus, device and computer readable storage medium for correcting UV coordinates |
CN112819939A (en) * | 2021-03-08 | 2021-05-18 | 网易(杭州)网络有限公司 | Method, apparatus, device and computer readable storage medium for correcting UV coordinates |
WO2022193739A1 (en) * | 2021-03-19 | 2022-09-22 | 深圳市火乐科技发展有限公司 | Projection image adjustment method and apparatus, storage medium, and electronic device |
WO2022193559A1 (en) * | 2021-03-19 | 2022-09-22 | 深圳市火乐科技发展有限公司 | Projection correction method and apparatus, storage medium, and electronic device |
CN112967348A (en) * | 2021-04-01 | 2021-06-15 | 深圳大学 | Three-dimensional reconstruction method based on one-dimensional scanning structured light system and related components thereof |
CN113191963A (en) * | 2021-04-02 | 2021-07-30 | 华中科技大学 | Projector residual distortion full-field calibration method and device without additional operation |
CN113191963B (en) * | 2021-04-02 | 2022-08-05 | 华中科技大学 | Projector residual distortion full-field calibration method and device without additional operation |
CN113259642A (en) * | 2021-05-12 | 2021-08-13 | 华强方特(深圳)科技有限公司 | Film visual angle adjusting method and system |
CN113470131B (en) * | 2021-06-08 | 2024-04-26 | 中国人民解放军93114部队 | Sea surface simulation image generation method and device, electronic equipment and storage medium |
CN113470131A (en) * | 2021-06-08 | 2021-10-01 | 中国人民解放军93114部队 | Sea surface simulation image generation method and device, electronic equipment and storage medium |
CN113781550A (en) * | 2021-08-10 | 2021-12-10 | 国网河北省电力有限公司保定供电分公司 | Four-foot robot positioning method and system |
CN114018173B (en) * | 2021-11-01 | 2024-04-02 | 江苏鑫晨光热技术有限公司 | Heliostat surface shape initial normal outdoor measurement system and method |
CN114018173A (en) * | 2021-11-01 | 2022-02-08 | 江苏鑫晨光热技术有限公司 | Heliostat surface shape initial normal outdoor measurement system and method |
CN114827563A (en) * | 2021-11-16 | 2022-07-29 | 海信视像科技股份有限公司 | Projection apparatus and projection region correction method |
CN114885136A (en) * | 2021-11-16 | 2022-08-09 | 海信视像科技股份有限公司 | Projection apparatus and image correction method |
WO2023087951A1 (en) * | 2021-11-16 | 2023-05-25 | 海信视像科技股份有限公司 | Projection device, and display control method for projected image |
WO2023087947A1 (en) * | 2021-11-16 | 2023-05-25 | 海信视像科技股份有限公司 | Projection device and correction method |
CN114205570A (en) * | 2021-11-16 | 2022-03-18 | 海信视像科技股份有限公司 | Projection equipment and display control method for automatically correcting projected image |
WO2023088329A1 (en) * | 2021-11-16 | 2023-05-25 | 海信视像科技股份有限公司 | Projection device and projection image correction method |
WO2023088304A1 (en) * | 2021-11-16 | 2023-05-25 | 海信视像科技股份有限公司 | Projection device and projection area correction method |
CN114111639A (en) * | 2021-11-26 | 2022-03-01 | 凌云光技术股份有限公司 | Correction method and device for surface structured light three-dimensional measurement system |
CN114111639B (en) * | 2021-11-26 | 2024-04-30 | 凌云光技术股份有限公司 | Correction method and device of surface structured light three-dimensional measurement system |
CN114567762B (en) * | 2021-12-28 | 2024-03-05 | 上海顺久电子科技有限公司 | Correction method for projection image in projection plane and projection equipment |
CN114567762A (en) * | 2021-12-28 | 2022-05-31 | 上海顺久电子科技有限公司 | Method for correcting projected image in projection plane and projection equipment |
CN114866751A (en) * | 2022-04-15 | 2022-08-05 | 海信视像科技股份有限公司 | Projection equipment and trigger correction method |
CN114760454A (en) * | 2022-05-24 | 2022-07-15 | 海信视像科技股份有限公司 | Projection equipment and trigger correction method |
CN115190281B (en) * | 2022-06-30 | 2024-01-02 | 海宁奕斯伟集成电路设计有限公司 | Device and method for adjusting projection position of projector |
CN115190281A (en) * | 2022-06-30 | 2022-10-14 | 海宁奕斯伟集成电路设计有限公司 | Device and method for adjusting projection position of projector |
CN115174879B (en) * | 2022-07-18 | 2024-03-15 | 峰米(重庆)创新科技有限公司 | Projection screen correction method, apparatus, computer device and storage medium |
CN115174879A (en) * | 2022-07-18 | 2022-10-11 | 峰米(重庆)创新科技有限公司 | Projection picture correction method, projection picture correction device, computer equipment and storage medium |
WO2024066776A1 (en) * | 2022-09-29 | 2024-04-04 | 海信视像科技股份有限公司 | Projection device and projection-picture processing method |
CN115760620A (en) * | 2022-11-18 | 2023-03-07 | 荣耀终端有限公司 | Document correction method and device and electronic equipment |
CN115760620B (en) * | 2022-11-18 | 2023-10-20 | 荣耀终端有限公司 | Document correction method and device and electronic equipment |
CN117058342B (en) * | 2023-10-12 | 2024-01-26 | 天津科汇新创科技有限公司 | Spine 3D voxel model construction method based on projection image |
CN117058342A (en) * | 2023-10-12 | 2023-11-14 | 天津科汇新创科技有限公司 | Spine 3D voxel model construction method based on projection image |
Also Published As
Publication number | Publication date |
---|---|
CN110336987B (en) | 2021-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110336987A (en) | A kind of projector distortion correction method, device and projector | |
CN110111262A (en) | A kind of projector distortion correction method, device and projector | |
CN110191326A (en) | A kind of optical projection system resolution extension method, apparatus and optical projection system | |
US8432435B2 (en) | Ray image modeling for fast catadioptric light field rendering | |
CN109272478B (en) | Screen projection method and device and related equipment | |
CN106127745B (en) | The combined calibrating method and device of structure light 3 D vision system and line-scan digital camera | |
Sajadi et al. | Auto-calibration of cylindrical multi-projector systems | |
ES2282429T3 (en) | PROCEDURE AND SYSTEM TO PRODUCE FORMATED INFORMATION RELATED TO GEOMETRIC DISTORSIONS. | |
CN110490916A (en) | Three dimensional object modeling method and equipment, image processing apparatus and medium | |
US7583307B2 (en) | Autostereoscopic display | |
JP5486113B2 (en) | Projection method implemented by a panorama projection device | |
JPWO2018235163A1 (en) | Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method | |
WO2018077071A1 (en) | Panoramic image generating method and apparatus | |
US8917317B1 (en) | System and method for camera calibration | |
CN108257183A (en) | A kind of camera lens axis calibrating method and device | |
US20180114348A1 (en) | Method and apparatus for generating panoramic image with rotation, translation and warping process | |
JP2016091553A (en) | Automated texturing mapping and animation from images | |
CN110176030A (en) | A kind of autoegistration method, device and the electronic equipment of unmanned plane image | |
US10783607B2 (en) | Method of acquiring optimized spherical image using multiple cameras | |
Sajadi et al. | Autocalibrating tiled projectors on piecewise smooth vertically extruded surfaces | |
US20130070094A1 (en) | Automatic registration of multi-projector dome images | |
CN105025284B (en) | Demarcate the method and apparatus that integration imaging shows the display error of equipment | |
CN110648274B (en) | Method and device for generating fisheye image | |
CN106170086B (en) | Method and device thereof, the system of drawing three-dimensional image | |
CN110490967A (en) | Image procossing and object-oriented modeling method and equipment, image processing apparatus and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20220726 Address after: Room 1903, 19 / F, building D, Zhizhen building, No. 7 Zhichun Road, Haidian District, Beijing 100088 Patentee after: Bird innovation (Beijing) Technology Co.,Ltd. Address before: 100191 room 1801, block D, Zhizhen building, 7 Zhichun Road, Haidian District, Beijing Patentee before: BEIJING XIAONIAO TINGTING TECHNOLOGY Co.,Ltd. |