CN110111262A - A kind of projector distortion correction method, device and projector - Google Patents

A kind of projector distortion correction method, device and projector Download PDF

Info

Publication number
CN110111262A
CN110111262A CN201910249736.5A CN201910249736A CN110111262A CN 110111262 A CN110111262 A CN 110111262A CN 201910249736 A CN201910249736 A CN 201910249736A CN 110111262 A CN110111262 A CN 110111262A
Authority
CN
China
Prior art keywords
point
plane
projection
ray machine
perspective plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910249736.5A
Other languages
Chinese (zh)
Other versions
CN110111262B (en
Inventor
苏劲
蔡志博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bird Innovation Beijing Technology Co ltd
Original Assignee
Beijing Xiaoniao Tingting Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaoniao Tingting Technology Co Ltd filed Critical Beijing Xiaoniao Tingting Technology Co Ltd
Priority to CN201910249736.5A priority Critical patent/CN110111262B/en
Publication of CN110111262A publication Critical patent/CN110111262A/en
Application granted granted Critical
Publication of CN110111262B publication Critical patent/CN110111262B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention discloses a kind of projector distortion correction method, device and projector.The method comprise the steps that standard grid image is projected on perspective plane using ray machine camera lens, and point cloud projection image is obtained using camera shooting perspective plane;According to the pixel point correspondence and three-dimensional projection model between standard grid image and point cloud projection image, three-dimensional coordinate of the mesh point based on ray machine camera lens coordinate system on perspective plane is obtained;Using the three-dimensional coordinate of mesh point on perspective plane, obtains reference projection plane and obtain mesh point on perspective plane in the subpoint of reference projection plane, and obtain the corresponding texture sample coordinates of each pixel of grating pixel faces;Using texture sample coordinates to carry out texture mapping image to be projected, the image Jing Guo texture mapping is output to the grating pixel faces and is projected.Technical solution of the present invention makes projector show distortionless effect to the image on non-smooth perspective plane.

Description

A kind of projector distortion correction method, device and projector
Technical field
The present invention relates to a kind of projector distortion correction method, device and projectors.
Background technique
With the maturation of short focus ray machine technology and declining to a great extent for cost, smart projector application placed in the middle of being in increasingly increases It is more.It is not to meet requirement of the household user to high quality viewing experience, the resolution ratio of smart projector, in terms of brightness and color Picture quality has obtained rapid promotion.However under usual domestic environment, for cost consideration, people will not generally use dedicated Projection screen is using common household metope as perspective plane as perspective plane.And for perspective plane, this metope Quality be usually it is undesirable, there is metope out-of-flatness, the problems such as reflection coefficient is relatively low and reflection non-directive.Reflection coefficient Relatively low and non-directive can be solved simply by increasing projector illumination and reducing environmental light brightness, but metope out-of-flatness is drawn The projection distortion risen is but difficult to solve by conventional projector technology.This sightless metope fluctuation of naked eyes will cause throwing Shadow picture generates macroscopic distortion, and this distortion is especially serious when projecting high definition or substantially image, is difficult to high-end User is received, and needs to find the completely new method of one kind to solve the problems, such as this.
Currently marketed projector does not provide feasible solution to the distortion of picture caused by the out-of-flatness of perspective plane. Existing scheme be based on mobile phone camera acquired projections image, by mobile phone application control control entire correction course and by with Entire precorrection process is completed in the interaction of projector.This solution there is mobile phone acquisition image quality it is not high, user clap The disadvantages of environment is uncontrollable and cumbersome is taken the photograph, satisfactory user experience can not be provided.
Summary of the invention
The present invention provides a kind of projector distortion correction method, device and projectors, at least partly to solve above-mentioned ask Topic.
In a first aspect, projector has ray machine camera lens and camera shooting the present invention provides a kind of projector distortion correction method Head this method comprises: the standard grid image of grating pixel faces is projected on perspective plane using the ray machine camera lens, and is utilized The camera shoots the point cloud projection image that the perspective plane obtains mesh point on perspective plane;According to the standard grid image Pixel point correspondence between the point cloud projection image and according to the three-dimensional projection model constructed in advance, described in acquisition Three-dimensional coordinate of the mesh point based on the ray machine camera lens coordinate system on perspective plane, wherein the three-dimensional projection model is based on the throwing The ray machine camera lens coordinate system and camera coordinate system of shadow instrument and construct, be based on the light for calculating on the perspective plane mesh point The three-dimensional coordinate of machine camera lens coordinate system;Using the three-dimensional coordinate of mesh point on the perspective plane, reference projection plane, and benefit are obtained World coordinate system is constructed with the reference projection plane, based between the world coordinate system and the ray machine camera lens coordinate system Transformational relation obtains on the perspective plane mesh point in the subpoint of the reference projection plane;According to the subpoint and institute The corresponding relationship for stating ray machine camera lens grating pixel faces online lattice point, obtains the corresponding texture of each pixel of grating pixel faces Sample coordinate, the texture sample coordinates are for correcting distortion of each pixel of grating pixel faces on the perspective plane Displacement;Using the texture sample coordinates to carry out texture mapping image to be projected, the image Jing Guo texture mapping is output to The grating pixel faces are projected.
Second aspect, the present invention provides a kind of projector distortion correction device, projector has ray machine camera lens and camera shooting Head, the device include: point cloud projection acquiring unit, are projected the standard grid image of grating pixel faces using the ray machine camera lens On the projection surface, and using the camera point cloud projection image that the perspective plane obtains mesh point on perspective plane is shot;It throws Shadow point cloud coordinate calculating unit, according to the pixel point correspondence between the standard grid image and the point cloud projection image And according to the three-dimensional projection model constructed in advance, mesh point is obtained on the perspective plane based on the ray machine camera lens coordinate system Three-dimensional coordinate, wherein ray machine camera lens coordinate system and camera coordinate system of the three-dimensional projection model based on the projector and structure It builds, for calculating three-dimensional coordinate of the mesh point based on the ray machine camera lens coordinate system on the perspective plane;Reference projection plane structure Unit is built, using the three-dimensional coordinate of mesh point on the perspective plane, obtains reference projection plane, and flat using the reference projection Face constructs world coordinate system, based on the transformational relation between the world coordinate system and the ray machine camera lens coordinate system, obtains institute Mesh point is stated on perspective plane in the subpoint of the reference projection plane;Transformational relation computing unit, according to the subpoint with The corresponding relationship of the ray machine camera lens grating pixel faces online lattice point, obtains the corresponding line of each pixel of grating pixel faces Sample coordinate is managed, the texture sample coordinates are abnormal on the perspective plane for correcting each pixel of grating pixel faces Variable displacement;Texture map unit will be reflected using the texture sample coordinates to carry out texture mapping image to be projected by texture The image penetrated is output to the grating pixel faces and is projected.
The third aspect, the present invention provides a kind of projectors, comprising: is projected in the standard grid image of grating pixel faces Ray machine camera lens on perspective plane;Shooting perspective plane obtains point cloud projection image and is sent to the camera of the graphics processor; The memory of computer executable instructions is stored, it is above-mentioned that computer executable instructions when executed execute graphics processor Projector distortion correction method.
Fourth aspect stores on computer readable storage medium the present invention provides a kind of computer readable storage medium There are one or more computer programs, the figure reason device that one or more computer programs are projected instrument is realized above-mentioned when executing Projector distortion correction method.
The present invention is in projector built-in camera, so that projector has stereoscopic vision ability, utilizes the light of projector Machine camera lens and camera may be implemented to establish three-dimensional projection model to the active three-dimensional modeling of projection environment, be based on tripleplane Model projection instrument is that the distribution situation on perspective plane is automatically obtained using computer vision methods without setting by outside Standby, instrument calculates each pixel of grating pixel faces in the accurate distortion position on practical perspective plane by the distribution situation on perspective plane It moves, realizes the correction to raster image using mapping technology to eliminate the picture on perspective plane and distort.
Detailed description of the invention
Fig. 1 is the flow chart of the projector distortion correction method shown in the embodiment of the present invention;
Fig. 2 is the schematic diagram of optical system of the projector shown in the embodiment of the present invention;
Fig. 3 is the pin-hole imaging schematic diagram shown in the embodiment of the present invention;
Fig. 4 is the point cloud projection image generation process schematic diagram shown in the embodiment of the present invention;
Fig. 5 is the three-dimensional projection model schematic diagram shown in the embodiment of the present invention;
Fig. 6 is the reference projection plane fitting process schematic shown in the embodiment of the present invention;
Fig. 7 is world coordinate system, ray machine camera lens coordinate system, grating pixel faces and the reference projection shown in the embodiment of the present invention Relation schematic diagram between plane;
Fig. 8 is rectangular projection schematic diagram of the perspective plane mesh point in reference projection plane shown in the embodiment of the present invention;
Fig. 9 be the embodiment of the present invention shown in reference projection plane on extended projection point schematic diagram;
Figure 10 is the structural block diagram of the projector distortion correction device shown in the embodiment of the present invention;
Figure 11 is the structural schematic diagram of the projector shown in the embodiment of the present invention.
Specific embodiment
The embodiment of the present invention provides a kind of projector distortion correction method, is thrown using the various methods of computer vision The surface distribution parameter in shadow face is to pass through GPU reality according to precorrection Target Photo grid texture layout is calculated with this distribution parameter Now the texture mapping of input picture is corrected, so that predistortion projected image is obtained, so that finally projecting to non-smooth perspective plane Image show undistorted effect.The embodiment of the present invention also provides corresponding devices, projector and computer-readable storage Medium is described in detail separately below.
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with attached drawing to embodiment party of the present invention Formula is described in further detail.However, it should be understood that these descriptions are merely illustrative, and it is not intended to limit model of the invention It encloses.In addition, in the following description, descriptions of well-known structures and technologies are omitted, it is of the invention to avoid unnecessarily obscuring Concept.
Term as used herein is not intended to limit the present invention just for the sake of description specific embodiment.Used here as Word " one ", " one (kind) " and "the" etc. also should include " multiple ", " a variety of " the meaning, unless in addition context clearly refers to Out.In addition, the terms "include", "comprise" as used herein etc. show the presence of the feature, step, operation and/or component, But it is not excluded that in the presence of or add other one or more features, step, operation or component.
There are all terms (including technical and scientific term) as used herein those skilled in the art to be generally understood Meaning, unless otherwise defined.It should be noted that term used herein should be interpreted that with consistent with the context of this specification Meaning, without that should be explained with idealization or excessively mechanical mode.
Shown in the drawings of some block diagrams and/or flow chart.It should be understood that some sides in block diagram and/or flow chart Frame or combinations thereof can be realized by computer program instructions.These computer program instructions can be supplied to general purpose computer, The processor of special purpose computer or other programmable data processing units, so that these instructions are when executed by this processor can be with Creation is for realizing function/operation device illustrated in these block diagrams and/or flow chart.
Therefore, technology of the invention can be realized in the form of hardware and/or software (including firmware, microcode etc.).Separately Outside, technology of the invention can take the shape of the computer program product on the computer readable storage medium for being stored with instruction Formula, the computer program product use for instruction execution system or instruction execution system are combined to use.Of the invention upper Hereinafter, computer readable storage medium, which can be, can include, store, transmitting, propagating or transmitting the arbitrary medium of instruction.Example Such as, computer readable storage medium can include but is not limited to electricity, magnetic, optical, electromagnetic, infrared or semiconductor system, device, device Or propagation medium.The specific example of computer readable storage medium includes: magnetic memory apparatus, such as tape or hard disk (HDD);Light is deposited Storage device, such as CD (CD-ROM);Memory, such as random access memory (RAM) or flash memory;And/or wire/wireless communication chain Road.
The present invention provides a kind of projector distortion correction method.
Fig. 1 is the flow chart of the projector distortion correction method shown in the embodiment of the present invention, as shown in Figure 1, the present embodiment Method include:
The standard grid image of grating pixel faces is projected on perspective plane by S110 using ray machine camera lens, and utilizes camera shooting Head shooting perspective plane obtains the point cloud projection image of mesh point on perspective plane.
Ray machine in the present embodiment can be understood as the projection module in projection device, it is generally the case that ray machine is exactly handle Digital Micromirror Device shows that core, light source, camera lens optical path, heat dissipation all concentrate in a mechanism, is integrally formed component, With dust-proof, shockproof.
S120, according to the pixel point correspondence between standard grid image and point cloud projection image and according to preparatory structure The three-dimensional projection model built obtains three-dimensional coordinate of the mesh point based on ray machine camera lens coordinate system on perspective plane, wherein tripleplane Ray machine camera lens coordinate system and camera coordinate system of the model based on projector and construct, be based on for calculating mesh point on perspective plane The three-dimensional coordinate of ray machine camera lens coordinate system.
S130 obtains reference projection plane using the three-dimensional coordinate of mesh point on perspective plane, and utilizes reference projection plane World coordinate system is constructed, based on the transformational relation between world coordinate system and ray machine camera lens coordinate system, obtains grid on perspective plane Subpoint of the point in reference projection plane.
S140 obtains grating pixel faces according to the corresponding relationship of subpoint and ray machine camera lens grating pixel faces online lattice point The corresponding texture sample coordinates of each pixel, texture sample coordinates are for correcting each pixel of grating pixel faces on perspective plane On distortion shift.
S150, it is using texture sample coordinates to carry out texture mapping image to be projected, the image Jing Guo texture mapping is defeated Grating pixel faces are arrived out to be projected.
The present embodiment utilizes projector so that projector has stereoscopic vision ability in projector built-in camera Ray machine camera lens and camera may be implemented to establish three-dimensional projection model to the active three-dimensional modeling of projection environment, be thrown based on three-dimensional Shadow model projection instrument automatically obtains the distribution situation on perspective plane using computer vision methods without by outside Equipment, instrument calculate each pixel of grating pixel faces in the accurate distortion on practical perspective plane by the distribution situation on perspective plane Displacement realizes the correction to raster image using mapping technology to eliminate the picture on perspective plane and distort.
For the precision for promoting projector distortion correction, the present embodiment in advance demarcates projector, obtains projector Distortion parameter, the distortion parameter include ray machine camera lens and camera intrinsic parameter and outer parameter.Such as to the calibration process of projector Under:
As shown in Fig. 2, there are certain distance, world coordinate points same in this way between the ray machine camera lens and camera of projector There will be the point A in parallax, such as Tu2Zhong view field in ray machine grating pixel faces and sensor pixel face in ray machine light Correspond to location of pixels a1 in grid pixel faces, location of pixels a2 is corresponded on sensor pixel face, it is vertical thus to meet binocular The formation condition of body vision, since tripleplane can be constructed based on the ray machine camera lens coordinate system and camera coordinate system of projector Model.
Ray machine camera lens can be regarded as a reverse camera, can establish the pinhole imaging system mould similar with camera Type, thus the correction principle of ray machine camera lens and the correction principle of camera are similar, and the present embodiment is joined with the distortion for obtaining camera It is described for number.
As shown in figure 3, the formula of national forest park in Xiaokeng are as follows: and sm '=A [R | t] M ', wherein s is the Normalized Scale factor;A For the Intrinsic Matrix of camera;[R | t] it is outer parameter matrix, for the coordinate of picture point P to be transformed into from world coordinate system In camera coordinate system, R is spin matrix, and t is translation vector;M ' is coordinate position in the camera coordinate system, M ' be Coordinate position in world coordinate system;In pinhole imaging system optical path shown in Fig. 3, the coordinate c of object point FcxWith cyIn uv plane pair The plane coordinates answered is (u, v), u=fx·x′+cx, v=fy·y′+cy, x '=x/z, y '=y/z, fx,fyRespectively camera Focal length coordinate, cx,cyThe respectively X-axis and Y axis coordinate of object point Fc, x, y are respectively the coordinate of picture point P, and x ', y ' are respectively opposite P The normalized coordinate of point ordinate of orthogonal axes.The coordinate system of object point Fc shown in Fig. 3 is equivalent to the present embodiment camera coordinate system, and uv is flat Areal coordinate system is equivalent to the present embodiment camera sensing device pixel faces coordinate system.Therefore, world coordinate system and camera are being obtained It, can be according to u=f after transformational relation between coordinate systemx·x′+cx, v=fy·y′+cyObtain world coordinate system and sensor Corresponding relationship between pixel faces coordinate system.
For camera internal reference matrix number, the internal reference of projector camera can be obtained by scaling board and structured light projection Number, intrinsic parameter includes focal length, radial distortion parameter, tangential distortion parameter and principal point coordinate (i.e. the central point of sensor image). At this point, the corresponding relationship between camera coordinate system and sensor pixel areal coordinate system are as follows: u=fx*x″+cx, v=fy*y″+cy, Wherein, k1,k2,k3,k4,k5,k6Respectively take the photograph As the radial distortion parameter of head, p1,p2The respectively tangential distortion parameter of camera, s1,s2,s3,s4Respectively camera is thin Prismatic distortion parameter.
In the present embodiment, it is also necessary to obtain the translation vector and spin matrix between ray machine camera lens and camera.Specifically, In interior parameter calibration process, available world coordinate system to ray machine camera lens coordinate system spin matrix RpWith translation vector tp, the world Spin matrix R of the coordinate system to camera coordinate systemcWith translation vector tc, it can be obtained according to national forest park in Xiaokeng:(X, Y, Z) is the three-dimensional point coordinate of world coordinate system, (Xp,Yp, Zp) and (Xc,Yc,Zc) respectively correspond the three-dimensional point coordinate of ray machine camera lens coordinate system and camera coordinate system, the above-mentioned two public affairs of simultaneous The available ray machine camera lens of formula and camera relative positional relationship:It both can get in this way The outer parameter of projector binocular vision, the coordinate for realizing camera coordinate system to ray machine camera lens coordinate system are converted.
Before projector factory, by above-mentioned projector calibrating method, obtain ray machine camera lens and camera intrinsic parameter and Outer parameter.In some embodiments, the projector light bar resolution used is demarcated as 1920x1080, and net is obtained by re-projection Lattice point calculates calibration residual error in the virtual image in grating face.Intrinsic parameter calibration residual error is 0.2137 pixel in some embodiments, outside Parameter calibration residual error is 0.4324 pixel.Inside and outside parameter obtained by calibrating is downloaded in projector, in distortion below It is used in correcting process.
In projector applications, the user interface comprising perspective plane distortion correction process is selecting and is entering the process User interface after, projector can prompt enter distortion correction process, require the perspective plane that need to correct projector alignment at this time And keep projector position constant, into automated correction process.It is carried out below with reference to Fig. 4-9 couples of above-mentioned steps S110-S150 It is described in detail.
Firstly, executing step S110, i.e., the standard grid image of grating pixel faces is projected in projection using ray machine camera lens On face, and utilize the point cloud projection image of mesh point on the acquisition perspective plane of camera shooting perspective plane.
Under normal conditions project millet cake cloud density it is higher, to projection planar survey precision will be higher, but due to by The influence of the factors such as the scattering of round spot caused by lens distortion, acquisition noise and bias light just will affect if mesh-density is excessively high The speed of grid search, or even mesh point can not be searched.The present embodiment introduces the method that distorted grid is generated and projected, can In the case where not influencing mesh point search efficiency, grid dot density is promoted, to promote projection millet cake cloud measurement accuracy.
In some embodiments, perspective plane point cloud chart picture is obtained by following methods: utilizing ray machine camera lens by grating pixel The first standard grid image in face is projected on perspective plane, and obtains the first subpoint cloud atlas using camera shooting perspective plane Picture;The second standard grid image of grating pixel faces is projected on perspective plane using ray machine camera lens, and is shot using camera The perspective plane obtains the second point cloud projection image, and the first standard grid image and the round spot in the second standard grid image are mutual It is staggered;The round spot grid in the first point cloud projection image and the second point cloud projection image is identified respectively, obtains the first point cloud projection Location of pixels of each round spot in grating pixel faces in image and the second point cloud projection image;By being superimposed the first subpoint cloud atlas Picture and the second point cloud projection image obtain point cloud projection image, every in the first point cloud projection image and the second point cloud projection image A round spot is in the location of pixels that the location of pixels of grating pixel faces is corresponding round spot on point cloud projection image.
As shown in figure 4, the application program of projector generates two asymmetric round spot grid pictures, the round spot phase of two pictures Mutually dislocation, is spaced each other a grid distance.The two pictures successively project to perspective plane by ray machine camera lens, then pass through projection The camera acquired projections picture of instrument finds grid round spot in collected point cloud chart picture and records its pixel coordinate, then The coordinate of distorted grid round spot on two amplitude point cloud images is merged, obtaining mesh-density is two point cloud charts as round spot density The sum of point cloud projection image round spot central coordinate of circle.
On obtaining perspective plane after the point cloud projection image of mesh point, step S120 is continued to execute, i.e., according to standard network Pixel point correspondence between table images and point cloud projection image and according to the three-dimensional projection model constructed in advance, is thrown Three-dimensional coordinate of the mesh point based on ray machine camera lens coordinate system on shadow face, wherein ray machine of the three-dimensional projection model based on the projector Camera lens coordinate system and camera coordinate system and construct, for calculating three-dimensional of the mesh point based on ray machine camera lens coordinate system on perspective plane Coordinate.
Before the distribution situation for obtaining perspective plane based on perspective plane mesh point, using having obtained distortion parameter to grating image Image on vegetarian noodles and sensor pixel face is corrected, the mesh point coordinate (u of the ccd image after being correctedc,vc) and light Mesh point coordinate (u in machine imagep,vp);Wherein the acquisition process of distortion parameter sees above description.
In some embodiments, three-dimensional projection model is constructed by following methods: first according to the camera optical path light The heart and its first check point in the camera sensing device pixel faces, establish the first linear relationship;Then according to the light Machine camera lens optical path optical center and its second check point in the ray machine camera lens grating pixel faces, establish the second linear relationship;So Afterwards according to the external parameter matrix between the camera and the ray machine camera lens, first check point and described second is established The third linear relationship of check point;Finally according to the first linear relationship, the second linear relationship and third linear relationship, projected Three-dimensional coordinate of the mesh point based on ray machine camera lens coordinate system on face.
The three-dimensional projection model of the present embodiment calculates the three-dimensional coordinate of mesh point on perspective plane using deltic method.Such as figure Shown in 5, the mapping point of any mesh point on perspective plane on sensor pixel face and in grating pixel faces can be obtained, and standard Pixel point correspondence between grid image and point cloud projection image is determining, and then can reconstruct perspective plane mesh point Three-dimensional coordinate.I.e. since perspective plane point cloud chart seems to be projected standard grid image by ray machine camera lens, and by camera It is acquired and obtains, therefore, the three of each mesh point in perspective plane can be reconstructed based on three-dimensional projection model shown in fig. 5 Tie up coordinate.
One example, the u in Fig. 5cvcPlane coordinate system corresponds to coordinate system where sensor pixel face, upvpPlane is sat Mark system corresponds to coordinate system where grating pixel faces, then according to previously described formula u=fx*x″+cxWith v=fy·y′+cy Camera optical path optical center O can be obtainedcIt is q in the coordinate of coordinate system where sensor pixel facec, ray machine camera lens optical path optical center Op It is q in the coordinate of coordinate system where grating pixel facesp, thus according to the three-dimensional projection model of the present embodiment building, can calculate Perspective plane respective grid points Q outwThree-dimensional coordinate in ray machine camera lens coordinate system.
Assuming that the coordinate of some mesh point of perspective plane is (Xp,Yp,Zp), then availableWithspWith scThe respectively scale factor of camera and ray machine camera lens, (uc,vc) and (up, vp) it is respectively two-dimensional coordinate of the space three-dimensional point in sensor pixel face and grating pixel faces upslide shadow point, AcWith ApIt images respectively Head and the Intrinsic Matrix of ray machine camera lens are the outer parameter matrix that [R | t] is projector.
Further according to camera optical path optical center OcWith the first pixel (u in camera sensing device pixel facesp,vp) establish the One linear relationship and ray machine camera lens optical path optical center OpWith the second pixel (u in ray machine camera lens grating pixel facesp,vp) establish (X can be calculated in two wires sexual intercoursep,Yp,Zp)。
On obtaining perspective plane after three-dimensional coordinate of the mesh point based on ray machine camera lens coordinate system, step is continued to execute S130 obtains reference projection plane that is, using the three-dimensional coordinate of mesh point on perspective plane, and constructs generation using reference projection plane Boundary's coordinate system obtains mesh point on perspective plane based on the transformational relation between world coordinate system and the ray machine camera lens coordinate system In the subpoint of reference projection plane.
In some embodiments, space filter is carried out to mesh point on perspective plane according to the three-dimensional coordinate of mesh point on perspective plane It removes, filters out the invalid mesh point in mesh point, obtain effective grid point;Plane is carried out to effective grid point using least square method Fitting determines that being fitted obtained plane is benchmark projection plane;Wherein, the effective grid point is grid on the perspective plane The mesh point being located substantially on same plane in point, the invalid mesh point are big far from described in mesh point on the perspective plane Mesh point in plane.
As shown in fig. 6, the three-dimensional point cloud of all mesh point compositions may be not on same plane on perspective plane, these It may include spatial noise in mesh point to select, discontinuous select and need to filter out these before rebuilding projection reference surface from millet cake Invalid mesh point forms smooth projection millet cake cloud, carries out plane fitting to filtered smooth projection millet cake cloud and obtains institute State reference projection plane.
In some embodiments, spatial noise point in invalid mesh point and non-can be filtered out first with low-pass filtering mode Continuity point, then again by following methods filter out in invalid grid from millet cake, obtain effective grid point:
Step A: randomly selecting three not conllinear mesh points in the mesh point, obtains three not conllinear mesh points and determines Sub- plane a '0x+a′1y+a′2Z=d, wherein a '0,a′1,a′2, d is constant;
Step B: each mesh point and i-th of sub- plane distance d on perspective plane are calculatedi=a '0xi+a′1yi+a′2zi, pick Except with sub- plan range be greater than pre-determined distance value pixilated grid point, obtain reference net lattice point number, reference net lattice point be with The distance of sub- plane is not more than the mesh point of pre-determined distance value;Illustratively, pre-determined distance value t=2 σ, σ are that all mesh points arrive The standard deviation of current sub- plan range, works as diDetermine that the mesh point is removed for abnormal point when 2 σ of >, on the contrary di≤ 2 σ are determined The mesh point is retained for reference net lattice point.
Repeat step A and step B, after iteration n times, determining after n times iteration has most in obtained N number of sub- plane The sub- plane of reference of more reference net lattice point numbers, the reference net lattice point with reference to sub- plane are effective grid point.
With reference to Fig. 6, the left figure in Fig. 6 is the perspective plane point cloud chart before filtering, it can be seen that some points are in from face position, Right figure in Fig. 6 is the point cloud for carrying out a cloud plane fitting, it can be seen that those points from face have been filtered out, remaining Point is generally in fit Plane.
After obtaining effective grid point, the three-dimensional coordinate that can use effective grid point determines the plane of reference projection plane Equation is a0x+a1y+a2=z, the unit normal vector of the planeWherein a0,a1,a2It is normal Number, NbpFor the normal vector of benchmark projection plane, norm () is vector norm operator.
After obtaining reference projection plane and its posture information, the world coordinate system of the present embodiment can be determined.
In some embodiments, the face XOY of world coordinate system is overlapped with reference projection plane, and X-axis and ray machine camera lens are sat The face XOY for marking system is parallel, and Z-direction is perpendicular to reference projection plane.It, can be with due to the normal line vector of known reference projection plane Calculating vector of three reference axis of world coordinate system in ray machine camera lens coordinate system by following formula indicates: Vxw=Nbp× Vyc, Vyw=Vyc, Vzw=Nbp, wherein Vyc=[0,1,0], Vxw、VywWith VzwBe respectively three reference axis of world coordinate system to Amount expression, VycIt is indicated for the vector of ray machine camera lens coordinate system Y-axis, the origin O of world coordinate systemw=[X0-a0t,Y0-a1t,Z0+ T],(Xck,Yck,Zck) it is k-th of sub- plane Barycentric coodinates, [X0,Y0,Z0] be N number of sub- plane average focus point, t is constant.
As shown in fig. 7, the XOY plane of world coordinate system is overlapped with reference projection plane, the Y-axis amount of world coordinate system with The direction of the Y-axis amount of ray machine camera lens coordinate system is identical, Z axis vector and the reference projection plane normal vector of world coordinate system Direction is identical, and world coordinate system origin is subpoint of the average focus point of N number of sub- plane in reference projection plane.
After constructing world coordinate system, the translation vector T of ray machine camera lens coordinate system and world coordinate system can have both been determined =OwWith spin matrix R=(Vxw,Vyw,Vzw)T, can both obtain the conversion pass between world coordinate system and ray machine camera lens coordinate system System is T=Ow, R=(Vxw,Vyw,Vzw)T
In some embodiments, by following methods obtain perspective plane on mesh point reference projection plane projection Point: according to the transformational relation T=O between world coordinate system and ray machine camera lens coordinate systemw, R=(Vxw,Vyw,Vzw)T, by ray machine mirror Mesh point coordinate is converted to mesh point coordinate on the perspective plane under world coordinate system on perspective plane under head coordinate system(Xw,Yw,Zw) and (Xc,Yc,Zc) it is respectively mesh point in world coordinate system and ray machine camera lens coordinate Coordinate in system;Mesh point rectangular projection on the perspective plane under world coordinate system is converted into be thrown to reference projection plane Subpoint of the mesh point in reference projection plane on shadow face.
As shown in figure 8, by the XOY of all three dimensional network lattice point rectangular projections for being mapped to world coordinate system to world coordinate system In plane namely in reference projection plane, the two-dimensional coordinate that mesh point is projected on projection datum plane, which is equal to, takes mesh point three Tie up the X of coordinatewAnd Yw.It is assumed that viewer's face projects datum plane, by projecting to reference projection plane in above-mentioned calculating process On two dimensional image reduce the fault image distribution situation as caused by the out-of-flatness of perspective plane that viewer sees, thus project Instrument has been automatically derived the distribution situation on perspective plane.
Mesh point continues to execute step S140, i.e. root after the subpoint of reference projection plane on obtaining perspective plane According to the corresponding relationship of subpoint and ray machine camera lens grating pixel faces online lattice point, it is corresponding to obtain each pixel of grating pixel faces Texture sample coordinates, texture sample coordinates are for correcting the distortion shift of each pixel of grating pixel faces on the projection surface.
After completing rectangular projection of the three dimensional network lattice point to reference projection plane, select effective subpoint is corresponding just to trade Shadow point matches with mesh point corresponding in grating pixel faces, calculates the homography matrix between two plane coordinate systems.So Calculated homography matrix afterwards, as shown in figure 8, two-dimensional grid point coordinate in reference projection plane is mapped to light The pixel coordinate of grid pixel faces, so that obtaining distortion coordinate of the mesh point of grating pixel faces on perspective plane is mapped to grating pixel The location of pixels in face.
It is to be understood that if the pixel of the original pixel position in grating pixel faces is replaced with distortion picture in the picture Actual pixel value on plain position, then in reference projection plane the pixel value of corresponding projected position in the grating pixel faces The pixel value of same location is consistent, to realize undistorted projection.
In some embodiments, homography matrix is obtained by following methods and obtain each pixel of grating pixel faces Corresponding texture sample coordinates: according in ray machine camera lens grating pixel faces with the matched grating pixel faces grid of effective subpoint cluster Point cluster, obtains the first homography matrix between ray machine camera lens grating pixel faces and reference projection plane, wherein effectively projection Point is subpoint of the effective grid point in reference projection plane;Subpoint coordinate is mapped to using the first homography matrix Grating pixel faces coordinate obtains the texture sampling seat that distortion coordinate of the standard grid image on perspective plane is mapped to grating pixel faces Mark.Subpoint two-dimensional coordinate of the total-grid point in reference projection plane on perspective plane is passed through into the first homography matrix The back projection's raster coordinate being mapped as in grating pixel faces, which is texture sample coordinates.
Since the grid round spot for being projected and being searched for must be kept completely, the point cloud projection image with reference to shown in Fig. 4 should Point cloud projection image corner areas is that there is no grid round spots, i.e. covering model of the point cloud projection image corner areas in grid round spot Except enclosing, therefore after obtaining grating pixel faces mesh point cluster, also need further to project the subpoint on datum plane to side Angle extension, so that extended projection point corresponds to the mesh point of grating pixel faces top angular zone, distortion correction range is extended to Entire view field.As shown in figure 9, the known mesh point that datum plane will be projected, with horizontal or vertical direction to image border Directional Extension forms four groups of upper and lower, left and right border extended subpoint row and column, saves each extended projection point and grating pixel The pixel coordinate of corresponding mesh point in figure forms the whole projection point diagram comprising corner extended projection point as shown in Figure 9 Picture.
In some embodiments, extended projection point is obtained particular by following methods and position is calculated based on extended projection point In the corresponding homography matrix of grating pixel faces mesh point cluster of corner areas, grating image is obtained according to the homography matrix The corresponding texture sample coordinates of each pixel of vegetarian noodles corner areas:, will first according to the corner areas of the grating pixel faces Subpoint in the reference projection plane is extended to the corner areas direction, is obtained four groups and is located at the reference projection The extended projection point of horizontal edge and with the mesh point in the matched grating pixel faces of the extended projection point;Then sharp Extension point is scanned for according to setting step-size in search with the search of the search box of pre-set dimension, using being located under each step-size in search In search box with effective matched grating pixel faces mesh point cluster of subpoint cluster, obtain the ray machine camera lens grating pixel The second homography matrix between face and the reference projection plane;It finally will be described using the second homography matrix Extended projection point coordinate is mapped to the grating pixel faces coordinate, obtains the standard grid image marginal point on the perspective plane Distortion coordinate be mapped to the texture sample coordinates of the grating pixel faces.
As shown in figure 9,3 × 3 pixel sizes of search box, step-size in search is 3 pixel distances, and each step-size in search is the next In the subpoint number in search box be 9, wherein be located at edge search box in subpoint in include 5 extended projections Point, 4 known adjacent projections points;And being located in the subpoint in the search box of edge includes 3 extended projection points, 6 are Know adjacent projections point.As shown in figure 9, search box inner cross subpoint is known adjacent projections point.Calculate these neighbor mesh points The second homography matrix between grating pixel faces and reference projection plane, calculated second homography The two-dimensional coordinate of the corresponding grating pixel faces of extended projection point is denoted as the line of the mesh point of grating pixel faces corner areas by matrix Manage sample coordinate;Known mesh point line in the texture sample coordinates and grating pixel faces of the mesh point of grating pixel faces corner areas It manages sample coordinate and forms the mesh point and mapping point cluster for covering entire raster image region, complete coordinate mapping.
After obtaining the corresponding texture sample coordinates of each pixel of grating pixel faces, step S150 is continued to execute, i.e., Using texture sample coordinates to carry out texture mapping image to be projected, the image Jing Guo texture mapping is output to grating pixel faces It is projected.
After the completion of coordinate mapping calculation, the pixel coordinate and corresponding texture sampling of each mesh point of grating pixel faces are recorded Coordinate remains into the nonvolatile storage of projector, completes the distortion correction measurement procedure of Current projection environment.
The distortion in real time school of projected image is exactly based on what GPU was accelerated, and illustratively, GPU rendering pipeline establishes net Lattice, i.e., from top to bottom, from left to right all mesh points in raster pixel faces, mesh point is formed with three for one group of vertex One triangle, each triangle are sequentially connected in series the complete grating pixel faces of covering, include itself of vertex for each vertex attribute Coordinate and the corresponding source images texture coordinate of the grid vertex being calculated.In the process of running, rendering pipeline passes through two-wire Property interpolation calculation vertex triangle interior pixels texture coordinate, the sampler that GPU renders pipeline is based on mapping point cluster from original Pixel value is extracted in corresponding coordinate position in beginning picture, as the output pixel value of the vertex position, by the figure of texture mapping As being output in the grating pixel faces of projector, entire distortion precorrection process is completed by projection.
Thus the present embodiment is automatically completed by projector the application and is divided perspective plane not by external instrument, equipment Cloth estimation, and according to this to raster image into distortion correction, so that the picture for actually projecting to non-smooth perspective plane is not in picture The case where area distortion.The present embodiment is suitable for measuring smooth surface space distribution situation any applied field there are demand Scape.
The present invention also provides a kind of projector distortion correction devices.
Figure 10 is the structural block diagram of the projector distortion correction device shown in the embodiment of the present invention, as shown in Figure 10, this reality The device applied includes:
The standard grid image of grating pixel faces is projected in projection using the ray machine camera lens by point cloud projection acquiring unit On face, and the point cloud projection image that the perspective plane obtains mesh point on perspective plane is shot using the camera;
Point cloud projection coordinate calculating unit, according to the pixel between the standard grid image and the point cloud projection image Point correspondence and according to the three-dimensional projection model constructed in advance obtains mesh point on the perspective plane and is based on the ray machine mirror The three-dimensional coordinate of head coordinate system, wherein ray machine camera lens coordinate system and camera of the three-dimensional projection model based on the projector Coordinate system and construct, for calculating three-dimensional coordinate of the mesh point based on the ray machine camera lens coordinate system on the perspective plane;
It is flat to obtain reference projection using the three-dimensional coordinate of mesh point on the perspective plane for reference projection plane construction unit Face, and world coordinate system is constructed using the reference projection plane, based on the world coordinate system and the ray machine camera lens coordinate Transformational relation between system obtains on the perspective plane mesh point in the subpoint of the reference projection plane;
Transformational relation computing unit, it is corresponding with ray machine camera lens grating pixel faces online lattice point according to the subpoint Relationship obtains the corresponding texture sample coordinates of each pixel of grating pixel faces, and the texture sample coordinates are for correcting Distortion shift of each pixel of grating pixel faces on the perspective plane;
Texture map unit will pass through texture using the texture sample coordinates to carry out texture mapping image to be projected The image of mapping is output to the grating pixel faces and is projected.
In some embodiments, the device in Figure 10 further includes pretreatment unit, according to the camera optical path optical center with Its first check point in the camera sensing device pixel faces, establishes the first linear relationship;According to the ray machine camera lens light Road optical center and its second check point in the grating pixel faces, establish the second linear relationship;According to the camera and institute The external parameter matrix between ray machine camera lens is stated, the third linear for establishing first check point and second check point is closed System;According to first linear relationship, second linear relationship and the third linear relationship, the perspective plane online is obtained Three-dimensional coordinate of the lattice point based on the ray machine camera lens coordinate system.
In some embodiments, reference projection plane construction unit includes: fitting module, filtering module and mapping block;
Fitting module carries out space to mesh point on the perspective plane according to the three-dimensional coordinate of mesh point on the perspective plane It filters out, filters out the invalid mesh point in the mesh point, obtain effective grid point;Using least square method to the effective grid Point carries out plane fitting, determines that being fitted obtained plane is the reference projection plane.
Filtering module after iteration n times, is determined obtained N number of after n times iteration by repeating step A and step B In sub- plane with most reference net lattice point numbers the sub- plane of reference, the reference net lattice point with reference to sub- plane be described in Effective grid point;Wherein step A: randomly selecting three not conllinear mesh points in the mesh point, obtains described three not altogether The sub- plane that gauze lattice point determines;Step B: it calculates each mesh point on the perspective plane and is rejected at a distance from the sub- plane It is greater than the pixilated grid point of pre-determined distance value with the sub- plan range, obtains the number of reference net lattice point, the grid of reference Point is the mesh point at a distance from the sub- plane no more than the pre-determined distance value.
Mapping block will be described according to the transformational relation between the world coordinate system and the ray machine camera lens coordinate system Mesh point coordinate is converted on the perspective plane under the world coordinate system on the perspective plane under ray machine camera lens coordinate system Mesh point coordinate;Mesh point rectangular projection is converted on the perspective plane under the world coordinate system to the reference projection Plane obtains on the perspective plane mesh point in the subpoint of the reference projection plane.
In some embodiments, transformational relation computing unit includes the first computing module and the second computing module;
First computing module, according in the ray machine camera lens grating pixel faces with the matched grating pixel of effective subpoint cluster Veil lattice point cluster obtains the first homography square between the ray machine camera lens grating pixel faces and the reference projection plane Battle array, effective subpoint are subpoint of the effective grid point in the reference projection plane;It is single using described first It answers mapping matrix that the subpoint coordinate is mapped to the grating pixel faces coordinate, obtains the standard grid image described The distortion coordinate on perspective plane is mapped to the texture sample coordinates of the grating pixel faces.
Second computing module, according to the corner areas of the grating pixel faces, by the projection in the reference projection plane Point to the corner areas direction extend, obtain four groups be located at the reference projection horizontal edge extended projection point and With the mesh point in the matched grating pixel faces of the extended projection point;Using the search box search of pre-set dimension according to setting Determine step-size in search to extension point scan for, using under each step-size in search be located at described search frame in effective projection The point matched grating pixel faces mesh point cluster of cluster, obtains between the ray machine camera lens grating pixel faces and the reference projection plane The second homography matrix;The extended projection point coordinate is mapped to the grating using the second homography matrix Pixel areal coordinate obtains distortion coordinate of the standard grid image marginal point on the perspective plane and is mapped to the grating pixel The texture sample coordinates in face.
In some embodiments, point cloud projection acquiring unit is marked using the ray machine camera lens by the first of grating pixel faces Quasi- grid image is projected on the perspective plane, and is shot the perspective plane using the camera and obtained the first subpoint cloud atlas Picture;The second standard grid image of grating pixel faces is projected on the perspective plane using the ray machine camera lens, and utilizes institute It states camera and shoots the second point cloud projection image of the perspective plane acquisition, the first standard grid image and second standard Round spot in grid image mutually staggers;It is identified in the first point cloud projection image and the second point cloud projection image respectively Round spot grid, obtain in the first point cloud projection image and the second point cloud projection image each round spot in the grating The location of pixels of pixel faces;By merging each round spot in the first point cloud projection image and the second point cloud projection image In the location of pixels of the grating pixel faces, the location of pixels of each round spot of the point cloud projection image is obtained.
For device embodiment, since it corresponds essentially to embodiment of the method, so related place is referring to method reality Apply the part explanation of example.The apparatus embodiments described above are merely exemplary, wherein described be used as separation unit The unit of explanation may or may not be physically separated, and component shown as a unit can be or can also be with It is not physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to actual It needs that some or all of the modules therein is selected to achieve the purpose of the solution of this embodiment.Those of ordinary skill in the art are not In the case where making the creative labor, it can understand and implement.
The present invention also provides a kind of projectors.
Figure 11 is the structural schematic diagram of the projector shown in the embodiment of the present invention, as shown in figure 11, in hardware view, the throwing Shadow instrument includes graphics processor, optionally further comprising internal bus, network interface, memory.Wherein, memory may be comprising interior It deposits, such as high-speed random access memory (Random-Access Memory, RAM), it is also possible to further include non-volatile memories Device (non-volatile memory), for example, at least magnetic disk storage etc..Certainly, which is also possible that other Hardware required for business, such as ray machine camera lens and camera, the standard grid image of grating pixel faces is projected in by ray machine camera lens On perspective plane, camera shoots perspective plane and obtains point cloud projection image.
Graphics processor, network interface and memory can be connected with each other by internal bus, which can be ISA (Industry Standard Architecture, industry standard architecture) bus, PCI (Peripheral Component Interconnect, Peripheral Component Interconnect standard) bus or EISA (Extended Industry Standard Architecture, expanding the industrial standard structure) bus etc..The bus can be divided into address bus, data/address bus, control always Line etc..Only to be indicated with a four-headed arrow in Figure 11, it is not intended that an only bus or a type of convenient for indicating Bus.
Memory, for storing program.Specifically, program may include program code, and said program code includes calculating Machine executable instruction.Memory may include memory and nonvolatile memory, and provide instruction and data to graphics processor.
Graphics processor is being patrolled from the then operation into memory of corresponding computer program is read in nonvolatile memory Projector distortion correction device is formed in the level of collecting.Graphics processor executes the program that memory is stored and realizes as retouched above The projector distortion correction method stated.
The method that projector distortion correction device disclosed in the above-mentioned embodiment illustrated in fig. 11 such as this specification executes can answer It is realized in graphics processor, or by graphics processor.Graphics processor may be a kind of IC chip, have letter Number processing capacity.During realization, each step of above-described projector distortion correction method can be by figure at The instruction of the integrated logic circuit or software form of managing the hardware in device is completed.Above-mentioned graphics processor can be general place Manage device, including central processing unit (Central Processing Unit, CPU), network processing unit (Network Processor, NP) etc.;It can also be digital signal processor (Digital Signal Processor, DSP), dedicated integrated Circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field- Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic, Discrete hardware components.It may be implemented or execute disclosed each method, step and the logic diagram in this specification embodiment.It is logical It can be microprocessor with processor or the processor be also possible to any conventional processor etc..Implement in conjunction with this specification The step of method disclosed in example, can be embodied directly in hardware decoding processor and execute completion, or in decoding processor Hardware and software module combination execute completion.Software module can be located at random access memory, and flash memory, read-only memory may be programmed In the storage medium of this fields such as read-only memory or electrically erasable programmable memory, register maturation.The storage medium Positioned at memory, processor reads the information in memory, in conjunction with the step of its hardware completion above method.
The present invention also provides a kind of computer readable storage mediums.
The computer-readable recording medium storage one or more computer program, the one or more computer program packet Instruction is included, which can be realized above-described projector distortion correction side when the graphics processor for being projected instrument executes Method.
For the ease of clearly describing the technical solution of the embodiment of the present invention, in the embodiment of invention, use " first ", Printed words such as " second " distinguish function and the essentially identical identical entry of effect or similar item, and those skilled in the art can manage The printed words such as solution " first ", " second " are not defined quantity and execution order.
The above description is merely a specific embodiment, under above-mentioned introduction of the invention, those skilled in the art Other improvement or deformation can be carried out on the basis of the above embodiments.It will be understood by those skilled in the art that above-mentioned tool Body description only preferably explains that the purpose of the present invention, protection scope of the present invention should be subject to the protection scope in claims.

Claims (12)

1. a kind of projector distortion correction method, wherein the projector has ray machine camera lens and camera, the method packet It includes:
The standard grid image of grating pixel faces is projected on perspective plane using the ray machine camera lens, and utilizes the camera Shoot the point cloud projection image that the perspective plane obtains mesh point on perspective plane;
According to the pixel point correspondence between the standard grid image and the point cloud projection image and according to preparatory structure The three-dimensional projection model built obtains three-dimensional coordinate of the mesh point based on the ray machine camera lens coordinate system on the perspective plane, wherein Ray machine camera lens coordinate system and camera coordinate system of the three-dimensional projection model based on the projector and construct;
Using the three-dimensional coordinate of mesh point on the perspective plane, reference projection plane is obtained, and utilizes the reference projection plane World coordinate system is constructed, based on the transformational relation between the world coordinate system and the ray machine camera lens coordinate system, described in acquisition Subpoint of the mesh point in the reference projection plane on perspective plane;
According to the corresponding relationship of the subpoint and ray machine camera lens grating pixel faces online lattice point, the grating pixel is obtained The corresponding texture sample coordinates of each pixel in face, the texture sample coordinates are for correcting each pixel of grating pixel faces Distortion shift of the point on the perspective plane;
Using the texture sample coordinates to carry out texture mapping image to be projected, the image Jing Guo texture mapping is output to institute Grating pixel faces are stated to be projected.
2. according to the method described in claim 1, wherein, constructing the three-dimensional projection model by following methods:
According to the camera optical path optical center and its first check point in the camera sensing device pixel faces, first is established Linear relationship;
According to the ray machine camera lens optical path optical center and its second check point in the ray machine camera lens grating pixel faces, is established Two wires sexual intercourse;
According to the external parameter matrix between the camera and the ray machine camera lens, first check point and described the are established The third linear relationship of two check points;
According to first linear relationship, second linear relationship and the third linear relationship, obtain on the perspective plane Three-dimensional coordinate of the mesh point based on the ray machine camera lens coordinate system.
3. according to the method described in claim 1, wherein, the three-dimensional coordinate using mesh point on the perspective plane obtains Reference projection plane includes:
Spatial filter is carried out to mesh point on the perspective plane according to the three-dimensional coordinate of mesh point on the perspective plane, is filtered out described Invalid mesh point in mesh point obtains effective grid point;
Plane fitting is carried out to the effective grid point using least square method, determines that being fitted obtained plane is the benchmark Projection plane.
4. according to the method described in claim 3, wherein, the three-dimensional coordinate according to mesh point on the perspective plane is to described Mesh point carries out spatial filter on perspective plane, filters out the invalid mesh point in the mesh point, obtains effective grid point, comprising:
Step A: randomly selecting three not conllinear mesh points in the mesh point, obtains described three not conllinear mesh points and determines Sub- plane;
Step B: it calculates each mesh point on the perspective plane and is rejected big with the sub- plan range at a distance from the sub- plane In the pixilated grid point of pre-determined distance value, the number of reference net lattice point is obtained, the reference net lattice point is and the sub- plane Distance is not more than the mesh point of the pre-determined distance value;
It repeats step A and step B, after iteration n times, determines after n times iteration that there is at most ginseng in obtained N number of sub- plane The sub- plane of reference of mesh point number is examined, the reference net lattice point with reference to sub- plane is the effective grid point.
5. according to the method described in claim 4, wherein, the XOY plane and the reference projection of the world coordinate system XOYZ Plane is overlapped, and the Y-axis amount of the world coordinate system is identical as the direction of Y-axis amount of the ray machine camera lens coordinate system, described The Z axis vector of world coordinate system XOYZ is identical as the direction of the reference projection plane normal vector, and the world coordinate system is former Point O is subpoint of the average focus point of N number of sub- plane in the reference projection plane.
It is described based on the world coordinate system and the ray machine camera lens coordinate system 6. according to the method described in claim 1, wherein Between transformational relation, obtain on the perspective plane mesh point in the subpoint of the reference projection plane, comprising:
According to the transformational relation between the world coordinate system and the ray machine camera lens coordinate system, by the ray machine camera lens coordinate system Under the perspective plane on mesh point coordinate be converted to mesh point coordinate on the perspective plane under the world coordinate system;
Mesh point rectangular projection on the perspective plane under the world coordinate system is converted into obtain to the reference projection plane Subpoint of the mesh point in the reference projection plane on the perspective plane.
7. described according to the subpoint and the ray machine camera lens grating pixel faces according to the method described in claim 3, wherein The corresponding relationship of upper mesh point obtains the corresponding texture sample coordinates of each pixel of grating pixel faces, comprising:
According in the ray machine camera lens grating pixel faces with the matched grating pixel faces mesh point cluster of effective subpoint cluster, obtain institute The first homography matrix between ray machine camera lens grating pixel faces and the reference projection plane is stated, effective subpoint is Subpoint of the effective grid point in the reference projection plane;
The subpoint coordinate is mapped to the grating pixel faces coordinate using the first homography matrix, described in acquisition Distortion coordinate of the standard grid image on the perspective plane is mapped to the texture sample coordinates of the grating pixel faces.
8. described according to the subpoint and the ray machine camera lens grating pixel faces according to the method described in claim 7, wherein The corresponding relationship of upper mesh point obtains the corresponding texture sample coordinates of each pixel of grating pixel faces, further includes:
According to the corner areas of the grating pixel faces, by the subpoint in the reference projection plane to the corner areas institute In Directional Extension, obtain extended projection point that four groups are located at the reference projection horizontal edge and with the extended projection point The mesh point in the grating pixel faces matched;
Extension point is scanned for according to setting step-size in search using the search box search of pre-set dimension, utilizes each step-size in search Down be located at described search frame in effective matched grating pixel faces mesh point cluster of subpoint cluster, obtain the ray machine mirror The second homography matrix between head grating pixel faces and the reference projection plane;
The extended projection point coordinate is mapped to the grating pixel faces coordinate using the second homography matrix, is obtained The texture sampling that distortion coordinate of the standard grid image marginal point on the perspective plane is mapped to the grating pixel faces is sat Mark.
9. according to the method described in claim 1, wherein, it is described using the ray machine camera lens by the standard grid of grating pixel faces Image projection shoots the perspective plane using the camera and obtains point cloud projection image on the perspective plane, comprising:
The first standard grid image of grating pixel faces is projected on the perspective plane using the ray machine camera lens, and utilizes institute It states camera and shoots the first point cloud projection image of the perspective plane acquisition;
The second standard grid image of grating pixel faces is projected on the perspective plane using the ray machine camera lens, and utilizes institute It states camera and shoots the second point cloud projection image of the perspective plane acquisition, the first standard grid image and second standard Round spot in grid image mutually staggers;
It identifies the round spot grid in the first point cloud projection image and the second point cloud projection image respectively, obtains described the Location of pixels of each round spot in the grating pixel faces in one point cloud projection image and the second point cloud projection image;
By each round spot in merging the first point cloud projection image and the second point cloud projection image in the grating image The location of pixels of vegetarian noodles obtains the location of pixels of each round spot of the point cloud projection image.
10. a kind of projector distortion correction device, wherein the projector has ray machine camera lens and camera, described device packet It includes:
The standard grid image of grating pixel faces is projected in perspective plane using the ray machine camera lens by point cloud projection acquiring unit On, and the point cloud projection image that the perspective plane obtains mesh point on perspective plane is shot using the camera;
Point cloud projection coordinate calculating unit, according to the pixel pair between the standard grid image and the point cloud projection image It should be related to and according to the three-dimensional projection model constructed in advance, obtain mesh point on the perspective plane and sat based on the ray machine camera lens The three-dimensional coordinate for marking system, wherein ray machine camera lens coordinate system and camera coordinate of the three-dimensional projection model based on the projector It is and constructs;
Reference projection plane construction unit obtains reference projection plane using the three-dimensional coordinate of mesh point on the perspective plane, and World coordinate system is constructed using the reference projection plane, based between the world coordinate system and the ray machine camera lens coordinate system Transformational relation, obtain on the perspective plane mesh point in the subpoint of the reference projection plane;
Transformational relation computing unit, according to the corresponding pass of the subpoint and ray machine camera lens grating pixel faces online lattice point System obtains the corresponding texture sample coordinates of each pixel of grating pixel faces, and the texture sample coordinates are for correcting institute State distortion shift of each pixel of grating pixel faces on the perspective plane;
Texture map unit will pass through texture mapping using the texture sample coordinates to carry out texture mapping image to be projected Image be output to the grating pixel faces and projected.
11. a kind of projector, wherein include:
The ray machine camera lens standard grid image of grating pixel faces being projected on perspective plane;
Shooting perspective plane obtains point cloud projection image and is sent to the camera of graphics processor;
The memory of computer executable instructions is stored, the computer executable instructions make the graphics process when executed Device executes such as the described in any item methods of claim 1-9.
12. a kind of computer readable storage medium, wherein be stored with one or more meters on the computer readable storage medium Calculation machine program, the graphics processor that one or more of computer programs are projected instrument realize such as claim 1-9 when executing Described in any item methods.
CN201910249736.5A 2019-03-29 2019-03-29 Projector projection distortion correction method and device and projector Active CN110111262B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910249736.5A CN110111262B (en) 2019-03-29 2019-03-29 Projector projection distortion correction method and device and projector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910249736.5A CN110111262B (en) 2019-03-29 2019-03-29 Projector projection distortion correction method and device and projector

Publications (2)

Publication Number Publication Date
CN110111262A true CN110111262A (en) 2019-08-09
CN110111262B CN110111262B (en) 2021-06-04

Family

ID=67484743

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910249736.5A Active CN110111262B (en) 2019-03-29 2019-03-29 Projector projection distortion correction method and device and projector

Country Status (1)

Country Link
CN (1) CN110111262B (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110784699A (en) * 2019-11-01 2020-02-11 成都极米科技股份有限公司 Projection processing method, projection processing device, projector and readable storage medium
CN110864649A (en) * 2019-11-25 2020-03-06 歌尔股份有限公司 Method for determining compensation value and determining flatness of optical module
CN111385947A (en) * 2020-03-23 2020-07-07 北京经纬恒润科技有限公司 Control method and device applied to pixel lamp
CN111669557A (en) * 2020-06-24 2020-09-15 歌尔光学科技有限公司 Projected image correction method and correction device
CN111935468A (en) * 2020-09-24 2020-11-13 歌尔股份有限公司 Method and device for detecting deviation of projection center and computer readable storage medium
CN112053397A (en) * 2020-07-14 2020-12-08 北京迈格威科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN112295109A (en) * 2020-10-20 2021-02-02 北京理工大学 Therapeutic light control method and photodynamic therapy device using same
CN112330794A (en) * 2020-10-09 2021-02-05 同济大学 Single-camera image acquisition system based on rotary bipartite prism and three-dimensional reconstruction method
CN112614190A (en) * 2020-12-14 2021-04-06 北京淳中科技股份有限公司 Method and device for projecting map
CN112614075A (en) * 2020-12-29 2021-04-06 凌云光技术股份有限公司 Distortion correction method and equipment for surface structured light 3D system
CN112652047A (en) * 2020-10-23 2021-04-13 成都完美时空网络技术有限公司 Warping effect generation method, device, equipment and storage medium
CN112672127A (en) * 2020-12-29 2021-04-16 视田科技(天津)有限公司 Automatic calibration method for projection reflection picture
CN112995625A (en) * 2021-02-23 2021-06-18 峰米(北京)科技有限公司 Trapezoidal correction method and device for projector
CN113066158A (en) * 2019-12-16 2021-07-02 杭州海康威视数字技术股份有限公司 Vehicle-mounted all-round looking method and device
CN113099198A (en) * 2021-03-19 2021-07-09 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN113487500A (en) * 2021-06-28 2021-10-08 北京紫光展锐通信技术有限公司 Image distortion correction method and apparatus, electronic device, and storage medium
CN113643414A (en) * 2020-05-11 2021-11-12 北京达佳互联信息技术有限公司 Three-dimensional image generation method and device, electronic equipment and storage medium
CN113838002A (en) * 2021-08-25 2021-12-24 网易(杭州)网络有限公司 Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN113938661A (en) * 2021-09-29 2022-01-14 漳州万利达科技有限公司 Projector side projection correction method, terminal device and storage medium
CN115314689A (en) * 2022-08-05 2022-11-08 深圳海翼智新科技有限公司 Projection correction method, projection correction device, projector and computer program product
CN115412719A (en) * 2021-05-26 2022-11-29 致伸科技股份有限公司 Method for aligning camera lens and light source
WO2023029893A1 (en) * 2021-08-31 2023-03-09 北京字跳网络技术有限公司 Texture mapping method and apparatus, device and storage medium
WO2023087947A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and correction method
CN116540872A (en) * 2023-04-28 2023-08-04 中广电广播电影电视设计研究院有限公司 VR data processing method, device, equipment, medium and product
CN117058342A (en) * 2023-10-12 2023-11-14 天津科汇新创科技有限公司 Spine 3D voxel model construction method based on projection image

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1649419A (en) * 2004-01-28 2005-08-03 惠普开发有限公司 Method and system for display of facial features on nonplanar surfaces
CN1678049A (en) * 2004-03-30 2005-10-05 精工爱普生株式会社 Keystone distortion correction of a projector
CN101321303A (en) * 2008-07-17 2008-12-10 上海交通大学 Geometric and optical correction method for non-plane multi-projection display
CN101335901A (en) * 2007-06-29 2008-12-31 三星电子株式会社 Projected picture correcting method and apparatus
US20100002193A1 (en) * 2008-07-02 2010-01-07 Hines Stephen P Projected autostereoscopic lenticular 3-D system
CN102184566A (en) * 2011-04-28 2011-09-14 湘潭大学 Micro projector mobile phone platform-based portable three-dimensional scanning system and method
CN103426149A (en) * 2013-07-24 2013-12-04 玉振明 Large-viewing-angle image distortion correction and processing method
CN107454373A (en) * 2016-05-31 2017-12-08 财团法人工业技术研究院 Projection system and non-planar automatic correction method and automatic correction processing device thereof
EP3273680A1 (en) * 2015-03-19 2018-01-24 MegaChips Corporation Projection system, projector device, image capturing device, and program
CN107833253A (en) * 2017-09-22 2018-03-23 北京航空航天大学青岛研究院 A kind of camera pose refinement method towards the generation of RGBD three-dimensional reconstructions texture
CN108377371A (en) * 2018-02-09 2018-08-07 深圳市火乐科技发展有限公司 A kind of method and device of projection image correction
CN108665536A (en) * 2018-05-14 2018-10-16 广州市城市规划勘测设计研究院 Three-dimensional and live-action data method for visualizing, device and computer readable storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1649419A (en) * 2004-01-28 2005-08-03 惠普开发有限公司 Method and system for display of facial features on nonplanar surfaces
CN1678049A (en) * 2004-03-30 2005-10-05 精工爱普生株式会社 Keystone distortion correction of a projector
CN101335901A (en) * 2007-06-29 2008-12-31 三星电子株式会社 Projected picture correcting method and apparatus
US20100002193A1 (en) * 2008-07-02 2010-01-07 Hines Stephen P Projected autostereoscopic lenticular 3-D system
CN101321303A (en) * 2008-07-17 2008-12-10 上海交通大学 Geometric and optical correction method for non-plane multi-projection display
CN102184566A (en) * 2011-04-28 2011-09-14 湘潭大学 Micro projector mobile phone platform-based portable three-dimensional scanning system and method
CN103426149A (en) * 2013-07-24 2013-12-04 玉振明 Large-viewing-angle image distortion correction and processing method
EP3273680A1 (en) * 2015-03-19 2018-01-24 MegaChips Corporation Projection system, projector device, image capturing device, and program
CN107454373A (en) * 2016-05-31 2017-12-08 财团法人工业技术研究院 Projection system and non-planar automatic correction method and automatic correction processing device thereof
CN107833253A (en) * 2017-09-22 2018-03-23 北京航空航天大学青岛研究院 A kind of camera pose refinement method towards the generation of RGBD three-dimensional reconstructions texture
CN108377371A (en) * 2018-02-09 2018-08-07 深圳市火乐科技发展有限公司 A kind of method and device of projection image correction
CN108665536A (en) * 2018-05-14 2018-10-16 广州市城市规划勘测设计研究院 Three-dimensional and live-action data method for visualizing, device and computer readable storage medium

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110784699B (en) * 2019-11-01 2021-06-25 成都极米科技股份有限公司 Projection processing method, projection processing device, projector and readable storage medium
CN110784699A (en) * 2019-11-01 2020-02-11 成都极米科技股份有限公司 Projection processing method, projection processing device, projector and readable storage medium
CN110864649A (en) * 2019-11-25 2020-03-06 歌尔股份有限公司 Method for determining compensation value and determining flatness of optical module
CN113066158B (en) * 2019-12-16 2023-03-10 杭州海康威视数字技术股份有限公司 Vehicle-mounted all-round looking method and device
CN113066158A (en) * 2019-12-16 2021-07-02 杭州海康威视数字技术股份有限公司 Vehicle-mounted all-round looking method and device
CN111385947A (en) * 2020-03-23 2020-07-07 北京经纬恒润科技有限公司 Control method and device applied to pixel lamp
CN111385947B (en) * 2020-03-23 2022-04-26 北京经纬恒润科技股份有限公司 Control method and device applied to pixel lamp
CN113643414A (en) * 2020-05-11 2021-11-12 北京达佳互联信息技术有限公司 Three-dimensional image generation method and device, electronic equipment and storage medium
CN113643414B (en) * 2020-05-11 2024-02-06 北京达佳互联信息技术有限公司 Three-dimensional image generation method and device, electronic equipment and storage medium
CN111669557A (en) * 2020-06-24 2020-09-15 歌尔光学科技有限公司 Projected image correction method and correction device
CN111669557B (en) * 2020-06-24 2022-05-13 歌尔光学科技有限公司 Projected image correction method and correction device
CN112053397A (en) * 2020-07-14 2020-12-08 北京迈格威科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN111935468A (en) * 2020-09-24 2020-11-13 歌尔股份有限公司 Method and device for detecting deviation of projection center and computer readable storage medium
CN111935468B (en) * 2020-09-24 2021-01-22 歌尔股份有限公司 Method and device for detecting deviation of projection center and computer readable storage medium
CN112330794A (en) * 2020-10-09 2021-02-05 同济大学 Single-camera image acquisition system based on rotary bipartite prism and three-dimensional reconstruction method
CN112295109A (en) * 2020-10-20 2021-02-02 北京理工大学 Therapeutic light control method and photodynamic therapy device using same
CN112652047A (en) * 2020-10-23 2021-04-13 成都完美时空网络技术有限公司 Warping effect generation method, device, equipment and storage medium
CN112614190A (en) * 2020-12-14 2021-04-06 北京淳中科技股份有限公司 Method and device for projecting map
CN112614190B (en) * 2020-12-14 2023-06-06 北京淳中科技股份有限公司 Method and device for projecting mapping
CN112672127A (en) * 2020-12-29 2021-04-16 视田科技(天津)有限公司 Automatic calibration method for projection reflection picture
CN112614075B (en) * 2020-12-29 2024-03-08 凌云光技术股份有限公司 Distortion correction method and equipment for surface structured light 3D system
CN112614075A (en) * 2020-12-29 2021-04-06 凌云光技术股份有限公司 Distortion correction method and equipment for surface structured light 3D system
CN112995625A (en) * 2021-02-23 2021-06-18 峰米(北京)科技有限公司 Trapezoidal correction method and device for projector
CN112995625B (en) * 2021-02-23 2022-10-11 峰米(北京)科技有限公司 Trapezoidal correction method and device for projector
CN113099198A (en) * 2021-03-19 2021-07-09 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN115412719B (en) * 2021-05-26 2024-03-01 致伸科技股份有限公司 Method for aligning camera lens and light source
CN115412719A (en) * 2021-05-26 2022-11-29 致伸科技股份有限公司 Method for aligning camera lens and light source
CN113487500B (en) * 2021-06-28 2022-08-02 北京紫光展锐通信技术有限公司 Image distortion correction method and apparatus, electronic device, and storage medium
CN113487500A (en) * 2021-06-28 2021-10-08 北京紫光展锐通信技术有限公司 Image distortion correction method and apparatus, electronic device, and storage medium
CN113838002A (en) * 2021-08-25 2021-12-24 网易(杭州)网络有限公司 Image processing method, image processing device, computer-readable storage medium and electronic equipment
WO2023029893A1 (en) * 2021-08-31 2023-03-09 北京字跳网络技术有限公司 Texture mapping method and apparatus, device and storage medium
CN113938661A (en) * 2021-09-29 2022-01-14 漳州万利达科技有限公司 Projector side projection correction method, terminal device and storage medium
CN113938661B (en) * 2021-09-29 2024-05-07 漳州万利达科技有限公司 Projector side projection correction method, terminal equipment and storage medium
WO2023087947A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and correction method
CN115314689A (en) * 2022-08-05 2022-11-08 深圳海翼智新科技有限公司 Projection correction method, projection correction device, projector and computer program product
CN116540872A (en) * 2023-04-28 2023-08-04 中广电广播电影电视设计研究院有限公司 VR data processing method, device, equipment, medium and product
CN116540872B (en) * 2023-04-28 2024-06-04 中广电广播电影电视设计研究院有限公司 VR data processing method, device, equipment, medium and product
CN117058342A (en) * 2023-10-12 2023-11-14 天津科汇新创科技有限公司 Spine 3D voxel model construction method based on projection image
CN117058342B (en) * 2023-10-12 2024-01-26 天津科汇新创科技有限公司 Spine 3D voxel model construction method based on projection image

Also Published As

Publication number Publication date
CN110111262B (en) 2021-06-04

Similar Documents

Publication Publication Date Title
CN110111262A (en) A kind of projector distortion correction method, device and projector
CN110336987A (en) A kind of projector distortion correction method, device and projector
CN110191326A (en) A kind of optical projection system resolution extension method, apparatus and optical projection system
US8432435B2 (en) Ray image modeling for fast catadioptric light field rendering
CN106127745B (en) The combined calibrating method and device of structure light 3 D vision system and line-scan digital camera
TWI555379B (en) An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
WO2023045147A1 (en) Method and system for calibrating binocular camera, and electronic device and storage medium
Sajadi et al. Auto-calibration of cylindrical multi-projector systems
CN108288292A (en) A kind of three-dimensional rebuilding method, device and equipment
US7583307B2 (en) Autostereoscopic display
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
CN102072706B (en) Multi-camera positioning and tracking method and system
CN106709865B (en) Depth image synthesis method and device
JP2010517038A (en) Method and apparatus for quantitative three-dimensional imaging
CN109559349A (en) A kind of method and apparatus for calibration
CN109379578A (en) Omnidirectional three-dimensional video-splicing method, apparatus, equipment and storage medium
CA2220721A1 (en) Texture mapping of photographic images to cad surfaces
JP6674643B2 (en) Image processing apparatus and image processing method
CN110337674A (en) Three-dimensional rebuilding method, device, equipment and storage medium
WO2023273108A1 (en) Monocular distance measurement method and apparatus, and intelligent apparatus
CN105306922A (en) Method and device for obtaining depth camera reference diagram
JP7479729B2 (en) Three-dimensional representation method and device
CN110335307A (en) Scaling method, device, computer storage medium and terminal device
JP4554231B2 (en) Distortion parameter generation method, video generation method, distortion parameter generation apparatus, and video generation apparatus
JP2001016621A (en) Multi-eye data input device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220803

Address after: Room 1903, 19 / F, building D, Zhizhen building, No. 7 Zhichun Road, Haidian District, Beijing 100088

Patentee after: Bird innovation (Beijing) Technology Co.,Ltd.

Address before: 100191 room 1801, block D, Zhizhen building, 7 Zhichun Road, Haidian District, Beijing

Patentee before: BEIJING XIAONIAO TINGTING TECHNOLOGY Co.,Ltd.