CN101650831A - Light source direction calibration method based on random position multi-globule - Google Patents
Light source direction calibration method based on random position multi-globule Download PDFInfo
- Publication number
- CN101650831A CN101650831A CN200910092909A CN200910092909A CN101650831A CN 101650831 A CN101650831 A CN 101650831A CN 200910092909 A CN200910092909 A CN 200910092909A CN 200910092909 A CN200910092909 A CN 200910092909A CN 101650831 A CN101650831 A CN 101650831A
- Authority
- CN
- China
- Prior art keywords
- light source
- bead
- source direction
- highlight
- center
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention relates to a light source direction calibration method based on random location multi-globules, comprising the following steps: (1) identifying a globule central point and high light point based on an image; (2) calculating spatial locations of the globule central point and high light point; (3) calculating according to the specular reflection principle of the globule high light point to obtain the direction of a light source; in order to obtain an accurate result, shooting a plurality of globules at different locations under the light source, and obtaining the direction of the final light source by utilizing an SVD decomposition solving overdetermined equation. The light source direction calibration method has rapidness, robustness and accurate calculation result and is suitable for calibrating directional light sources of any amount and location.
Description
Technical field
The invention belongs to computer virtual reality and the computer graphic image field that learns a skill, the image that specifically utilizes camera to take is demarcated light source direction, is used for the data acquisition system (DAS) of camera and light source composition.
Background technology
During computer virtual reality learns a skill with computer graphic image, have and much relate to equipment such as utilizing light source and camera and carry out the collection of data, need demarcate light source, calculate concrete light source direction, therefore realize a kind of simply, the light source direction scaling method just seems very necessary fast and accurately.
It is by measuring or computable equipment itself obtains light source direction that some researchs are arranged at present.Document 1-Levoy. " Stanford Spherical Gangry; " the spherical saddle collecting device that http://graphics.stanford.edu/projects/gantry/. makes, constitute by two computer-controlled mechanical arms, can do the spheroid motion around the center, but carry camera or light source above the mechanical arm, can be accurately with the optional position of light source and camera centre rotation platform episphere.Document 2-T.Weyrich, M.Wojciech, P.Hanspeter et al., " Analysis of human facesusing a measurement-based skin reflectance model; " ACM Trans.Graph., vol.25, no.3, pp.1013-1024,2006. developed the collecting device Dome of a cover spheroid framework, this equipment is that 3 meters spheroid constitutes by a diameter, altogether carry 150 led light sources, 16 digital cameras are evenly distributed on the sphere, and light source direction can calculate according to the standard geometrical shapes of equipment.Though this method can quick and precisely obtain light source direction, very high to the requirement of equipment, be not easy to realize.
Document 3-Mark W.Powell, Sudeep Sarkar et al., " Calibration of Light Sources " IEEE., pp.1063-6919,2000. use the bead of three known separation, utilize the camera pictures taken on the one hand, obtain the highlight of bead; Utilize spatial digitizer to obtain the spatial information of bead on the one hand, list system of equations, find the solution and obtain light source direction according to known spacing.This method has been used special bead and spatial digitizer, and shooting and computation process more complicated all, very flexible.
Summary of the invention
The technical problem to be solved in the present invention: overcome the deficiencies in the prior art, provide a kind of simply, light source direction scaling method accurately.
The technical solution used in the present invention:, it is characterized in that step is as follows based on the light source direction scaling method of the many balls in optional position:
(1) identification is based on the center of pellet point and the highlight of image;
(2) locus of calculating center of pellet point and highlight;
(3) the direct reflection principle according to the bead highlight calculates light source direction.
Identification based on the center of pellet point of image and the method for highlight is in the described step (1): the background picture of taking bead picture under certain light source and removing bead, the two subtracts each other the bead picture that obtains background, after passing through denoising and binaryzation again, utilize the two-stage Hough transform method to obtain the pixel coordinate and the little radius of a ball of the central point of bead; In the circle that identifies, find out the average coordinates in the zone of preceding 10% high-high brightness then, as the pixel coordinate of bead highlight.
The method of the calculating center of pellet point in the described step (2) and the locus of highlight: bead is placed on the plane of gridiron pattern scaling board during shooting, the three-dimensional Z coordinate figure of the center of pellet radius that is bead like this, add the center of pellet pixel coordinate that goes out from picture recognition and the inside and outside ginseng of camera, can solve the position of center of pellet in three-dimensional, same method, we can solve the three-dimensional position of bead highlight.
The direct reflection principle according to the bead highlight in the described step (3) calculates the light source direction principle and is: according to principle of reflection, about the normal direction symmetry of highlight, therefore can go out light source direction in the viewpoint direction at bead highlight place and light source direction according to viewpoint position, center of pellet point and highlight position calculation.
In described step (3), in order to obtain more accurate result, take the bead of a plurality of diverse locations under this light source, utilize SVD to decompose and ask the overdetermined equation group to obtain the direction of final light source.
The method of utilizing SVD to decompose to ask the overdetermined equation group to obtain final light source direction in the described step (3) is: the light source direction structural matrix M that obtains according to each little ball position
TM and vector M
TB asks the least square solution of equation M θ=b, is the error that minimizes light source direction, by its optimum least square solution of SVD decomposition computation.
The present invention's advantage compared with prior art is:
(1) use equipment is simple.This method does not have special requirement to light source and camera, only needs a bead as calibration tool, very is easy to realize.
(2) implementation procedure is easy.Finish on the basis of camera calibration, only need utilize camera that bead is taken, on image, light source direction is being calculated, so shooting process and computation process are all very easy.
(3) result of calculation is accurate.Use SVD to decompose the overdetermination matrix is found the solution, the light source direction that obtains according to the bead of the diverse location of taking calculates the result of error minimum, can reduce error to an acceptable scope by taking multi-group data.
Description of drawings
Fig. 1 is the synoptic diagram according to the normal direction n at bead highlight h place and pilot direction v calculating light source direction.
Embodiment
Concrete steps of the present invention are as follows:
1, identification is based on the center of pellet point and the highlight of image
The background picture of taking bead picture under certain light source and removing bead, the two subtracts each other the bead picture that obtains background, use medium filtering to remove noise to the result, by assign thresholds the result is changed into bianry image, utilize the two-stageHough transform method to obtain the pixel coordinate and the little radius of a ball of the central point of bead.In the scope of circle, at first obtain high-high brightness I
Max, traversal obtains the interior brightness of circle greater than 0.1*I then
MaxThe average coordinates of pixel, with the pixel coordinate of this result as highlight.
2, calculate the locus of center of pellet point and highlight
Demarcate light source direction by the picture of taking the diffuse reflection bead under a certain light source.Bead is placed on the plane of gridiron pattern scaling board during shooting, the three-dimensional Z coordinate figure of the center of pellet radius that is bead like this, add the center of pellet pixel coordinate that goes out from picture recognition, can solve the position c of center of pellet in three-dimensional, same method, we can solve the three-dimensional position h of bead highlight.According to principle of reflection, be about the normal direction n symmetry of highlight in the viewpoint direction at bead highlight place and light source direction, so just can obtain the light source direction of the highlight of this position.
3, according to the light source direction of organizing measurement data error of calculation minimums more
The bead of taking diverse location can obtain a plurality of results, and we can obtain objective function E according to the method for solving of linear least-squares problem:
L wherein
iBe i the light source direction that little ball position obtains, always have the bead image of n position, l is the final result who requires.For the error that minimizes light source direction obtains better result, we will allow the value of objective function E as far as possible little, therefore according to following formula structural matrix M
TM and vector M
TB asks the least square solution of equation M1=b to obtain final light source direction, wherein:
(M is 3n * 3 matrixes)
Claims (6)
1,, it is characterized in that step is as follows based on the light source direction scaling method of the many balls in optional position:
(1) identification is based on the center of pellet point and the highlight of image;
(2) locus of calculating center of pellet point and highlight;
(3) the direct reflection principle according to the bead highlight calculates light source direction.
2, according to the described light source direction scaling method of claim 1 based on the many balls in optional position, it is characterized in that: identification based on the center of pellet point of image and the method for highlight is in the described step (1): the background picture of taking bead picture under certain light source and removing bead, the two subtracts each other the bead picture that obtains background, after passing through denoising and binaryzation again, utilize the two-stage Hough transform method to obtain the pixel coordinate and the little radius of a ball of the central point of bead; In the circle that identifies, find out the average coordinates in the zone of preceding 10% high-high brightness then, as the pixel coordinate of bead highlight.
3, according to the described light source direction scaling method of claim 1 based on the many balls in optional position, it is characterized in that: the method for the calculating center of pellet point in the described step (2) and the locus of highlight: bead is placed on the plane of gridiron pattern scaling board during shooting, the three-dimensional Z coordinate figure of the center of pellet radius that is bead like this, add the center of pellet pixel coordinate that goes out from picture recognition and the inside and outside ginseng of camera, can solve the position of center of pellet in three-dimensional, same method, we can solve the three-dimensional position of bead highlight.
4, according to the described light source direction scaling method of claim 1 based on the many balls in optional position, it is characterized in that: the direct reflection principle according to the bead highlight in the described step (3) calculates the light source direction principle and is: according to principle of reflection, about the normal direction symmetry of highlight, therefore can go out light source direction in the viewpoint direction at bead highlight place and light source direction according to viewpoint position, center of pellet point and highlight position calculation.
5, according to the described light source direction scaling method of claim 1 based on the many balls in optional position, it is characterized in that: in described step (3), in order to obtain more accurate result, take the bead of a plurality of diverse locations under this light source, utilize SVD to decompose and ask the overdetermined equation group to obtain the direction of final light source.
6, according to the described light source direction scaling method based on the many balls in optional position of claim 5, it is characterized in that: the method for utilizing SVD to decompose to ask the overdetermined equation group to obtain final light source direction in the described step (3) is: the light source direction structural matrix M that obtains according to each little ball position
TM and vector M
TB asks the least square solution of equation M θ=b, is the error that minimizes light source direction, by its optimum least square solution of SVD decomposition computation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200910092909A CN101650831A (en) | 2009-09-10 | 2009-09-10 | Light source direction calibration method based on random position multi-globule |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200910092909A CN101650831A (en) | 2009-09-10 | 2009-09-10 | Light source direction calibration method based on random position multi-globule |
Publications (1)
Publication Number | Publication Date |
---|---|
CN101650831A true CN101650831A (en) | 2010-02-17 |
Family
ID=41673061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200910092909A Pending CN101650831A (en) | 2009-09-10 | 2009-09-10 | Light source direction calibration method based on random position multi-globule |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101650831A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102855626A (en) * | 2012-08-09 | 2013-01-02 | 深圳先进技术研究院 | Methods and devices for light source direction calibration and human information three-dimensional collection |
-
2009
- 2009-09-10 CN CN200910092909A patent/CN101650831A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102855626A (en) * | 2012-08-09 | 2013-01-02 | 深圳先进技术研究院 | Methods and devices for light source direction calibration and human information three-dimensional collection |
CN102855626B (en) * | 2012-08-09 | 2016-01-27 | 深圳先进技术研究院 | Light source direction is demarcated and human body information three-dimensional acquisition method and apparatus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109031440B (en) | Gamma radioactive imaging method based on deep learning | |
CN105809689B (en) | Hull six degree of freedom measurement method based on machine vision | |
CN104165750B (en) | Measurement method for pose of wind tunnel model combining stereoscopic vision with gyroscope | |
CN113191388B (en) | Image acquisition system for training target detection model and sample generation method | |
CA2928174A1 (en) | Systems and methods for automated device pairing | |
CN104504761B (en) | A kind of method and device of control 3D models rotation | |
CN109840508A (en) | One robot vision control method searched for automatically based on the depth network architecture, equipment and storage medium | |
CN112115607A (en) | Mobile intelligent digital twin system based on multidimensional Sayboat space | |
CN108010122B (en) | Method and system for reconstructing and measuring three-dimensional model of human body | |
CN109668568A (en) | A kind of method carrying out location navigation using panoramic imagery is looked around | |
CN115451964A (en) | Ship scene simultaneous mapping and positioning method based on multi-mode mixed features | |
Chaochuan et al. | An extrinsic calibration method for multiple RGB-D cameras in a limited field of view | |
Pueo et al. | Video-based system for automatic measurement of barbell velocity in back squat | |
Özcan et al. | A novel camera-based measurement system for roughness determination of concrete surfaces | |
Fu et al. | Image segmentation of cabin assembly scene based on improved RGB-D mask R-CNN | |
CN107320118B (en) | Method and system for calculating three-dimensional image space information of carbon nano C-shaped arm | |
CN109631849A (en) | A kind of high gradient slope crag measurement method based on oblique photograph | |
CN101650831A (en) | Light source direction calibration method based on random position multi-globule | |
CN110853103B (en) | Data set manufacturing method for deep learning attitude estimation | |
Chen et al. | Motion recognition method for construction workers using selective depth inspection and optimal inertial measurement unit sensors | |
CN109164909B (en) | Data processing device, system and method based on virtual reality technology | |
Dihoru et al. | A computer vision approach for dynamic tracking of components in a nuclear reactor core model | |
CN109918988A (en) | A kind of transplantable unmanned plane detection system of combination imaging emulation technology | |
CN109900704B (en) | In-situ three-dimensional reconstruction method for microscopic morphology of worn surface of gear | |
Jiang et al. | A multi-view structured light measurement method based on pose estimation using deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
WD01 | Invention patent application deemed withdrawn after publication |
Open date: 20100217 |