CN103888750B - 3-dimensional image shoot control system and method - Google Patents

3-dimensional image shoot control system and method Download PDF

Info

Publication number
CN103888750B
CN103888750B CN201210560154.7A CN201210560154A CN103888750B CN 103888750 B CN103888750 B CN 103888750B CN 201210560154 A CN201210560154 A CN 201210560154A CN 103888750 B CN103888750 B CN 103888750B
Authority
CN
China
Prior art keywords
base length
pair
filming apparatus
rate
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210560154.7A
Other languages
Chinese (zh)
Other versions
CN103888750A (en
Inventor
甄梓宁
张晓林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BIBIWAY CORP
Original Assignee
BIBIWAY CORP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BIBIWAY CORP filed Critical BIBIWAY CORP
Priority to CN201210560154.7A priority Critical patent/CN103888750B/en
Publication of CN103888750A publication Critical patent/CN103888750A/en
Application granted granted Critical
Publication of CN103888750B publication Critical patent/CN103888750B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Stereoscopic And Panoramic Photography (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention discloses a kind of 3-dimensional image shoot control system, it has considered shooting condition and has viewed and admired condition, and by controlling the position of a pair filming apparatus and direction, to obtain good 3-D effect, and can keep safe disparity range.Wherein, first by base length determining unit determination base length, then moved the position of described a pair filming apparatus by camera control unit, to make described a pair filming apparatus, there is determined base length.Or, first by convergence angle control unit according to parallax control convergence angle that is set or that calculate, then by base length determining unit according to the rate that furthers determination base length that is set or that calculate.

Description

3-dimensional image shoot control system and method
Technical field
The invention belongs to 3-dimensional image technique for taking field, particularly a kind of control is used for the position of camera (camera) and the system in direction of 3-dimensional image shooting.
Background technology
In recent years, with three-dimensional movie and three-dimensional television etc. for representative, stereo three-dimensional (3D) image can be showed more and more extensive in the application in various field.3-dimensional image is generally taken by two three-dimensional cameras (being respectively used to shooting left-eye images and right-eye image).
Generally, first the position of two three-dimensional shooting cameras is fixed, then take in this condition.Now, in order to avoid producing the problems such as sand table effect (namely giant looks), usually the spacing (i.e. base length) of three-dimensional camera is remained on more than 65 millimeters.In addition, also can according to the relation of the base length of three-dimensional camera and angle of visibility, the spacing of manual adjustments three-dimensional camera.
At patent documentation 1(Unexamined Patent 2010-81010) in, disclose by regulating base length to obtain the image pickup method of 3-D effect.Specifically, first, the base length of two three-dimensional cameras is fixed; Then, according to the base length of three-dimensional camera and the relation of angle of visibility, manual adjustments base length and angle of visibility are to obtain three-dimensional 3-dimensional image effect.But the image pickup method of above-mentioned 3-dimensional image, can only be used for taking in-plant 3-dimensional image usually.
When taking remote object, zoom lens or long focal length lens must be aided with.Now, when the spacing setting of two video cameras is 65 millimeters, what cannot obtain reference object has relief image.Therefore, up to the present, in 3-D photography field, a lot of camera work person (as photographer and three dimensional design teacher etc.) can only rely on the experience of self and spacing and the angle of visibility of feeling to arrange two video cameras, causes the stereoeffect of captured 3-dimensional image not good.Such as, traditional three-dimensional image pickup method has third dimension between captured close shot and the 3-dimensional image of distant view and there are differences, and is particularly difficult to realize the problems such as shooting at close range sports.
As can be seen here, in the prior art comprising above-mentioned patent documentation 1, specifically do not provide the shooting condition calculation relational expression and control method that are suitable for 3-D photography.The problem that after above-mentioned patent documentation 1 fails to solve zoom operation, captured view stereoscopic vision disappears.
On the other hand, not yet someone proposes suitable relational expression up to now, for describe can create good shooting effect camera coverage angle, base length, convergence angle and and reference object between the correlation of the shooting condition such as distance, and not yet establish the theoretical system of the three-dimensional camera system that can consider shooting condition and view and admire condition.In addition, owing to not setting up the relational expression to considering safe parallax (within parallax is in 50 millimeters of scopes) and clearly expressing with the photography conditions of 3-D effect, in practical operation, camera work person can only rely on the experience of self and sensation to arrange the base length of video camera and angle of visibility.
Therefore, be necessary to develop a kind of 3-dimensional image shoot control system, shooting condition can be considered and condition of viewing and admiring to control shooting process, to obtain good 3-D effect.
Summary of the invention
The invention provides a kind of 3-dimensional image shoot control system and method, to solve the above-mentioned problem cannot determining shooting condition according to shooting condition or the condition of viewing and admiring, to obtain good 3-D effect.
For solving the problems of the technologies described above, the invention provides a kind of 3-dimensional image shoot control system, it comprises:
Disparity computation unit, for according to the image obtained from least one pair of filming apparatus taken reference object, calculates the relative disparity of described reference object;
Base length determining unit, for determining the base length of described a pair filming apparatus; And
Base length control unit, for control described a pair filming apparatus to move to have described by the position of base length determined;
Wherein, described base length determining unit calculates sensation distance when described image is viewed and admired according to described relative disparity, and according to the base length of described a pair filming apparatus and the ratio of human eye base length and the functional relation that is mutually related had between rate that furthers, determine the base length of described a pair filming apparatus, wherein, the ratio of distance in kind and described sensation distance when viewing and admiring when the rate of furthering described in is shooting.
Optionally, described 3-dimensional image shoot control system also comprises:
Parallax receiving element, for receiving setting parallax; And
Convergence angle control unit, moves to described a pair filming apparatus the position that it has the convergence angle corresponding with described setting parallax for controlling;
Wherein, described base length determining unit calculates described sensation distance according to described setting parallaxometer.
Optionally, described 3-dimensional image shoot control system also comprises:
The rate that furthers receiving element, to further rate for receiving setting; And
Convergence angle control unit, moves to described a pair filming apparatus the position that it has the convergence angle corresponding with the parallax that the rate that furthers according to described setting obtains for controlling;
Wherein, described base length determining unit calculates the base length of described a pair filming apparatus according to the described setting rate of furthering.
Optionally, described 3-dimensional image shoot control system also comprises:
Convergence angle control unit, moves for controlling described a pair filming apparatus thus realizes the adjustment to convergence angle,
Wherein, described base length determining unit calculates according to the angle of visibility of described a pair filming apparatus the rate of furthering, and described a pair filming apparatus is moved to the position that it has the convergence angle corresponding with the parallax that the rate that furthers calculated described in basis obtains by described convergence angle control unit;
Described base length determining unit according to described in the rate of furthering that calculates determine the base length of described a pair filming apparatus.
Optionally, in described 3-dimensional image shoot control system, described angle of visibility is recorded by the angle of visibility measuring unit for measuring angle of visibility, or draws according to the focal length of described a pair filming apparatus and imageing sensor width calculation.
Optionally, described 3-dimensional image shoot control system also comprises:
Width multiplying power receiving element, for receiving setting width multiplying power,
Wherein, described base length determining unit calculates according to angle of visibility and described setting width dynameter the rate of furthering.
Optionally, described 3-dimensional image shoot control system, also comprises:
Depth-to-width ratio receiving element, for receiving setting depth-to-width ratio, wherein, described base length determining unit, according to the base length of described a pair filming apparatus and the ratio of human eye base length and described rate and the proportional relation of described setting depth-to-width ratio product between the two of furthering, determines the base length of described a pair filming apparatus.
Optionally, described 3-dimensional image shoot control system also comprises:
Allowing disparity range receiving element, allowing disparity range for receiving setting; And
Shooting condition regulon, for allowing in disparity range by regulating various parameter to make the parallax after adjustment fall within described setting;
Wherein, described disparity computation unit calculates the disparity range of integral image the image obtained from described a pair filming apparatus; The disparity range of the described integral image that described shooting condition regulon foundation calculates and described setting allow disparity range, at least determine the allowed band of the base length of described a pair filming apparatus and the allowed band of convergence angle both one of, and regulate base length and/or the convergence angle of described a pair filming apparatus, make it fall in fixed corresponding allowed band.
Optionally, in described 3-dimensional image shoot control system, described base length determining unit calculates described sensation distance according to described parallax and the condition of viewing and admiring, described in the condition of viewing and admiring comprise: human eye base length, screen width and people's eyebase are to the distance between described screen.
Optionally, in described 3-dimensional image shoot control system, described distance in kind is recorded by the distance measuring unit in kind for measuring distance in kind, or calculates according to the base length of described relative disparity and described a pair filming apparatus, convergence angle and angle of visibility.
Optionally, in described 3-dimensional image shoot control system, two pairs or more the filming apparatus of at least one pair of filming apparatus described for being formed by more than three or three filming apparatus.
Accordingly, the present invention also provides a kind of 3-dimensional image filming control method, and it comprises:
Disparity computation step, according to the image obtained from a pair filming apparatus taken reference object, calculates the relative disparity of described reference object;
Base length determining step, sensation distance when described image is viewed and admired is calculated according to described relative disparity, and according to the base length of described a pair filming apparatus and the ratio of human eye base length and the functional relation that is mutually related had between rate that furthers, determine the base length of described a pair filming apparatus; And
Base length rate-determining steps, moves to it and has described by the position of base length determined by described a pair filming apparatus;
Wherein, the ratio of distance in kind and described sensation distance when viewing and admiring when the rate of furthering described in is shooting.
Optionally, described 3-dimensional image filming control method also comprises:
Parallax receiving step, receives a setting parallax; And
Convergence angle rate-determining steps, moves to described a pair filming apparatus the position that it has the convergence angle corresponding with described setting parallax.
Optionally, described 3-dimensional image filming control method also comprises:
The rate that furthers receiving step, receives one and sets the rate that furthers;
Parallax determining step, calculates the parallax of described a pair filming apparatus according to the described setting rate of furthering;
Convergence angle rate-determining steps, controls described a pair filming apparatus and moves to the position that it has the convergence angle corresponding with the parallax that the rate that furthers according to described setting obtains.
Optionally, described 3-dimensional image filming control method also comprises:
Parallax determining step, the angle of visibility according to described a pair filming apparatus calculates the rate of furthering, and determines that the rate that furthers calculated described in basis obtains parallax;
Convergence angle rate-determining steps, controls described a pair filming apparatus and moves to the position that it has the convergence angle corresponding with the parallax that the rate that furthers calculated described in basis obtains.
3-dimensional image shoot control system of the present invention comprises disparity computation unit for calculating the relative disparity of described reference object; Base length determining unit is for determining the base length of described a pair filming apparatus; And described a pair filming apparatus is moved to the position with described base length for controlling by base length control unit.Wherein, described base length determining unit calculates sensation distance when viewing and admiring according to described parallaxometer, and according to the base length of described a pair filming apparatus and the ratio of human eye base length and the distance of material object when taking and described view and admire time sensation distance ratio between the functional relation that is mutually related that has, determine the described setting base length of described a pair filming apparatus.Visible, use 3-dimensional image shoot control system of the present invention to obtain to have considered shooting condition and viewed and admired the parameter of condition, and by using described parameter to control at least one in the position of a pair camera and direction, good 3-D effect can be obtained, and safe disparity range can be kept.
Accompanying drawing explanation
Fig. 1 is the schematic diagram of 3-dimensional image shoot control system and other devices relative according to an embodiment of the invention;
Fig. 2 is the schematic diagram of the three-dimensional photographic unit of 3-dimensional image shoot control system according to an embodiment of the invention;
Fig. 3 A is the schematic diagram of the shooting condition of 3-dimensional image shoot control system according to an embodiment of the invention;
Fig. 3 B is the schematic diagram of the condition of viewing and admiring of 3-dimensional image shoot control system according to an embodiment of the invention;
Fig. 4 A is the schematic diagram of the formula of the material object distance described for obtaining 3-dimensional image shoot control system according to an embodiment of the invention;
Fig. 4 B is the schematic diagram of the formula of the sensation distance described for obtaining 3-dimensional image shoot control system according to an embodiment of the invention;
Fig. 5 is the three-dimensional photographic unit of 3-dimensional image shoot control system according to an embodiment of the invention and the functional block diagram of control device;
Fig. 6 is the hardware structure of computer schematic diagram of the control device forming 3-dimensional image shoot control system according to an embodiment of the invention.
Reference numeral:
1 ... camera, 2 ... driver part, 3 ... amount of movement measurement mechanism, 4 ... camera control platform, 5 ... three-dimensional photographic unit, 6 ... control device, 7 ... 3-dimensional image shoot control system, 8 ... dispensing device, 9 ... receiving system, 10 ... 3-dimensional image quality adjustment device, 11 ... three-dimension projection machine, 12 ... screen, 13 ... three-dimensional television.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearly, elaborate further below in conjunction with accompanying drawing.
In one embodiment of the invention, a kind of 3-dimensional image shoot control system is provided, it according to shooting condition and can view and admire condition, at least one in the spacing (i.e. base length) of a pair three-dimensional camera for shooting and direction (i.e. convergence angle) is controlled, to obtain good 3-D effect or to realize good safe parallax.
First, with reference to figure 1, to the 3-dimensional image shoot control system 7 in one embodiment of the invention, and be described from utilizing described 3-dimensional image shoot control system 7 to obtain 3-dimensional image data to 3-dimensional image being shown to each device involved this process of audience.
As shown in Figure 1,3-dimensional image shoot control system 7 comprises three-dimensional photographic unit 5 and control device 6.Three-dimensional photographic unit 5 comprises camera 1, multiple driver part 2, amount of movement measurement mechanism 3(please refer to Fig. 2) and camera control platform 4.Camera 1 is the filming apparatus for taking 3-dimensional image, and its a pair camera be made up of the camera 1-L for taking left image and the camera 1-R for taking right image is formed.The operation of the part such as the zoom of camera 1 can control by using control device 6 can realize synchronous interaction.In this manual, when describing above-mentioned two cameras at the same time, be collectively referred to as camera 1.
Under the control of the control signal that driver part 2 sends at control device 6, removable camera 1(is mobile for taking the camera 1-L of left image and the camera 1-R for taking right image respectively).Driver part 2 can carry out three axial in rotary moving to camera 1, and moving horizontally along base direction.Although use driver part 2 to move camera 1 in the present embodiment, the present invention is not limited to this, other any known methods can also be used to move camera 1 in any direction, or moved to optional position.Measurement result for measuring the amount of movement (comprising the anglec of rotation and displacement) of camera 1 driven member 2 movement, and is sent to control device 6 by amount of movement measurement mechanism 3.
Two cameras 1 are juxtaposed on the base 4a of camera control platform 4, and substantially remain on each other on sustained height.Specifically, for taking the camera 1-L of left image driven member 2 can carry out in rotary moving at three direction of principal axis and to be installed on base 4a in the state that base direction carries out moving horizontally.Same, for taking the camera 1-R of right image also driven member 2 can carry out in rotary moving at three direction of principal axis and to be installed on base 4a in the state that base direction carries out moving horizontally.In addition, base 4a can horizontally rotate relative to camera control platform 4.
Data after process for receiving 3-dimensional image data from camera 1 and implementing certain process to it, and are sent to dispensing device 8 by control device 6.Thereafter, above-mentioned data first send to receiving system 9(can be arranged on distance dispensing device 8 remotely by dispensing device 8 through the network such as the Internet or wide area network), then the 3-dimensional image quality adjustment device 10 being arranged at the places such as cinema, TV station, relay base, average family is sent to by receiving system 9.After 3-dimensional image quality adjustment device 10 receives 3-dimensional image data, image quality adjustment process and other correction of image process will be implemented to it.
After above-mentioned process, 3-dimensional image data are just by viewing and admiring for audience someway.Such as, as shown in Figure 1, for cinema, above-mentioned 3-dimensional image data first send to three-dimension projection machine 11, are then shown on the screen 12 by 3-dimensional image by three-dimension projection machine 11.And for average family, the display screen by three-dimensional television 13 is play by these 3-dimensional image data.
Although in exemplified by Fig. 1,3-dimensional image shoot control system 7 comprises three-dimensional photographic unit 5, and it also can not comprise the parts such as camera 1 and camera control platform 4, and only includes control device 6, for carrying out essence control to camera 1.
Below with reference to Fig. 2 to the position of camera 1 in three-dimensional photographic unit 5 and the adjustment in direction with to be mobilely described.
As shown in Figure 2, the optical axis 15L of left camera (namely for taking the camera of left image) 1-L and the optical axis 15R of right camera (namely for taking the camera of right image) 1-R in the plane comprising this two optical axis angulation 1/2 for convergence angle (see figure 4), and the reference object (sighting target) of left camera 1-L and right camera 1-R is reference object 20 shown in figure.
During with the optical axis 15L of left camera 1-L for benchmark, reference object 20 is labeled as in the deflection angle of horizontal rotatio direction the deflection angle in vertical rotary direction is labeled as and the deflection angle in the direction that pivots is labeled as similarly, when with the optical axis 15R of right camera 1-R for benchmark time, reference object 20 is then labeled as in the deflection angle of horizontal rotatio direction the deflection angle in vertical rotary direction is then labeled as the deflection angle in direction of pivoting then is labeled as
The base length b of left camera 1-L and right camera 1-R has been shown in Fig. 2.Herein, so-called base length refers to the distance between the photocentre of two cameras (or people two), and photocentre then refers to the position at the pin hole place when camera being approximately pinhole camera model.In addition, this base length b can also represent the distance between the rotating shaft that two cameras horizontally rotate.In addition, other basic point can also be used to carry out approximate above-mentioned distance, using the base length b(as two cameras in such cases, the error of this parameter need be considered).
Driver part 2 is set to drive two cameras respectively.Such as, it can be set there are the three direction degrees of freedom, rotate relative to each rotating shaft in order to drives camera.In addition, the driver part 2 for changing base length is also comprised.In addition, although according to mentioned above, two cameras 1 are arranged on base 4a, and drive left-right rotation in the horizontal direction by base 4a, also can control camera 1 by driver part 2 herein and do above-mentioned rotation.
In fig. 2, driver part 2-L-P rotates in the horizontal direction for driving left camera 1-L, and the left camera 1-L anglec of rotation is in the horizontal direction designated as θ 1-p; Driver part 2-R-P rotates in the horizontal direction for driving right camera 1-R, and the right camera 1-R anglec of rotation is in the horizontal direction designated as θ r-p; Driver part 2-L-T rotates in the vertical direction for driving left camera 1-L, and the left camera 1-L anglec of rotation is in the vertical direction designated as θ 1-t; Driver part 2-R-T rotates in the vertical direction for driving right camera 1-R, and the right camera 1-R anglec of rotation is in the vertical direction designated as θ r-t; Driver part 2-L-R rotates around axle (optical axis) for driving left camera 1-L, and the angle that left camera 1-L pivots is designated as θ 1-r; Driver part 2-R-R rotates around axle (optical axis) for driving right camera 1-R, and the angle that right camera 1-R pivots is designated as θ r-r.
Driver part 2-B is for changing the spacing (namely changing base length) of two cameras 1, and as mentioned above, this spacing is designated as base length b.In addition, the base 4a that driver part 2-N is used for drives camera parametric controller 4 rotates in the horizontal direction, and the base 4a anglec of rotation is in the horizontal direction designated as θ n.
When control device 6 shown in Fig. 1 sends for base length b is adjusted to the control signal of a certain specific distance, driver part 2-B just performs corresponding operating, and base length b is adjusted to this specific distance.Same, when control device 6 sends for the convergence angle of two cameras 1 is adjusted to a certain special angle, driver part 2-L-P and driver part 2-L-R just performs corresponding operating, convergence angle is adjusted to this special angle.
Although schematically depict a mechanism only using single driver part 2-B to regulate base length in Fig. 2, base length also can be regulated by the mode being moved two cameras 1 by two driver parts respectively as shown in Figure 1.In addition, for regulating two of convergence angle driver parts (i.e. driver part 2-L-P and driver part 2-L-R) generally by the mode respectively two cameras being rotated equal angular, convergence angle to be regulated a certain special angle.
In addition, although in this specification embodiment, use driver part (motor) to adjust position and the direction of camera 1, the present invention is not limited thereto, other mechanism also can be used to move camera 1.
Below with reference to Fig. 3, modeling explanation is carried out to the shooting of 3-dimensional image and broadcasting.Fig. 3 A is depicted as the three-dimensional shooting model that two cameras (i.e. left camera 1-L and right camera 1-R) use converge like the spokes of a wheel at the hub shooting method to carry out taking.Wherein, the spacing of left camera 1-L and right camera 1-R is set to base length b; 1/2 of two camera optical axis institute angulations are designated as convergence angle β; The angle of visibility of left camera 1-L and right camera 1-R is respectively α; The plane being 1 place apart from camera photocentre distance (namely apart from the air line distance of the vertical direction of the baseline mid point of connection two photocentres) is called normalization plane; And the viewing angle in this normalization plane is set to normalization angle of visibility r.
In addition, by the intersection point of described two camera optical axises, and the plane making two cameras viewing angle thereon consistent is converge like the spokes of a wheel at the hub plane, and the distance (being namely connected the air line distance of the vertical direction between the baseline mid point of two photocentres and converge like the spokes of a wheel at the hub plane) between camera photocentre with converge like the spokes of a wheel at the hub plane is set to g.Therefore, in the model, when horizontally rotating two cameras, convergence angle β and distance g will change; When implementing zoom operation, angle of visibility α and normalization angle of visibility r will change; And distance g is determined jointly by base length b and convergence angle β.Herein, base length b, convergence angle β and angle of visibility α are the parameter representing shooting condition.
Fig. 3 B is depicted as and plays or three-dimensional playing model that projection 3-dimensional image being undertaken by people two is viewed and admired.Wherein, the spacing (distance namely between photocentre) of people two is expressed as base length m; Screen width is designated as w; The people two eye heart is designated as d with the distance (being namely connected the air line distance of the vertical direction between the baseline mid point of two photocentres and screen) between screen.Herein, the long m of people two eyebase, screen width w and the distance d between people two eyebase and screen are the parameter representing condition of viewing and admiring.
In the model, by following equation, normalization angle of visibility r is similar to:
R=2tan (α/2) ... (equation 1)
And be similar to by the following equation d that adjusts the distance:
G=(b/2) cot (β) ... (equation 2)
Below with reference to Fig. 4, modeling explanation is carried out to 3-dimensional image shooting and the reference object in playing.Fig. 4 A is depicted as and takes the corresponding reference object model of model with the three-dimensional in Fig. 3 A.Wherein, the mid point connecting the line segment (baseline) of two camera photocentres when being taken by 3-dimensional image is set to initial point, x-axis is set to the right from this initial point axle that camera 1-R photocentre extends, the axle extended to two camera optical axes crosspoints from this initial point is set to y-axis, and this initial point will be comprised, the two dimensional surface of x-axis and y-axis is defined as camera plane.
In exemplified by Fig. 4 A, it is y place that a reference object 20a is placed in distance initial point distance, and is taken by two cameras simultaneously.These two cameras (i.e. left camera 1-L and right camera 1-R) can obtain image when taking reference object 20a, comprising left-eye image (image namely taken by left camera 1-L) and eye image (image namely taken by right camera 1-R).According to the difference (i.e. parallax) between left-eye image and eye image abscissa, the ratio of parallax and screen width can be obtained, be designated as parallax information p.
The coordinate of reference object 20a in camera plane is designated as (x, y), when the actual range simultaneously defining reference object is y, then can obtain the material object distance y of reference object 20a, two camera base length b, relation as shown in following equation between the spacing g of camera and converge like the spokes of a wheel at the hub plane, normalization angle of visibility r and reference object parallax information p:
(g-y)/y=grp/b ... (equation 3)
And by this formula, the expression equation of distance in kind can be tried to achieve:
Y=bg/ (b+grp) ... (equation 4)
Fig. 4 B is depicted as the reference object model corresponding with the three-dimensional playing model in Fig. 3 B.Wherein, the mid point of the line segment (baseline) connecting the people two eye heart when viewing and admiring the 3-dimensional image play or show is set to initial point, a axle is set to the right from this initial point axle that the eye heart extends, the axle extended to the direction (dead ahead) perpendicular to the line segment connecting the people two eye heart from this initial point is set to z-axis, and this initial point will be comprised, the two dimensional surface of a axle and z-axis is defined as sensation plane.Relation between three-dimensional shooting model and three-dimensional playing model is from camera plane to the space transforming feeling plane.
In exemplified by Fig. 4 B, reference object 20a shown in Fig. 4 A is perceived as one and is placed in the reference object 20b that distance parallax range is z place.The coordinate of reference object 20b in sensation plane is designated as (a, z), define the sensation of reference object apart from during for z simultaneously, then can obtain the sensation distance z of reference object 20b, human eye base length m, relation as shown in following equation between distance d between people two eyebase and screen, screen width w and reference object parallax information p:
(d-z)/z=wp/m ... (equation 5)
And by this formula, the expression equation of sensation distance can be tried to achieve:
Z=md/ (m+wp) ... (equation 6)
When a scene be taken is play or is shown, by reference object compared with outdoor scene sensuously by the degree that furthers (namely, ratio between the material object distance of reference object and sensation distance) be defined as the rate h that furthers, and reference object is defined as degree of depth multiplying power t in the expansion multiplying power of depth direction.In addition, also the expansion multiplying power of reference object width is defined as width multiplying power s, and the ratio between degree of depth multiplying power and width multiplying power is defined as depth-to-width ratio q.
When width multiplying power is less, sand table effect (namely giant looks) will be there is, and when depth-to-width ratio is less, cardboard effect will occur.
Below to combining above-mentioned three-dimensional shooting model, the collective model of three-dimensional playing model and reference object model is described.First, for three-dimensional shooting model and three-dimensional playing model, obtain the space transforming function for the point (x, y) in camera plane being converted to the corresponding points (a, z) in sensation plane.
When definition A=m-(bw/gr), when B=bw/r, C=lw/r, D=dl, try to achieve coordinate a and coordinate z by following formula:
A=Cx/ (Ay+B) ... (equation 7)
Z=Dy/ (Ay+B) ... (equation 8)
Then, utilize above-mentioned space transforming function, obtain the rate h that furthers of reference object in scene respectively, degree of depth multiplying power t, width multiplying power s and depth-to-width ratio q.
Further rate h=y/z=(Ay+B)/D ... (equation 9)
Degree of depth multiplying power t = ∂ z / ∂ y = BD / ( Ay + B ) 2 (equation 10)
Width multiplying power s = ∂ a / ∂ x = C / ( Ay + B ) (equation 11)
Depth-to-width ratio q=t/s=BD/C (Ay+B) ... (equation 12)
Finally, can draw and be applicable to all three-dimensional shooting models and three-dimensional playing model, and be applicable to following two equations of any reference object in scene:
Hq=B/C=b/m ... (equation 13)
Hs=C/D=w/dr ... (equation 14)
B/m in equation 13 represents the base length b of two cameras and the ratio of the long m of people two eyebase, is defined as expansion multiplying power herein.Thus equation 13 can be expressed as following equation 15(and collective model equation 1 further):
The reference object rate of furthering × reference object depth-to-width ratio=base length expands multiplying power ... (equation 15)
In addition, the d/w in equation 14 represents the ratio of distance between human eye and screen and screen width, is defined as herein and views and admires distance multiplying power.Thus equation 14 can be expressed as following equation 16(and collective model equation 2 further):
The reference object rate of furthering × reference object width multiplying power × view and admire distance multiplying power × normalization angle of visibility=1 ... (equation 16)
Below with reference to Fig. 5, the various functions of control device 6 are described.As shown in Figure 5, in one embodiment of the invention, control device 6 comprises: image data receiving element 6a, shooting condition receiving element 6b, disparity computation unit 6c, shooting controlled processing unit 6d, camera control unit 6e, views and admires condition and other data receipt unit 6f, set point receiving element 6g, cell encoder 6h, and convergence angle computing unit 6i.Shooting controlled processing unit 6d also comprises base length determining unit 6d-1, convergence angle control unit 6d-2, and shooting condition regulon 6d-3.
Image data receiving element 6a is by conductor wire etc. and two camera 1(and left camera 1-L and right camera 1-R) be connected, use converge like the spokes of a wheel at the hub shooting method to take the 3-dimensional image data of (the camera synchronization shooting of two, left and right) for receiving two cameras 1.In addition, image data receiving element 6a can also according to real needs, receive the data that user inputs to camera 1, or are received in (such as, coordinate datas) such as the reference object related datas that automatically generates when reference object followed the trail of by camera.
Shooting condition receiving element 6b is connected with two cameras 1 by conductor wire etc., for receiving the parameter of the shooting condition that camera 1 exports.Shooting condition receiving element 6b also comprises angle of visibility measurement mechanism 3a by conductor wire etc. with amount of movement measurement mechanism 3(, and distance-measuring device 3b in kind) connect, and receive the shooting condition parameter of this device output.In addition, it according to real needs, can also receive following data: the base length b that base length measurement mechanism 3c measures, and is stored in the convergence angle β of convergence angle computing unit 6i.
With regard to design parameter, shooting condition receiving element 6b can receive various parameter according to the actual requirements: such as, receives focal length and imageing sensor width data from camera 1, for calculating angle of visibility α; The data of the angle of visibility α of actual measurement are received from angle of visibility measurement mechanism 3a; The data of the material object distance y of (camera 1 baseline mid point is extremely) reference object of actual measurement are received from distance-measuring device 3b in kind, or for calculating other range data of distance y in kind.In structure shown in Fig. 5, angle of visibility measurement mechanism 3a and distance-measuring device 3b in kind, respectively as other devices, is arranged at camera 1 outside.Such as, use shooting at close range object measured and effective active laser sensor laser range finder as distance-measuring device 3b in kind.But, also the functional part at least partially of angle of visibility measurement mechanism 3a and distance-measuring device 3b in kind can be arranged at camera 1 herein inner.
Disparity computation unit 6c is used for the characteristic point extracting each reference object from the left and right image data taken by camera 1 that image data receiving element 6a receives, and calculates the difference (i.e. parallax) of left images abscissa by mating these characteristic points.Due to the specific objective thing that reference object is in photographic image, photographer or three dimensional design teacher can be selected it by various method when taking and specify, and such as, the input function by camera is selected, or automatically selects.Identical with image data, the reference object dependent coordinate data of being specified by said method are also sent to disparity computation unit 6c by camera 1 by image data receiving element 6a.
The above-mentioned characteristic point that various method can be used to extract reference object is mated, and such as: be set to the center of circle with a certain selected bit of reference object, extracts the characteristic point in certain radius; Separate picture, extract the characteristic point of a certain zoning; Extract whole characteristic point (using the mean value of whole parallax or weighted average as final parallax).In addition, SIFT(Scale-invariantfeaturetransform can be used, Scale invariant features transform) the various known method such as method carries out the coupling of above-mentioned characteristic point.
Parallax tried to achieve herein, then try to achieve the ratio (for unit in units of pixel or with physical length (rice)) between itself and screen width w through following shooting controlled processing unit 6d, and using the value of trying to achieve as parallax information p, for subsequent calculations.
The condition that shooting controlled processing unit 6d derives according to above-mentioned collective model etc., determines camera 1 base length being suitable for creating good three-dimensional shooting effect, and controls convergence angle.In addition, shooting controlled processing unit 6d is also for determining the safe range of base length and convergence angle, and the implementation method of this function will describe in detail below.
Camera control unit 6e is for receiving the determined base length of base length determining unit 6d-1 of shooting controlled processing unit 6d, and from the base length measurement mechanism 3c of amount of movement measurement mechanism 3() receive measurement data, and the mode of the angle of camera 1 is not regulated by the horizontal level (i.e. spacing) of an adjustment camera 1, make camera 1 have determined base length.Therefore, for realizing this object, camera control unit 6e also sends control data to driver part 2, carries out corresponding operating to control driver part 2.Such as, when for regulating camera 1 base length b, it controls driver part 2-B and carries out corresponding operating (as shown in Figure 2).
In addition, camera control unit 6e also by the convergence angle control unit 6d-2 of shooting controlled processing unit 6d, carries out FEEDBACK CONTROL to the convergence angle β of camera 1.Convergence angle control unit 6d-2, according to the parallax received from disparity computation unit 6c, regulates the angle of camera 1.Therefore, for realizing this object, camera control unit 6e also sends control data to driver part 2, carries out corresponding operating to control driver part 2.When for regulating camera 1 convergence angle β, it controls driver part 2-L-P and driver part 2-R-P and carries out corresponding operating (as shown in Figure 2).
For example, the control of driver part (motor) realizes by following steps: first the digital signal that camera control unit 6e exports is carried out digital-to-analogue conversion (i.e. D/A conversion), again by the driver of the analog signal input motor after conversion, last by driver drives motor, make the amount of movement specified by its movement.Such as, can use and be arranged at movement quantity detecting sensor on motor as amount of movement measurement mechanism 3, by it, motor rotating angles signal detected is carried out analog-to-digital conversion (i.e. A/D conversion), and the digital signal after conversion is sent to camera control unit 6e.
Implement the change of camera 1 base length controlling to occur with driver part 2-B, detected by base length measurement mechanism 3c, and testing result is sent to camera control unit 6e.Camera control unit 6e, then according to this testing result, is similar to PID to the action of driver part 2-B and controls the FEEDBACK CONTROL of (proportional-integral-derivativecontrol, proportional integral differential control) or feedfoward control etc.
Herein, for convenience of description, in structure shown in Fig. 5, base length measurement mechanism 3c and rotational angle measurement apparatus 3d is arranged at the outside of camera 1 and driver part 2 as other devices.In addition, also the functional part at least partially of these devices (such as, being arranged at the similar parts of movement quantity detecting sensor on motor with above-mentioned) can be arranged at camera 1 and driver part 2 is inner.In addition, although be subject to describing the present invention premised on FEEDBACK CONTROL or feedfoward control by driver part 2 above, the present invention is non-essential just can be achieved by this type of control.
View and admire condition and other data receipt unit 6f for the long m of recipient's eyebase, screen width w and people two eyebase and the distance d between screen etc. and audience view and admire the relevant data of the condition of viewing and admiring of 3-dimensional image, and the imageing sensor width etc. receiving camera 1 is for obtaining camera 1 base length b or the data for controlling the objects such as convergence angle β.These data can be the data that user waits by keyboard or the input of other input equipments, also can be the data that any equipment sends automatically.In addition, although in the present embodiment, human eye base length m is undertaken receiving by the condition of viewing and admiring and other data receipt unit 6f and is obtained, and the present invention is not limited to this.Human eye base length m also can be in advance setting and constant (as 60 millimeters) in input system, in this case, this constant can be used as human eye base length m.
Set point receiving element 6g is used for, under the pattern obtaining camera 1 base length b or control convergence angle β according to user's setting data, receiving the data set by user.Such as, it may further include: for receiving the parallax receiving element of set parallax; For the rate that the furthers receiving element of the rate that furthers set by receiving; For receiving the depth-to-width ratio receiving element of set depth-to-width ratio; For receiving the width multiplying power receiving element of set width multiplying power; And for receiving the receiving-member of the allowed disparity range for regulating safe parallax.These setting datas are generally waited by keyboard or other input equipment typings by user.
The image that cell encoder 6h is used for receiving at image data receiving element 6a, after the process of necessity, is encoded to certain data mode, and is exported as 3-dimensional image data.
Convergence angle computing unit 6i is used for receiving rotation angle θ from rotational angle measurement apparatus 3d, and is preserved calculate the convergence angle β of camera 1 according to these data after.The value of the convergence angle β calculated sends to camera control unit 6e and shooting condition receiving element 6b simultaneously.
Below the various control methods implemented shooting controlled processing unit 6d are described.
[control method based on collective model equation 1]
Shooting controlled processing unit 6d can use relation shown in above-mentioned collective model equation 1 to implement the control of various modes.Herein, be described for wherein four kinds of representative patterns.
● the first control model
As shown in collective model equation 1, it is proportional that the base length being suitable for creating good shooting condition expands the further product of rate h × reference object depth-to-width ratio q of multiplying power b/m and reference object, and the reference object rate h that furthers according to the material object distance y of reference object and can feel that distance z calculate.Herein in order to effectively prevent cardboard effect, the reference object depth-to-width ratio q in collective model equation 1 is set as constant 1.
After reference object depth-to-width ratio q is set as constant 1, it is proportional that the above-mentioned base length being suitable for creating good shooting condition expands multiplying power (b/m) the rate h that just only furthers with reference object.In addition, because reference object depth-to-width ratio q can by other specific function (herein, constant also can be considered a kind of function, i.e. constant function) to express, the base length being suitable for creating good shooting condition so above-mentioned expands multiplying power (b/m) and reference object and furthers just to be provided with between rate h and carry out by specific function the relation that is mutually related.Such as, described function can be the function that certain expresses distance y in kind, and make its value when reference object and camera 1 close together be 1, time distant, value is 0.5.In addition, reference object depth-to-width ratio q can represent (in formula, k is coefficient) by following formula (i.e. described specific function):
Q=k 2/3(equation 17)
And from collective model equation 1, reference object depth-to-width ratio is the reference object spacing g of distance y, camera base length b, camera and converge like the spokes of a wheel at the hub plane in kind and the function of normalization angle of visibility r simultaneously.
Thus, the baseline long value of camera 1 just can be obtained according to collective model equation 1.And base length determining unit 6d-1 also can determine the base length b of camera 1 accordingly, and the base length b after determining is sent to camera control unit 6e.After camera control unit 6e receives base length b data, just can implement to control the base length of camera 1 to be adjusted to determined base length b.
As known from the above, if distance z and human eye base length m is known for reference object distance y in kind, reference object sensation, the base length b of camera 1 can just be obtained.Due to can using the measurement result of distance-measuring device 3b in kind as reference object distance y in kind, or obtain reference object distance y in kind according to the spacing g of the base length b of reference object parallax information p, camera 1, the baseline of camera 1 and converge like the spokes of a wheel at the hub plane and normalization angle of visibility r, as long as so obtain each parameter value except the base length b of camera 1, the base length b of camera 1 just finally can be tried to achieve.
In above parameter: the parallax that reference object parallax information p can export according to disparity computation unit 6c and screen width w calculate; As previously discussed, the baseline of camera 1 and the spacing g of converge like the spokes of a wheel at the hub plane can calculate according to the base length b of camera 1 and convergence angle β; Convergence angle β can the anglec of rotation measured by rotational angle measurement apparatus 3d try to achieve, and then, such as, the convergence angle β being stored in convergence angle computing unit 6i is sent to shooting controlled processing unit 6d through shooting condition receiving element 6b; As mentioned above, normalization angle of visibility r can obtain according to the angle of visibility α of camera 1; Angle of visibility α can be recorded by angle of visibility measurement mechanism 3a, and camera 1 focal length that also can receive according to shooting condition receiving element 6b and camera 1 imageing sensor width are obtained; Imageing sensor width then obtains by modes such as user's inputs.
In addition, reference object sensation distance z and can view and admire condition (that is, human eye base length m, screen width w and the distance d between people two eyebase and screen) and calculates according to reference object parallax information p.
● the second control model
In the second control model, parallax value is set by the user.After the parallax value set by user is supplied to shooting controlled processing unit 6d by set point receiving element 6g, real-time reception is just taken according to camera 1 mode that parallax value that the image data that obtains calculates is adjusted to set parallax value by disparity computation unit 6c by continuous by convergence angle control unit 6d-2, camera control unit 6e is controlled, to regulate convergence angle β (i.e. FEEDBACK CONTROL (PID control)).
Because parallax value sets, thereafter (w is known) parallax information p can just be obtained according to collective model equation 1, and then according to parallax information p and view and admire condition (namely, human eye base length m, screen width w and the distance d between people two eyebase and screen) calculate reference object and feel distance z, also can obtain the reference object changed with the adjustment of convergence angle β distance y in kind simultaneously.Then, according to reference object distance y in kind and reference object, the base length determining unit 6d-1 of shooting controlled processing unit 6d can feel that distance z obtains the rate h that furthers after change, and according to this value determination base length b, then implemented to control by camera control unit 6e, the base length of camera 1 is adjusted to determined base length b.
In addition, for reference object distance y in kind, both can directly using the measurement result of distance-measuring device 3b in kind as its value, also can not use this measurement result, and pass through calculating according to method described in the first control model above-mentioned and drawn.
In addition, in the second control model, after user sets parallax value, the adjustment of convergence angle β and the control of base length b also can asynchronously be carried out.
● the third control model
In the third control model, the rate h that furthers of reference object is set by the user.The rate that furthers value set by user is first supplied to shooting controlled processing unit 6d by set point receiving element 6g.Then, base length b is determined by the base length determining unit 6d-1 of shooting controlled processing unit 6d according to the set rate h that furthers.Thereafter, implemented to control by camera control unit 6e, the base length of camera 1 is adjusted to determined base length b.
In addition, shooting controlled processing unit 6d obtains reference object sensation distance z according to the set rate h and reference object distance y in kind that furthers, and then feels distance z according to reference object and view and admire condition (that is, m, w and d, is given value) obtain parallax information p and parallax.Afterwards, real-time reception is just taken according to camera 1 mode that parallax value that the image data that obtains calculates is adjusted to set parallax value by disparity computation unit 6c by continuous by convergence angle control unit 6d-2, realizes the FEEDBACK CONTROL (PID control) to convergence angle β by camera control unit 6e.
In addition, for reference object distance y in kind, both can directly using the measurement result of distance-measuring device 3b in kind as its value, also can not use this measurement result, and pass through calculating according to method described in the first control model above-mentioned and drawn.
In addition, identical with the first control model, in the third control model, after user sets the value of the rate h that furthers, the adjustment of convergence angle β and the control of base length b also can asynchronously be carried out.
● the 4th kind of control model
In the 4th kind of control model, be set by the user as constant 1 or as the depth-to-width ratio q of specific function in aforementioned several modes.The value of the depth-to-width ratio set by user is first supplied to shooting controlled processing unit 6d by set point receiving element 6g.Then, by shooting controlled processing unit 6d according to the various methods used in above-mentioned first to the third control model, and by value (but not above-mentioned any set point) set herein, as depth-to-width ratio q, is determined base length.
[based on the control method of collective model equation 1 with collective model equation 2]
Shooting controlled processing unit 6d can use relation shown in above-mentioned collective model equation 1 and collective model equation 2 to implement the control of various modes simultaneously.Herein, be described for wherein two kinds of representative patterns.
● the 5th kind of control model
As shown in collective model equation 2, reference object further rate h, reference object width multiplying power s, view and admire distance multiplying power (d/w) and normalization angle of visibility r between product be constant (=1).Herein, in order to effectively prevent cardboard effect, by implementing zoom operation to camera 1, adjust its angle of visibility α, all the time the width multiplying power s of reference object is remained constant 1.In addition, be the condition of viewing and admiring, need be inputted in advance by user owing to viewing and admiring distance multiplying power (d/w), so in such a mode, the product that reference object furthers between rate h and normalization angle of visibility r is also constant.
With identical to the processing method of reference object depth-to-width ratio q in front several control model, the reference object width multiplying power s under this pattern also can not as constant, and as a certain specific function.According to collective model equation 2, reference object width multiplying power s is the reference object spacing g of distance y, camera base length b, camera and converge like the spokes of a wheel at the hub plane in kind and the function of normalization angle of visibility r.
In such a mode, first controlled processing unit 6d is taken according to collective model equation 2, and according to measured by the angle of visibility measurement mechanism 3a, or according to the calculated angle of visibility α of focal length, calculate normalization angle of visibility r, and then calculate the rate h that furthers being suitable for creating good shooting condition according to this value.
Thereafter, base length determining unit 6d-1 according to the rate h that furthers tried to achieve, determines base length b again.Then, implemented to control by camera control unit 6e, the base length of camera 1 is adjusted to determined base length b.
In addition, shooting controlled processing unit 6d is according to collective model equation 1, and obtain reference object sensation distance z according to the rate h and reference object distance y in kind that furthers tried to achieve, and then according to reference object sensation distance z and view and admire condition (namely, m, w and d, is given value) obtain parallax information p and parallax.Afterwards, real-time reception is just taken according to camera 1 mode that parallax value that the image data that obtains calculates is adjusted to set parallax value by disparity computation unit 6c by continuous by convergence angle control unit 6d-2, realizes the FEEDBACK CONTROL (PID control) to convergence angle β by camera control unit 6e.
In addition, for reference object distance y in kind, both can directly using the measurement result of distance-measuring device 3b in kind as its value, also can not use this measurement result, and pass through calculating according to method described in the first control model above-mentioned and drawn.
In addition, in the 5th kind of control model, after calculating the value of the rate h that furthers, the adjustment of convergence angle β and the control of base length b can asynchronously be carried out.
● the 6th kind of control model
In the 6th kind of control model, the former width multiplying power s remaining constant 1 in the 5th kind of control model is set by the user.The value of the width multiplying power set by user is first supplied to shooting controlled processing unit 6d by set point receiving element 6g.Then, by shooting controlled processing unit 6d according to the method used in above-mentioned 5th kind of control model, and by using value set herein as width multiplying power s, control convergence angle.
[control method based on fail safe]
Shooting controlled processing unit 6d also can implement the 7th kind of control model for the fail safe of 3-dimensional image.Wherein, shooting controlled processing unit 6d is according to safely instruction policies such as such as relevant to 3-D photography safely instruction policies (as: Japan's " 3DC safety guide " (Japanese 3D Alliance Secure guide sub-committee issues)), control base length b and convergence angle β, make it fall into safe disparity range.
First, the shooting condition regulon 6d-3 of shooting controlled processing unit 6d is according to being set in advance by user and the allowed disparity range (p_lower received by set point receiving element 6g, p_upper), by the disparity range (p_min (t) of the calculated integral image of disparity computation unit 6c, p_max (t)) now value, and from base length b now (t) (t) of camera 1 that base length measurement mechanism 3c receives, by following equation 18, determine the allowable range (refer in particular to maximum can permissible value b_upper (t+1)) of subsequent time base length.The parallax of described integral image refers to the parallax of all or most pixel of the image captured by camera 1, but not refers to the parallax of one Partial Feature point.After obtaining this parallax, just can obtain the disparity range (p_min (t), p_max (t)) of integral image.
B_upper (t+1)=b (t) (p_upper-p_lower)/(p_max (t)-p_min (t) ... (equation 18)
As shown in equation 18, upper limit b_upper (t+1) value of subsequent time (t+1) base length b, by the product of base length b now (t) (t) and the maximum of integral image disparity range and the difference of minimum value, determines with the ratio of the difference that can allow the upper lower limit value of disparity range.
Then, shooting condition regulon 6d-3 is according to being set in advance by user and the allowed disparity range (p_lower received by set point receiving element 6g, p_upper), by the disparity range (p_min (t) of the calculated integral image of disparity computation unit 6c, p_max (t)) now value, and convergence angle β now (t) of camera 1 of to be preserved by convergence angle computing unit 6i, by following equation 19 and equation 20, determine the allowable range of subsequent time convergence angle.
β_upper(t+1)=β(t)+δ(p_max(t)<p_upper)
=β (t)-δ (p_max (t) >p_upper) ... (equation 19)
β_lower(t+1)=β(t)+δ(p_min(t)>p_lower)
=β (t)-δ (p_min (t) <p_lower) ... (equation 20)
Herein, the large I of δ value suitably adjusts according to various factorss such as can allowing the parallax of disparity range, integral image and moment (t).Such as, 0.01 degree of this value can be used.
From equation 19, upper limit β _ the upper (t+1) of subsequent time (t+1) convergence angle obtains according to convergence angle β (t) of now (t): when the maximum (p_max (t)) of integral image parallax is greater than higher limit (p_upper) that can allow disparity range, as long as deduct δ value from β (t) value just can obtain β _ upper (t+1); In contrast, when the maximum (p_max (t)) of integral image parallax is less than higher limit (p_upper) that can allow disparity range, β _ upper (t+1) equals β (t) value and δ value sum.Same, from equation 19, lower limit β _ the lower (t+1) of subsequent time (t+1) convergence angle obtains according to convergence angle β (t) of now (t): when the minimum value (p_min (t)) of integral image parallax is less than lower limit (p_lower) that can allow disparity range, as long as deduct δ value from β (t) value just can obtain β _ lower (t+1); And in contrast, when the minimum value (p_min (t)) of integral image parallax is greater than lower limit (p_lower) that can allow disparity range, β _ upper (t+1) equals β (t) value and δ value sum.
In addition, when the maximum (p_max (t)) of integral image parallax is greater than the higher limit (p_upper) that can allow disparity range, and when the minimum value (p_min (t)) of integral image parallax is less than lower limit (p_lower) that can allow disparity range simultaneously, according to equation 18, because now b_upper (t+1) value has been less than b (t) value, base length b will be turned down by corresponding, so after certain a period of time, integral image parallax will fall into again within allowable range.
Shooting condition regulon 6d-3 checks through base length determining unit 6d-1 or convergence angle control unit 6d-2 established data content, as found, base length b is greater than above-mentioned base length b higher limit b_upper (t+1), or convergence angle β exceed convergence angle scope (β _ lower (t+1) ~ β _ upper (t+1)) situation occur time, base length b and convergence angle β will be adjusted, make it fall in respective range.In addition, in an embodiment of the present invention, 3-dimensional image shoot control system 7 might not have this regulates base length b and convergence angle β function by shooting condition regulon 6d-3, and this function can suitably be added according to actual needs.
Although the 3-dimensional image shoot control system 7 in above one embodiment of the invention illustrated by reference to Fig. 1 to Fig. 5 only includes the camera 1 be made up of a pair camera, it also can use multiple phase unit (or camera to), for adopting multiple screening-mode in once shooting simultaneously, multiplely view and admire condition to meet.Wherein, control device 6, for often organizing camera, determines its base length can creating suitable shooting condition and convergence angle respectively, and controls the position of respective camera group and direction accordingly.
Such as, when 3-dimensional image shoot control system 7 comprises four camera (camera A, camera B, camera C and camera D) time, these four cameras are by mutual combination, formed as (camera A+ camera B) (camera C+ camera D), or multiple phase units of (camera A+ camera B) (camera A+ camera C) (camera A+ camera D) etc.
Exemplified by Fig. 6, the computer configuation of the control device 6 forming 3-dimensional image shoot control system 7 in one embodiment of the invention is described.Herein, only representatively property one is routine for computer 100 shown in Fig. 6, in order to the structure that can realize the computer of each function of control device 6 to be described.
Computer 100 comprises: CPU(CentralProcessingUnit, central processing unit) 101, memory 102, filming apparatus interface 103, driver part interface 104, amount of movement measures device interface 105, display controller 106, display 107, input equipment interface 108, keyboard 109, mouse 110, external memory media interface 111, External memory equipment 112, network interface 113, and for the interconnective bus 114 of above-mentioned each part.
CPU101 be used at OS(OperatingSystem, operating system) control under, control to implement various function to other each part of computer 100.Such as, it disparity computation unit 6c shown in control chart 5 and shooting controlled processing unit 6d etc. can carry out respective handling.
Memory 102 is general by RAM(RandomAccessMemory, random access memory) composition.Memory 102 loads the program of each function (function as disparity computation unit 6c and shooting controlled processing unit 6d etc.) for realizing being performed by CPU101, and preserves the necessary data needed for said procedure etc. (as convergence angle β and angle of visibility α etc. now) temporarily.
Filming apparatus interface 103 is connected with camera 1 (being connected with the camera 1-L for taking left image and the camera 1-R for taking right image respectively), sends and receives, and receive the view data of camera 1 transmission for the data controlled between camera 1.
Driver part interface 104 is connected with driver part 2, and sends to it control data implementing corresponding operating for controlling driver part.In addition, amount of movement measures device interface 105 for measurement data such as the angle and distances that receives amount of movement measurement mechanism 3 and send.
The image data that display controller 106 sends for the treatment of CPU101 etc., and image is shown on display 107, display 107 then comprises by LCD(LiquidCrystalDisplay, liquid crystal display) assembly such as display unit and touch-screen that forms.Such as, when needs input set point by user etc. to control device 6, display setting value inputting interface on display 107.
The keyboard 109 of input equipment interface 108 for operating from user to input data to control device 6, the input equipment Received signal strength such as mouse 110 and touch-screen, and send it to CPU101.In addition, input data etc. and be all stored in memory 102.
External memory media interface 111 is for accessing External memory equipment 112, and the transmission of control data and reception.Such as, it can drive CD 121, and reads the data of wherein preserving by its recording surface of access, or by the data write CD 121 in External memory equipment 112.In addition, external memory media interface 111 can also access portable flash memory devices 122, to realize the data interaction between itself and computer 100.
External memory equipment 112 is generally the memory devices such as hard disk, and it, except can preserving the program for realizing each function performed by CPU101, can also preserve the data that this class method uses.
Network interface 113 can be connected with comprising the network such as wide area network and the Internet 130, and the data between computer for controlling 100 and network 130 send and receive.By network interface 113, computer 100 can be accessed by wide area network or the Internet to realize transmission and the reception of data.For realizing the program of each function of the present invention performed by CPU101, through this network interface 113 or said external medium interface 111, computer 100 can be provided in from outside.
In addition, sell after all or part program for realizing each function of the present invention also can write chip.
In sum, use 3-dimensional image shoot control system of the present invention to obtain to have considered shooting condition and viewed and admired the parameter of condition, and by using described parameter to control at least one in the position of a pair camera and direction, good 3-D effect can be obtained, and safe disparity range can be kept.
Obviously, those skilled in the art can carry out various change and modification to invention and not depart from the spirit and scope of the present invention.Like this, if these amendments of the present invention and modification belong within the scope of the claims in the present invention and equivalent technologies thereof, then the present invention is also intended to comprise these change and modification.

Claims (15)

1. a 3-dimensional image shoot control system, comprising:
Disparity computation unit, for according to the image obtained from least one pair of filming apparatus taken reference object, calculates the relative disparity of described reference object;
Base length determining unit, for determining the base length of described a pair filming apparatus; And
Base length control unit, for control described a pair filming apparatus to move to have described by the position of base length determined;
Wherein, described base length determining unit calculates sensation distance when described image is viewed and admired according to described relative disparity, and according to the base length of described a pair filming apparatus and the ratio of human eye base length and the functional relation that is mutually related had between rate that furthers, determine the base length of described a pair filming apparatus, wherein, the rate of furthering × reference object depth-to-width ratio=base length expands multiplying power, the ratio of distance in kind and described sensation distance when viewing and admiring when the described rate of furthering is shooting, the ratio of the base length of described a pair filming apparatus and people two eyebase length is that base length expands multiplying power, described reference object depth-to-width ratio is the ratio between degree of depth multiplying power and width multiplying power, described degree of depth multiplying power is the expansion multiplying power of reference object at depth direction, described width multiplying power is the expansion multiplying power of reference object width.
2. 3-dimensional image shoot control system as claimed in claim 1, is characterized in that, also comprise:
Parallax receiving element, for receiving setting parallax; And
Convergence angle control unit, moves to described a pair filming apparatus the position that it has the convergence angle corresponding with described setting parallax for controlling;
Wherein, described base length determining unit calculates described sensation distance according to described setting parallaxometer.
3. 3-dimensional image shoot control system as claimed in claim 1, is characterized in that, also comprise:
The rate that furthers receiving element, to further rate for receiving setting; And
Convergence angle control unit, moves to described a pair filming apparatus the position that it has the convergence angle corresponding with the parallax that the rate that furthers according to described setting obtains for controlling;
Wherein, described base length determining unit calculates the base length of described a pair filming apparatus according to the described setting rate of furthering.
4. 3-dimensional image shoot control system as claimed in claim 1, is characterized in that, also comprise:
Convergence angle control unit, moves for controlling described a pair filming apparatus thus realizes the adjustment to convergence angle,
Wherein, described base length determining unit calculates according to the angle of visibility of described a pair filming apparatus the rate of furthering, and described a pair filming apparatus is moved to the position that it has the convergence angle corresponding with the parallax that the rate that furthers calculated described in basis obtains by described convergence angle control unit;
Described base length determining unit according to described in the rate of furthering that calculates determine the base length of described a pair filming apparatus.
5. 3-dimensional image shoot control system as claimed in claim 4, it is characterized in that, described angle of visibility is recorded by the angle of visibility measuring unit for measuring angle of visibility, or draws according to the focal length of described a pair filming apparatus and imageing sensor width calculation.
6. 3-dimensional image shoot control system as claimed in claim 4, is characterized in that, also comprise:
Width multiplying power receiving element, for receiving setting width multiplying power,
Wherein, described base length determining unit calculates according to angle of visibility and described setting width dynameter the rate of furthering.
7., as the 3-dimensional image shoot control system in claim 1 to 6 as described in any one, it is characterized in that, also comprise:
Depth-to-width ratio receiving element, for receiving setting depth-to-width ratio, wherein, described base length determining unit is according to the base length of described a pair filming apparatus and the ratio of human eye base length and described rate and the proportional relation of described setting depth-to-width ratio product between the two of furthering, wherein, the rate of furthering × reference object depth-to-width ratio=base length expands multiplying power, determines the base length of described a pair filming apparatus.
8., as the 3-dimensional image shoot control system in claim 1 to 6 as described in any one, it is characterized in that, also comprise:
Allowing disparity range receiving element, allowing disparity range for receiving setting; And
Shooting condition regulon, for allowing in disparity range by regulating various parameter to make the parallax after adjustment fall within described setting;
Wherein, described disparity computation unit calculates the disparity range of integral image the image obtained from described a pair filming apparatus; The disparity range of the described integral image that described shooting condition regulon foundation calculates and described setting allow disparity range, at least determine the allowed band of the base length of described a pair filming apparatus and the allowed band of convergence angle both one of, and regulate base length and/or the convergence angle of described a pair filming apparatus, make it fall in fixed corresponding allowed band.
9. as the 3-dimensional image shoot control system in claim 1 to 6 as described in any one, it is characterized in that, described base length determining unit calculates described sensation distance according to described relative disparity and the condition of viewing and admiring, described in the condition of viewing and admiring comprise: human eye base length, screen width and people's eyebase are to the distance between described screen.
10. as the 3-dimensional image shoot control system in claim 1 to 6 as described in any one, it is characterized in that, described distance in kind is recorded by the distance measuring unit in kind for measuring distance in kind, or calculates according to the base length of described relative disparity and described a pair filming apparatus, convergence angle and angle of visibility.
11., as the 3-dimensional image shoot control system in claim 1 to 6 as described in any one, is characterized in that, at least one pair of filming apparatus described is the above filming apparatus of two couple of being formed by more than three filming apparatus.
12. 1 kinds of 3-dimensional image filming control methods, comprising:
Disparity computation step, according to the image obtained from a pair filming apparatus taken reference object, calculates the relative disparity of described reference object;
Base length determining step, sensation distance when described image is viewed and admired is calculated according to described relative disparity, and according to the base length of described a pair filming apparatus and the ratio of human eye base length and the functional relation that is mutually related had between rate that furthers, determine the base length of described a pair filming apparatus; And
Base length rate-determining steps, moves to it and has described by the position of base length determined by described a pair filming apparatus;
Wherein, the rate of furthering × reference object depth-to-width ratio=base length expands multiplying power, the ratio of distance in kind and described sensation distance when viewing and admiring when the described rate of furthering is shooting, the ratio of the base length of described a pair filming apparatus and people two eyebase length is that base length expands multiplying power, described reference object depth-to-width ratio is the ratio between degree of depth multiplying power and width multiplying power, described degree of depth multiplying power is the expansion multiplying power of reference object at depth direction, and described width multiplying power is the expansion multiplying power of reference object width.
13. 3-dimensional image filming control methods as claimed in claim 12, is characterized in that, also comprise;
Parallax receiving step, receives a setting parallax; And
Convergence angle rate-determining steps, moves to described a pair filming apparatus the position that it has the convergence angle corresponding with described setting parallax.
14. 3-dimensional image filming control methods as claimed in claim 12, is characterized in that, also comprise:
The rate that furthers receiving step, receives one and sets the rate that furthers;
Parallax determining step, calculates the parallax of described a pair filming apparatus according to the described setting rate of furthering;
Convergence angle rate-determining steps, controls described a pair filming apparatus and moves to the position that it has the convergence angle corresponding with the parallax that the rate that furthers according to described setting obtains.
15. 3-dimensional image filming control methods as claimed in claim 12, is characterized in that, also comprise:
Parallax determining step, the angle of visibility according to described a pair filming apparatus calculates the rate of furthering, and determines that the rate that furthers calculated described in basis obtains parallax;
Convergence angle rate-determining steps, controls described a pair filming apparatus and moves to the position that it has the convergence angle corresponding with the parallax that the rate that furthers calculated described in basis obtains.
CN201210560154.7A 2012-12-20 2012-12-20 3-dimensional image shoot control system and method Active CN103888750B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210560154.7A CN103888750B (en) 2012-12-20 2012-12-20 3-dimensional image shoot control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210560154.7A CN103888750B (en) 2012-12-20 2012-12-20 3-dimensional image shoot control system and method

Publications (2)

Publication Number Publication Date
CN103888750A CN103888750A (en) 2014-06-25
CN103888750B true CN103888750B (en) 2016-02-24

Family

ID=50957441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210560154.7A Active CN103888750B (en) 2012-12-20 2012-12-20 3-dimensional image shoot control system and method

Country Status (1)

Country Link
CN (1) CN103888750B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6205069B2 (en) 2014-12-04 2017-09-27 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Imaging system and method
CN104614378A (en) * 2015-02-16 2015-05-13 无锡力优医药自动化技术有限公司 Boomerang disc image detection assembly
CN104793643B (en) * 2015-04-13 2018-01-19 北京迪生数字娱乐科技股份有限公司 A kind of three-dimensional stop motion animation camera system and control method
CN104883560B (en) 2015-06-10 2017-07-04 京东方科技集团股份有限公司 binocular stereo vision device and its adjusting method, device and display device
CN106412557A (en) * 2016-11-02 2017-02-15 深圳市魔眼科技有限公司 3D camera control method and 3D camera control device
CN106773508B (en) * 2017-02-15 2022-05-24 天津长瑞大通流体控制系统有限公司 Shooting system for watching 3D (three-dimensional) images by naked eyes and using method
CN106657974B (en) * 2017-02-27 2024-02-09 北京图森智途科技有限公司 Control method and device of binocular camera and binocular camera
CN107289247B (en) * 2017-08-04 2020-05-05 南京管科智能科技有限公司 Double-camera three-dimensional imaging device and imaging method thereof
CN107642654B (en) * 2017-09-27 2023-07-28 南京管科智能科技有限公司 Pipeline robot shell assembly
JP6892134B2 (en) * 2019-01-25 2021-06-18 学校法人福岡工業大学 Measurement system, measurement method and measurement program
CN110133958A (en) * 2019-05-21 2019-08-16 广州悦享环球文化科技有限公司 A kind of tracking system and method for three-dimensional film
CN113141499A (en) * 2020-01-20 2021-07-20 北京芯海视界三维科技有限公司 Method and device for realizing 3D shooting and 3D display terminal
CN117751399A (en) * 2021-12-29 2024-03-22 广州工商学院 Simulation teaching experience device based on multidimensional sensing technology

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201188667Y (en) * 2008-04-30 2009-01-28 北京工业大学 Binocular stereoscopic camera capable of self-adjusting base length
CN101843107A (en) * 2007-10-08 2010-09-22 斯特瑞欧比亚株式会社 OSMU(one source multi use)-type stereoscopic camera and method of making stereoscopic video content thereof
CN102165785A (en) * 2008-09-24 2011-08-24 富士胶片株式会社 Three-dimensional imaging device, method, and program
CN102665087A (en) * 2012-04-24 2012-09-12 浙江工业大学 Automatic shooting parameter adjusting system of three dimensional (3D) camera device
WO2012128178A1 (en) * 2011-03-18 2012-09-27 富士フイルム株式会社 Lens system for capturing stereoscopic images

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101843107A (en) * 2007-10-08 2010-09-22 斯特瑞欧比亚株式会社 OSMU(one source multi use)-type stereoscopic camera and method of making stereoscopic video content thereof
CN201188667Y (en) * 2008-04-30 2009-01-28 北京工业大学 Binocular stereoscopic camera capable of self-adjusting base length
CN102165785A (en) * 2008-09-24 2011-08-24 富士胶片株式会社 Three-dimensional imaging device, method, and program
WO2012128178A1 (en) * 2011-03-18 2012-09-27 富士フイルム株式会社 Lens system for capturing stereoscopic images
CN102665087A (en) * 2012-04-24 2012-09-12 浙江工业大学 Automatic shooting parameter adjusting system of three dimensional (3D) camera device

Also Published As

Publication number Publication date
CN103888750A (en) 2014-06-25

Similar Documents

Publication Publication Date Title
CN103888750B (en) 3-dimensional image shoot control system and method
US11398042B2 (en) Method for displaying a virtual image, a virtual image display system and device, a non-transient computer-readable storage medium
US10187633B2 (en) Head-mountable display system
CN104661012B (en) Personal holographic 3 D displaying method and equipment
US11190756B2 (en) Head-mountable display system
US20190045125A1 (en) Virtual reality video processing
CN102289144A (en) Intelligent three-dimensional (3D) video camera equipment based on all-around vision sensor
US9161020B2 (en) 3D video shooting control system, 3D video shooting control method and program
JP2014501086A (en) Stereo image acquisition system and method
CN107545537A (en) A kind of method from dense point cloud generation 3D panoramic pictures
CN110915206A (en) Systems, methods, and software for generating a virtual three-dimensional image that appears to be projected in front of or above an electronic display
CN102325262A (en) Control system for stereo video camera
WO2018000892A1 (en) Imaging method, apparatus and system for panoramic stereo image
KR101670328B1 (en) The appratus and method of immersive media display and image control recognition using real-time image acquisition cameras
Thatte et al. Depth augmented stereo panorama for cinematic virtual reality with focus cues
JP5223096B2 (en) 3D video shooting control system, 3D video shooting control method, and program
CA2875252A1 (en) Three-dimensional moving picture photographing apparatus and camera
Hasmanda et al. The modelling of stereoscopic 3D scene acquisition
US10110876B1 (en) System and method for displaying images in 3-D stereo
CN201178472Y (en) Three-dimensional video taking apparatus capable of direct playing on ordinary video playing apparatus
KR101873161B1 (en) Method and apparatus for providing personal 3-dimensional image using convergence matching algorithm
Smith et al. Perception of size and shape in stereoscopic 3d imagery
CN206181270U (en) Novel 3D video mobile communication terminal
CN109429057A (en) Synchronous 3D panoramic video playing system
JP4856775B2 (en) 3D image presentation device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant