JP3353737B2 - 3D information input camera - Google Patents

3D information input camera

Info

Publication number
JP3353737B2
JP3353737B2 JP08715699A JP8715699A JP3353737B2 JP 3353737 B2 JP3353737 B2 JP 3353737B2 JP 08715699 A JP08715699 A JP 08715699A JP 8715699 A JP8715699 A JP 8715699A JP 3353737 B2 JP3353737 B2 JP 3353737B2
Authority
JP
Japan
Prior art keywords
image
dimensional information
photographing
subject
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP08715699A
Other languages
Japanese (ja)
Other versions
JP2000283739A (en
Inventor
正隆 浜田
Original Assignee
ミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ミノルタ株式会社 filed Critical ミノルタ株式会社
Priority to JP08715699A priority Critical patent/JP3353737B2/en
Publication of JP2000283739A publication Critical patent/JP2000283739A/en
Application granted granted Critical
Publication of JP3353737B2 publication Critical patent/JP3353737B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Description

DETAILED DESCRIPTION OF THE INVENTION

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a three-dimensional information input camera.

[0002]

2. Description of the Related Art Conventionally, as three-dimensional information input, a method of obtaining three-dimensional information from two images passing through a plurality of photographing lenses, or the principle of triangulation by projecting light onto an object as shown in FIG. There is known a method of detecting a distance distribution by using the method.

Further, as disclosed in, for example, JP-A-6-249624, there is a method of projecting a fringe pattern, inputting the pattern with another camera, and detecting a distance distribution by so-called triangulation. Also, when the grid pattern is projected on the object and observed from different angular directions,
A method has also been proposed in which the projected grid pattern obtains deformation data according to the undulation of the object, thereby obtaining the undulation of the object. (Journal of the Japan Society for Precision Engineering, 55, 10, 85 (198
9)). Further, as shown in FIG. 17, a gray code pattern is projected instead of the grid pattern projection, and the optical distribution is measured by a CCD camera.

[0004] In order to obtain three-dimensional information by these methods, it is necessary to take a plurality of images or the processing of the image information is troublesome. Therefore, although there is no problem as a measuring instrument, it is considered that it is not suitable for use in a camera.

As a method for obtaining three-dimensional information with high accuracy by short-time photographing and post-calculation, there are the following proposals.

For example, FIG. 18 (Source: “optical three-dimensional measurement”)
As shown in Toru Yoshizawa, New Technology Communications, p. 89, Fig. 5.2.12 (a)), a stripe pattern is projected, and the projected stripe pattern is projected from the subject at an angle determined by design. And the distance distribution of the subject is detected from the deformed image of the stripe due to the unevenness of the subject. That is, a phase shift from the original fringe with respect to the phase of the image measured at each image point is calculated. This phase shift also includes information on the height of the subject. Therefore, the distance distribution of the subject is obtained based on the phase information and the information obtained by triangulation. However, detection requires high accuracy. Since there is a limit in the density distribution and luminous intensity of the stripe pattern, a method has been adopted in which the distance distribution of the subject is obtained from a plurality of captured images in which the position of the stripe pattern is slightly shifted. For example, four phase shift patterns of 0 °, 90 °, 180 ° and 270 ° are projected.

An improvement of this method is to detect a distance distribution by projecting a sine wave stripe pattern onto a subject and detecting a phase shift from another angle as shown in FIGS. 19 and 20, for example. . According to this, it is possible to detect the distance distribution of a large number of phase positions for one stripe in one photographing.

[0008]

By the way, the above-mentioned fringe pattern projection method is premised on a usage method fixed in a room or the like, and the projection pattern can be set to a size enough to cover an object, but it is premised on handling. Cannot set a large projection pattern. Therefore, a device in which the size of the projection pattern unit is reduced and which is easily carried can be considered.

However, in this case, it is necessary to limit the photographing distance and the photographing screen. As a result of shooting, if you fall outside this limit condition, even if you can take a picture of the subject,
It may happen that three-dimensional information cannot be obtained.

Accordingly, a technical problem to be solved by the present invention is to provide a three-dimensional information input camera which does not cause a three-dimensional information input error.

[0011]

SUMMARY OF THE INVENTION In order to solve the above-mentioned technical problems, the present invention displays three-dimensional information input conditions in order to prevent input errors of three-dimensional information. A three-dimensional information input camera having the following configuration and having the following basic features is provided.

[0012] The three-dimensional information input camera includes imaging means for photographing a photographing area, and projecting means for projecting pattern light onto the photographing area, and is formed on a subject in the photographing area by the pattern light projected by the projecting means. There is also a type in which the projected pattern is photographed by the photographing means. The inputting camera is a limiting condition for making it possible to obtain three-dimensional information by the projection of the pattern light, the covenants
Outside of the case, even if photography is possible, 3D information
There is provided a restriction condition transmitting means for transmitting information on the condition that cannot be obtained .

In the above configuration, the restriction condition transmitting means includes:
For example, regarding the conditions for obtaining three-dimensional information by projecting the pattern light, such as the distance to the subject and the position of the subject with respect to the pattern light, the shooting range in which the three-dimensional information can be obtained, Information such as whether or not the restriction condition is actually satisfied is transmitted to the operator by visual display or sound. Thus, it is possible to know whether or not the restriction condition is satisfied, and it is possible to satisfy the restriction condition.

Therefore, it is possible to prevent an input error of three-dimensional information.

[0015] Specifically, the restriction condition transmitting means is configured in various modes.

As a first aspect, the restriction condition transmitting means includes a photographing area display means and an inputtable area display means. The photographing area display means displays a photographed image of the photographing area. The input area display means can obtain three-dimensional information by projecting pattern light if a subject is present in the photographing area display image displayed by the photographing area display means.

In the above configuration, the inputtable area generally depends on the projection angle of view of the pattern light, the reach of the pattern light (light emission amount), or the imaging sensitivity of the imaging means (brightness of the lens, photographing time, etc.). Decided.

According to the above arrangement, whether or not the three-dimensional information can be input can be determined based on whether or not the subject is present in the inputtable area. Input is possible. Therefore, an input error of three-dimensional information can be prevented in a manner that is easy for the operator to understand.

According to a second aspect, the restriction condition transmitting means includes a photographing area display means and an inputtable partial area display means. The photographing area display means displays a photographed image of the photographing area. The input-capable partial area display means includes an input-capable partial area in which a three-dimensional information can be obtained by projecting a pattern light from a partial area where a subject is present in a captured image displayed by the capturing area display means Is displayed.

According to the above configuration, three-dimensional information can be obtained for a subject in the input-capable partial area. Whether or not three-dimensional information can be input can be determined based on whether or not the input-capable partial area is displayed, and it can be determined for which subject the three-dimensional information can be input. Therefore, an input error of three-dimensional information can be prevented in a manner that is easy for the operator to understand.

In a third aspect, the limiting condition transmitting means includes:
It includes a subject detection unit and a warning unit. The subject detecting means detects whether or not the subject exists within an effective distance range in which three-dimensional information can be obtained by projecting the pattern light. The warning unit issues a warning when the subject detection unit cannot detect the presence of the subject within the effective distance range.

According to the above configuration, when no subject exists within the effective distance range, the photographer can know that by a warning. The warning is transmitted to the operator by visual display or sound. The presence or absence of the warning indicates whether or not the input of the three-dimensional information is possible. At the time of the warning, if the subject is moved until the warning disappears, the three-dimensional information can be input. Therefore, an input error of three-dimensional information can be prevented in a manner that is easy for the operator to understand.

In the first and second aspects, preferably, the photographing area display means is a liquid crystal display.

In the above configuration, the liquid crystal display (L
CD) is thin and the power supply is small. Therefore, it is easy to reduce the size of the three-dimensional information input camera including the photographing area display means.

[0025] In the first aspect, preferably, the inputtable area display means displays the shooting area according to a change in a shooting angle of view of the imaging means and / or a change in a projection angle of pattern light. The display size of the input enabled area in the image changes.

According to the above arrangement, when the photographing angle of view of the image pickup means and / or the projection angle of the pattern light changes and the inputtable area changes, the display size of the inputtable area in the photographed image changes accordingly. Changes. Therefore, even when the size of the display of the inputtable area changes due to a change in the angle of view, the range of the inputtable area can be known, so that
Three-dimensional information of the subject can be input.

[0027] In the first aspect, preferably, the pattern light is changed according to a change in the photographic angle of view of the imaging means so that the display of the inputtable area in the photographic area display image does not change. Pattern light projection angle changing means for changing the projection angle is provided.

According to the above arrangement, even if the angle of view is changed,
Since the size of the display of the inputtable area in the captured image does not change, the operator can perform an operation such that the subject enters the inputtable area only by changing the angle of view of the image without moving. Therefore, the operation becomes easy.

[0029]

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a three-dimensional information input camera (hereinafter, referred to as a 3D camera) according to an embodiment of the present invention will be described with reference to the drawings.

As shown in the front view of FIG. 1, the 3D camera includes a stripe pattern projection unit 1, a box-shaped camera body 2,
And a rectangular parallelepiped imaging unit 3 (shown by a thick line). The imaging unit 3 is detachable from the right side of the camera body 2 when viewed from the front.

The image pickup section 3 is provided with an image pickup circuit having a CCD color area sensor 303 (see FIG. 5) at an appropriate position behind a zoom lens 301 having a macro function as a photographing lens. In addition, similarly to the silver halide lens shutter camera, a light control circuit 305 including a light control sensor 305 for receiving the reflected light of the flash light from the subject at an appropriate position in the imaging unit 3.
Reference numeral 04 (see FIG. 5) further includes a distance measuring sensor AF for measuring the distance to the subject and an optical finder 31.

On the other hand, a zoom motor M1 (see FIG. 5) for changing the zoom ratio of the zoom lens 301 and moving the lens between the accommodation position and the photographing position and focusing is provided inside the imaging section body 3. M2 (see FIG. 5).

On the front surface of the camera body 2, a grip portion 4 is provided at an appropriate position on the left end, a built-in flash 5 is provided at an appropriate upper portion on the right end, and a 3D camera and an external device (for example, another 3D camera or the like). An IRDA port for performing infrared communication with a personal computer) is provided.
A shutter button 9 is provided on the upper surface of the camera body 2.

The stripe pattern projection unit 1 is located between the camera main unit 2 and the imaging unit main unit 3, and the stripe pattern projection unit 501 is arranged. The projection unit 501 is arranged so that the center of the stripe pattern is located at substantially the same height as the center of the optical axis of the photographing lens 301. The stripe pattern is arranged so that the pattern direction is perpendicular to the direction away from the optical axis. Since it is fundamental to obtain three-dimensional information based on the principle of triangulation, it is necessary to increase the so-called base line length and secure the accuracy, and to provide an offset or relative to an arrangement with an angle other than vertical. The objective is to cover the subject with a small stripe pattern. The projection of the stripe pattern uses flash light here. A film is used for the stripe pattern.

As a modification of these, the projection may be lamp light instead of flash light. The stripe pattern is not limited to a film, but may be a glass substrate provided with a pattern such as a pigment or a dye.

As shown in the rear view of FIG. 2, on the back of the camera body 2, an LCD display unit 10 for displaying a monitor of a captured image (corresponding to a viewfinder) and displaying and reproducing a recorded image is provided. Is provided. Key switches 521 to 526 for operating the 3D camera, and a power switch P of the camera main body are located below the LCD display unit 10.
S is provided. On the left side of the power switch PS, there are provided an LED 1 which is lit when the power is ON, and a BUSY display LED 2 which indicates a state in which an input to the camera is not accepted because the memory card is being accessed or necessary for photographing preparation.

Further, on the back of the camera body 2, there is provided a photographing / playback mode setting switch 14 for switching between a "photographing mode" and a "playback mode". The photographing mode is a mode in which a photograph is taken, and the reproduction mode is a mode in which a photographed image recorded on the memory card 8 (see FIG.
This is a mode for reproducing and displaying on the D display unit 10. The shooting / playback mode setting switch 14 also includes a two-contact slide switch. For example, when the switch is slid down, the playback mode is set, and when the switch is slid up, the shooting mode is set.

Further, a four-way switch Z is provided on the upper right side of the back of the camera. By pressing buttons Z1 to Z2, the zoom motor M1 (see FIG. 5) is driven to perform zooming. Exposure compensation is performed by pressing Z4.

An LCD button for turning on and off the LCD display is provided on the back side of the image pickup unit 3. Each time the button is pressed, the LCD display switches between on and off states. For example, when photographing is performed exclusively using the optical viewfinder 31, the LCD display is turned off for the purpose of saving power. At the time of macro shooting, by pressing the MACRO button, the focus motor M2 is driven and the shooting lens 301 enters a state where macro shooting is possible.

A flash power supply for projecting a stripe pattern, that is, a 3D flash power switch Z5 is arranged on the back side of the stripe pattern projection unit 1.

As shown in the side view of FIG. 3, a DC input terminal and a video output terminal Video for outputting the contents displayed on the liquid crystal to an external video monitor are provided on the side surface of the main body 2 of the 3D camera. Is provided.

As shown on the bottom of FIG.
A battery loading chamber 18 and a card loading chamber 17 for the memory card 8 are provided on the bottom surface of the device, and the loading port is closed by a clamshell type lid 15. The drive source of the 3D camera in the present embodiment is a power battery formed by connecting four AA batteries in series. Further, on the bottom surface, a release lever Rel for releasing the engagement between the imaging unit 3 and the main unit 2 connected by the connector and the hook-shaped connecting tool is provided.

A battery loading chamber 518 and a lid 515 are provided on the bottom surface of the stripe pattern projection unit 1 in the same manner as the camera body 2, and a flash battery different from the camera body 2 is used. A tripod screw 50 is provided on the bottom surface of the stripe pattern projection unit 1.
2 are provided. The tripod screw 502 is provided in the stripe pattern projection unit 1 located relatively at the center in view of the balance of the camera.

Next, the internal blocks of the image pickup section 3 will be described with reference to the blocks of FIG.

The CCD 303 is a macro zoom lens 30
The light image of the object formed by the step 1 is represented by R (red), G
(Green) and B (blue) color component image signals (signals composed of a signal sequence of pixel signals received by each pixel) are photoelectrically converted and output. The timing generator 314 includes the CCD 30
3 to generate various timing pulses for controlling the driving of the third driving circuit.

The exposure control in the image pickup section 3 is performed by adjusting the exposure amount of the CCD 303, that is, the charge accumulation time of the CCD 303 corresponding to the shutter speed, since the stop is a fixed stop. If an appropriate shutter speed cannot be set when the subject brightness is low, the CCD 3
By performing the level adjustment of the image signal output from the device 03, the improper exposure due to the insufficient exposure is corrected. That is, when the luminance is low, the exposure control is performed by combining the shutter speed and the gain adjustment. The level adjustment of the image signal is performed in gain adjustment of an AGC circuit described later in the signal processing circuit 313.

The timing generator 314 generates a drive control signal for the CCD 303 based on a reference clock transmitted from the timing control circuit 202 of the main unit 2. The timing generator 314 includes, for example, a timing signal of integration start / end (exposure start / end),
A clock signal such as a read control signal (horizontal synchronization signal, vertical synchronization signal, transfer signal, etc.) for the light receiving signal of each pixel is generated,
Output to CD303.

The signal processing circuit 313 performs predetermined analog signal processing on an image signal (analog signal) output from the CCD 303. The signal processing circuit 313
It has a DS (correlated double sampling) circuit and an AGC (auto gain control) circuit. The CDS circuit reduces the noise of the image signal, and the level of the image signal is adjusted by adjusting the gain of the AGC circuit.

The light control circuit 304 controls the amount of light emitted from the built-in flash 5 during flash photography by the overall control unit 2 of the main unit 2.
11 to control the light emission amount to a predetermined value. In flash photography, the reflected light of the flash light from the subject is received by the light control sensor 305 at the same time as the start of exposure, and when the amount of received light reaches a predetermined light emission amount, the light control circuit 304 is provided in the control unit 211. The FL control circuit emission stop signal is output. The FL control circuit forcibly stops the light emission of the built-in flash 5 in response to the light emission stop signal, whereby the light emission amount of the built-in flash 5 is controlled to a predetermined light emission amount.

The 3D information input will be described later in the sequence, but is obtained from two flash images. One is an image with stripe pattern projection and the other is an image without stripe pattern projection. Ideally, the basic luminosity (see FIG. 20) is constant between the two images. When extracting phase information from fringe pattern information, basic luminous intensity information must be removed. Therefore, in the case of taking two pictures, the flash emission time is kept constant without performing separate light control. Note that the flash light control itself is controlled by the overall control unit 211 of the camera body 2.

As described above, the imaging section 3 and the main body 2 are provided on the mounting surface 334 of the imaging section 3 by 334a to 334.
g, and the seven groups of connection terminals 234a to 234g provided on the connection surface 233 of the main body 2 allow the imaging unit 3 and the main body 2 to pass through the stripe pattern projection unit 1. And are electrically connected. Also,
The stripe pattern projection unit 1 and the main unit 2 are electrically connected by a connection terminal of 234h.

Next, the internal blocks of the camera body 2 will be described.

In the camera body 2, the A / D converter 205 converts each pixel signal of the image signal into a 10-bit digital signal.

In the camera body 2, a timing control circuit 202 for generating a reference clock, a timing generator 314, and a clock for the A / D converter 205 is provided. The timing control circuit 202 includes the control unit 2
11.

The black level correction circuit 206 corrects the black level of the A / D converted pixel signal (hereinafter referred to as pixel data) to a reference black level. The WB circuit 207 performs level conversion of pixel data of each of the R, G, and B color components so that the white balance is also adjusted after the γ correction. The WB circuit 207 converts the level of the pixel data of each of the R, G, and B color components using the level conversion table input from the overall control unit 211. The conversion coefficient (gradient of the characteristic) of each color component in the level conversion table is set by the overall control unit 211 for each captured image.

The γ correction circuit 208 corrects γ characteristics of pixel data.

The image memory 209 is a memory for storing pixel data output from the gamma correction circuit 208. The image memory 209 has a storage capacity for one frame. That is, in the image memory 209, the CCD 303
In the case of having pixels in rows and m columns, it has a storage capacity for pixel data of n × m pixels, and each pixel data is stored in a corresponding pixel position.

The VRAM 210 is a buffer memory for image data reproduced and displayed on the LCD display unit 10. VR
The AM 210 has a storage capacity for image data corresponding to the number of pixels of the LCD display unit 10.

In the photographing standby state, the A / D converters 205 to γ correction circuit 208 perform predetermined signal processing on each pixel data of an image captured every 1/30 (second) by the imaging unit 3. Thereafter, the image data is stored in the image memory 209, transferred to the VRAM 210 via the overall control unit 211, and displayed on the LCD display unit 10 (live view display). Thus, the photographer can visually recognize the subject image from the image displayed on the LCD display unit 10. Further, in the reproduction mode, the image read from the memory card 8 is subjected to predetermined signal processing by the overall control unit 211 and then transferred to the VRAM 210, where
Will be displayed.

The card I / F 212 is an interface for writing image data to the memory card 8 and reading image data.

The flash control circuit 216 is a circuit for controlling light emission of the built-in flash 5. The flash control circuit 216 controls the presence / absence of light emission of the built-in flash 5, the amount of light emission, the light emission timing, and the like based on the control signal of the overall control unit 211, and based on the light emission stop signal STP input from the dimming circuit 304. Is controlled.

The RTC 219 is a clock circuit for managing the shooting date and time. It is driven by another power source (not shown).

The operation section 250 is provided with the various switches and buttons described above.

The shutter button 9 is in a half-pressed state (S1) as employed in a silver halide camera and in a depressed state (S1).
2) is a two-stage switch that can be detected. When the shutter button is set to the state S1 in the standby state, the distance information is transmitted to the overall control unit 2 based on the distance measurement information from the distance measurement sensor AF.
Input to 11. According to an instruction from the overall control unit 211, A
By driving the F motor M2, the photographing lens 301 is moved to the in-focus position.

The overall control unit 211 is composed of a microcomputer, and includes the inside of the image pickup unit 3 and the camera main unit 2 described above.
The driving of each of the members is organically controlled to totally control the photographing operation of the 3D camera 1. This will be described with reference to the block diagram of FIG.

The overall control unit 211 includes a luminance determination unit 211a for setting an exposure control value (shutter speed (SS)) and a shutter speed setting unit (SS setting unit 2).
11b).

The luminance determining unit 211a determines the brightness of the subject in the photographing standby state by using an image captured by the CCD 303 every 1/30 (second). That is, the luminance determination unit 211a determines whether the image memory 2
The brightness of the subject is determined using the image data that is renewed and stored in step 09.

The shutter speed setting section 211b sets the shutter speed (integration time of the CCD 303) based on the result of the brightness determination of the subject by the brightness determination section.

Further, the overall control unit 211 includes a filter unit 211f for performing a filtering process and a recorded image generating unit 211g for generating a thumbnail image and a compressed image in order to perform the recording process of the photographed image. In order to reproduce the image recorded on the LCD display unit 10, a reproduced image generation unit 211h that generates a reproduced image is provided.

The filter section 211f corrects high-frequency components of an image to be recorded by a digital filter to correct image quality related to contours.

The recording image generation section 211f is provided in the image memory 2
The pixel data is read from 09 and a thumbnail image and a compressed image to be recorded on the memory card 8 are generated. The recording image generation unit 211h reads pixel data for every eight pixels in both the horizontal direction and the vertical direction while scanning from the image memory 209 in the raster scanning direction, and sequentially transfers the pixel data to the memory card 8, thereby obtaining the thumbnail image. Is recorded on the memory card 8 while generating.

The recording image generating section 211f reads out all pixel data from the image memory 209 and converts these pixel data into JPE data such as two-dimensional DCT transform and Huffman coding.
A predetermined compression process according to the G method is performed to generate image data of a compressed image, and the compressed image data is recorded in the main image area of the memory card 8.

In the case of the 3D information input mode, JP
Since it is desirable not to perform EG compression, it is treated as 1/1 compression when passing through the recording image generation unit 211f.

In the photographing mode, when photographing is instructed by the shutter button 9 in the photographing mode, the overall control section 211 displays the thumbnail image of the image taken into the image memory 209 after the photographing instruction and the compression ratio set by the compression ratio setting switch 12. Generates a compressed image compressed by the JPEG method, and generates tag information (frame number, exposure value, shutter speed, compression ratio, shooting date, flash on / off data at shooting, scene information, image determination result regarding the shot image) ) Are stored in the memory card 8 together with information such as

In the case of the 3D information input mode, as shown in FIG. 7, 3D information of one subject is obtained for the first time in the first and second frames. That is, the first image is an image with a stripe pattern and the second image is a normal image without a stripe pattern and b. Normally, if a card can take 40 shots, 20
This is a 3D image of the scene.

Each frame of the image recorded by the 3D camera is composed of a tag portion, high-resolution image data ((1600 × 1200) pixels) compressed in JPEG format, and image data (80 × 60) pixels for thumbnail display. ) Is recorded.

When the photographing / reproduction mode setting switch 14 is set to the reproduction mode, the image data having the largest frame number in the memory card 8 is read out, the data is expanded by the reproduction image generation section 211h, and the data is expanded.
By being transferred to 0, the image with the largest frame number, that is, the most recently captured image is displayed on the display unit 10. By operating the UP switch Z3, an image having a large frame number is displayed, and the DOWN switch Z4 is displayed.
By pressing, an image with a small frame number is displayed.
However, the image a when photographing in the 3D mode, that is, the image with the stripe pattern is not displayed, and only the image b is displayed.

Next, the portion of the stripe pattern projection section 1 will be described. The internal circuit of the stripe pattern projection unit 1 operates when the switch of the 3D flash power switch Z5 is ON. If it is ON, the flash control circuit 216 of the camera body and the built-in flash 5 enter a non-operation state. The control circuit 514 of the stripe pattern projection unit 1 includes a circuit for operating the flash 505 of the stripe pattern projection unit 1 and a circuit for switching the stripe pattern. To switch the mask, a signal is sent to the mask motor M3 to operate the pattern mask 530. A power supply circuit and a battery (not shown) are additionally arranged in the stripe pattern projection unit 1. The control circuit 514 controls the zoom motor M4 of the flash.

The inside of the stripe pattern projection unit 1 is as shown in FIG. A xenon tube 531 that emits flash light, a cylindrical concave lens 532 for projecting the pattern onto the subject in a wide manner, a mask pattern unit 530, and a projection window 5 for the mask pattern unit.
There is a shaft 534 for completely retracting rotation from the shaft 33, and a motor (not shown) for rotating the shaft. The control circuit 2 includes
There are a capacitor for accumulating electric energy for flash light, a switch IGBT for interrupting flash light emission upon receiving a signal from the light control sensor 305, and the like. In the embodiment in which the flash is zoomed, the xenon tube 531 is moved in the optical axis direction of the cylindrical concave lens 532 as shown in FIG. The zoom mechanism of the flash has the same configuration as that of the conventional zoom flash, and thus is omitted here.

On the other hand, when a configuration in which the flash is not zoomed is adopted, the xenon tube 531 shown in FIG. 14 is fixed, and the zoom motor M4 for the flash shown in FIG. 5 is omitted.

In FIG. 5, the flash light is controlled by the signal from the overall control circuit 211 in FIG. 5, as a signal path for receiving the signal of the light control sensor and controlling the flash light amount. The signal from the dimming circuit 304 may be directly input through the 334f terminal to control the flash emission time.

The mask pattern 530 is as shown in FIG. The number of stripe patterns is 10 to 30 (here, there are 13), and each stripe has a density distribution. Based on this density distribution, a phase shift when light is received can be detected, and a phase image, that is, a distance distribution image (three-dimensional image) can be obtained. Each concentration shows a triangular wave with a distribution of, for example, 20% to 70%. In principle, any combination of a monotonically increasing portion and a monotonically decreasing portion may be used, and each of them may have a sine wave or a Gaussian distribution. In FIG. 8A, a portion indicated by a symbol K is a portion where the color is changed in order to specify which frequency component. As shown in FIG. 8B, a portion having a different density is a portion K having a color.

Further, in order to improve the accuracy of position specification,
For example, a colored pattern is placed in the center, and not only a stripe with gradation but also a marker using color information can be placed to improve the accuracy of position information.

In order to accurately detect the phase shift of the stripe pattern projected on the subject, the entire density distribution must be 5
A contrast of about 0% is required. If the CCD sensor can detect a change of (SN) of 5% from the detection ability, it is possible to distinguish 10 levels of density here. As the contrast increases, the resolution increases, and the accuracy in obtaining 3D information improves.

Next, the operation of the three-dimensional information input camera will be described with reference to FIGS.

First, the main switch PS of the camera is turned on.
After that, the 3D flash switch Z5 is turned on (S
1). Next, the 3D mode is set (S2). Here, the mode is set using the switches 521 to 526. 3
The mode may be automatically set at the same time when the D flash switch Z5 is turned on. If the circuit type and the power source type are supplied only from the camera body, the settings may be made only by the switches 521 to 526.

If the mode is set, the BUSY display LED
2 is turned on (S3), and as shown in FIG.
The 3D photographable area display E is lit on the CD display 10.
This display shows a stripe pattern projectable area. And 3D
The charging of the flash capacitor (not shown) is started (S4). Wait for charging to be completed (S5).
The SY display LED 2 goes out (S6). Then, in S7, a release signal (shutter button 9 is turned on) is waited for.

[0088] 3D photographing requires two continuous shots. 1
One image is obtained with a stripe pattern, and one image is obtained without a stripe pattern.

When the release signal is received, the photographing of the first image is started, and the integration of the image sensor is started (S8). During this integration, the flash with the stripe pattern emits light, and a stripe pattern image is obtained. Here, the image with the stripe pattern is the first image, but may be the second image.

Next, in the camera body 2, image data a
(Image with stripe pattern) is stored. On the other hand, unlike a general flash, the stripe pattern projection unit 1 prohibits entry into additional charging after flash emission (S31), and switches mask patterns (S32).

When switching the mask pattern, the mask pattern 530 is evacuated by the mask motor M3 as shown by the dotted line in FIG. This retraction time is shortened, and the photographing interval between the two images is shortened as much as possible. The displacement of the image is negligible even if the subject moves. For example,
The target is 100 ms or less including the bound of the mask 530.

When this retraction is performed by the mask motor M3,
Requires large current consumption. Therefore, if flash charging starts at the same time, both require a large current,
When the motor M3 does not move, it cannot be evacuated,
An image without a stripe pattern cannot be obtained in the second photographing. Therefore, simultaneous operation of flash capacitor charging and motor energization is avoided.

After the pattern is switched, the second image pickup is started (S10). Similarly, a flash light is emitted (S1
1) An image without a stripe pattern is obtained.

Then, the image b without stripe pattern is stored in S22, and the image a with stripe pattern and the image b without stripe pattern are written in the memory card 8 in S23.

Here, the reason why the images a and b are collectively written to the memory card 8 is to shorten the time interval for photographing two images. This is because it takes time to write data one by one. That is, when the 3D mode is set, a mode is set in which data is written to the memory card 8 two by two.

On the other hand, the striped pattern projection unit 1 returns the mask pattern 530 which has been evacuated (S33). And
Here, the charging of the 3D flash is restarted for the first time (S3
4) The BUSY display LED 2 is turned on again (S3)
5).

If the ON state of the 3D flash switch Z5 continues, the process returns to S5 (S12).

On the other hand, a photographing preparation state before entering the photographing sequence will be described. When a subject to be photographed is captured by an LCD finder or the like, a photographable area is generally displayed in the entire finder. There are two approaches to three-dimensional information shooting and normal shooting.

One is a method in which the flash is not zoomed with respect to the zoom of the photographing lens. The feature of this method is
Since the frequency of the stripe pattern is constant for a specific subject, the accuracy of the three-dimensional information hardly changes. However, the shooting range changes on the viewfinder.

The other is a method of zooming the flash. This means that the accuracy of the three-dimensional information changes,
The shooting range does not change on the viewfinder. The reason for zooming the flash is that when the taking lens is wide-angle, I want to project a wide stripe pattern to fit the lens. This is because when the photographing lens is telephoto, a wide projection range is not required according to the lens, but it is desired to project the stripe pattern far.

As shown in FIG. 14, when the taking lens is telephoto, the xenon tube 531 is at a solid line position (position a), and when the taking lens is wide-angle, it is at a chain line position (position b). In this case, the stripe pattern projected on the subject is as shown in FIG. FIG.
(A) shows the case of position a (telephoto), and (b) shows the case of position b (wide angle). For example, the frequency of the stripe pattern on the projected object is the position b
In this case, the position becomes 1/2 of the position a, but since the magnification of the photographing lens also becomes 1/2, the frequency of the stripe pattern captured by the image sensor can be set to be the same. Thus, the same angle of view resolution can be obtained regardless of whether the lens is wide-angle or telephoto.

When the photographing lens and the flash are simultaneously zoomed, the design can be made so that the area does not change.
As shown in (b), a three-dimensional information inputtable area may be displayed by a fixed frame E.

On the other hand, when only the taking lens zooms,
The three-dimensional information input area changes. FIG. 11A shows an example in which the three-dimensional information input area is displayed by a frame E having a variable size.
(B) and (c). FIG. 11A shows a case where the focal length of the photographing lens is 7 mm, and FIG.
FIG. 11C shows a case of a telephoto lens of 4 mm, and FIG. Although the magnification on the screen changes, the three-dimensional information input area for the subject is the same.

Another problem concerning the accuracy is the problem of the contrast of the fringe pattern image. The stripe pattern image on the subject has a contrast as shown in FIG. 9B when the distance is short, but decreases as the distance increases as shown in FIG. 9A. This depends on the xenon tube having a finite size. FIG.
If the contrast is reduced as in (a), the number of steps of the density cannot be increased. For example, if only a change of 5% can be obtained in the image pickup system, only five steps can be taken in the range of 45% to 65%. In (a) with respect to (b) of FIG. 9, ten steps are reduced to five steps, and three-dimensional information accuracy is reduced to half. Therefore, there is a limit of information input somewhere, and it is necessary to inform the photographer of this.

For example, assuming that the three-dimensional information input limit is up to 1.5 m, as shown in FIG.
A warning display F is issued for the above subject. The distance to the subject in the figure is set to 3 m. Here, "TO
“OFAR” is displayed. This distance determination is performed by the distance measurement sensor AF of the imaging unit body 3.

As another method, as shown in FIG. 11E, a 3 m and 2 m subject cannot be used, and a 1.5 m subject can be input with three-dimensional information. By surrounding with E, three-dimensional information input display is performed. In the case of the flash zoom type, the limit distance is short when the stripe pattern projection is wide-angle. At this time, the distance is, for example, 1 m. The displayed threshold also changes. The distance warning display is particularly effective when the flash has a zoom.

The above is the operation of the camera. Data for obtaining 3D information is in the memory card 8. To reproduce the data into a 3D image, this data is subjected to post-processing by a computer such as a personal computer. This process is performed according to the procedure shown in FIG.

That is, after the memory card 8 is set in the personal computer (not shown), data of the image a with the stripe pattern and the image b without the stripe pattern are input from the memory card 8 (D1, D2). The basic luminous intensity information is extracted from the image a, and a basic luminous magnification n for the image b is obtained (D3).
The basic luminous intensity is image data that does not depend on the stripe pattern as shown in FIG.

Next, the basic luminous intensity levels of the image a and the image b are matched to obtain only the stripe pattern information c (D4). Then, based on the stripe pattern information c, a phase image in which the gain is normalized is extracted (D5).

At D6, the distance distribution of the subject is calculated from the phase image. At this time, since the positions of the stripe patterns can be distinguished, it is possible to accurately specify the order of the stripe position corresponding to the phase position. In other words, the position of the projection pattern and the position of the reflection pattern from the subject can be accurately matched. In this way, the distance to the subject,
And the distance distribution can be obtained as accurate information.
When obtaining a three-dimensional image, only information on the distance distribution may be used.

The 3D camera described above displays three-dimensional information input conditions (frame E, warning F) in order to prevent input errors of three-dimensional information from occurring.

The present invention is not limited to the above embodiment, but can be implemented in various other modes.

For example, although the embodiment of the digital camera has been described, a silver halide camera may similarly capture two images, one with a stripe pattern and the other without a stripe pattern, on a silver halide film and create a 3D image by post-processing. Is possible. In this case, if the film is developed, digitized by a film scanner, and loaded into a computer such as a personal computer, the post-processing becomes the same.

Further, the present invention is not limited to a three-dimensional information input camera that accumulates imaging information of a subject image at the time of pattern projection and imaging information at the time of not projecting a pattern. A three-dimensional information input camera for input is also applicable.

[Brief description of the drawings]

FIG. 1 is a front view of a three-dimensional information input camera according to an embodiment of the present invention.

FIG. 2 is a rear view of the camera of FIG. 1;

FIG. 3 is a left side view of the camera shown in FIG. 1;

FIG. 4 is a bottom view of the camera of FIG. 1;

FIG. 5 is a circuit block diagram of the camera of FIG. 1;

FIG. 6 is a detailed block diagram of a main part of FIG. 5;

FIG. 7 is an explanatory diagram of a data array.

FIG. 8 is an explanatory diagram of a mask pattern.

FIG. 9 is an explanatory diagram of contrast of a mask pattern.

FIG. 10 is an explanatory diagram of an LCD display.

FIG. 11 is an explanatory diagram of an LCD display.

FIG. 12 is a flowchart of a photographing operation.

FIG. 13 is a flowchart of photographed image processing.

FIG. 14 is a configuration diagram of a flash unit.

FIG. 15 is an explanatory diagram of a mask pattern.

FIG. 16 is an explanatory diagram of a conventional example.

FIG. 17 is an explanatory diagram of a conventional example.

FIG. 18 is an explanatory diagram of a conventional example.

FIG. 19 is an explanatory diagram of a conventional example.

FIG. 20 is an explanatory diagram of a conventional example.

[Explanation of symbols]

 DESCRIPTION OF SYMBOLS 1 Stripe pattern projection part (projection means) 2 Camera body part (imaging means) 3 Imaging part 4 Grip part 5 Built-in flash 8 Memory card 9 Shutter button 10 LCD display part (Shooting area display means) 14 Mode setting switch 15 Cover 17 Card Loading chamber 18 Battery loading chamber 31 Optical viewfinder 301 Zoom lens 302 Imaging circuit 303 CCD color area sensor 304 Dimming circuit 305 Dimming sensor 501 Stripe pattern projection unit 502 Tripod screw 515 Lid 518 Battery loading chamber 521 to 526 Key switch 530 Mask pattern Unit 531 Xenon tube 532 Concave lens 533 Projection window 534 Axis AF Distance measuring sensor (Subject detecting means) E Frame (Limit condition transmitting means, Inputtable area displaying means) F Warning display (Limit condition transmitting means, Warning means) M1 Zoo Motor M2 focus motor M3 mask motor M4 flash zoom motor PS power switch Rel release lever Z 4-way switch Z1~Z4 button Z5 3D flash power switch

Continuation of the front page (58) Field surveyed (Int.Cl. 7 , DB name) G01B 11/24 G06T 1/00 H04N 5/225 H04N 13/02

Claims (7)

(57) [Claims]
1. An image pickup device for photographing a photographing region, and a projecting device for projecting a pattern light onto the photographing region, wherein a projection pattern formed on a subject in the photographing region by the pattern light projected by the projecting device is formed. Shooting by shooting means 3
In dimension information input camera, a limiting condition for making it possible to obtain three-dimensional information by the projection of the pattern light in the outside the limiting conditions
A three-dimensional information input camera, comprising: a restriction condition transmitting means for transmitting information relating to conditions under which three-dimensional information cannot be obtained even though photography is possible .
2. A photographing area display means for displaying a photographed image of a photographing area, wherein the restriction condition transmitting means comprises:
2. A three-dimensional information input camera according to claim 1, further comprising input-capable area display means for displaying input-capable areas in which three-dimensional information can be obtained by projecting pattern light when a subject is present. .
3. The photographing area display means for displaying a photographed image of a photographing area, wherein the restriction condition transmitting means comprises: a pattern light of a partial area where a subject exists in the photographed image displayed by the photographing area display means; 2. A three-dimensional information input camera according to claim 1, further comprising input-capable partial area display means for displaying an input-capable partial area in which three-dimensional information can be obtained by projecting the image.
4. A subject detecting means for detecting whether or not a subject exists within an effective distance range in which three-dimensional information can be obtained by projecting the pattern light; 3. The three-dimensional information input camera according to claim 1, further comprising: a warning unit that issues a warning when the unit cannot detect the presence of the subject within the effective distance range.
5. The method according to claim 2, wherein the photographing area display means is a liquid crystal display.
Dimension information input camera.
6. The input-capable area display means displays the input-capable area in the image-capturing-area display image according to a change in a shooting angle of view of the image capturing means and / or a change in a projection angle of pattern light. Characterized in that the size changes,
The three-dimensional information input camera according to claim 2.
7. A pattern light projection angle changing unit that changes a projection angle of the pattern light according to a change in a shooting angle of view of the imaging unit so that a display of the input enabled area in the shooting area display image does not change. Characterized by comprising means,
The three-dimensional information input camera according to claim 2.
JP08715699A 1999-03-29 1999-03-29 3D information input camera Expired - Fee Related JP3353737B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP08715699A JP3353737B2 (en) 1999-03-29 1999-03-29 3D information input camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP08715699A JP3353737B2 (en) 1999-03-29 1999-03-29 3D information input camera

Publications (2)

Publication Number Publication Date
JP2000283739A JP2000283739A (en) 2000-10-13
JP3353737B2 true JP3353737B2 (en) 2002-12-03

Family

ID=13907137

Family Applications (1)

Application Number Title Priority Date Filing Date
JP08715699A Expired - Fee Related JP3353737B2 (en) 1999-03-29 1999-03-29 3D information input camera

Country Status (1)

Country Link
JP (1) JP3353737B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6915072B2 (en) 2002-10-23 2005-07-05 Olympus Corporation Finder, marker presentation member, and presentation method of positioning marker for calibration photography

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4501551B2 (en) * 2004-06-23 2010-07-14 富士ゼロックス株式会社 Three-dimensional shape measuring apparatus and method
US20060072122A1 (en) * 2004-09-30 2006-04-06 Qingying Hu Method and apparatus for measuring shape of an object
JP4380663B2 (en) 2006-06-08 2009-12-09 コニカミノルタセンシング株式会社 Three-dimensional shape measurement method, apparatus, and focus adjustment method
JP2012519277A (en) * 2009-02-27 2012-08-23 ボディー サーフェイス トランスレーションズ, インコーポレイテッド Physical parameter estimation using 3D display
JP5984422B2 (en) * 2012-02-23 2016-09-06 キヤノン株式会社 Imaging apparatus and imaging control method
JP6172904B2 (en) * 2012-09-05 2017-08-02 キヤノン株式会社 3D shape measuring apparatus, 3D shape measuring method, program, storage medium
JP6119232B2 (en) 2012-12-19 2017-04-26 株式会社ソシオネクスト Distance measuring device and distance measuring method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6915072B2 (en) 2002-10-23 2005-07-05 Olympus Corporation Finder, marker presentation member, and presentation method of positioning marker for calibration photography

Also Published As

Publication number Publication date
JP2000283739A (en) 2000-10-13

Similar Documents

Publication Publication Date Title
JP4406937B2 (en) Imaging device
EP2352278B1 (en) Imaging apparatus, a focusing method and a program for executing such a method
US8208034B2 (en) Imaging apparatus
US7796169B2 (en) Image processing apparatus for correcting captured image
US7102686B1 (en) Image-capturing apparatus having multiple image capturing units
JP3534101B2 (en) Digital camera
US7148928B2 (en) Camera body and interchangeable lens of a digital camera with image-dependent color compensation
JP4198449B2 (en) Digital camera
JP5004726B2 (en) Imaging apparatus, lens unit, and control method
US7920782B2 (en) Imaging device
CN100550995C (en) Image sensing apparatus and control method thereof
US5550587A (en) White balance adjustment device for a still-video camera having an electronic flash
US7764321B2 (en) Distance measuring apparatus and method
JP4348118B2 (en) Solid-state imaging device and imaging device
JP5276308B2 (en) Imaging apparatus and control method thereof
JP3473552B2 (en) Digital still camera
JP4644883B2 (en) Imaging device
US20050195285A1 (en) Electronic still camera and method of image acquisition of electronic still camera
US20030020814A1 (en) Image capturing apparatus
JP2007264196A (en) Strobe control unit and method
US7071986B2 (en) Digital camera utilizing illumination from a flash light for focus detection and control
JP2014044345A (en) Imaging apparatus
US7804533B2 (en) Image sensing apparatus and correction method
JP4615458B2 (en) Exposure control method and imaging apparatus
JP3896505B2 (en) Electronic camera

Legal Events

Date Code Title Description
FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20070927

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080927

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080927

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090927

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100927

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100927

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110927

Year of fee payment: 9

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110927

Year of fee payment: 9

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120927

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120927

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130927

Year of fee payment: 11

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees