WO2016051871A1 - Multi-lens imaging device, method for manufacturing same, and information terminal device - Google Patents

Multi-lens imaging device, method for manufacturing same, and information terminal device Download PDF

Info

Publication number
WO2016051871A1
WO2016051871A1 PCT/JP2015/066616 JP2015066616W WO2016051871A1 WO 2016051871 A1 WO2016051871 A1 WO 2016051871A1 JP 2015066616 W JP2015066616 W JP 2015066616W WO 2016051871 A1 WO2016051871 A1 WO 2016051871A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
image
imaging module
lens
module
Prior art date
Application number
PCT/JP2015/066616
Other languages
French (fr)
Japanese (ja)
Inventor
慶延 岸根
達郎 岩▲崎▼
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2016051871A1 publication Critical patent/WO2016051871A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • G03B35/10Stereoscopic photography by simultaneous recording having single camera with stereoscopic-base-defining system
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Definitions

  • the present invention relates to a multi-view imaging device, a manufacturing method thereof, and an information terminal device, and more particularly to a technique for arranging a plurality of imaging modules for estimating a distance from a subject.
  • the stereo camera estimates the distance to the subject from the positional deviation of the images taken by the two cameras. However, if there is an individual difference in the distortion shape of the captured images of the two cameras, an error occurs in the distance estimation because the individual position causes a shift in the subject position in the captured image.
  • Patent Document 1 captures an initialization chart using two cameras, calculates correction parameters, and performs image processing based on the correction parameters, thereby shifting the position of the two cameras.
  • the stereo camera which corrects the distortion accompanying this is described.
  • Patent Document 2 describes a stereo camera that performs distortion correction with high accuracy by performing image processing using a small number of line buffers in order to correct a positional shift caused by a relative position between two or more cameras. ing.
  • the present invention has been made in view of such circumstances, and a multi-lens imaging device that increases the distance recognition accuracy by mechanically absorbing individual differences in the distortion shape of the captured image of the imaging module, a manufacturing method thereof, and an information terminal device
  • the purpose is to provide.
  • a multi-eye imaging device that shoots a subject by arranging a plurality of imaging modules in a uniaxial direction, and each of the plurality of imaging modules includes an imaging lens and an imaging lens.
  • An image sensor that converts an optical image of a subject received through the imaging lens into an image signal, and the image sensor is attached to the imaging lens by adjusting the alignment in order to correct the resolution of the optical image.
  • An uniaxial direction in which a plurality of imaging modules in an overlapping region of an image captured by a first imaging module serving as a reference and an image captured by a second imaging module different from the first imaging module are arranged.
  • I / H is 96, where I is the height in the orthogonal direction and H is the image height in the direction orthogonal to the uniaxial direction of the image captured by the first imaging module. ⁇ I / H ⁇ 100 ⁇ 99.8% condition is satisfied.
  • I / H satisfies the condition of 98% ⁇ I / H ⁇ 100 ⁇ 99.8%. Furthermore, it is more preferable that I / H satisfies the condition of 99% ⁇ I / H ⁇ 100 ⁇ 99.8%. When such a condition is satisfied, the distortion shape of the image photographed by the first imaging module and the image photographed by the second imaging module more closely match, so that the distance recognition accuracy can be improved. .
  • a line connecting the center of the light receiving surface of the image pickup device of the first image pickup module and the center of the light receiving surface of the image pickup device of the second image pickup module passes through the X axis and passes through the center of the light receiving surface of the image pickup device of the first image pickup module.
  • the axis on the light receiving surface of the image pickup device of the first image pickup module perpendicular to the axis is the Y axis, and the axis that passes through the center of the light receiving surface of the image pickup device of the first image pickup module and is orthogonal to the X axis and the Y axis is the Z axis.
  • satisfies the condition of 0.1 ° ⁇
  • the chart When the chart is arranged at a position separated from the light receiving surface of the image sensor by a predetermined distance L and the chart is photographed by the first imaging module and the second imaging module, the first photographed by the first imaging module.
  • a chart arranged at the center of the overlapping region is used as a central index
  • a chart arranged around the overlapping region of the image is used as a peripheral index.
  • the center distance between the center of the light receiving surface of the image sensor of the first imaging module and the center of the light receiving surface of the image sensor of the second image module is d millimeters
  • the focal length of the imaging lens is f millimeters
  • the pixel pitch of the image sensor is s millimeters.
  • L is 1 meter
  • satisfying it is preferable.
  • ⁇ p satisfies the condition of 0 ⁇ p ⁇
  • a distance measuring unit that measures the distance from the multi-view imaging device to the subject based on the first image captured by the first imaging module and the second image captured by the second imaging module may be provided. Accordingly, the distance to the subject can be measured based on the first image and the second image.
  • High-resolution image generation means for generating an image with a resolution higher than the resolution of the image sensor based on the first image captured by the first imaging module and the second image captured by the second imaging module. Is preferred. Thereby, an image having a resolution higher than that of the image sensor can be acquired.
  • the multi-eye imaging device preferably has two eyes. Thereby, two images can be acquired appropriately.
  • the multi-lens imaging device is a three-lens system in which three imaging modules are arranged in a uniaxial direction, the first imaging module is a central imaging module, and the second imaging module is an imaging module at both ends. Good. By using the central imaging module as a reference, three images can be appropriately acquired.
  • an information terminal device is a multi-eye imaging device that shoots a subject by arranging a plurality of imaging modules in a uniaxial direction.
  • the imaging module includes an imaging lens and an imaging lens, respectively.
  • An image sensor that converts an optical image of a subject received through the image sensor into an image signal, and the image sensor is attached to the imaging lens by adjusting the alignment in order to correct the resolution of the optical image.
  • a direction orthogonal to a uniaxial direction in which a plurality of imaging modules in an overlapping region of an image captured by a first imaging module serving as a reference and an image captured by a second imaging module different from the first imaging module are arranged.
  • I / H is 96% ⁇ I, where I is the height of the image taken in the first imaging module and H is the height of the image perpendicular to the uniaxial direction of the image taken by the first imaging module. And H ⁇ 100 ⁇ 99.8% of satisfying multi-view imaging apparatus, and a communication means for transmitting and receiving digital data such as images.
  • one aspect of a method for manufacturing a multi-lens imaging device includes a first imaging lens and a first imaging lens that converts an optical image of a subject received through the first imaging lens into an image signal.
  • a first imaging module including an imaging device, wherein the first imaging module in which the alignment of the first imaging device is adjusted to correct the resolution of the optical image is fixed to a support member;
  • a second imaging module comprising: a second imaging lens; and a second imaging element that converts an optical image of a subject received through the second imaging lens into an image signal, wherein the resolution of the optical image is increased.
  • I / H is 96% ⁇ I / H ⁇ 100 ⁇ 9, where I is the height in the direction and I is the height of the image orthogonal to the uniaxial direction of the image captured by the first imaging module. The condition of 9.8% is satisfied.
  • the first imaging module is fixed and the second imaging module is temporarily fixed, and images are respectively captured by the fixed first imaging module and the temporarily fixed second imaging module.
  • the distortion shape of each image is calculated, the calculated distortion shape is matched and the orientation of the second imaging module is adjusted and fixed, and the image captured by the first imaging module and the second imaging module are further fixed.
  • the height in the direction orthogonal to the uniaxial direction in which the first imaging module and the second imaging module are aligned in the overlapping area with the captured image is I, and the direction orthogonal to the uniaxial direction of the image captured by the first imaging module Since the I / H satisfies the condition of 96% ⁇ I / H ⁇ 100 ⁇ 99.8% when the height of the image is H, the distance recognition accuracy using these images can be increased.
  • the present invention it is possible to increase the distance recognition accuracy by mechanically absorbing individual differences in the distortion shape of the captured image of the imaging module.
  • FIG. 1A is a front perspective view of the multi-eye camera 10.
  • FIG. 1B is a rear perspective view of the multi-eye camera 10.
  • FIG. 2 is a front perspective view of the imaging module 50.
  • FIG. 3 is a diagram illustrating an internal configuration of the imaging module 50.
  • FIG. 4 is a block diagram illustrating an example of the electrical configuration of the multi-view camera 10.
  • FIG. 5 is a diagram for explaining a distortion generation mechanism of the imaging module 50.
  • FIG. 6 is a diagram for explaining a distortion correction method of the multiview camera 10.
  • FIG. 7 is a flowchart showing the process of the manufacturing method of the multiview camera 10.
  • FIG. 8A is a front perspective view of the fixing member 80.
  • FIG. 8B is a front perspective view of the fixing member 80.
  • FIG. 8A is a front perspective view of the fixing member 80.
  • FIG. 8C is a front perspective view of the fixing member 80.
  • FIG. 9A is a plan view of the fixing member 80.
  • FIG. 9B is a plan view of the fixing member 80.
  • FIG. 9C is a plan view of the fixing member 80.
  • FIG. 10A is a diagram for explaining chart photographing.
  • FIG. 10B is a diagram for explaining chart photographing.
  • FIG. 11 is a graph showing the distortion shape of the left and right viewpoint images.
  • FIG. 12 is a graph showing the distortion shape of the left and right viewpoint images.
  • FIG. 13A is a diagram illustrating an optical axis tilt angle of the left viewpoint imaging unit 30L with respect to the right viewpoint imaging unit 30R.
  • FIG. 13B is a diagram illustrating an optical axis tilt angle of the left viewpoint imaging unit 30L with respect to the right viewpoint imaging unit 30R.
  • FIG. 14A is a diagram illustrating image overlap between a left viewpoint image and a right viewpoint image.
  • FIG. 14B is a diagram illustrating image overlap between the left viewpoint image and the right viewpoint image.
  • FIG. 15 is a diagram illustrating calculation of a difference between the distortion shape of the left viewpoint image and the distortion shape of the right viewpoint image.
  • FIG. 16 is a diagram illustrating parameters of the imaging optical system of the multi-lens camera 10.
  • FIG. 17A is a front perspective view of smartphone 100 with a multi-eye camera.
  • FIG. 17B is a rear perspective view of smartphone 100 with a multi-eye camera.
  • FIG. 18 is a block diagram illustrating an example of an electrical configuration of the smartphone 100 with a multi-view camera.
  • the multi-lens camera according to the present embodiment is a multi-lens imaging device that shoots a subject by arranging a plurality of imaging optical systems in a uniaxial direction, and here, two eyes that shoot a stereoscopic image of the same subject viewed from two viewpoints.
  • a camera will be described as an example.
  • FIG. 1A is a front perspective view of the multi-view camera 10.
  • the multi-view camera 10 includes a housing 20, a shutter button 22, a left viewpoint imaging unit 30L, and a right viewpoint imaging unit 30R.
  • the housing 20 is formed in a substantially rectangular parallelepiped box shape, and the shutter button 22 is disposed on the upper surface of the housing 20 in the figure.
  • the shutter button 22 is an input means for making preparations for shooting such as focus lock and photometry when half-pressed and for capturing an image when fully pressed.
  • the left-viewpoint imaging unit 30L and the right-viewpoint imaging unit 30R are arranged in the horizontal direction on the front side of the housing 20 with the left-viewpoint imaging unit 30L on the right side and the right-viewpoint imaging unit 30R on the left side. They are arranged side by side in an example of a uniaxial direction.
  • FIG. 1B is a rear perspective view of the multi-view camera 10. As shown in the figure, a display unit 40 and a cross key 42 are arranged on the back of the multi-view camera 10.
  • the display unit 40 is a lenticular lens type 3D monitor capable of stereoscopically displaying a stereoscopic image captured by the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R.
  • the cross key 42 is an input means for the photographer to operate the multiview camera 10.
  • the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R of the multi-view camera 10 use the imaging modules 50 having the same shape.
  • the imaging module 50 includes an imaging lens 54 on the front surface of the housing 52.
  • the optical system of the imaging module 50 includes an imaging lens 54 and an imaging element 56.
  • the imaging lens 54 includes a convex lens 54-1, a convex lens 54-2, and a concave lens 54-3, and is set to a predetermined focal length f (unit: millimeter). Each lens is arranged so that a predetermined interval is provided and the respective optical axes coincide with each other. Subject light incident from a convex lens 54-1 located on the front side of the housing 52 enters the image sensor 56 via a convex lens 54-2 and a concave lens 54-3.
  • the image sensor 56 is composed of a two-dimensional color CCD (Charge-Coupled Device) solid-state image sensor.
  • a large number of photodiodes are two-dimensionally arranged at a predetermined pixel pitch s (unit: millimeter) on the light receiving surface of the image sensor 56, and a color filter is arranged in a predetermined arrangement on each photodiode. ing.
  • the optical image of the subject formed on the light receiving surface of the image sensor 56 via the imaging lens 54 is converted into signal charges corresponding to the amount of incident light by the photodiode.
  • the multi-view camera 10 includes a left viewpoint imaging unit 30L, a right viewpoint imaging unit 30R, a display unit 40, a CPU (Central Processing Unit) 61, a ROM (Read Only Memory) 62, and an operation unit 63. , Imaging signal processing unit 65, control bus 66, data bus 67, memory control unit 68, main memory 69, digital signal processing unit 70, compression / decompression processing unit 71, integration unit 72, distance estimation unit 73, external memory control unit 74 And a block such as a display control unit 76, a camera shake correction control unit 77, and an angular velocity sensor 78.
  • a display control unit 76 a camera shake correction control unit 77
  • an angular velocity sensor 78 an angular velocity sensor
  • Each block operates under the control of the CPU 61.
  • the CPU 61 controls each block by executing a control program based on the input from the operation unit 63.
  • the ROM 62 stores various data necessary for control.
  • the CPU 61 controls each block of the multi-lens camera 10 by reading the control program recorded in the ROM 62 into the main memory 69 and sequentially executing it.
  • the main memory 69 is composed of SDRAM (Synchronous Dynamic Random Access Memory), and is used as a program execution processing area, a temporary storage area for image data, and various work areas.
  • SDRAM Serial Dynamic Random Access Memory
  • the operation unit 63 includes the shutter button 22 and the cross key 42 shown in FIG. 1 and outputs a signal corresponding to each operation to the CPU 61.
  • the left viewpoint imaging unit 30L includes an imaging lens 54L and an imaging element 56L.
  • the right viewpoint imaging unit 30R includes an imaging lens 54R and an imaging element 56R.
  • a large number of light receiving elements are two-dimensionally arranged at a pixel pitch s (unit: millimeter) on the light receiving surfaces of the image pickup elements 56L and 56R, and red (not shown) corresponding to each light receiving element.
  • R, green (G), and blue (B) primary color filters are arranged in a predetermined arrangement structure. The subject light imaged on the light receiving surface is converted into an electrical signal by each light receiving element and accumulated.
  • the electrical signal accumulated in each light receiving element is read out to a vertical transfer path (not shown).
  • the vertical transfer path transfers this signal to a horizontal transfer path (not shown) line by line in synchronization with a clock supplied from an image sensor driving unit (not shown). Further, the horizontal transfer path outputs the signal for one line transferred from the vertical transfer path to the imaging signal processing unit 65 in synchronization with a clock supplied from an imaging element driving unit (not shown).
  • the output of the image signal is started when the multiview camera 10 is set to the photographing mode. That is, when the multi-lens camera 10 is set to the photographing mode, output of an image signal is started in order to display a live view image on the display unit 40.
  • the output of the image signal for the live view image is temporarily stopped when an instruction for the main photographing is given, and is restarted when the main photographing is finished.
  • the live view image may be displayed based on the image signal captured by the left viewpoint imaging unit 30L, or the image signal captured by the left viewpoint imaging unit 30L and the image signal captured by the right viewpoint imaging unit 30R. May be displayed in a stereoscopic manner.
  • the captured image is displayed on the display unit 40 for a certain period of time (post view).
  • the user can confirm whether or not the captured image has been properly captured by confirming the postview.
  • the imaging signal processing unit 65 includes a correlated double sampling circuit, a clamp processing circuit, an automatic gain control circuit, and an analog / digital converter (not shown).
  • the correlated double sampling circuit removes noise contained in the image signal.
  • the clamp processing circuit performs processing for removing dark current components.
  • the automatic gain control circuit amplifies the image signal from which the dark current component has been removed with a predetermined gain corresponding to a set ISO (International Standards Organization) sensitivity (imaging sensitivity).
  • ISO International Standards Organization
  • imaging sensitivity imaging sensitivity
  • This image signal is so-called RAW data (raw image data), and has a gradation value indicating the density of R, G, and B for each pixel.
  • This digital image signal is stored in the main memory 69 via the data bus 67 and the memory control unit 68.
  • control bus 66 and the data bus 67 include a memory control unit 68, a digital signal processing unit 70, a compression / decompression processing unit 71, an integration unit 72, a distance estimation unit 73, and an external memory control unit. 74, a display control unit 76, and the like are connected, and these can transmit and receive information to and from each other via a data bus 67 based on a control signal of the control bus 66.
  • the digital signal processing unit 70 performs predetermined signal processing on the image signals of R, G, and B colors stored in the main memory 69, and outputs an image signal (Y) composed of a luminance signal Y and color difference signals Cr and Cb. / C signal). In addition, the digital signal processing unit 70 generates an image having a resolution higher than the resolution of the imaging elements 56L and 56R from the image signal captured by the left viewpoint imaging unit 30L and the image signal captured by the right viewpoint imaging unit 30R. You may make it function as a high-resolution image production
  • the compression / decompression processing unit 71 converts the input luminance signal Y and color difference signals Cr and Cb into an image signal (Y / C signal) in a predetermined format (for example, JPEG (Joint Photographic Experts Group). ) To generate compressed image data. Further, in accordance with a decompression command from the CPU 61, the input compressed image data is subjected to decompression processing in a predetermined format to generate non-compressed image data.
  • a predetermined format for example, JPEG (Joint Photographic Experts Group).
  • the accumulating unit 72 takes in R, G, and B image signals stored in the main memory 69 in accordance with a command from the CPU 61, and calculates an accumulated value necessary for AE control.
  • the CPU 61 calculates a luminance value from the integrated value and obtains an exposure value from the luminance value. Further, the aperture value and the shutter speed are determined from the exposure value according to a predetermined program diagram.
  • the distance estimation unit 73 (an example of a distance measurement unit and an example of a parallelization processing unit) performs a parallelization process on the image captured by the left viewpoint imaging unit 30L and the image captured by the right viewpoint imaging unit 30R, and performs the parallelization process.
  • the distance from the multiview camera 10 to the subject is estimated based on the performed image.
  • the collimating process is a process of converting both images so as to be equivalent to an image captured by a parallel optical system.
  • the deviation amounts in the optical axis direction of the left viewpoint imaging unit 30 ⁇ / b> L and the right viewpoint imaging unit 30 ⁇ / b> R necessary for the parallelization processing are stored in advance in the ROM 62.
  • the external memory control unit 74 controls reading / writing of data with respect to the recording medium 75 in accordance with a command from the CPU 61.
  • the recording medium 75 may be detachable from the main body of the multi-view camera 10 such as a memory card, or may be built in the main body of the multi-view camera 10. In the case of detachable, a card slot is provided in the main body of the multi-view camera 10, and the card slot is used by being loaded.
  • the display control unit 76 controls display on the display unit 40 in accordance with a command from the CPU 61.
  • the display unit 40 is a lenticular lens type 3D monitor as described above, and has directivity for the left viewpoint image captured by the left viewpoint imaging unit 30L and the right viewpoint image captured by the right viewpoint imaging unit 30R. Can be displayed simultaneously.
  • the display unit 40 displays a moving image (live view image) and can be used as an electronic viewfinder, and displays a photographed image before recording (postview image), a reproduced image read from the recording medium 75, and the like. Is possible.
  • the camera shake correction control unit 77 corrects camera shake of an image captured by moving the image sensors 56L and 56R in a plane orthogonal to the optical axis direction in accordance with camera shake accompanying the shooting operation of the multi-lens camera 10.
  • the angular velocity sensor 78 is detection means for detecting camera shake occurring in the multi-lens camera 10 and outputs a signal representing the angular velocity of the multi-lens camera 10 with respect to the horizontal direction and the vertical direction.
  • the angular velocity signals in the horizontal direction and the vertical direction detected by the angular velocity sensor 78 are input to the CPU 61 via the data bus 67.
  • the CPU 61 calculates the horizontal and vertical movement amounts of the image sensors 56 ⁇ / b> L and 56 ⁇ / b> R from the input angular velocity signal and outputs the movement amounts to the camera shake correction control unit 77.
  • the camera shake correction control unit 77 shifts (moves) the image sensors 56L and 56R while keeping the direction of the light receiving surface constant by the horizontal and vertical movement amounts acquired from the CPU 61. By moving the imaging elements 56L and 56R in this way, it is possible to correct camera shake of the left viewpoint image captured by the left viewpoint imaging unit 30L and the right viewpoint image captured by the right viewpoint imaging unit 30R.
  • FIG. 5A is a diagram showing how the imaging module 50 captures a subject.
  • the optical axes of the convex lens 54-1, the convex lens 54-2, and the concave lens 54-3 are all coincident.
  • the relationship between the subject light and the optical image of the subject is shown.
  • FIG. 5B shows an example of an optical image of a subject imaged on the image sensor 56 of the imaging module 50 shown in FIG.
  • no distorted shape is generated in the captured image.
  • FIG. 5C is a diagram showing a state of photographing in the imaging module 50 in which the lens is decentered due to a manufacturing error.
  • the optical axis of the concave lens 54-3 is the convex lens 54-1.
  • it shows a state in which it is decentered vertically above the optical axis of the convex lens 54-2.
  • the focusing position is the front side of the subject with respect to the upper side in the vertical direction of the imaging device (the optical image on the lower side in the vertical direction of the subject).
  • the front pin state and the lower side in the vertical direction of the image sensor are in the rear pin state in which the in-focus position is a position on the back side of the subject.
  • FIG. 5 shows an example of an optical image of a subject formed on the image sensor 56 of the imaging module 50 shown in (c) of FIG.
  • the subject is focused on the center in the vertical direction, but the upper and lower sides of the subject are not in focus as the distance from the center in the vertical direction increases, resulting in a blurred image.
  • the imaging module 50 is an optical system in which the imaging element 56 is aligned with respect to the optical axis of the imaging lens 54 and an image is formed on the imaging element 56. The resolving power of the image is corrected.
  • FIG. 5E is a diagram illustrating a state in which the subject is photographed by the imaging module 50 that has been subjected to the alignment adjustment.
  • the imaging module 50 shown in FIG. 5E uses a horizontal line passing through the center of the light receiving surface of the image sensor 56 as a fulcrum, and brings the upper side in the vertical direction of the image sensor 56 closer to the subject.
  • the image sensor 56 is tilted and adjusted so that the lower side in the vertical direction is away from the subject.
  • alignment adjustment reference can be made to, for example, Japanese Patent Application Laid-Open No. 2010-21985.
  • FIG. 5 shows an example of an optical image of a subject formed on the image sensor 56 of the imaging module 50 shown in (e) of FIG.
  • an image with corrected resolution can be taken.
  • an image photographed by the imaging module 50 that has been adjusted in this way has a distorted shape.
  • a trapezoidal distortion shape is generated.
  • [Distortion shape correction method] 6A is a diagram illustrating the left-viewpoint imaging unit 30L and the right-viewpoint imaging unit 30R of the multiview camera 10 and the subject that is captured by the multiview camera 10.
  • the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R are arranged so that the optical axes of the imaging lenses 54L and 54R are parallel to each other.
  • the imaging module 50 used as the left viewpoint imaging unit 30L and the imaging module 50 used as the right viewpoint imaging unit 30R are both in the ideal state shown in FIG. 5A, FIG. As shown in FIG. 5, the left viewpoint image captured by the left viewpoint imaging unit 30L and the right viewpoint image captured by the right viewpoint imaging unit 30R have the same image distortion shape. Therefore, the multiview camera 10 can estimate accurate distance information to the subject using the left and right viewpoint images.
  • the imaging module 50 used as the right viewpoint imaging unit 30R is in the ideal state illustrated in FIG. 5A, and the imaging module 50 used as the left viewpoint imaging unit 30L is illustrated in FIG.
  • the distortion shapes of the left and right viewpoint images do not match as shown in FIG. Therefore, when the multi-view camera 10 estimates the distance information to the subject using the left and right viewpoint images, a positional shift occurs in the same subject in the left and right viewpoint images, and thus an error occurs in the estimated distance information.
  • “(distance to subject) ⁇ 2” indicates the square of (distance to subject).
  • the distance to the subject is 1000 millimeters
  • the focal length of the imaging lens is 3.5 millimeters
  • the pixel pitch of the imaging element is 1.4 micrometers
  • the camera interval is 50 millimeters
  • (Distance resolution) 1000 ⁇ 2 / ⁇ (3.5 ⁇ 0.0014)
  • ⁇ 50 8.0 millimeters. Therefore, when the distortion shapes of the left and right viewpoint images match, the limit is that it is possible to separate whether the measured distance is 1000 millimeters or 1008 millimeters.
  • the distortion shape of the left and right viewpoint images can be matched by actually measuring the distortion shape of the left and right viewpoint images and deforming both viewpoint images or one of the viewpoint images by image processing.
  • FIG. 6D illustrates a left and right viewpoint image in which the left viewpoint image captured by the left viewpoint imaging unit 30L is deformed to match the distortion shape.
  • the multi-lens camera 10 can estimate accurate distance information to the subject by using the left and right viewpoint images after the image processing.
  • the left and right viewpoint images are mechanically matched in distortion by providing an angle between the optical axis of the left viewpoint imaging unit 30L and the optical axis of the right viewpoint imaging unit 30R.
  • FIG. 6E shows the optical axis of the left viewpoint imaging unit 30L and the optical axis of the right viewpoint imaging unit 30R by fixing the right viewpoint imaging unit 30R as a reference and tilting the left viewpoint imaging unit 30L in the vertical direction. Shows a state in which an angle is given in the vertical direction.
  • (f) of FIG. 6 has shown the left-right viewpoint image image
  • the two imaging modules 50 and the fixing member 80 are prepared, and the imaging module 50 (an example of the first imaging module) used as the reference right viewpoint imaging unit 30R is fixed to the fixing member 80 (step S1, fixing). Process). It is assumed that the resolution of the optical image is corrected in each of the two imaging modules 50 by alignment adjustment.
  • the fixing member 80 (an example of a support member) is a plate-like member for fixing the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R, and has a rectangular hole as shown in FIGS. 8A and 9A. 80R and 80L are provided. As shown in FIGS. 8B and 9B, the imaging module 50 used as the right viewpoint imaging unit 30R is bonded and fixed so as to close the hole 80R.
  • the imaging module 50 (an example of the second imaging module) used as the left viewpoint imaging unit 30L is temporarily fixed to the fixing member 80 (step S2, temporary fixing step).
  • the imaging module 50 used as the left viewpoint imaging unit 30L is supported by a support member 82 and temporarily fixed so as to close the hole 80L as shown in FIGS. 8C and 9C.
  • the optical axis of the imaging lens 54R (an example of the first imaging lens) of the right viewpoint imaging unit 30R and the optical axis of the imaging lens 54L (an example of the second imaging lens) of the left viewpoint imaging unit 30L are parallel to each other.
  • the left viewpoint imaging unit 30L is temporarily fixed.
  • the chart according to the present embodiment is a dot pattern in which dots of a certain size are arranged at regular intervals, as shown in FIG. 10A.
  • the chart is not limited to a dot pattern, and a cross pattern, a lattice pattern, a chess board pattern, or the like can be used.
  • the chart is captured by the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R, and a left viewpoint image and a right viewpoint image (an example of an image) are acquired.
  • An axis that passes through the center of the light receiving surface of the image sensor 56R and that is perpendicular to the X axis on the light receiving surface of the image sensor 56R passes through the Y axis, passes through the center of the light receiving surface of the image sensor 56R, and is orthogonal to the X axis and Y axis Is the Z axis, the optical axis directions of the imaging lens 54L and the imaging lens 54R are parallel to each other. Therefore, there is a deviation in the X direction between the imaging range in the left viewpoint imaging unit 30L and the imaging range in the right viewpoint imaging unit 30R. , Y direction coincides.
  • the distortion shape of the left viewpoint image and the distortion shape of the right viewpoint image are calculated from the acquired left viewpoint image and right viewpoint image (step S4, calculation step).
  • 11A is a graph showing the distortion shape of the left viewpoint image
  • FIG. 11B is a graph showing the distortion shape of the right viewpoint image.
  • the horizontal axis is the distance from the center of the viewpoint image in percentage units
  • the vertical axis is the distortion amount in pixels (if the distortion is positive, the shift to the outside of the screen, if it is negative, the screen center direction
  • the distortion in the entire image is plotted. Since the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R are respectively adjusted in alignment, the distortion shape of the left viewpoint image and the distortion shape of the right viewpoint image are different.
  • an optical axis tilt angle of the left viewpoint imaging unit 30L for matching the distortion shape of the right viewpoint image and the distortion shape of the left viewpoint image is calculated (step S5, calculation step). That is, the difference between the distortion shape is calculated from the distortion shape of the left viewpoint image calculated in step S4 and the distortion shape of the right viewpoint image, and left viewpoint imaging based on the right viewpoint imaging unit 30R for absorbing the calculated difference.
  • the tilt angle of the optical axis of the part 30L is calculated.
  • the relationship between the tilt angle of the optical axis and the amount of change in the distortion shape is stored in advance as a table.
  • the left viewpoint imaging unit 30L is tilted by the support member 82 by the calculated tilt angle of the optical axis, and the left viewpoint imaging unit 30L is bonded and fixed to the fixing member 80 in this tilted state (step S6, adjustment process).
  • the fixing member 80 to which the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R are fixed is attached to the housing 20 of the multiview camera 10 (step S7).
  • 12 (a) and 12 (b) are graphs showing the distortion shape of the left viewpoint image and the distortion shape of the right viewpoint image in the multi-view camera 10 manufactured as described above. As shown in the figure, the distortion shape of the left viewpoint image matches the distortion shape of the right viewpoint image. In this manner, the multi-view camera 10 that mechanically absorbs individual differences in the distortion shape of the left and right viewpoint images can be manufactured.
  • FIG. 13A shows the optical axis of the imaging lens 54L of the left viewpoint imaging unit 30L and the optical axis of the imaging lens 54R of the right viewpoint imaging unit 30R. As shown in the figure, where the optical axes of the imaging lens 54R of the imaging lens 54L is assumed to face the angle theta 1 different directions in the YZ plane.
  • the light receiving surface of the image sensor 56L and the light receiving surface of the image sensor 56R face different directions by an angle ⁇ 2 in the YZ plane.
  • This angle ⁇ 2 is
  • the angle ⁇ 2 is 0.1 ° ⁇
  • the multi-eye camera 10 the angle theta 1, the difference in Y direction and imaging range of the imaging range and the right-view imaging unit 30R of the left-view imaging unit 30L as shown in FIG. 14A occurs.
  • the height in the Y direction of the overlapping region of the left viewpoint image captured by the left viewpoint imaging unit 30L and the right viewpoint image captured by the right viewpoint imaging unit 30R is I
  • the right viewpoint image When the height in the Y direction is H, the ratio I / H of the overlapping area to the viewpoint image is 96% ⁇ I / H ⁇ 100 ⁇ 99.8% More preferably, 98% ⁇ I / H ⁇ 100 ⁇ 99.8% More preferably 99% ⁇ I / H ⁇ 100 ⁇ 99.8% Meet the conditions.
  • the height of the overlapping region in the Y direction is shifted by shifting one or both of the imaging element 56L of the left viewpoint imaging unit 30L and the imaging element 56R of the right viewpoint imaging unit 30R in the Y direction by the camera shake correction control unit 77. Can be changed.
  • the multi-eye camera 10 captures the chart shown in FIG.
  • the distance L between the multiview camera 10 and the chart is 1 meter.
  • the shooting distance is set to 1 meter (1000 millimeters) because the smartphone, which is the main application of the small camera module, often shoots at a subject distance of about 1 meter, so the distance measurement accuracy at 1 meter is the highest. This is because it is common to design / produce to be.
  • FIG. 15A illustrates a left viewpoint image (an example of a second image) captured by the left viewpoint imaging unit 30L, a right viewpoint image (an example of a first image) captured by the right viewpoint imaging unit 30R, It is a figure which shows the overlapping area
  • a dot arranged at the center of the overlapping area is referred to as a center index O
  • dots arranged around the overlapping area are referred to as a peripheral index A, a peripheral index B, a peripheral index C, and a peripheral index D.
  • the dots arranged at the four corners (an example of at least four locations) of the overlapping area are used as the peripheral index A, the peripheral index B, the peripheral index C, and the peripheral index D.
  • the distances from the central index O to the peripheral index A, the peripheral index B, the peripheral index C, and the peripheral index D in the left viewpoint image are represented by P A 1, P B 1, P C, respectively. 1 and P D 1.
  • the distances from the central index O to the peripheral index A, the peripheral index B, the peripheral index C, and the peripheral index D in the right viewpoint image are represented by P A 2, P B 2, Let P C 2 and P D 2.
  • the focal lengths of the imaging lenses 54L and 54R of the multi-lens camera 10 are f millimeters
  • the center distance between the center of the light receiving surface of the image sensor 56L and the center of the light receiving surface of the image sensor 56R is d millimeters
  • the pixel pitch of the elements 56L and 56R is s millimeters.
  • 0.1 indicates a distance measurement accuracy of 10%.
  • the difference between the distance P B 1 and the distance P B 2 is ⁇ P B (pixel)
  • the difference between the distance P C 1 and the distance P C 2 is ⁇ P C (pixel)
  • ⁇ P D (pixel) 0 ⁇ P B ⁇
  • 0 ⁇ P C
  • peripheral index may be set to dots other than the four corners in the overlapping area, or may be set to five or more locations.
  • a smartphone 100 with a multi-lens camera according to the present embodiment is a trinocular camera including three imaging optical systems, and is configured to be able to shoot a stereoscopic image when the same subject is viewed from three viewpoints.
  • the imaging module 50 see FIG. 2 in which the alignment is adjusted is used.
  • FIG. 17A is a front perspective view of the smartphone 100 with a multi-view camera
  • FIG. 17B is a rear perspective view of the smartphone 100 with a multi-view camera
  • the smartphone 100 with a multi-eye camera is provided with a display 104 as a display unit on the front surface of the housing 102.
  • the display 104 is a lenticular lens type 3D monitor capable of stereoscopically displaying a stereoscopic image captured by the left viewpoint imaging unit 30L, the middle viewpoint imaging unit 30C, and the right viewpoint imaging unit 30R.
  • Various kinds of information can also be displayed under the control of the display control unit 76.
  • the display 104 is integrally formed with a touch panel 106 and can accept a touch operation by a user.
  • An operation button 108 is provided on the front surface of the housing 102.
  • the imaging lens 54L of the left viewpoint imaging unit 30L, the imaging lens 54C of the middle viewpoint imaging unit 30C, and 54R of the right viewpoint imaging unit 30R are exposed on the rear surface of the housing 102.
  • the imaging lenses 54L, 54C, and 54R have focal lengths set to 5 millimeters.
  • the touch panel 106 and the operation button 108 output a signal corresponding to the operation to the CPU 61.
  • the user can perform shooting by the left viewpoint imaging unit 30L, the middle viewpoint imaging unit 30C, and the right viewpoint imaging unit 30R by operating the touch panel 106 and the operation buttons 108.
  • the display 104 functions as an electronic viewfinder that displays a live view image.
  • the multi-camera-equipped smartphone 100 includes a call unit 110 and a data transmission / reception unit 112.
  • the call unit 110 performs telephone communication via a fixed telephone network, a mobile phone network, or the like.
  • a user can make a call with a user of another electronic device using the call unit 110.
  • the data transmitter / receiver 112 (an example of a communication unit) transmits / receives digital data via a mobile phone base station or a wireless LAN (Local Area Network).
  • the user can send and receive various digital data through the data sending / receiving unit 112.
  • digital data of images taken by the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R can be transmitted to another electronic device.
  • the left viewpoint imaging unit 30L, the middle viewpoint imaging unit 30C, and the right viewpoint imaging unit 30R are centered on the reference middle viewpoint imaging unit 30C, the left viewpoint imaging unit 30L, and The right viewpoint imaging unit 30R is arranged side by side in the uniaxial direction with both ends. Further, the distortion shape of the left middle right viewpoint image is mechanically matched with the middle viewpoint imaging unit 30C as a reference.
  • the ratio I L / H of the overlapping area to the viewpoint image is 96% ⁇ I L /H ⁇ 100 ⁇ 99.8% More preferably, 98% ⁇ I L /H ⁇ 100 ⁇ 99.8% More preferably 99% ⁇ I L /H ⁇ 100 ⁇ 99.8% Meet the conditions.
  • the height in the Y direction of the overlapping area between the middle viewpoint image captured by the middle viewpoint imaging unit 30C and the right viewpoint image captured by the right viewpoint imaging unit 30R is set to I R
  • the height of the middle viewpoint image in the Y direction is set.
  • the ratio I R / H of the overlapping area with respect to the viewpoint image is 96% ⁇ I R /H ⁇ 100 ⁇ 99.8% More preferably, 98% ⁇ I R /H ⁇ 100 ⁇ 99.8% More preferably 99% ⁇ I R /H ⁇ 100 ⁇ 99.8% Meet the conditions.
  • the light-receiving surface and the light-receiving surface of the image sensor 56L of the imaging device 56C are oriented only different angular orientation theta L in YZ plane.
  • the angle ⁇ L is 0.1 ° ⁇
  • the light receiving surface and the light receiving surface of the image pickup device 56R of the imaging device 56C are oriented only different angular orientation theta R in the YZ plane.
  • the angle ⁇ R is 0.1 ° ⁇
  • the distances are P A 1, P B 1, P C 1, and P D 1
  • the distances from the center index O 1 to the peripheral index A, the peripheral index B, the peripheral index C, and the peripheral index D in the left viewpoint image are P A 2, P B 2, P C 2, and P D 2
  • the focal lengths of the imaging lenses 54C and 54L are f millimeters, and the center distance between the center of the light receiving surface of the image sensor 56C and the center of the light receiving surface of the image sensor 56L.
  • the pixel pitch of the image sensors 56C and 56L is s millimeters
  • the difference between the distance P A 1 and the distance P A 2 is ⁇ P A (pixel)
  • the difference between the distance P B 1 and the distance P B 2 is ⁇ P B ( Pixel)
  • the difference between the distance P C 1 and the distance P C 2 is ⁇ P C (pixel)
  • the difference between the distance P D 1 and the distance P D 2 is ⁇ P D (pixel)
  • 0 ⁇ P B
  • 0 ⁇ P C
  • the arranged dots are set as the peripheral index E, the peripheral index F, the peripheral index G, and the peripheral index H, and each of the center index O 2 to the peripheral index E, the peripheral index F, the peripheral index G, and the peripheral index H in the middle viewpoint image.
  • the distances are P E 1, P F 1, P G 1, and P H 1, and the distances from the center index O 1 to the peripheral index E, the peripheral index F, the peripheral index G, and the peripheral index H in the left viewpoint image are P E 2, P F 2, P G 2, and P H 2, the imaging lens 54C and focal length f mm 54R, the center distance between the light receiving surface center of the light receiving surface center and an image pickup device 56R of the imaging device 56C Mm, s mm pixel pitch of the image pickup element 56C and 56R, the distance P E 1 and the distance difference of [Delta] P E between P E 2 (pixels), the distance P F 1 and the distance difference [Delta] P F (pixels between P F 2 ),
  • the difference between the distance P G 1 and the distance P G 2 is ⁇ P G (pixel)
  • the difference between the distance P H 1 and the distance P H 2 is ⁇ P H (pixel), respectively, 0 ⁇ P E ⁇
  • the multi-lens camera-equipped smartphone 100 has been described by taking a trinocular camera including three imaging optical systems as an example.
  • the multi-lens camera may have two eyes as in the multi-lens camera 10, or four or more multi-lens cameras. It may be a camera.
  • SYMBOLS 10 Multi-view camera, 30L ... Left viewpoint imaging part, 30R ... Right viewpoint imaging part, 50 ... Imaging module, 54, 54L, 54R ... Imaging lens, 56, 56L, 56R ... Imaging element, 73 ... Distance estimation part, 80 ... Fixing member, 80L, 80R ... Hole, 82 ... Supporting member, 100 ... Smartphone, 104 ... Display, 106 ... Touch panel, 108 ... Operation button, O ... Center indicator, A, B, C, D ... Peripheral indicator

Abstract

The problem addressed by the present invention is to provide a multi-lens imaging device that mechanically absorbs individual differences in distorted shapes of images imaged by an imaging module and thereby improves the accuracy of distance recognition, a method for manufacturing the same, and an information terminal device. The problem above is resolved by a multi-lens imaging device wherein: each imaging module has an imaging lens and an imaging element for converting an optical image of a subject for which light is received via the imaging lens into an image signal, with the imaging element being arranged having alignment adjusted with respect to the imaging lens in order to correct resolving power for the optical image; and the condition of I/H being 96% < I/H × 100 < 99.8% is satisfied, where I is the height of a region in the direction running through an axial direction wherein the plurality of imaging modules are lined up, said region being formed by superimposing an image imaged by a first imaging module forming a reference out of a plurality of imaging modules on an image imaged by a second imaging module different from the first imaging module, and H is the height of the image imaged by the first imaging module in the direction running through the axial direction.

Description

多眼撮像装置及びその製造方法、情報端末装置Multi-eye imaging device, manufacturing method thereof, and information terminal device
 本発明は、多眼撮像装置及びその製造方法、情報端末装置に関し、特に被写体との距離を推定するための複数の撮像モジュールを配置する技術に関する。 The present invention relates to a multi-view imaging device, a manufacturing method thereof, and an information terminal device, and more particularly to a technique for arranging a plurality of imaging modules for estimating a distance from a subject.
 近年、小型カメラモジュール用レンズは、薄型化や高画素、Fナンバーの明るいレンズの搭載等の高機能化に伴い、製造難易度が上昇しており、所望の解像度を維持するためにレンズの像面の倒れや偏芯調整等の影響を軽減するために、レンズモジュールの位置に対して撮像素子位置の調整を行うアライメント調整が行われている。しかしながら、アライメント調整を行うとカメラによって撮影される画像の歪み形状(ディストーション)が回転非対称に変化するという欠点があり、アライメント調整量に応じて撮影画像の歪み形状に個体差が生じる。 In recent years, lenses for small camera modules have become more difficult to manufacture as they become thinner and have higher functions such as mounting of high-pixel lenses and bright F-number lenses. In order to maintain the desired resolution, the lens image has been increased. In order to reduce the influence of surface tilt, eccentricity adjustment, and the like, alignment adjustment is performed to adjust the position of the image sensor with respect to the position of the lens module. However, when alignment adjustment is performed, there is a drawback that the distortion shape (distortion) of the image captured by the camera changes in a rotationally asymmetric manner, and individual differences occur in the distortion shape of the captured image in accordance with the alignment adjustment amount.
 ステレオカメラは、2台のカメラの撮影画像の位置ズレ量から被写体までの距離を推定している。しかしながら、2台のカメラの撮影画像の歪み形状に個体差が存在する場合は、この個体差によって撮影画像内の被写体位置にズレが生じるため、距離推定に誤差が生じてしまう。 The stereo camera estimates the distance to the subject from the positional deviation of the images taken by the two cameras. However, if there is an individual difference in the distortion shape of the captured images of the two cameras, an error occurs in the distance estimation because the individual position causes a shift in the subject position in the captured image.
 このような課題に対し、特許文献1には、2台のカメラによりイニシャライズチャートを撮影して補正パラメータを算出し、この補正パラメータに基づいて画像処理を行うことで、2台のカメラの位置ズレに伴う歪みを補正するステレオカメラが記載されている。また、特許文献2には、2つ以上のカメラ間の相対位置による位置ズレを補正するために、少ないラインバッファを用いて画像処理を行うことで高精度に歪み補正を行うステレオカメラが記載されている。 In order to deal with such a problem, Patent Document 1 captures an initialization chart using two cameras, calculates correction parameters, and performs image processing based on the correction parameters, thereby shifting the position of the two cameras. The stereo camera which corrects the distortion accompanying this is described. Patent Document 2 describes a stereo camera that performs distortion correction with high accuracy by performing image processing using a small number of line buffers in order to correct a positional shift caused by a relative position between two or more cameras. ing.
特開2005-351855号公報JP 2005-351855 A 特開2013-201753号公報JP 2013-201753 A
 これらのステレオカメラは、補正処理に計算コストがかかるために動画撮影におけるリアルタイム処理は困難であった。また、補正処理によって画像の解像度が劣化し、距離推定に誤差が生じてしまう等の問題点もあった。 These stereo cameras are difficult to perform real-time processing in moving image shooting because of the computational cost of correction processing. Further, there has been a problem that the resolution of the image deteriorates due to the correction processing, and an error occurs in distance estimation.
 本発明はこのような事情に鑑みてなされたもので、撮像モジュールの撮影画像の歪み形状の個体差を機械的に吸収して距離認識精度を高める多眼撮像装置及びその製造方法、情報端末装置を提供することを目的とする。 The present invention has been made in view of such circumstances, and a multi-lens imaging device that increases the distance recognition accuracy by mechanically absorbing individual differences in the distortion shape of the captured image of the imaging module, a manufacturing method thereof, and an information terminal device The purpose is to provide.
 上記目的を達成するために多眼撮像装置の一の態様は、一軸方向に複数の撮像モジュールを並べて被写体の撮影を行う多眼撮像装置であって、複数の撮像モジュールは、それぞれ、撮像レンズと撮像レンズを介して受光した被写体の光学像を画像信号に変換する撮像素子とを有し、光学像の解像力を補正するために撮像素子が撮像レンズに対してアライメント調整されて取り付けられており、複数の撮像モジュールのうち基準となる第1の撮像モジュールで撮影した画像と第1の撮像モジュールとは異なる第2の撮像モジュールで撮影した画像との重なり領域の複数の撮像モジュールが並ぶ一軸方向と直行する方向の高さをI、第1の撮像モジュールで撮影した画像の一軸方向と直行する方向の画像の高さをHとしたときに、I/Hが96%<I/H×100<99.8%の条件を満たす。 In order to achieve the above object, one aspect of a multi-eye imaging device is a multi-eye imaging device that shoots a subject by arranging a plurality of imaging modules in a uniaxial direction, and each of the plurality of imaging modules includes an imaging lens and an imaging lens. An image sensor that converts an optical image of a subject received through the imaging lens into an image signal, and the image sensor is attached to the imaging lens by adjusting the alignment in order to correct the resolution of the optical image. An uniaxial direction in which a plurality of imaging modules in an overlapping region of an image captured by a first imaging module serving as a reference and an image captured by a second imaging module different from the first imaging module are arranged. I / H is 96, where I is the height in the orthogonal direction and H is the image height in the direction orthogonal to the uniaxial direction of the image captured by the first imaging module. <I / H × 100 <99.8% condition is satisfied.
 本態様によれば、第1の撮像モジュールで撮影された画像と第2の撮像モジュールで撮影された画像との歪み形状が一致しているので、これらの画像を用いた距離認識精度を高めることができる。 According to this aspect, since the distortion shape of the image image | photographed with the 2nd image pick-up module and the image image | photographed with the 1st image pick-up module is in agreement, distance recognition accuracy using these images is improved. Can do.
 I/Hが98%<I/H×100<99.8%の条件を満たすことが好ましい。さらにI/Hが99%<I/H×100<99.8%の条件を満たすことがより好ましい。このような条件を満たす場合には第1の撮像モジュールで撮影された画像と第2の撮像モジュールで撮影された画像との歪み形状がより一致しているので、距離認識精度を高めることができる。 It is preferable that I / H satisfies the condition of 98% <I / H × 100 <99.8%. Furthermore, it is more preferable that I / H satisfies the condition of 99% <I / H × 100 <99.8%. When such a condition is satisfied, the distortion shape of the image photographed by the first imaging module and the image photographed by the second imaging module more closely match, so that the distance recognition accuracy can be improved. .
 第1の撮像モジュールの撮像素子の受光面中心と、第2の撮像モジュールの撮像素子の受光面中心とを結ぶ線をX軸、第1の撮像モジュールの撮像素子の受光面中心を通り、X軸と垂直な第1の撮像モジュールの撮像素子の受光面上の軸をY軸、第1の撮像モジュールの撮像素子の受光面中心を通り、X軸とY軸とに直行する軸をZ軸とするとき、第2の撮像モジュールの撮像素子の受光面中心を通るYZ平面における第2の撮像モジュールの撮像素子のY軸に対する傾き角θが0.1°<|θ|<2°の条件を満たすことが好ましい。これにより、第1の撮像モジュールで撮影された画像と第2の撮像モジュールで撮影された画像との歪み形状が一致しているので、距離認識精度を高めることができる。 A line connecting the center of the light receiving surface of the image pickup device of the first image pickup module and the center of the light receiving surface of the image pickup device of the second image pickup module passes through the X axis and passes through the center of the light receiving surface of the image pickup device of the first image pickup module. The axis on the light receiving surface of the image pickup device of the first image pickup module perpendicular to the axis is the Y axis, and the axis that passes through the center of the light receiving surface of the image pickup device of the first image pickup module and is orthogonal to the X axis and the Y axis is the Z axis. When the inclination angle θ with respect to the Y axis of the image pickup device of the second image pickup module in the YZ plane passing through the center of the light receiving surface of the image pickup device of the second image pickup module is 0.1 ° <| θ | <2 ° It is preferable to satisfy. Thereby, since the distortion shape of the image image | photographed with the 2nd imaging module and the image image | photographed with the 1st imaging module corresponds, distance recognition accuracy can be improved.
 θが0.1°<|θ|<1°の条件を満たすことが好ましい。さらにθが0.1°<|θ|<0.5°の条件を満たすことがより好ましい。このような条件を満たす場合には第1の撮像モジュールで撮影された画像と第2の撮像モジュールで撮影された画像との歪み形状がより一致しているので、距離認識精度を高めることができる。 It is preferable that θ satisfies the condition of 0.1 ° <| θ | <1 °. Furthermore, it is more preferable that θ satisfies the condition of 0.1 ° <| θ | <0.5 °. When such a condition is satisfied, the distortion shape of the image photographed by the first imaging module and the image photographed by the second imaging module more closely match, so that the distance recognition accuracy can be improved. .
 撮像素子の受光面から所定の距離Lだけ離れた位置にチャートを配置し、第1の撮像モジュールと第2の撮像モジュールとでチャートを撮影した場合に、第1の撮像モジュールで撮影した第1の画像と第2の撮像モジュールで撮影した第2の画像の重なり領域内において、重なり領域の中心に配置されたチャートを中心指標、画像の重なり領域の周辺に配置されたチャートを周辺指標とし、第1の撮像モジュールの撮像素子の受光面中心と第2の撮像モジュールの撮像素子の受光面中心との中心間隔をdミリメートル、撮像レンズの焦点距離をfミリメートル、撮像素子の画素ピッチをsミリメートル、Lを1メートルとしたときに、第1の画像における中心指標から周辺指標までの距離p1と第2の画像における中心指標から周辺指標までの距離p2との差Δp(画素)が0<Δp<|0.1×(d×f)/(1000×s)|の条件を満たすことが好ましい。これにより、第1の撮像モジュールで撮影された画像と第2の撮像モジュールで撮影された画像との歪み形状が一致しているので、距離認識精度を高めることができる。 When the chart is arranged at a position separated from the light receiving surface of the image sensor by a predetermined distance L and the chart is photographed by the first imaging module and the second imaging module, the first photographed by the first imaging module. In the overlapping region of the second image captured by the second imaging module and the second image, a chart arranged at the center of the overlapping region is used as a central index, and a chart arranged around the overlapping region of the image is used as a peripheral index. The center distance between the center of the light receiving surface of the image sensor of the first imaging module and the center of the light receiving surface of the image sensor of the second image module is d millimeters, the focal length of the imaging lens is f millimeters, and the pixel pitch of the image sensor is s millimeters. , L is 1 meter, the distance p1 from the central index to the peripheral index in the first image and the central index to the peripheral index in the second image. The difference Delta] p (pixels) and the distance p2 of 0 <Δp <| 0.1 × (d × f) / (1000 × s) | satisfying it is preferable. Thereby, since the distortion shape of the image image | photographed with the 2nd imaging module and the image image | photographed with the 1st imaging module corresponds, distance recognition accuracy can be improved.
 また、少なくとも4箇所の周辺指標においてΔpが0<Δp<|0.1×(d×f)/(1000×s)|の条件を満たすことが好ましい。これにより第1の撮像モジュールで撮影された画像と第2の撮像モジュールで撮影された画像との歪み形状がより一致しているので、距離認識精度を高めることができる。 Further, it is preferable that Δp satisfies the condition of 0 <Δp <| 0.1 × (d × f) / (1000 × s) | in at least four peripheral indicators. Thereby, since the distortion shape of the image image | photographed with the 1st image pick-up module and the image image | photographed with the 2nd image pick-up module corresponds more, distance recognition accuracy can be improved.
 第1の撮像モジュールと第2の撮像モジュールとによりそれぞれ撮影された画像に平行化処理を施す平行化処理手段を備えることが好ましい。これにより、第1の撮像モジュールの撮像レンズの光軸方向と第2の撮像モジュールの撮像レンズの光軸方向とが傾きを持っている場合であっても、適切に距離認識を行うことができる。 It is preferable to include a parallelization processing unit that performs parallelization processing on images captured by the first imaging module and the second imaging module. Thereby, even when the optical axis direction of the imaging lens of the first imaging module and the optical axis direction of the imaging lens of the second imaging module are inclined, distance recognition can be performed appropriately. .
 第1の撮像モジュールで撮影した第1の画像と第2の撮像モジュールで撮影した第2の画像とに基づいて多眼撮像装置から被写体までの距離を測定する距離測定手段を備えてもよい。これにより第1の画像と第2の画像とに基づいて被写体までの距離を測定することができる。 A distance measuring unit that measures the distance from the multi-view imaging device to the subject based on the first image captured by the first imaging module and the second image captured by the second imaging module may be provided. Accordingly, the distance to the subject can be measured based on the first image and the second image.
 第1の撮像モジュールで撮影した第1の画像と第2の撮像モジュールで撮影した第2の画像とに基づいて撮像素子の解像度よりも高い解像度の画像を生成する高解像度画像生成手段を備えることが好ましい。これにより、撮像素子の解像度よりも高い解像度の画像を取得することができる。 High-resolution image generation means for generating an image with a resolution higher than the resolution of the image sensor based on the first image captured by the first imaging module and the second image captured by the second imaging module. Is preferred. Thereby, an image having a resolution higher than that of the image sensor can be acquired.
 多眼撮像装置は2眼であることが好ましい。これにより、適切に2枚の画像を取得することができる。 The multi-eye imaging device preferably has two eyes. Thereby, two images can be acquired appropriately.
 また、多眼撮像装置は一軸方向に3台の撮像モジュールを並べた3眼であり、第1の撮像モジュールが中央の撮像モジュールであり、第2の撮像モジュールが両端の撮像モジュールであってもよい。中央の撮像モジュールを基準とすることで、適切に3枚の画像を取得することができる。 Further, the multi-lens imaging device is a three-lens system in which three imaging modules are arranged in a uniaxial direction, the first imaging module is a central imaging module, and the second imaging module is an imaging module at both ends. Good. By using the central imaging module as a reference, three images can be appropriately acquired.
 上記目的を達成するために情報端末装置の一の態様は、一軸方向に複数の撮像モジュールを並べて被写体の撮影を行う多眼撮像装置であって、撮像モジュールは、それぞれ、撮像レンズと撮像レンズを介して受光した被写体の光学像を画像信号に変換する撮像素子とを有し、光学像の解像力を補正するために撮像素子が撮像レンズに対してアライメント調整されて取り付けられており、複数の撮像モジュールのうち基準となる第1の撮像モジュールで撮影した画像と第1の撮像モジュールとは異なる第2の撮像モジュールで撮影した画像との重なり領域の複数の撮像モジュールが並ぶ一軸方向と直行する方向の高さをI、第1の撮像モジュールで撮影した画像の一軸方向と直行する方向の画像の高さをHとしたときに、I/Hが96%<I/H×100<99.8%の条件を満たす多眼撮像装置と、画像等のデジタルデータを送受信する通信手段とを備えた。 In order to achieve the above object, one aspect of an information terminal device is a multi-eye imaging device that shoots a subject by arranging a plurality of imaging modules in a uniaxial direction. The imaging module includes an imaging lens and an imaging lens, respectively. An image sensor that converts an optical image of a subject received through the image sensor into an image signal, and the image sensor is attached to the imaging lens by adjusting the alignment in order to correct the resolution of the optical image. A direction orthogonal to a uniaxial direction in which a plurality of imaging modules in an overlapping region of an image captured by a first imaging module serving as a reference and an image captured by a second imaging module different from the first imaging module are arranged. I / H is 96% <I, where I is the height of the image taken in the first imaging module and H is the height of the image perpendicular to the uniaxial direction of the image taken by the first imaging module. And H × 100 <99.8% of satisfying multi-view imaging apparatus, and a communication means for transmitting and receiving digital data such as images.
 本態様によれば、第1の撮像モジュールで撮影された画像と第2の撮像モジュールで撮影された画像との歪み形状が一致しているので、これらの画像を用いた距離認識精度を高めることができる。 According to this aspect, since the distortion shape of the image image | photographed with the 2nd image pick-up module and the image image | photographed with the 1st image pick-up module is in agreement, distance recognition accuracy using these images is improved. Can do.
 上記目的を達成するために多眼撮像装置の製造方法の一の態様は、第1の撮像レンズと、第1の撮像レンズを介して受光した被写体の光学像を画像信号に変換する第1の撮像素子とを備えた第1の撮像モジュールであって、光学像の解像力を補正するために第1の撮像素子のアライメント調整がされた第1の撮像モジュールを支持部材に固定する固定工程と、第2の撮像レンズと、第2の撮像レンズを介して受光した被写体の光学像を画像信号に変換する第2の撮像素子とを備えた第2の撮像モジュールであって、光学像の解像力を補正するために第2の撮像素子のアライメント調整がされた第2の撮像モジュールを支持部材に仮固定する仮固定工程と、固定された第1の撮像モジュールと仮固定された第2の撮像モジュールとでそれぞれ画像を撮影する撮影工程と、第1の撮像モジュールにより撮影された画像の歪み形状と第2の撮像モジュールにより撮影された画像の歪み形状とを算出する算出工程と、算出された歪み形状に基づいて第2の撮像モジュールの向きを調整して固定する調整工程であって、第1の撮像モジュールにより撮影される画像の歪み形状と第2の撮像モジュールにより撮影される画像の歪み形状とを一致させる調整工程とを備え、第1の撮像モジュールで撮影した画像と第2の撮像モジュールで撮影した画像との重なり領域の第1の撮像モジュールと第2の撮像モジュールとが並ぶ一軸方向と直行する方向の高さをI、第1の撮像モジュールで撮影した画像の一軸方向と直行する方向の画像の高さをHとしたときに、I/Hが96%<I/H×100<99.8%の条件を満たす。 In order to achieve the above object, one aspect of a method for manufacturing a multi-lens imaging device includes a first imaging lens and a first imaging lens that converts an optical image of a subject received through the first imaging lens into an image signal. A first imaging module including an imaging device, wherein the first imaging module in which the alignment of the first imaging device is adjusted to correct the resolution of the optical image is fixed to a support member; A second imaging module comprising: a second imaging lens; and a second imaging element that converts an optical image of a subject received through the second imaging lens into an image signal, wherein the resolution of the optical image is increased. Temporary fixing step of temporarily fixing the second imaging module in which the alignment of the second imaging element is adjusted for correction to the support member, and the fixed first imaging module and the temporarily fixed second imaging module And each An imaging step of capturing an image, a calculation step of calculating a distortion shape of an image captured by the first imaging module and a distortion shape of an image captured by the second imaging module, and the calculated distortion shape And adjusting the orientation of the second imaging module to fix the distortion shape of the image captured by the first imaging module and the distortion shape of the image captured by the second imaging module. And an adjustment step for causing the first imaging module and the second imaging module in the overlapping region of the image captured by the first imaging module and the image captured by the second imaging module to be orthogonal to the uniaxial direction. I / H is 96% <I / H × 100 <9, where I is the height in the direction and I is the height of the image orthogonal to the uniaxial direction of the image captured by the first imaging module. The condition of 9.8% is satisfied.
 本態様によれば、第1の撮像モジュールを固定するとともに第2の撮像モジュールを仮固定し、固定された第1の撮像モジュールと仮固定された第2の撮像モジュールとでそれぞれ画像を撮影し、各画像の歪み形状を算出し、算出された歪み形状を一致させて第2の撮像モジュールの向きを調整して固定し、さらに第1の撮像モジュールで撮影した画像と第2の撮像モジュールで撮影した画像との重なり領域の第1の撮像モジュールと第2の撮像モジュールとが並ぶ一軸方向と直行する方向の高さをI、第1の撮像モジュールで撮影した画像の一軸方向と直行する方向の画像の高さをHとしたときに、I/Hが96%<I/H×100<99.8%の条件を満たすので、これらの画像を用いた距離認識精度を高めることができる。 According to this aspect, the first imaging module is fixed and the second imaging module is temporarily fixed, and images are respectively captured by the fixed first imaging module and the temporarily fixed second imaging module. The distortion shape of each image is calculated, the calculated distortion shape is matched and the orientation of the second imaging module is adjusted and fixed, and the image captured by the first imaging module and the second imaging module are further fixed. The height in the direction orthogonal to the uniaxial direction in which the first imaging module and the second imaging module are aligned in the overlapping area with the captured image is I, and the direction orthogonal to the uniaxial direction of the image captured by the first imaging module Since the I / H satisfies the condition of 96% <I / H × 100 <99.8% when the height of the image is H, the distance recognition accuracy using these images can be increased.
 本発明によれば、撮像モジュールの撮影画像の歪み形状の個体差を機械的に吸収して距離認識精度を高めることができる。 According to the present invention, it is possible to increase the distance recognition accuracy by mechanically absorbing individual differences in the distortion shape of the captured image of the imaging module.
図1Aは、多眼カメラ10の正面斜視図である。FIG. 1A is a front perspective view of the multi-eye camera 10. 図1Bは、多眼カメラ10の背面斜視図である。FIG. 1B is a rear perspective view of the multi-eye camera 10. 図2は、撮像モジュール50の正面斜視図である。FIG. 2 is a front perspective view of the imaging module 50. 図3は、撮像モジュール50の内部構成を示す図である。FIG. 3 is a diagram illustrating an internal configuration of the imaging module 50. 図4は、多眼カメラ10の電気的構成の一例を示すブロック図である。FIG. 4 is a block diagram illustrating an example of the electrical configuration of the multi-view camera 10. 図5は、撮像モジュール50の歪み発生メカニズムを説明するための図である。FIG. 5 is a diagram for explaining a distortion generation mechanism of the imaging module 50. 図6は、多眼カメラ10の歪み補正方法を説明するための図である。FIG. 6 is a diagram for explaining a distortion correction method of the multiview camera 10. 図7は、多眼カメラ10の製造方法の処理を示すフローチャートである。FIG. 7 is a flowchart showing the process of the manufacturing method of the multiview camera 10. 図8Aは、固定部材80の正面斜視図である。FIG. 8A is a front perspective view of the fixing member 80. 図8Bは、固定部材80の正面斜視図である。FIG. 8B is a front perspective view of the fixing member 80. 図8Cは、固定部材80の正面斜視図である。FIG. 8C is a front perspective view of the fixing member 80. 図9Aは、固定部材80の平面図である。FIG. 9A is a plan view of the fixing member 80. 図9Bは、固定部材80の平面図である。FIG. 9B is a plan view of the fixing member 80. 図9Cは、固定部材80の平面図である。FIG. 9C is a plan view of the fixing member 80. 図10Aは、チャートの撮影を説明するための図である。FIG. 10A is a diagram for explaining chart photographing. 図10Bは、チャートの撮影を説明するための図である。FIG. 10B is a diagram for explaining chart photographing. 図11は、左右視点画像の歪み形状を示すグラフである。FIG. 11 is a graph showing the distortion shape of the left and right viewpoint images. 図12は、左右視点画像の歪み形状を示すグラフである。FIG. 12 is a graph showing the distortion shape of the left and right viewpoint images. 図13Aは、右視点撮像部30Rに対する左視点撮像部30Lの光軸傾け角度を示す図である。FIG. 13A is a diagram illustrating an optical axis tilt angle of the left viewpoint imaging unit 30L with respect to the right viewpoint imaging unit 30R. 図13Bは、右視点撮像部30Rに対する左視点撮像部30Lの光軸傾け角度を示す図である。FIG. 13B is a diagram illustrating an optical axis tilt angle of the left viewpoint imaging unit 30L with respect to the right viewpoint imaging unit 30R. 図14Aは、左視点画像と右視点画像との画像の重なりを示す図である。FIG. 14A is a diagram illustrating image overlap between a left viewpoint image and a right viewpoint image. 図14Bは、左視点画像と右視点画像との画像の重なりを示す図である。FIG. 14B is a diagram illustrating image overlap between the left viewpoint image and the right viewpoint image. 図15は、左視点画像の歪み形状と右視点画像の歪み形状との差の算出を示す図である。FIG. 15 is a diagram illustrating calculation of a difference between the distortion shape of the left viewpoint image and the distortion shape of the right viewpoint image. 図16は、多眼カメラ10の撮像光学系の各パラメータを示す図である。FIG. 16 is a diagram illustrating parameters of the imaging optical system of the multi-lens camera 10. 図17Aは、多眼カメラ付きスマートフォン100の正面斜視図である。FIG. 17A is a front perspective view of smartphone 100 with a multi-eye camera. 図17Bは、多眼カメラ付きスマートフォン100の背面斜視図である。FIG. 17B is a rear perspective view of smartphone 100 with a multi-eye camera. 図18は、多眼カメラ付きスマートフォン100の電気的構成の一例を示すブロック図である。FIG. 18 is a block diagram illustrating an example of an electrical configuration of the smartphone 100 with a multi-view camera.
 以下、添付図面に従って本発明の好ましい実施の形態について詳説する。 Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
 〔ステレオカメラの構成〕
 本実施形態に係る多眼カメラは、一軸方向に複数の撮像光学系を並べて被写体の撮影を行う多眼撮像装置であり、ここでは同一被写体を2つの視点から見た立体画像を撮影する2眼カメラを例に説明する。
[Configuration of stereo camera]
The multi-lens camera according to the present embodiment is a multi-lens imaging device that shoots a subject by arranging a plurality of imaging optical systems in a uniaxial direction, and here, two eyes that shoot a stereoscopic image of the same subject viewed from two viewpoints. A camera will be described as an example.
 図1Aは、多眼カメラ10の正面斜視図である。同図に示すように、多眼カメラ10は、筐体20、シャッターボタン22、左視点撮像部30L、右視点撮像部30Rを備えている。 FIG. 1A is a front perspective view of the multi-view camera 10. As shown in the figure, the multi-view camera 10 includes a housing 20, a shutter button 22, a left viewpoint imaging unit 30L, and a right viewpoint imaging unit 30R.
 筐体20は略直方体の箱状に形成され、シャッターボタン22は筐体20の図中の上面に配置されている。シャッターボタン22は、半押し時にフォーカスロック、測光等の撮影準備を行わせ、全押し時に画像の取り込みを行わせる入力手段である。 The housing 20 is formed in a substantially rectangular parallelepiped box shape, and the shutter button 22 is disposed on the upper surface of the housing 20 in the figure. The shutter button 22 is an input means for making preparations for shooting such as focus lock and photometry when half-pressed and for capturing an image when fully pressed.
 左視点撮像部30L及び右視点撮像部30Rは、多眼カメラ10の正面に向かって右側に左視点撮像部30Lが、左側に右視点撮像部30Rが、それぞれ筐体20の前面に水平方向(一軸方向の一例)に並べて配置されている。 The left-viewpoint imaging unit 30L and the right-viewpoint imaging unit 30R are arranged in the horizontal direction on the front side of the housing 20 with the left-viewpoint imaging unit 30L on the right side and the right-viewpoint imaging unit 30R on the left side. They are arranged side by side in an example of a uniaxial direction.
 図1Bは、多眼カメラ10の背面斜視図である。同図に示すように、多眼カメラ10の背面には、表示部40及び十字キー42が配置されている。 FIG. 1B is a rear perspective view of the multi-view camera 10. As shown in the figure, a display unit 40 and a cross key 42 are arranged on the back of the multi-view camera 10.
 表示部40は、左視点撮像部30L及び右視点撮像部30Rによって撮影された立体画像を立体表示することが可能なレンチキュラーレンズ方式の3Dモニタである。十字キー42は、撮影者が多眼カメラ10を操作するための入力手段である。 The display unit 40 is a lenticular lens type 3D monitor capable of stereoscopically displaying a stereoscopic image captured by the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R. The cross key 42 is an input means for the photographer to operate the multiview camera 10.
 〔撮像モジュールの構成〕
 多眼カメラ10の左視点撮像部30L及び右視点撮像部30Rは、それぞれ同一形状の撮像モジュール50が用いられる。図2に示すように、この撮像モジュール50は、筐体52の前面に撮像レンズ54を備えている。また、図3に示すように、撮像モジュール50の光学系は、撮像レンズ54及び撮像素子56から構成されている。
[Configuration of imaging module]
The left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R of the multi-view camera 10 use the imaging modules 50 having the same shape. As shown in FIG. 2, the imaging module 50 includes an imaging lens 54 on the front surface of the housing 52. As shown in FIG. 3, the optical system of the imaging module 50 includes an imaging lens 54 and an imaging element 56.
 撮像レンズ54は、凸レンズ54-1、凸レンズ54-2、及び凹レンズ54-3を備え、予め定められた焦点距離f(単位:ミリメートル)に設定されている。各レンズは、予め定められた間隔を設けて、かつそれぞれの光軸が一致するように配置されている。筐体52の前面側に位置する凸レンズ54-1から入射した被写体光は、凸レンズ54-2及び凹レンズ54-3を介して撮像素子56に入射する。 The imaging lens 54 includes a convex lens 54-1, a convex lens 54-2, and a concave lens 54-3, and is set to a predetermined focal length f (unit: millimeter). Each lens is arranged so that a predetermined interval is provided and the respective optical axes coincide with each other. Subject light incident from a convex lens 54-1 located on the front side of the housing 52 enters the image sensor 56 via a convex lens 54-2 and a concave lens 54-3.
 撮像素子56は、2次元のカラーCCD(Charge-Coupled Device)固体撮像素子により構成されている。撮像素子56の受光面には、多数のフォトダイオードが2次元的に予め定められた画素ピッチs(単位:ミリメートル)で配列されており、各フォトダイオードには所定の配列でカラーフィルタが配置されている。撮像レンズ54を介して撮像素子56の受光面上に結像された被写体の光学像は、このフォトダイオードによって入射光量に応じた信号電荷に変換される。 The image sensor 56 is composed of a two-dimensional color CCD (Charge-Coupled Device) solid-state image sensor. A large number of photodiodes are two-dimensionally arranged at a predetermined pixel pitch s (unit: millimeter) on the light receiving surface of the image sensor 56, and a color filter is arranged in a predetermined arrangement on each photodiode. ing. The optical image of the subject formed on the light receiving surface of the image sensor 56 via the imaging lens 54 is converted into signal charges corresponding to the amount of incident light by the photodiode.
 〔ステレオカメラの電気的構成〕
 図4に示すように、多眼カメラ10は、左視点撮像部30L、右視点撮像部30R、表示部40の他、CPU(Central Processing Unit)61、ROM(Read Only Memory)62、操作部63、撮像信号処理部65、制御バス66、データバス67、メモリ制御部68、メインメモリ69、デジタル信号処理部70、圧縮伸張処理部71、積算部72、距離推定部73、外部メモリ制御部74、表示制御部76、手ブレ補正制御部77、及び角速度センサ78等のブロックを備えて構成される。
[Electric configuration of stereo camera]
As shown in FIG. 4, the multi-view camera 10 includes a left viewpoint imaging unit 30L, a right viewpoint imaging unit 30R, a display unit 40, a CPU (Central Processing Unit) 61, a ROM (Read Only Memory) 62, and an operation unit 63. , Imaging signal processing unit 65, control bus 66, data bus 67, memory control unit 68, main memory 69, digital signal processing unit 70, compression / decompression processing unit 71, integration unit 72, distance estimation unit 73, external memory control unit 74 And a block such as a display control unit 76, a camera shake correction control unit 77, and an angular velocity sensor 78.
 各ブロックはCPU61に制御されて動作する。CPU61は、操作部63からの入力に基づき制御プログラムを実行することにより、各ブロックを制御する。ROM62にはCPU61が実行する制御プログラムのほか、制御に必要な各種データ等が記録されている。CPU61は、ROM62に記録された制御プログラムをメインメモリ69に読み出し、逐次実行することにより、多眼カメラ10の各ブロックを制御する。 Each block operates under the control of the CPU 61. The CPU 61 controls each block by executing a control program based on the input from the operation unit 63. In addition to the control program executed by the CPU 61, the ROM 62 stores various data necessary for control. The CPU 61 controls each block of the multi-lens camera 10 by reading the control program recorded in the ROM 62 into the main memory 69 and sequentially executing it.
 なお、メインメモリ69は、SDRAM(Synchronous Dynamic Random Access Memory)で構成されており、プログラムの実行処理領域として利用されるほか、画像データ等の一時記憶領域、各種作業領域として利用される。 The main memory 69 is composed of SDRAM (Synchronous Dynamic Random Access Memory), and is used as a program execution processing area, a temporary storage area for image data, and various work areas.
 操作部63は、図1に示したシャッターボタン22、十字キー42等を備え、それぞれの操作に応じた信号をCPU61に出力する。 The operation unit 63 includes the shutter button 22 and the cross key 42 shown in FIG. 1 and outputs a signal corresponding to each operation to the CPU 61.
 左視点撮像部30Lは、撮像レンズ54L及び撮像素子56Lを備えている。同様に、右視点撮像部30Rは、撮像レンズ54R及び撮像素子56Rを備えている。撮像素子56L及び撮像素子56Rの受光面には、それぞれ図示しない多数の受光素子が二次元的に画素ピッチs(単位:ミリメートル)で配列されており、各受光素子に対応して図示しない赤(R)、緑(G)、青(B)の原色カラーフィルタが所定の配列構造で配置されている。受光面上に結像された被写体光は、各受光素子によって電気信号に変換され、蓄積される。 The left viewpoint imaging unit 30L includes an imaging lens 54L and an imaging element 56L. Similarly, the right viewpoint imaging unit 30R includes an imaging lens 54R and an imaging element 56R. A large number of light receiving elements (not shown) are two-dimensionally arranged at a pixel pitch s (unit: millimeter) on the light receiving surfaces of the image pickup elements 56L and 56R, and red (not shown) corresponding to each light receiving element. R, green (G), and blue (B) primary color filters are arranged in a predetermined arrangement structure. The subject light imaged on the light receiving surface is converted into an electrical signal by each light receiving element and accumulated.
 各受光素子に蓄積された電気信号は、図示しない垂直転送路に読み出される。垂直転送路は、この信号を図示しない撮像素子駆動部から供給されるクロックに同期して、1ラインずつ図示しない水平転送路に転送する。さらに水平転送路は、垂直転送路から転送された1ライン分の信号を、図示しない撮像素子駆動部から供給されるクロックに同期して撮像信号処理部65へ出力する。 The electrical signal accumulated in each light receiving element is read out to a vertical transfer path (not shown). The vertical transfer path transfers this signal to a horizontal transfer path (not shown) line by line in synchronization with a clock supplied from an image sensor driving unit (not shown). Further, the horizontal transfer path outputs the signal for one line transferred from the vertical transfer path to the imaging signal processing unit 65 in synchronization with a clock supplied from an imaging element driving unit (not shown).
 なお、画像信号の出力は、多眼カメラ10が撮影モードにセットされると開始される。即ち、多眼カメラ10が撮影モードにセットされると、表示部40にライブビュー画像を表示するため、画像信号の出力が開始される。このライブビュー画像用の画像信号の出力は、本撮影の指示が行われると一旦停止され、本撮影が終了すると再度開始される。ライブビュー画像は、左視点撮像部30Lで撮像された画像信号に基づいて表示してもよいし、左視点撮像部30Lで撮像された画像信号と右視点撮像部30Rで撮像された画像信号とを立体視可能に表示してもよい。 Note that the output of the image signal is started when the multiview camera 10 is set to the photographing mode. That is, when the multi-lens camera 10 is set to the photographing mode, output of an image signal is started in order to display a live view image on the display unit 40. The output of the image signal for the live view image is temporarily stopped when an instruction for the main photographing is given, and is restarted when the main photographing is finished. The live view image may be displayed based on the image signal captured by the left viewpoint imaging unit 30L, or the image signal captured by the left viewpoint imaging unit 30L and the image signal captured by the right viewpoint imaging unit 30R. May be displayed in a stereoscopic manner.
 また、本撮影終了時には、一定時間、表示部40に本撮影された撮影画像が表示される(ポストビュー)。ユーザは、ポストビューを確認することにより、撮影画像が適切に撮影できたか否かを確認することができる。 Also, at the end of the main shooting, the captured image is displayed on the display unit 40 for a certain period of time (post view). The user can confirm whether or not the captured image has been properly captured by confirming the postview.
 撮像信号処理部65は、図示しない相関二重サンプリング回路、クランプ処理回路、自動ゲインコントロール回路、及びアナログ/デジタル変換器を含んで構成される。相関二重サンプリング回路は、画像信号に含まれているノイズの除去を行う。クランプ処理回路は、暗電流成分を除去する処理を行う。さらに、自動ゲインコントロール回路は、暗電流成分が除去された画像信号を、設定されたISO(International Standards Organization)感度(撮影感度)に応じた所定のゲインで増幅する。所要の信号処理が施されたアナログの画像信号は、アナログ/デジタル変換器において所定ビットの階調幅を持ったデジタルの画像信号に変換される。この画像信号は、いわゆるRAWデータ(生画像データ)であり、画素毎にR、G、Bの濃度を示す階調値を有している。このデジタルの画像信号は、データバス67、メモリ制御部68を介してメインメモリ69に格納される。 The imaging signal processing unit 65 includes a correlated double sampling circuit, a clamp processing circuit, an automatic gain control circuit, and an analog / digital converter (not shown). The correlated double sampling circuit removes noise contained in the image signal. The clamp processing circuit performs processing for removing dark current components. Further, the automatic gain control circuit amplifies the image signal from which the dark current component has been removed with a predetermined gain corresponding to a set ISO (International Standards Organization) sensitivity (imaging sensitivity). The analog image signal subjected to the required signal processing is converted into a digital image signal having a gradation width of a predetermined bit by an analog / digital converter. This image signal is so-called RAW data (raw image data), and has a gradation value indicating the density of R, G, and B for each pixel. This digital image signal is stored in the main memory 69 via the data bus 67 and the memory control unit 68.
 制御バス66、データバス67には、CPU61、撮像信号処理部65の他、メモリ制御部68、デジタル信号処理部70、圧縮伸張処理部71、積算部72、距離推定部73、外部メモリ制御部74、及び表示制御部76等が接続されており、これらは制御バス66の制御信号に基づいて、データバス67を介して互いに情報を送受信できるようにされている。 In addition to the CPU 61 and the imaging signal processing unit 65, the control bus 66 and the data bus 67 include a memory control unit 68, a digital signal processing unit 70, a compression / decompression processing unit 71, an integration unit 72, a distance estimation unit 73, and an external memory control unit. 74, a display control unit 76, and the like are connected, and these can transmit and receive information to and from each other via a data bus 67 based on a control signal of the control bus 66.
 デジタル信号処理部70は、メインメモリ69に格納されたR、G、Bの各色の画像信号に対して所定の信号処理を施し、輝度信号Yと色差信号Cr、Cbとからなる画像信号(Y/C信号)を生成する。また、デジタル信号処理部70は、左視点撮像部30Lで撮像された画像信号と右視点撮像部30Rで撮像された画像信号とから撮像素子56L及び56Rの解像度よりも高い解像度の画像を生成する高解像度画像生成手段として機能させてもよい。 The digital signal processing unit 70 performs predetermined signal processing on the image signals of R, G, and B colors stored in the main memory 69, and outputs an image signal (Y) composed of a luminance signal Y and color difference signals Cr and Cb. / C signal). In addition, the digital signal processing unit 70 generates an image having a resolution higher than the resolution of the imaging elements 56L and 56R from the image signal captured by the left viewpoint imaging unit 30L and the image signal captured by the right viewpoint imaging unit 30R. You may make it function as a high-resolution image production | generation means.
 圧縮伸張処理部71は、CPU61からの圧縮指令に従い、入力された輝度信号Yと色差信号Cr、Cbとからなる画像信号(Y/C信号)に所定形式(たとえば、JPEG(Joint Photographic Experts Group))の圧縮処理を施し、圧縮画像データを生成する。また、CPU61からの伸張指令に従い、入力された圧縮画像データに所定形式の伸張処理を施して、非圧縮の画像データを生成する。 In accordance with a compression command from the CPU 61, the compression / decompression processing unit 71 converts the input luminance signal Y and color difference signals Cr and Cb into an image signal (Y / C signal) in a predetermined format (for example, JPEG (Joint Photographic Experts Group). ) To generate compressed image data. Further, in accordance with a decompression command from the CPU 61, the input compressed image data is subjected to decompression processing in a predetermined format to generate non-compressed image data.
 積算部72は、CPU61の指令に従い、メインメモリ69に格納されたR、G、Bの画像信号を取り込み、AE制御に必要な積算値を算出する。CPU61は、積算値から輝度値を算出し、輝度値から露出値を求める。また露出値から所定のプログラム線図に従って、絞り値及びシャッタースピードを決定する。 The accumulating unit 72 takes in R, G, and B image signals stored in the main memory 69 in accordance with a command from the CPU 61, and calculates an accumulated value necessary for AE control. The CPU 61 calculates a luminance value from the integrated value and obtains an exposure value from the luminance value. Further, the aperture value and the shutter speed are determined from the exposure value according to a predetermined program diagram.
 距離推定部73(距離測定手段の一例、平行化処理手段の一例)は、左視点撮像部30Lにおいて撮影した画像及び右視点撮像部30Rにおいて撮影した画像について平行化処理を行い、平行化処理を行った画像に基づいて多眼カメラ10から被写体までの距離を推定する。ここで、平行化処理とは、両画像を平行な光学系で撮影した画像と等価になるように変換する処理である。平行化処理に必要な左視点撮像部30L及び右視点撮像部30Rの光軸方向のズレ量は、ROM62に予め記憶されている。 The distance estimation unit 73 (an example of a distance measurement unit and an example of a parallelization processing unit) performs a parallelization process on the image captured by the left viewpoint imaging unit 30L and the image captured by the right viewpoint imaging unit 30R, and performs the parallelization process. The distance from the multiview camera 10 to the subject is estimated based on the performed image. Here, the collimating process is a process of converting both images so as to be equivalent to an image captured by a parallel optical system. The deviation amounts in the optical axis direction of the left viewpoint imaging unit 30 </ b> L and the right viewpoint imaging unit 30 </ b> R necessary for the parallelization processing are stored in advance in the ROM 62.
 外部メモリ制御部74は、CPU61からの指令に従い、記録媒体75に対してデータの読み/書きを制御する。なお、記録媒体75は、メモリカードのように多眼カメラ10の本体に対して着脱自在なものでもよいし、多眼カメラ10の本体に内蔵されたものでもよい。着脱自在とする場合は、多眼カメラ10の本体にカードスロットを設け、このカードスロットに装填して使用する。 The external memory control unit 74 controls reading / writing of data with respect to the recording medium 75 in accordance with a command from the CPU 61. The recording medium 75 may be detachable from the main body of the multi-view camera 10 such as a memory card, or may be built in the main body of the multi-view camera 10. In the case of detachable, a card slot is provided in the main body of the multi-view camera 10, and the card slot is used by being loaded.
 表示制御部76は、CPU61からの指令に従い、表示部40への表示を制御する。表示部40は、前述したようにレンチキュラーレンズ方式の3Dモニタであり、左視点撮像部30Lにおいて撮影された左視点画像及び右視点撮像部30Rにおいて撮影された右視点画像を、指向性を持たせて同時に表示させることができる。また、表示部40は、動画(ライブビュー画像)を表示して電子ビューファインダとして使用できるとともに、撮影した記録前の画像(ポストビュー画像)や記録媒体75から読み出した再生画像等を表示することが可能である。 The display control unit 76 controls display on the display unit 40 in accordance with a command from the CPU 61. The display unit 40 is a lenticular lens type 3D monitor as described above, and has directivity for the left viewpoint image captured by the left viewpoint imaging unit 30L and the right viewpoint image captured by the right viewpoint imaging unit 30R. Can be displayed simultaneously. The display unit 40 displays a moving image (live view image) and can be used as an electronic viewfinder, and displays a photographed image before recording (postview image), a reproduced image read from the recording medium 75, and the like. Is possible.
 手ブレ補正制御部77は、多眼カメラ10の撮影動作に伴う手ブレに伴い撮像素子56L及び56Rを光軸方向に直交する面内で移動させて撮影される画像の手ブレを補正する。角速度センサ78は、多眼カメラ10に発生する手ブレを検出するための検出手段であり、多眼カメラ10の水平方向及び鉛直方向に対する角速度を表す信号を出力する。 The camera shake correction control unit 77 corrects camera shake of an image captured by moving the image sensors 56L and 56R in a plane orthogonal to the optical axis direction in accordance with camera shake accompanying the shooting operation of the multi-lens camera 10. The angular velocity sensor 78 is detection means for detecting camera shake occurring in the multi-lens camera 10 and outputs a signal representing the angular velocity of the multi-lens camera 10 with respect to the horizontal direction and the vertical direction.
 角速度センサ78において検出された水平方向及び鉛直方向の角速度信号は、データバス67を介してCPU61に入力される。CPU61は、入力された角速度信号から撮像素子56L及び56Rの水平方向及び鉛直方向の移動量を算出し、手ブレ補正制御部77に出力する。手ブレ補正制御部77は、CPU61から取得した水平方向及び鉛直方向の移動量で撮像素子56L及び56Rを受光面の向きを一定としたままシフト(移動)させる。このように撮像素子56L及び56Rを移動させることで、左視点撮像部30Lにおいて撮影される左視点画像及び右視点撮像部30Rにおいて撮影される右視点画像の手ブレを補正することができる。 The angular velocity signals in the horizontal direction and the vertical direction detected by the angular velocity sensor 78 are input to the CPU 61 via the data bus 67. The CPU 61 calculates the horizontal and vertical movement amounts of the image sensors 56 </ b> L and 56 </ b> R from the input angular velocity signal and outputs the movement amounts to the camera shake correction control unit 77. The camera shake correction control unit 77 shifts (moves) the image sensors 56L and 56R while keeping the direction of the light receiving surface constant by the horizontal and vertical movement amounts acquired from the CPU 61. By moving the imaging elements 56L and 56R in this way, it is possible to correct camera shake of the left viewpoint image captured by the left viewpoint imaging unit 30L and the right viewpoint image captured by the right viewpoint imaging unit 30R.
 〔撮像モジュールの歪み発生メカニズム〕
 次に、図5を用いて撮像モジュール50の撮影画像の歪み形状の発生メカニズムを説明する。
[Distortion mechanism of imaging module]
Next, the generation mechanism of the distortion shape of the captured image of the imaging module 50 will be described with reference to FIG.
 図5の(a)は、撮像モジュール50において被写体を撮影する様子を示した図であり、凸レンズ54-1、凸レンズ54-2、及び凹レンズ54-3の光軸が全て一致した理想状態の場合の被写体光と被写体の光学像との関係を示している。図5の(b)は、図5の(a)に示した撮像モジュール50の撮像素子56上に結像された被写体の光学像の一例を示している。このように、凸レンズ54-1、凸レンズ54-2、及び凹レンズ54-3の光軸が全て一致している場合には、撮影画像には歪み形状が発生しない。 FIG. 5A is a diagram showing how the imaging module 50 captures a subject. In the ideal state where the optical axes of the convex lens 54-1, the convex lens 54-2, and the concave lens 54-3 are all coincident. The relationship between the subject light and the optical image of the subject is shown. FIG. 5B shows an example of an optical image of a subject imaged on the image sensor 56 of the imaging module 50 shown in FIG. As described above, when the optical axes of the convex lens 54-1, the convex lens 54-2, and the concave lens 54-3 are all coincident, no distorted shape is generated in the captured image.
 これに対し、図5の(c)は、製造誤差によりレンズに偏芯が発生した撮像モジュール50における撮影の様子を示した図であり、ここでは凹レンズ54-3の光軸が凸レンズ54-1及び凸レンズ54-2の光軸よりも鉛直方向上方へ偏芯している状態を示している。この撮像モジュール50の場合、撮像素子56の中央部では合焦するものの、撮像素子の鉛直方向上側(被写体の鉛直方向下側の光学像)については合焦位置が被写体より手前側の位置である前ピン状態、撮像素子の鉛直方向下側(被写体の鉛直方向上側の光学像)については合焦位置が被写体より奥側の位置である後ピン状態となる。 On the other hand, FIG. 5C is a diagram showing a state of photographing in the imaging module 50 in which the lens is decentered due to a manufacturing error. Here, the optical axis of the concave lens 54-3 is the convex lens 54-1. In addition, it shows a state in which it is decentered vertically above the optical axis of the convex lens 54-2. In the case of this imaging module 50, although it is focused at the center of the imaging device 56, the focusing position is the front side of the subject with respect to the upper side in the vertical direction of the imaging device (the optical image on the lower side in the vertical direction of the subject). The front pin state and the lower side in the vertical direction of the image sensor (optical image on the upper side in the vertical direction of the subject) are in the rear pin state in which the in-focus position is a position on the back side of the subject.
 図5の(d)は、図5の(c)に示した撮像モジュール50の撮像素子56上に結像された被写体の光学像の一例を示している。同図に示すように、被写体の鉛直方向中央部には合焦しているが、被写体の鉛直方向上側及び下側は鉛直方向中央部から離れるほどピントが合わず、ぼけた画像となる。 (D) of FIG. 5 shows an example of an optical image of a subject formed on the image sensor 56 of the imaging module 50 shown in (c) of FIG. As shown in the figure, the subject is focused on the center in the vertical direction, but the upper and lower sides of the subject are not in focus as the distance from the center in the vertical direction increases, resulting in a blurred image.
 このようなレンズの偏芯等に起因した画像の低解像度化を防止するため、撮像モジュール50は撮像レンズ54の光軸に対して撮像素子56がアライメント調整され、撮像素子56に結像する光学像の解像力が補正される。図5の(e)は、アライメント調整がされた撮像モジュール50において被写体を撮影する様子を示した図である。図5の(e)に示した撮像モジュール50は、アライメント調整の一例として、撮像素子56の受光面中心を通る水平線を支点とし、撮像素子56の鉛直方向上側を被写体へ近づけ、撮像素子56の鉛直方向下側を被写体から遠ざかるように、撮像素子56が倒れ調整されている。なお、アライメント調整の詳細については、例えば特開2010-21985号公報を参照することができる。 In order to prevent a reduction in image resolution due to such eccentricity of the lens, the imaging module 50 is an optical system in which the imaging element 56 is aligned with respect to the optical axis of the imaging lens 54 and an image is formed on the imaging element 56. The resolving power of the image is corrected. FIG. 5E is a diagram illustrating a state in which the subject is photographed by the imaging module 50 that has been subjected to the alignment adjustment. As an example of alignment adjustment, the imaging module 50 shown in FIG. 5E uses a horizontal line passing through the center of the light receiving surface of the image sensor 56 as a fulcrum, and brings the upper side in the vertical direction of the image sensor 56 closer to the subject. The image sensor 56 is tilted and adjusted so that the lower side in the vertical direction is away from the subject. For details of alignment adjustment, reference can be made to, for example, Japanese Patent Application Laid-Open No. 2010-21985.
 図5の(f)は、図5の(e)に示した撮像モジュール50の撮像素子56上に結像された被写体の光学像の一例を示している。同図に示すように、アライメント調整を行ったことで、解像度を補正した画像を撮影することができる。しかしながら、このようにアライメント調整された撮像モジュール50において撮影される画像は、歪み形状を有している。図5の(f)に示す例では、台形型の歪み形状が発生している。 (F) of FIG. 5 shows an example of an optical image of a subject formed on the image sensor 56 of the imaging module 50 shown in (e) of FIG. As shown in the figure, by performing alignment adjustment, an image with corrected resolution can be taken. However, an image photographed by the imaging module 50 that has been adjusted in this way has a distorted shape. In the example shown in FIG. 5F, a trapezoidal distortion shape is generated.
 〔歪み形状補正方法〕
 図6の(a)は、多眼カメラ10の左視点撮像部30L及び右視点撮像部30Rと、多眼カメラ10によって撮影される被写体を示す図である。ここで、左視点撮像部30L及び右視点撮像部30Rは、撮像レンズ54L及び54Rの光軸方向が互いに平行となるように配置されている。
[Distortion shape correction method]
6A is a diagram illustrating the left-viewpoint imaging unit 30L and the right-viewpoint imaging unit 30R of the multiview camera 10 and the subject that is captured by the multiview camera 10. FIG. Here, the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R are arranged so that the optical axes of the imaging lenses 54L and 54R are parallel to each other.
 左視点撮像部30Lとして使用する撮像モジュール50と右視点撮像部30Rとして使用する撮像モジュール50とが、ともに図5の(a)に示した理想状態である場合には、図6の(b)に示すように、左視点撮像部30Lにおいて撮影される左視点画像と右視点撮像部30Rにおいて撮影される右視点画像とは、ともに画像の歪み形状が一致する。したがって、多眼カメラ10は、この左右視点画像を用いて被写体までの正確な距離情報を推定することができる。 When the imaging module 50 used as the left viewpoint imaging unit 30L and the imaging module 50 used as the right viewpoint imaging unit 30R are both in the ideal state shown in FIG. 5A, FIG. As shown in FIG. 5, the left viewpoint image captured by the left viewpoint imaging unit 30L and the right viewpoint image captured by the right viewpoint imaging unit 30R have the same image distortion shape. Therefore, the multiview camera 10 can estimate accurate distance information to the subject using the left and right viewpoint images.
 しかしながら、例えば右視点撮像部30Rとして使用する撮像モジュール50が図5の(a)に示した理想状態であり、左視点撮像部30Lとして使用する撮像モジュール50が図5の(e)に示したアライメント調整されている撮像モジュール50である場合には、図6の(c)に示すように左右視点画像の歪み形状は一致しない。したがって、多眼カメラ10がこの左右視点画像を用いて被写体までの距離情報を推定すると、左右視点画像の同じ被写体に位置ズレが発生するために、推定した距離情報には誤差が発生する。 However, for example, the imaging module 50 used as the right viewpoint imaging unit 30R is in the ideal state illustrated in FIG. 5A, and the imaging module 50 used as the left viewpoint imaging unit 30L is illustrated in FIG. In the case of the imaging module 50 in which the alignment is adjusted, the distortion shapes of the left and right viewpoint images do not match as shown in FIG. Therefore, when the multi-view camera 10 estimates the distance information to the subject using the left and right viewpoint images, a positional shift occurs in the same subject in the left and right viewpoint images, and thus an error occurs in the estimated distance information.
 2眼カメラで距離情報を取得する場合の理論限界は、
 (距離分解能)=(被写体との距離)^2÷(焦点距離÷画素ピッチ)÷カメラ間隔
 と表すことができる。なお、「(被写体との距離)^2」は(被写体との距離)の2乗を示す。例えば、被写体との距離が1000ミリメートル、撮像レンズの焦点距離が3.5ミリメートル、撮像素子の画素ピッチが1.4マイクロメートル、カメラ間隔が50ミリメートルの場合、
 (距離分解能)=1000^2÷(3.5÷0.0014)÷50=8.0ミリメートル
 である。したがって、左右視点画像の歪み形状が一致するとき、測定した距離が1000ミリメートルであるか、1008ミリメートルであるかを分離することができるのが限界である。
The theoretical limit when obtaining distance information with a twin-lens camera is
(Distance resolution) = (Distance to subject) ^ 2 ÷ (Focal distance ÷ Pixel pitch) ÷ Camera interval. Note that “(distance to subject) ^ 2” indicates the square of (distance to subject). For example, when the distance to the subject is 1000 millimeters, the focal length of the imaging lens is 3.5 millimeters, the pixel pitch of the imaging element is 1.4 micrometers, and the camera interval is 50 millimeters,
(Distance resolution) = 1000 ^ 2 / ÷ (3.5 ÷ 0.0014) ÷ 50 = 8.0 millimeters. Therefore, when the distortion shapes of the left and right viewpoint images match, the limit is that it is possible to separate whether the measured distance is 1000 millimeters or 1008 millimeters.
 また、この8.0ミリメートルが画像上の1画素に当たるため、歪みによって1画素分のズレが発生すると、8.0ミリメートルの測定誤差が生じる。このため、被写体距離が1000ミリメートルの場合に距離測定精度を±10パーセントとすると、許容される歪みは
 1000ミリメートル×±10パーセント÷8.0ミリメートル=±12.5画素分
 となる。
Further, since this 8.0 millimeter corresponds to one pixel on the image, if a displacement of one pixel occurs due to distortion, a measurement error of 8.0 millimeter occurs. For this reason, if the distance measurement accuracy is ± 10 percent when the subject distance is 1000 millimeters, the allowable distortion is 1000 millimeters × ± 10 percent ÷ 8.0 millimeters = ± 12.5 pixels.
 ここで、左右視点画像の歪み形状を実測し、両方の視点画像又は一方の視点画像を画像処理により変形させることで、左右視点画像の歪み形状を一致させることができる。図6の(d)は、左視点撮像部30Lにおいて撮影した左視点画像について画像変形することで、歪み形状を一致させた左右視点画像を示している。多眼カメラ10は、この画像処理後の左右視点画像を用いることで、被写体までの正確な距離情報を推定することができる。 Here, the distortion shape of the left and right viewpoint images can be matched by actually measuring the distortion shape of the left and right viewpoint images and deforming both viewpoint images or one of the viewpoint images by image processing. FIG. 6D illustrates a left and right viewpoint image in which the left viewpoint image captured by the left viewpoint imaging unit 30L is deformed to match the distortion shape. The multi-lens camera 10 can estimate accurate distance information to the subject by using the left and right viewpoint images after the image processing.
 しかしながら、画像処理には計算コストがかかり、また視点画像の解像度が劣化するという問題点がある。 However, there are problems in that image processing is computationally expensive and the resolution of the viewpoint image is degraded.
 したがって、本実施形態では、左視点撮像部30Lの光軸と右視点撮像部30Rの光軸とに角度を持たせることで、機械的に左右視点画像の歪み形状を一致させる。図6の(e)は、右視点撮像部30Rを基準として固定し、左視点撮像部30Lを鉛直方向に傾けることで、左視点撮像部30Lの光軸と右視点撮像部30Rの光軸とに鉛直方向に角度を持たせた様子を示している。また、図6の(f)は、図6の(e)で示した左視点撮像部30L及び右視点撮像部30Rにおいて撮影した左右視点画像を示している。このように光軸を調整することで、図6の(f)に示すように、左右視点画像の歪み形状を一致させることができる。 Therefore, in the present embodiment, the left and right viewpoint images are mechanically matched in distortion by providing an angle between the optical axis of the left viewpoint imaging unit 30L and the optical axis of the right viewpoint imaging unit 30R. FIG. 6E shows the optical axis of the left viewpoint imaging unit 30L and the optical axis of the right viewpoint imaging unit 30R by fixing the right viewpoint imaging unit 30R as a reference and tilting the left viewpoint imaging unit 30L in the vertical direction. Shows a state in which an angle is given in the vertical direction. Moreover, (f) of FIG. 6 has shown the left-right viewpoint image image | photographed in the left viewpoint imaging part 30L and the right viewpoint imaging part 30R shown in (e) of FIG. By adjusting the optical axis in this way, the distortion shapes of the left and right viewpoint images can be matched as shown in FIG.
 〔多眼カメラの製造方法〕
 多眼カメラ10の製造方法について、図7に示すフローチャートを用いて説明する。
[Manufacturing method of multi-lens camera]
A method for manufacturing the multi-lens camera 10 will be described with reference to a flowchart shown in FIG.
 最初に、2つの撮像モジュール50と固定部材80を用意し、基準となる右視点撮像部30Rとして用いる撮像モジュール50(第1の撮像モジュールの一例)を固定部材80に固定する(ステップS1、固定工程)。2つの撮像モジュール50は、それぞれアライメント調整により、光学像の解像力が補正されているものとする。また、固定部材80(支持部材の一例)は、左視点撮像部30L及び右視点撮像部30Rを固定するための板状部材であり、図8A及び図9Aに示すように、矩形状の穴部80R及び80Lが設けられている。右視点撮像部30Rとして用いる撮像モジュール50は、図8B及び図9Bに示すように、穴部80Rを塞ぐように接着固定される。 First, the two imaging modules 50 and the fixing member 80 are prepared, and the imaging module 50 (an example of the first imaging module) used as the reference right viewpoint imaging unit 30R is fixed to the fixing member 80 (step S1, fixing). Process). It is assumed that the resolution of the optical image is corrected in each of the two imaging modules 50 by alignment adjustment. The fixing member 80 (an example of a support member) is a plate-like member for fixing the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R, and has a rectangular hole as shown in FIGS. 8A and 9A. 80R and 80L are provided. As shown in FIGS. 8B and 9B, the imaging module 50 used as the right viewpoint imaging unit 30R is bonded and fixed so as to close the hole 80R.
 続いて、左視点撮像部30Lとして用いる撮像モジュール50(第2の撮像モジュールの一例)を固定部材80に仮固定する(ステップS2、仮固定工程)。左視点撮像部30Lとして用いる撮像モジュール50は、支持部材82により支持され、図8C及び図9Cに示すように、穴部80Lを塞ぐように仮固定される。このとき、右視点撮像部30Rの撮像レンズ54R(第1の撮像レンズの一例)の光軸と左視点撮像部30Lの撮像レンズ54L(第2の撮像レンズの一例)の光軸とが互いに平行になるように、左視点撮像部30Lが仮固定される。 Subsequently, the imaging module 50 (an example of the second imaging module) used as the left viewpoint imaging unit 30L is temporarily fixed to the fixing member 80 (step S2, temporary fixing step). The imaging module 50 used as the left viewpoint imaging unit 30L is supported by a support member 82 and temporarily fixed so as to close the hole 80L as shown in FIGS. 8C and 9C. At this time, the optical axis of the imaging lens 54R (an example of the first imaging lens) of the right viewpoint imaging unit 30R and the optical axis of the imaging lens 54L (an example of the second imaging lens) of the left viewpoint imaging unit 30L are parallel to each other. Thus, the left viewpoint imaging unit 30L is temporarily fixed.
 次に、左視点撮像部30L(仮固定された第2の撮像モジュールの一例)と右視点撮像部30R(固定された第1の撮像モジュールの一例)とにより、チャートを撮影する(ステップS3、撮影工程)。 Next, a chart is photographed by the left viewpoint imaging unit 30L (an example of a temporarily fixed second imaging module) and the right viewpoint imaging unit 30R (an example of a fixed first imaging module) (step S3, Shooting process).
 本実施形態に係るチャートは、図10Aに示すように、一定の大きさのドットが一定間隔で配置されたドットパターンである。チャートは、ドットパターンに限定されるものではなく、十字パターンや格子パターン、チェスボードパターン等を使用することができる。このチャートを、図10Bに示すように、左視点撮像部30L及び右視点撮像部30Rにおいて撮影し、左視点画像及び右視点画像(画像の一例)を取得する。 The chart according to the present embodiment is a dot pattern in which dots of a certain size are arranged at regular intervals, as shown in FIG. 10A. The chart is not limited to a dot pattern, and a cross pattern, a lattice pattern, a chess board pattern, or the like can be used. As shown in FIG. 10B, the chart is captured by the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R, and a left viewpoint image and a right viewpoint image (an example of an image) are acquired.
 ここで、撮像素子56L(第2の撮像素子の一例、図4参照)の受光面中心と撮像素子56R(第1の撮像素子の一例、図4参照)の受光面中心とを結ぶ線をX軸、撮像素子56Rの受光面中心を通り、X軸と垂直な撮像素子56Rの受光面上の軸をY軸、撮像素子56Rの受光面中心を通り、X軸とY軸とに直行する軸をZ軸とすると、撮像レンズ54L及び撮像レンズ54Rの光軸方向が互いに平行なため、左視点撮像部30Lにおける撮影範囲と右視点撮像部30Rにおける撮影範囲とは、X方向にズレが存在し、Y方向は一致している。 Here, a line connecting the center of the light receiving surface of the image sensor 56L (an example of the second image sensor, see FIG. 4) and the center of the light receiving surface of the image sensor 56R (an example of the first image sensor, see FIG. 4) X An axis that passes through the center of the light receiving surface of the image sensor 56R and that is perpendicular to the X axis on the light receiving surface of the image sensor 56R passes through the Y axis, passes through the center of the light receiving surface of the image sensor 56R, and is orthogonal to the X axis and Y axis Is the Z axis, the optical axis directions of the imaging lens 54L and the imaging lens 54R are parallel to each other. Therefore, there is a deviation in the X direction between the imaging range in the left viewpoint imaging unit 30L and the imaging range in the right viewpoint imaging unit 30R. , Y direction coincides.
 次に、取得した左視点画像及び右視点画像から、左視点画像の歪み形状及び右視点画像の歪み形状を算出する(ステップS4、算出工程)。図11の(a)は左視点画像の歪み形状を示すグラフであり、図11の(b)は右視点画像の歪み形状を示すグラフである。これらのグラフは、それぞれ横軸は視点画像の中心からの距離をパーセント単位、縦軸は歪み量を画素数単位(歪みがプラスの場合は画面外方向へのズレ、マイナスの場合は画面中心方向へのズレ)で示しており、画像全体における歪みをプロットしている。左視点撮像部30L及び右視点撮像部30Rは、それぞれアライメント調整がされているため、左視点画像の歪み形状と右視点画像の歪み形状とが異なっている。 Next, the distortion shape of the left viewpoint image and the distortion shape of the right viewpoint image are calculated from the acquired left viewpoint image and right viewpoint image (step S4, calculation step). 11A is a graph showing the distortion shape of the left viewpoint image, and FIG. 11B is a graph showing the distortion shape of the right viewpoint image. In these graphs, the horizontal axis is the distance from the center of the viewpoint image in percentage units, the vertical axis is the distortion amount in pixels (if the distortion is positive, the shift to the outside of the screen, if it is negative, the screen center direction The distortion in the entire image is plotted. Since the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R are respectively adjusted in alignment, the distortion shape of the left viewpoint image and the distortion shape of the right viewpoint image are different.
 続いて、右視点画像の歪み形状と左視点画像の歪み形状とを一致させるための左視点撮像部30Lの光軸傾け角度を算出する(ステップS5、算出工程)。すなわち、ステップS4において算出した左視点画像の歪み形状と右視点画像の歪み形状とから歪み形状の差分を算出し、算出した差分を吸収するための右視点撮像部30Rを基準とした左視点撮像部30Lの光軸の傾け角度を算出する。光軸の傾け角度と歪み形状の変化量との関係は、予めテーブルとして記憶してある。 Subsequently, an optical axis tilt angle of the left viewpoint imaging unit 30L for matching the distortion shape of the right viewpoint image and the distortion shape of the left viewpoint image is calculated (step S5, calculation step). That is, the difference between the distortion shape is calculated from the distortion shape of the left viewpoint image calculated in step S4 and the distortion shape of the right viewpoint image, and left viewpoint imaging based on the right viewpoint imaging unit 30R for absorbing the calculated difference. The tilt angle of the optical axis of the part 30L is calculated. The relationship between the tilt angle of the optical axis and the amount of change in the distortion shape is stored in advance as a table.
 次に、算出した光軸の傾け角度だけ支持部材82により左視点撮像部30Lを傾け、この傾けた状態で左視点撮像部30Lを固定部材80に接着固定する(ステップS6、調整工程)。 Next, the left viewpoint imaging unit 30L is tilted by the support member 82 by the calculated tilt angle of the optical axis, and the left viewpoint imaging unit 30L is bonded and fixed to the fixing member 80 in this tilted state (step S6, adjustment process).
 最後に、左視点撮像部30L及び右視点撮像部30Rが固定された固定部材80を、多眼カメラ10の筐体20に取り付ける(ステップS7)。 Finally, the fixing member 80 to which the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R are fixed is attached to the housing 20 of the multiview camera 10 (step S7).
 図12の(a)、図12の(b)は、このように製造された多眼カメラ10における左視点画像の歪み形状及び右視点画像の歪み形状を示すグラフである。同図に示すように、左視点画像の歪み形状と右視点画像の歪み形状とは一致している。このように、左右視点画像の歪み形状の個体差を機械的に吸収した多眼カメラ10を製造することができる。 12 (a) and 12 (b) are graphs showing the distortion shape of the left viewpoint image and the distortion shape of the right viewpoint image in the multi-view camera 10 manufactured as described above. As shown in the figure, the distortion shape of the left viewpoint image matches the distortion shape of the right viewpoint image. In this manner, the multi-view camera 10 that mechanically absorbs individual differences in the distortion shape of the left and right viewpoint images can be manufactured.
 〔多眼カメラの作用〕
 このように構成した多眼カメラ10の作用について説明する。
[Operation of multi-lens camera]
The operation of the multi-lens camera 10 configured as described above will be described.
 図13Aは、左視点撮像部30Lの撮像レンズ54Lの光軸と右視点撮像部30Rの撮像レンズ54Rの光軸とを示している。同図に示すように、ここでは撮像レンズ54Lの光軸と撮像レンズ54Rの光軸とは、YZ平面において角度θだけ異なる方向を向いているものとする。 FIG. 13A shows the optical axis of the imaging lens 54L of the left viewpoint imaging unit 30L and the optical axis of the imaging lens 54R of the right viewpoint imaging unit 30R. As shown in the figure, where the optical axes of the imaging lens 54R of the imaging lens 54L is assumed to face the angle theta 1 different directions in the YZ plane.
 また、図13Bに示すように、撮像素子56Lの受光面と撮像素子56Rの受光面とは、YZ平面において角度θだけ異なる方向を向いている。この角度θは、
 |θ|<|θ|
の関係を有している。
Further, as shown in FIG. 13B, the light receiving surface of the image sensor 56L and the light receiving surface of the image sensor 56R face different directions by an angle θ 2 in the YZ plane. This angle θ 2 is
| θ 2 | <| θ 1 |
Have the relationship.
 ここで、角度θは、
 0.1°<|θ|<2°
の条件を満たしている。より好ましくは
 0.1°<|θ|<1°
の条件を満たし、さらにより好ましくは
 0.1°<|θ|<0.5°
の条件を満たしている。
Here, the angle θ 2 is
0.1 ° <| θ 2 | <2 °
Meet the conditions. More preferably, 0.1 ° <| θ 2 | <1 °
And even more preferably 0.1 ° <| θ 2 | <0.5 °
Meet the conditions.
 また、多眼カメラ10は、角度θにより、図14Aに示すように左視点撮像部30Lの撮影範囲と右視点撮像部30Rの撮影範囲とにY方向の差が生じる。 Further, the multi-eye camera 10, the angle theta 1, the difference in Y direction and imaging range of the imaging range and the right-view imaging unit 30R of the left-view imaging unit 30L as shown in FIG. 14A occurs.
 ここで、図14Bに示すように、左視点撮像部30Lで撮影した左視点画像と右視点撮像部30Rで撮影した右視点画像との重なり領域のY方向の高さをI、右視点画像のY方向の高さをHとしたときに、視点画像に対する重なり領域の比率I/Hは、
 96%<I/H×100<99.8%
の条件を満たしており、より好ましくは
 98%<I/H×100<99.8%
を満たし、さらにより好ましくは
 99%<I/H×100<99.8%
の条件を満たしている。
Here, as shown in FIG. 14B, the height in the Y direction of the overlapping region of the left viewpoint image captured by the left viewpoint imaging unit 30L and the right viewpoint image captured by the right viewpoint imaging unit 30R is I, and the right viewpoint image When the height in the Y direction is H, the ratio I / H of the overlapping area to the viewpoint image is
96% <I / H × 100 <99.8%
More preferably, 98% <I / H × 100 <99.8%
More preferably 99% <I / H × 100 <99.8%
Meet the conditions.
 なお、左視点撮像部30Lの撮像素子56L及び右視点撮像部30Rの撮像素子56Rの一方又は両方を手ブレ補正制御部77によりY方向にシフトさせることで、重なり領域のY方向の高さを変更することができる。 The height of the overlapping region in the Y direction is shifted by shifting one or both of the imaging element 56L of the left viewpoint imaging unit 30L and the imaging element 56R of the right viewpoint imaging unit 30R in the Y direction by the camera shake correction control unit 77. Can be changed.
 また、多眼カメラ10において図10に示したチャートを撮影する場合を考える。ここでは、多眼カメラ10とチャートとの距離Lを1メートルとして撮影した場合を考える。ここで、撮影距離を1メートル(1000ミリメートル)としたのは、小型カメラモジュールの主な用途であるスマートフォンでは被写体距離1メートル前後での撮影が多いため、1メートルでの距離測定精度が最も高くなるよう設計/生産するのが一般的であるからである。 Consider the case where the multi-eye camera 10 captures the chart shown in FIG. Here, consider a case where the distance L between the multiview camera 10 and the chart is 1 meter. Here, the shooting distance is set to 1 meter (1000 millimeters) because the smartphone, which is the main application of the small camera module, often shoots at a subject distance of about 1 meter, so the distance measurement accuracy at 1 meter is the highest. This is because it is common to design / produce to be.
 図15の(a)は、左視点撮像部30Lで撮影した左視点画像(第2の画像の一例)と、右視点撮像部30Rで撮影した右視点画像(第1の画像の一例)と、左右視点画像の両方で撮影された領域である重なり領域を示す図である。ここで、重なり領域の中心に配置されたドットを中心指標O、重なり領域の周辺に配置されたドットを周辺指標A、周辺指標B、周辺指標C及び周辺指標Dとする。ここでは、重なり領域の四隅(少なくとも4箇所の一例)に配置されたドットを周辺指標A、周辺指標B、周辺指標C及び周辺指標Dとしている。 FIG. 15A illustrates a left viewpoint image (an example of a second image) captured by the left viewpoint imaging unit 30L, a right viewpoint image (an example of a first image) captured by the right viewpoint imaging unit 30R, It is a figure which shows the overlapping area | region which is an area | region image | photographed with both the right-and-left viewpoint images. Here, a dot arranged at the center of the overlapping area is referred to as a center index O, and dots arranged around the overlapping area are referred to as a peripheral index A, a peripheral index B, a peripheral index C, and a peripheral index D. Here, the dots arranged at the four corners (an example of at least four locations) of the overlapping area are used as the peripheral index A, the peripheral index B, the peripheral index C, and the peripheral index D.
 図15の(b)に示すように、左視点画像における中心指標Oから周辺指標A、周辺指標B、周辺指標C及び周辺指標Dまでのそれぞれの距離をP1、P1、P1、及びP1とする。また、図15の(c)に示すように、右視点画像における中心指標Oから周辺指標A、周辺指標B、周辺指標C及び周辺指標Dまでのそれぞれの距離をP2、P2、P2、及びP2とする。 As shown in FIG. 15B, the distances from the central index O to the peripheral index A, the peripheral index B, the peripheral index C, and the peripheral index D in the left viewpoint image are represented by P A 1, P B 1, P C, respectively. 1 and P D 1. Further, as shown in FIG. 15C, the distances from the central index O to the peripheral index A, the peripheral index B, the peripheral index C, and the peripheral index D in the right viewpoint image are represented by P A 2, P B 2, Let P C 2 and P D 2.
 また、図16に示すように、多眼カメラ10の撮像レンズ54L及び54Rの焦点距離をfミリメートル、撮像素子56Lの受光面中心と撮像素子56Rの受光面中心との中心間隔をdミリメートル、撮像素子56L及び56Rの画素ピッチをsミリメートルとする。 Further, as shown in FIG. 16, the focal lengths of the imaging lenses 54L and 54R of the multi-lens camera 10 are f millimeters, the center distance between the center of the light receiving surface of the image sensor 56L and the center of the light receiving surface of the image sensor 56R is d millimeters, and The pixel pitch of the elements 56L and 56R is s millimeters.
 このとき、距離P1と距離P2との差をΔPとすると、ΔP(画素(ピクセル))は、
 0<ΔP<|0.1×(d×f)/(1000×s)|
の条件を満たす。
At this time, when the difference between the distance P A 1 and the distance P A 2 and [Delta] P A, the [Delta] P A (picture elements (pixels)),
0 <ΔP A <| 0.1 × (d × f) / (1000 × s) |
Satisfy the condition of
 ここで0.1とは距離測定精度10%を指している。 Here, 0.1 indicates a distance measurement accuracy of 10%.
 同様に、距離P1と距離P2との差をΔP(画素)、距離P1と距離P2との差をΔP(画素)、距離P1と距離P2との差をΔP(画素)とすると、それぞれ
 0<ΔP<|0.1×(d×f)/(1000×s)|
 0<ΔP<|0.1×(d×f)/(1000×s)|
 0<ΔP<|0.1×(d×f)/(1000×s)|
の条件を満たす。
Similarly, the difference between the distance P B 1 and the distance P B 2 is ΔP B (pixel), the difference between the distance P C 1 and the distance P C 2 is ΔP C (pixel), and the distance P D 1 and the distance P D 2 And ΔP D (pixel), 0 <ΔP B <| 0.1 × (d × f) / (1000 × s) |
0 <ΔP C <| 0.1 × (d × f) / (1000 × s) |
0 <ΔP D <| 0.1 × (d × f) / (1000 × s) |
Satisfy the condition of
 なお、これらのうち少なくとも1つの条件を満たしていればよい。また、周辺指標は重なり領域内の四隅以外のドットに設定してもよいし、5箇所以上に設定してもよい。 Note that it is only necessary to satisfy at least one of these conditions. In addition, the peripheral index may be set to dots other than the four corners in the overlapping area, or may be set to five or more locations.
 〔情報端末装置の構成〕
 次に、多眼カメラ10を適用した情報端末装置である多眼カメラ付きスマートフォン100について、図17A、図17B及び図18を用いて説明する。なお、図4と共通する部分には同一の符号を付し、その詳細な説明は省略する。
[Configuration of information terminal device]
Next, a smartphone 100 with a multi-camera that is an information terminal device to which the multi-camera 10 is applied will be described with reference to FIGS. 17A, 17B, and 18. In addition, the same code | symbol is attached | subjected to the part which is common in FIG. 4, and the detailed description is abbreviate | omitted.
 本実施形態に係る多眼カメラ付きスマートフォン100(多眼カメラの一例)は、3つの撮像光学系を備えた3眼カメラであり、同一被写体を3つの視点から見た立体画像を撮影可能に構成されている。左視点撮像部30L、中視点撮像部30C及び右視点撮像部30Rは、それぞれアライメント調整がされた撮像モジュール50(図2参照)が用いられる。 A smartphone 100 with a multi-lens camera according to the present embodiment (an example of a multi-lens camera) is a trinocular camera including three imaging optical systems, and is configured to be able to shoot a stereoscopic image when the same subject is viewed from three viewpoints. Has been. For the left viewpoint imaging unit 30L, the middle viewpoint imaging unit 30C, and the right viewpoint imaging unit 30R, the imaging module 50 (see FIG. 2) in which the alignment is adjusted is used.
 図17Aは多眼カメラ付きスマートフォン100の正面斜視図、図17Bは、多眼カメラ付きスマートフォン100の背面斜視図である。図17Aに示すように多眼カメラ付きスマートフォン100は、筐体102の前面に表示部としてのディスプレイ104が設けられている。ディスプレイ104は、左視点撮像部30L、中視点撮像部30C及び右視点撮像部30Rによって撮影された立体画像を立体表示することが可能なレンチキュラーレンズ方式の3Dモニタである。また、表示制御部76の制御に従って各種情報を表示することもできる。 FIG. 17A is a front perspective view of the smartphone 100 with a multi-view camera, and FIG. 17B is a rear perspective view of the smartphone 100 with a multi-view camera. As illustrated in FIG. 17A, the smartphone 100 with a multi-eye camera is provided with a display 104 as a display unit on the front surface of the housing 102. The display 104 is a lenticular lens type 3D monitor capable of stereoscopically displaying a stereoscopic image captured by the left viewpoint imaging unit 30L, the middle viewpoint imaging unit 30C, and the right viewpoint imaging unit 30R. Various kinds of information can also be displayed under the control of the display control unit 76.
 ディスプレイ104はタッチパネル106が一体的に形成されており、ユーザによるタッチ操作を受け付け可能となっている。また、筐体102の前面には操作ボタン108が設けられている。 The display 104 is integrally formed with a touch panel 106 and can accept a touch operation by a user. An operation button 108 is provided on the front surface of the housing 102.
 図17Bに示すように、筐体102の背面には左視点撮像部30Lの撮像レンズ54L、中視点撮像部30Cの撮像レンズ54C及び右視点撮像部30Rの54Rが露呈している。撮像レンズ54L、54C及び54Rは、その焦点距離が5ミリメートルに設定されている。 17B, the imaging lens 54L of the left viewpoint imaging unit 30L, the imaging lens 54C of the middle viewpoint imaging unit 30C, and 54R of the right viewpoint imaging unit 30R are exposed on the rear surface of the housing 102. The imaging lenses 54L, 54C, and 54R have focal lengths set to 5 millimeters.
 図18に示すように、タッチパネル106及び操作ボタン108は、操作に応じた信号をCPU61に出力する。ユーザは、タッチパネル106や操作ボタン108を操作することで、左視点撮像部30L、中視点撮像部30C及び右視点撮像部30Rによる撮影を行うことができる。なお、撮影モード時においては、ディスプレイ104がライブビュー画像を表示する電子ビューファインダとして機能する。 As shown in FIG. 18, the touch panel 106 and the operation button 108 output a signal corresponding to the operation to the CPU 61. The user can perform shooting by the left viewpoint imaging unit 30L, the middle viewpoint imaging unit 30C, and the right viewpoint imaging unit 30R by operating the touch panel 106 and the operation buttons 108. In the shooting mode, the display 104 functions as an electronic viewfinder that displays a live view image.
 また、多眼カメラ付きスマートフォン100は、通話部110及びデータ送受信部112を備えている。通話部110は、固定電話網や携帯電話網等を介した電話通信を行う。ユーザは、通話部110により他の電子機器のユーザと通話を行うことができる。データ送受信部112(通信手段の一例)は、携帯電話用基地局や無線LAN(Local Area Network)を介したデジタルデータの送受信を行う。ユーザは、データ送受信部112により、各種デジタルデータの送受信を行うことができる。また、左視点撮像部30L及び右視点撮像部30Rにおいて撮影した画像のデジタルデータを他の電子機器に送信することができる。 The multi-camera-equipped smartphone 100 includes a call unit 110 and a data transmission / reception unit 112. The call unit 110 performs telephone communication via a fixed telephone network, a mobile phone network, or the like. A user can make a call with a user of another electronic device using the call unit 110. The data transmitter / receiver 112 (an example of a communication unit) transmits / receives digital data via a mobile phone base station or a wireless LAN (Local Area Network). The user can send and receive various digital data through the data sending / receiving unit 112. In addition, digital data of images taken by the left viewpoint imaging unit 30L and the right viewpoint imaging unit 30R can be transmitted to another electronic device.
 このように構成された多眼カメラ付きスマートフォン100において、左視点撮像部30L、中視点撮像部30C及び右視点撮像部30Rは、基準となる中視点撮像部30Cを中心、左視点撮像部30L及び右視点撮像部30Rを両端として一軸方向に並べて配置されている。また、中視点撮像部30Cを基準として左中右視点画像の歪み形状が機械的に一致される。 In the multi-camera-equipped smartphone 100 configured as described above, the left viewpoint imaging unit 30L, the middle viewpoint imaging unit 30C, and the right viewpoint imaging unit 30R are centered on the reference middle viewpoint imaging unit 30C, the left viewpoint imaging unit 30L, and The right viewpoint imaging unit 30R is arranged side by side in the uniaxial direction with both ends. Further, the distortion shape of the left middle right viewpoint image is mechanically matched with the middle viewpoint imaging unit 30C as a reference.
 その結果、中視点撮像部30Cで撮影した中視点画像と左視点撮像部30Lで撮影した左視点画像との重なり領域のY方向の高さをI、中視点画像のY方向の高さをHとしたときに、視点画像に対する重なり領域の比率I/Hは、
 96%<I/H×100<99.8%
の条件を満たしており、より好ましくは
 98%<I/H×100<99.8%
を満たし、さらにより好ましくは
 99%<I/H×100<99.8%
の条件を満たしている。
As a result, the overlap height of the Y direction of a region between the left viewpoint images taken by the viewpoint image and a left-view imaging unit 30L in photographed by mid-view imaging unit 30C I L, the Y-direction of the height of the middle viewpoint images When H is set, the ratio I L / H of the overlapping area to the viewpoint image is
96% <I L /H×100<99.8%
More preferably, 98% <I L /H×100<99.8%
More preferably 99% <I L /H×100<99.8%
Meet the conditions.
 同様に、中視点撮像部30Cで撮影した中視点画像と右視点撮像部30Rで撮影した右視点画像との重なり領域のY方向の高さをI、中視点画像のY方向の高さをHとしたときに、視点画像に対する重なり領域の比率I/Hは、
 96%<I/H×100<99.8%
の条件を満たしており、より好ましくは
 98%<I/H×100<99.8%
を満たし、さらにより好ましくは
 99%<I/H×100<99.8%
の条件を満たしている。
Similarly, the height in the Y direction of the overlapping area between the middle viewpoint image captured by the middle viewpoint imaging unit 30C and the right viewpoint image captured by the right viewpoint imaging unit 30R is set to I R , and the height of the middle viewpoint image in the Y direction is set. When H is set, the ratio I R / H of the overlapping area with respect to the viewpoint image is
96% <I R /H×100<99.8%
More preferably, 98% <I R /H×100<99.8%
More preferably 99% <I R /H×100<99.8%
Meet the conditions.
 また、撮像素子56Cの受光面と撮像素子56Lの受光面とは、YZ平面において角度θだけ異なる方向を向いている。 Further, the light-receiving surface and the light-receiving surface of the image sensor 56L of the imaging device 56C, are oriented only different angular orientation theta L in YZ plane.
 ここで、角度θは、
 0.1°<|θ|<2°
の条件を満たしている。より好ましくは
 0.1°<|θ|<1°
の条件を満たし、さらにより好ましくは
 0.1°<|θ|<0.5°
の条件を満たしている。
Here, the angle θ L is
0.1 ° <| θ L | <2 °
Meet the conditions. More preferably, 0.1 ° <| θ L | <1 °
And even more preferably 0.1 ° <| θ L | <0.5 °
Meet the conditions.
 同様に、撮像素子56Cの受光面と撮像素子56Rの受光面とは、YZ平面において角度θだけ異なる方向を向いている。 Similarly, the light receiving surface and the light receiving surface of the image pickup device 56R of the imaging device 56C, are oriented only different angular orientation theta R in the YZ plane.
 ここで、角度θは、
 0.1°<|θ|<2°
の条件を満たしている。より好ましくは
 0.1°<|θ|<1°
の条件を満たし、さらにより好ましくは
 0.1°<|θ|<0.5°
の条件を満たしている。
Here, the angle θ R is
0.1 ° <| θ R | <2 °
Meet the conditions. More preferably, 0.1 ° <| θ R | <1 °
And even more preferably 0.1 ° <| θ R | <0.5 °
Meet the conditions.
 さらに、多眼カメラ付きスマートフォン100と図10に示したチャートとの距離Lを1メートルとして撮影した場合の中視点画像と左視点画像との重なり領域の中心指標Oと、重なり領域の周辺に配置されたドットを周辺指標A、周辺指標B、周辺指標C及び周辺指標Dとし、中視点画像における中心指標Oから周辺指標A、周辺指標B、周辺指標C及び周辺指標Dまでのそれぞれの距離をP1、P1、P1、及びP1、左視点画像における中心指標Oから周辺指標A、周辺指標B、周辺指標C及び周辺指標Dまでのそれぞれの距離をP2、P2、P2、及びP2、撮像レンズ54C及び54Lの焦点距離をfミリメートル、撮像素子56Cの受光面中心と撮像素子56Lの受光面中心との中心間隔をdミリメートル、撮像素子56C及び56Lの画素ピッチをsミリメートル、距離P1と距離P2との差をΔP(画素)、距離P1と距離P2との差をΔP(画素)、距離P1と距離P2との差をΔP(画素)、距離P1と距離P2との差をΔP(画素)とすると、それぞれ
 0<ΔP<|0.1×(d×f)/(1000×s)|
 0<ΔP<|0.1×(d×f)/(1000×s)|
 0<ΔP<|0.1×(d×f)/(1000×s)|
 0<ΔP<|0.1×(d×f)/(1000×s)|
の条件を満たす。
Furthermore, when the distance L between the smartphone 100 with a multi-view camera and the chart shown in FIG. 10 is taken as 1 meter, the center index O 1 of the overlapping area between the middle viewpoint image and the left viewpoint image, and around the overlapping area placed dot peripheral indicators a, near the index B, and peripheral indicators C and peripheral indicators from D, the center index O 1 in the middle viewpoint image peripheral indicators a, near index B, up to around the index C and the peripheral indicators D respectively The distances are P A 1, P B 1, P C 1, and P D 1, and the distances from the center index O 1 to the peripheral index A, the peripheral index B, the peripheral index C, and the peripheral index D in the left viewpoint image are P A 2, P B 2, P C 2, and P D 2, the focal lengths of the imaging lenses 54C and 54L are f millimeters, and the center distance between the center of the light receiving surface of the image sensor 56C and the center of the light receiving surface of the image sensor 56L. d millimeters, the pixel pitch of the image sensors 56C and 56L is s millimeters, the difference between the distance P A 1 and the distance P A 2 is ΔP A (pixel), and the difference between the distance P B 1 and the distance P B 2 is ΔP B ( Pixel), the difference between the distance P C 1 and the distance P C 2 is ΔP C (pixel), and the difference between the distance P D 1 and the distance P D 2 is ΔP D (pixel), respectively, 0 <ΔP A <| 0.1 × (d × f) / (1000 × s) |
0 <ΔP B <| 0.1 × (d × f) / (1000 × s) |
0 <ΔP C <| 0.1 × (d × f) / (1000 × s) |
0 <ΔP D <| 0.1 × (d × f) / (1000 × s) |
Satisfy the condition of
 同様に、多眼カメラ付きスマートフォン100と図10に示したチャートとの距離Lを1メートルとして撮影した場合の中視点画像と右視点画像との重なり領域の中心指標O、重なり領域の周辺に配置されたドットを周辺指標E、周辺指標F、周辺指標G及び周辺指標Hとし、中視点画像における中心指標Oから周辺指標E、周辺指標F、周辺指標G及び周辺指標Hまでのそれぞれの距離をP1、P1、P1、及びP1、左視点画像における中心指標Oから周辺指標E、周辺指標F、周辺指標G及び周辺指標Hまでのそれぞれの距離をP2、P2、P2、及びP2、撮像レンズ54C及び54Rの焦点距離をfミリメートル、撮像素子56Cの受光面中心と撮像素子56Rの受光面中心との中心間隔をdミリメートル、撮像素子56C及び56Rの画素ピッチをsミリメートル、距離P1と距離P2との差をΔP(画素)、距離P1と距離P2との差をΔP(画素)、距離P1と距離P2との差をΔP(画素)、距離P1と距離P2との差をΔP(画素)とすると、それぞれ
 0<ΔP<|0.1×(d×f)/(1000×s)|
 0<ΔP<|0.1×(d×f)/(1000×s)|
 0<ΔP<|0.1×(d×f)/(1000×s)|
 0<ΔP<|0.1×(d×f)/(1000×s)|
の条件を満たす。
Similarly, the center index O 2 of the overlapping area between the middle viewpoint image and the right viewpoint image when the distance L between the smartphone 100 with a multi-view camera and the chart shown in FIG. The arranged dots are set as the peripheral index E, the peripheral index F, the peripheral index G, and the peripheral index H, and each of the center index O 2 to the peripheral index E, the peripheral index F, the peripheral index G, and the peripheral index H in the middle viewpoint image. The distances are P E 1, P F 1, P G 1, and P H 1, and the distances from the center index O 1 to the peripheral index E, the peripheral index F, the peripheral index G, and the peripheral index H in the left viewpoint image are P E 2, P F 2, P G 2, and P H 2, the imaging lens 54C and focal length f mm 54R, the center distance between the light receiving surface center of the light receiving surface center and an image pickup device 56R of the imaging device 56C Mm, s mm pixel pitch of the image pickup element 56C and 56R, the distance P E 1 and the distance difference of [Delta] P E between P E 2 (pixels), the distance P F 1 and the distance difference [Delta] P F (pixels between P F 2 ), The difference between the distance P G 1 and the distance P G 2 is ΔP G (pixel), and the difference between the distance P H 1 and the distance P H 2 is ΔP H (pixel), respectively, 0 <ΔP E <| 0 .1 × (d × f) / (1000 × s) |
0 <ΔP F <| 0.1 × (d × f) / (1000 × s) |
0 <ΔP G <| 0.1 × (d × f) / (1000 × s) |
0 <ΔP H <| 0.1 × (d × f) / (1000 × s) |
Satisfy the condition of
 ここでは多眼カメラ付きスマートフォン100は、3つの撮像光学系を備えた3眼カメラを例に説明したが、多眼カメラ10のように2眼であってもよいし、4眼以上の多眼カメラであってもよい。 Here, the multi-lens camera-equipped smartphone 100 has been described by taking a trinocular camera including three imaging optical systems as an example. However, the multi-lens camera may have two eyes as in the multi-lens camera 10, or four or more multi-lens cameras. It may be a camera.
 本発明の技術的範囲は、上記実施形態に記載の範囲には限定されない。各実施形態における構成等は、本発明の趣旨を逸脱しない範囲で、各実施形態間で適宜組み合せることができる。 The technical scope of the present invention is not limited to the scope described in the above embodiment. The configurations and the like in the respective embodiments can be appropriately combined among the respective embodiments without departing from the spirit of the present invention.
 10…多眼カメラ、30L…左視点撮像部、30R…右視点撮像部、50…撮像モジュール、54,54L,54R…撮像レンズ、56,56L,56R…撮像素子、73…距離推定部、80…固定部材、80L,80R…穴部、82…支持部材、100…スマートフォン、104…ディスプレイ、106…タッチパネル、108…操作ボタン、O…中心指標、A,B,C,D…周辺指標 DESCRIPTION OF SYMBOLS 10 ... Multi-view camera, 30L ... Left viewpoint imaging part, 30R ... Right viewpoint imaging part, 50 ... Imaging module, 54, 54L, 54R ... Imaging lens, 56, 56L, 56R ... Imaging element, 73 ... Distance estimation part, 80 ... Fixing member, 80L, 80R ... Hole, 82 ... Supporting member, 100 ... Smartphone, 104 ... Display, 106 ... Touch panel, 108 ... Operation button, O ... Center indicator, A, B, C, D ... Peripheral indicator

Claims (15)

  1.  一軸方向に複数の撮像モジュールを並べて被写体の撮影を行う多眼撮像装置であって、
     前記複数の撮像モジュールは、それぞれ、撮像レンズと前記撮像レンズを介して受光した被写体の光学像を画像信号に変換する撮像素子とを有し、前記光学像の解像力を補正するために前記撮像素子が前記撮像レンズに対してアライメント調整されて取り付けられており、
     前記複数の撮像モジュールのうち基準となる第1の撮像モジュールで撮影した画像と前記第1の撮像モジュールとは異なる第2の撮像モジュールで撮影した画像との重なり領域の前記複数の撮像モジュールが並ぶ一軸方向と直行する方向の高さをI、前記第1の撮像モジュールで撮影した画像の前記一軸方向と直行する方向の画像の高さをHとしたときに、I/Hが
     96%<I/H×100<99.8%
     の条件を満たす多眼撮像装置。
    A multi-lens imaging device that shoots a subject by arranging a plurality of imaging modules in a uniaxial direction,
    Each of the plurality of imaging modules includes an imaging lens and an imaging device that converts an optical image of a subject received through the imaging lens into an image signal, and the imaging device corrects the resolving power of the optical image. Is aligned and attached to the imaging lens,
    Among the plurality of imaging modules, the plurality of imaging modules are arranged in an overlapping region between an image captured by a first imaging module serving as a reference and an image captured by a second imaging module different from the first imaging module. When the height in the direction orthogonal to the uniaxial direction is I, and the height of the image in the direction orthogonal to the uniaxial direction of the image captured by the first imaging module is H, I / H is 96% <I /H×100<99.8%
    Multi-lens imaging device that satisfies the following conditions.
  2.  前記I/Hが
     98%<I/H×100<99.8%
     の条件を満たす請求項1に記載の多眼撮像装置。
    The I / H is 98% <I / H × 100 <99.8%
    The multi-lens imaging device according to claim 1, wherein the following condition is satisfied.
  3.  前記I/Hが
     99%<I/H×100<99.8%
     の条件を満たす請求項1に記載の多眼撮像装置。
    The I / H is 99% <I / H × 100 <99.8%
    The multi-lens imaging device according to claim 1, wherein the following condition is satisfied.
  4.  前記第1の撮像モジュールの前記撮像素子の受光面中心と、前記第2の撮像モジュールの前記撮像素子の受光面中心とを結ぶ線をX軸、前記第1の撮像モジュールの前記撮像素子の受光面中心を通り、前記X軸と垂直な前記第1の撮像モジュールの前記撮像素子の受光面上の軸をY軸、前記第1の撮像モジュールの前記撮像素子の受光面中心を通り、前記X軸と前記Y軸とに直行する軸をZ軸とするとき、
     前記第2の撮像モジュールの前記撮像素子の受光面中心を通るYZ平面における前記第2の撮像モジュールの前記撮像素子のY軸に対する傾き角θが
     0.1°<|θ|<2°
     の条件を満たす請求項1から3のいずれか1項に記載の多眼撮像装置。
    A line connecting the center of the light receiving surface of the image sensor of the first imaging module and the center of the light receiving surface of the image sensor of the second imaging module is an X axis, and light reception of the image sensor of the first imaging module An axis on the light receiving surface of the image sensor of the first imaging module that passes through the center of the surface and is perpendicular to the X axis is a Y axis, passes through the center of the light receiving surface of the image sensor of the first imaging module, and the X When the axis perpendicular to the axis and the Y axis is the Z axis,
    The inclination angle θ of the second imaging module with respect to the Y axis of the imaging element in a YZ plane passing through the center of the light receiving surface of the imaging element of the second imaging module is 0.1 ° <| θ | <2 °.
    The multi-lens imaging device according to any one of claims 1 to 3, which satisfies the following condition.
  5.  前記θが
     0.1°<|θ|<1°
     の条件を満たす請求項4に記載の多眼撮像装置。
    Θ is 0.1 ° <| θ | <1 °
    The multi-lens imaging device according to claim 4, which satisfies the following condition.
  6.  前記θが
     0.1°<|θ|<0.5°
     の条件を満たす請求項4に記載の多眼撮像装置。
    Θ is 0.1 ° <| θ | <0.5 °
    The multi-lens imaging device according to claim 4, which satisfies the following condition.
  7.  前記撮像素子の受光面から所定の距離Lだけ離れた位置にチャートを配置し、前記第1の撮像モジュールと前記第2の撮像モジュールとで前記チャートを撮影した場合に、前記第1の撮像モジュールで撮影した第1の画像と前記第2の撮像モジュールで撮影した第2の画像の重なり領域内において、前記重なり領域の中心に配置されたチャートを中心指標、前記画像の重なり領域の周辺に配置されたチャートを周辺指標とし、前記第1の撮像モジュールの前記撮像素子の受光面中心と前記第2の撮像モジュールの前記撮像素子の受光面中心との中心間隔をdミリメートル、前記撮像レンズの焦点距離をfミリメートル、前記撮像素子の画素ピッチをsミリメートル、前記Lを1メートルとしたときに、前記第1の画像における前記中心指標から前記周辺指標までの距離p1と前記第2の画像における前記中心指標から前記周辺指標までの距離p2との差Δp画素が
     0<Δp<|0.1×(d×f)/(1000×s)|
     の条件を満たす請求項1から6のいずれか1項に記載の多眼撮像装置。
    When the chart is arranged at a position away from the light receiving surface of the imaging element by a predetermined distance L and the chart is photographed by the first imaging module and the second imaging module, the first imaging module In the overlapping area of the first image captured by the second imaging module and the second image captured by the second imaging module, a chart disposed at the center of the overlapping area is a central index, and is disposed around the overlapping area of the image. With the chart as a peripheral index, the center distance between the center of the light receiving surface of the image sensor of the first imaging module and the center of the light receiving surface of the image sensor of the second imaging module is d millimeters, and the focal point of the imaging lens When the distance is f millimeters, the pixel pitch of the image sensor is s millimeters, and the L is 1 meter, the central index in the first image The difference Δp pixel between the distance p1 to the peripheral index and the distance p2 from the central index to the peripheral index in the second image is 0 <Δp <| 0.1 × (d × f) / (1000 × s ) |
    The multi-lens imaging device according to any one of claims 1 to 6, which satisfies the following condition.
  8.  少なくとも4箇所の前記周辺指標において前記Δp画素が
     0<Δp<|0.1×(d×f)/(1000×s)|
     の条件を満たす請求項7に記載の多眼撮像装置。
    In at least four of the peripheral indices, the Δp pixel is 0 <Δp <| 0.1 × (d × f) / (1000 × s) |
    The multi-lens imaging device according to claim 7, wherein the condition is satisfied.
  9.  前記第1の撮像モジュールと前記第2の撮像モジュールとによりそれぞれ撮影された画像に平行化処理を施す平行化処理手段を備えた請求項1から8のいずれか1項に記載の多眼撮像装置。 9. The multi-eye imaging apparatus according to claim 1, further comprising a parallelization processing unit configured to perform parallelization processing on images captured by the first imaging module and the second imaging module, respectively. .
  10.  前記第1の撮像モジュールで撮影した第1の画像と前記第2の撮像モジュールで撮影した第2の画像とに基づいて前記多眼撮像装置から前記被写体までの距離を測定する距離測定手段を備えた請求項1から9のいずれか1項に記載の多眼撮像装置。 Distance measuring means for measuring a distance from the multi-lens imaging device to the subject based on a first image captured by the first imaging module and a second image captured by the second imaging module; The multi-eye imaging device according to any one of claims 1 to 9.
  11.  前記第1の撮像モジュールで撮影した第1の画像と前記第2の撮像モジュールで撮影した第2の画像とに基づいて前記撮像素子の解像度よりも高い解像度の画像を生成する高解像度画像生成手段を備えた請求項1から10のいずれか1項に記載の多眼撮像装置。 High-resolution image generation means for generating an image having a resolution higher than the resolution of the image sensor based on the first image captured by the first imaging module and the second image captured by the second imaging module. The multi-lens imaging device according to any one of claims 1 to 10, further comprising:
  12.  前記多眼撮像装置は2眼である請求項1から11のいずれか1項に記載の多眼撮像装置。 The multi-eye imaging device according to claim 1, wherein the multi-eye imaging device has two eyes.
  13.  前記多眼撮像装置は前記一軸方向に3台の撮像モジュールを並べた3眼であり、前記第1の撮像モジュールが中央の撮像モジュールであり、前記第2の撮像モジュールが両端の撮像モジュールである請求項1から11のいずれか1項に記載の多眼撮像装置。 The multi-lens imaging device is a three-lens system in which three imaging modules are arranged in the uniaxial direction, the first imaging module is a central imaging module, and the second imaging module is an imaging module at both ends. The multi-eye imaging device according to any one of claims 1 to 11.
  14.  請求項1から13のいずれか1項に記載の多眼撮像装置と、
     画像等のデジタルデータを送受信する通信手段と、
     を備えた情報端末装置。
    The multi-lens imaging device according to any one of claims 1 to 13,
    Communication means for transmitting and receiving digital data such as images;
    An information terminal device comprising:
  15.  第1の撮像レンズと、前記第1の撮像レンズを介して受光した被写体の光学像を画像信号に変換する第1の撮像素子とを備えた第1の撮像モジュールであって、前記光学像の解像力を補正するために前記第1の撮像素子のアライメント調整がされた第1の撮像モジュールを支持部材に固定する固定工程と、
     第2の撮像レンズと、前記第2の撮像レンズを介して受光した被写体の光学像を画像信号に変換する第2の撮像素子とを備えた第2の撮像モジュールであって、前記光学像の解像力を補正するために前記第2の撮像素子のアライメント調整がされた第2の撮像モジュールを前記支持部材に仮固定する仮固定工程と、
     固定された前記第1の撮像モジュールと仮固定された前記第2の撮像モジュールとでそれぞれ画像を撮影する撮影工程と、
     前記第1の撮像モジュールにより撮影された画像の歪み形状と前記第2の撮像モジュールにより撮影された画像の歪み形状とを算出する算出工程と、
     前記算出された歪み形状に基づいて前記第2の撮像モジュールの向きを調整して固定する調整工程であって、前記第1の撮像モジュールにより撮影される画像の歪み形状と前記第2の撮像モジュールにより撮影される画像の歪み形状とを一致させる調整工程と、
     を備え、
     前記第1の撮像モジュールで撮影した画像と前記第2の撮像モジュールで撮影した画像との重なり領域の前記第1の撮像モジュールと前記第2の撮像モジュールとが並ぶ一軸方向と直行する方向の高さをI、前記第1の撮像モジュールで撮影した画像の前記一軸方向と直行する方向の画像の高さをHとしたときに、I/Hが
     96%<I/H×100<99.8%
     の条件を満たす多眼撮像装置の製造方法。
    A first imaging module comprising: a first imaging lens; and a first imaging element that converts an optical image of a subject received through the first imaging lens into an image signal, wherein the optical image A fixing step of fixing the first imaging module in which the alignment of the first imaging element is adjusted to correct the resolving power to a support member;
    A second imaging module comprising: a second imaging lens; and a second imaging element that converts an optical image of a subject received through the second imaging lens into an image signal, wherein the optical image A temporary fixing step of temporarily fixing the second imaging module in which the alignment of the second imaging element is adjusted to correct the resolving power to the support member;
    A photographing step of photographing images with the first imaging module fixed and the second imaging module temporarily fixed;
    A calculation step of calculating a distortion shape of an image captured by the first imaging module and a distortion shape of an image captured by the second imaging module;
    An adjustment step of adjusting and fixing the orientation of the second imaging module based on the calculated distortion shape, the distortion shape of an image captured by the first imaging module and the second imaging module An adjustment process for matching the distortion shape of the image captured by
    With
    A height in a direction perpendicular to the uniaxial direction in which the first imaging module and the second imaging module are arranged in an overlapping region of an image captured by the first imaging module and an image captured by the second imaging module. I / H is 96% <I / H × 100 <99.8, where I is the height of the image taken by the first imaging module and the height of the image perpendicular to the one-axis direction. %
    Of manufacturing a multi-lens imaging device that satisfies the following condition.
PCT/JP2015/066616 2014-09-30 2015-06-09 Multi-lens imaging device, method for manufacturing same, and information terminal device WO2016051871A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-200071 2014-09-30
JP2014200071 2014-09-30

Publications (1)

Publication Number Publication Date
WO2016051871A1 true WO2016051871A1 (en) 2016-04-07

Family

ID=55629925

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/066616 WO2016051871A1 (en) 2014-09-30 2015-06-09 Multi-lens imaging device, method for manufacturing same, and information terminal device

Country Status (1)

Country Link
WO (1) WO2016051871A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018021479A1 (en) * 2016-07-29 2018-02-01 ミツミ電機株式会社 Actuator, camera module, and camera-mounted device
JP2019054501A (en) * 2017-09-12 2019-04-04 イズメディア カンパニー リミテッド Method of aligning dual camera modules

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10115506A (en) * 1996-10-11 1998-05-06 Fuji Heavy Ind Ltd Apparatus for adjusting stereo camera
JP2006033395A (en) * 2004-07-15 2006-02-02 Fuji Photo Film Co Ltd Image pickup device and stereoscopic image pickup system
JP2008045983A (en) * 2006-08-15 2008-02-28 Fujifilm Corp Adjustment device for stereo camera
JP2011250022A (en) * 2010-05-25 2011-12-08 Olympus Imaging Corp Camera system
JP2012023561A (en) * 2010-07-14 2012-02-02 Bi2−Vision株式会社 Control system of stereo imaging device
JP2014164363A (en) * 2013-02-22 2014-09-08 Hitachi Ltd Multiple camera photographing device and multiple camera photographing method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10115506A (en) * 1996-10-11 1998-05-06 Fuji Heavy Ind Ltd Apparatus for adjusting stereo camera
JP2006033395A (en) * 2004-07-15 2006-02-02 Fuji Photo Film Co Ltd Image pickup device and stereoscopic image pickup system
JP2008045983A (en) * 2006-08-15 2008-02-28 Fujifilm Corp Adjustment device for stereo camera
JP2011250022A (en) * 2010-05-25 2011-12-08 Olympus Imaging Corp Camera system
JP2012023561A (en) * 2010-07-14 2012-02-02 Bi2−Vision株式会社 Control system of stereo imaging device
JP2014164363A (en) * 2013-02-22 2014-09-08 Hitachi Ltd Multiple camera photographing device and multiple camera photographing method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018021479A1 (en) * 2016-07-29 2018-02-01 ミツミ電機株式会社 Actuator, camera module, and camera-mounted device
CN109564373A (en) * 2016-07-29 2019-04-02 三美电机株式会社 Actuator, camara module and camera carrying device
KR20190055058A (en) * 2016-07-29 2019-05-22 미쓰미덴기가부시기가이샤 Actuator, camera module and camera mounting device
US10944909B2 (en) 2016-07-29 2021-03-09 Mitsumi Electric Co., Ltd. Actuator, camera module, and camera-mounted device
CN109564373B (en) * 2016-07-29 2021-06-25 三美电机株式会社 Actuator, camera module, and camera mounting device
TWI743154B (en) * 2016-07-29 2021-10-21 日商三美電機股份有限公司 Actuator, camera module and camera-equipped device
KR102354669B1 (en) * 2016-07-29 2022-01-24 미쓰미덴기가부시기가이샤 Actuators, camera modules and camera mounts
JP2019054501A (en) * 2017-09-12 2019-04-04 イズメディア カンパニー リミテッド Method of aligning dual camera modules

Similar Documents

Publication Publication Date Title
CN102227746B (en) Stereoscopic image processing device, method, recording medium and stereoscopic imaging apparatus
JP5640143B2 (en) Imaging apparatus and imaging method
JP5744263B2 (en) Imaging apparatus and focus control method thereof
US9237319B2 (en) Imaging device and automatic focus adjustment method
EP2571246A1 (en) Camera body, interchangeable lens unit, image capturing device, method for controlling camera body, method for controlling interchangeable lens unit, program, and recording medium on which program is recorded
JP5652157B2 (en) Imaging apparatus, image processing method, and computer program
JP4914420B2 (en) Imaging device, compound eye imaging device, and imaging control method
US20120268613A1 (en) Image capturing apparatus and control method thereof
WO2013183406A1 (en) Image capture device and image display method
CN104221370B (en) Image processing apparatus, camera head and image processing method
KR20120111918A (en) Three-dimensional image capture device
CN103597811B (en) Take the image-capturing element of Three-dimensional movable image and planar moving image and be equipped with its image capturing device
WO2012036019A1 (en) Monocular 3d-imaging device, shading correction method for monocular 3d-imaging device, and program for monocular 3d-imaging device
WO2014141561A1 (en) Imaging device, signal processing method, and signal processing program
JP2012133232A (en) Imaging device and imaging control method
US11747714B2 (en) Image sensor and image-capturing device that selects pixel signal for focal position
WO2013069445A1 (en) Three-dimensional imaging device and image processing method
JP2010226362A (en) Imaging apparatus and control method thereof
WO2013145821A1 (en) Imaging element and imaging device
WO2016051871A1 (en) Multi-lens imaging device, method for manufacturing same, and information terminal device
JP7009142B2 (en) Image pickup device and image processing method
JP2010103949A (en) Apparatus, method and program for photographing
JP2014158062A (en) Imaging element for capturing stereoscopic dynamic image and plane dynamic image, and imaging device mounting this imaging element
US20130088580A1 (en) Camera body, interchangeable lens unit, image capturing device, method for controlling camera body, program, and recording medium on which program is recorded
GB2496717A (en) Image detector outputting partial- and full-resolution left and right eye stereo images in video and still photography modes

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15845992

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15845992

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP