JP2009124340A - Imaging apparatus, photographing support method, and photographing support program - Google Patents

Imaging apparatus, photographing support method, and photographing support program Download PDF

Info

Publication number
JP2009124340A
JP2009124340A JP2007294756A JP2007294756A JP2009124340A JP 2009124340 A JP2009124340 A JP 2009124340A JP 2007294756 A JP2007294756 A JP 2007294756A JP 2007294756 A JP2007294756 A JP 2007294756A JP 2009124340 A JP2009124340 A JP 2009124340A
Authority
JP
Japan
Prior art keywords
image data
imaging
shooting
means
step
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2007294756A
Other languages
Japanese (ja)
Inventor
Kenji Funamoto
憲司 船本
Original Assignee
Fujifilm Corp
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp, 富士フイルム株式会社 filed Critical Fujifilm Corp
Priority to JP2007294756A priority Critical patent/JP2009124340A/en
Publication of JP2009124340A publication Critical patent/JP2009124340A/en
Pending legal-status Critical Current

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide an imaging apparatus for facilitating panoramic photographing with high accuracy. <P>SOLUTION: The imaging apparatus includes: a display part 23; a solid-state imaging element 5 for photographing an object and outputting an imaging signal; a digital signal processing part 17 having a function for generating first image data for panoramic image data generation from the imaging signal output from the solid-state imaging element 5 by photographing instructed in a panoramic photographing mode, a function for generating second image data for a through image to be displayed in the display part 23 from the imaging signal output from the solid-state imaging element 5 in the panoramic photographing mode, and a function for generating third image data in the same format as the second image data from the imaging signal output from the solid-state imaging element 5 by the photographing instructed; and a correlation detecting part 26 for detecting correlation between data in a first region set at the end of the third image data generated by the photographing instruction and data in a second region set at the end of the second image data generated after the photographing instructed. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

  The present invention relates to an imaging apparatus having a panoramic shooting mode for generating and recording panoramic image data obtained by combining these from a plurality of captured image data.

  A panoramic image is created by shooting images at each position or at a fixed angle at a certain interval, and combining the overlapping portions of a plurality of images obtained by shooting. For this reason, when shooting a plurality of images used for a panoramic image, after shooting an image in a certain shooting range, when shooting an image in the shooting range next to it, It is necessary to shoot while confirming the degree of overlap with the next shooting range.

  However, as described above, it is actually very difficult to capture a plurality of images each having an appropriate overlap. For example, in order to shoot the plurality of images, a target object or the like is determined within the shooting range, and if the target is brought to any position in the liquid crystal screen, an image having an appropriate overlap in the horizontal direction is obtained. After judging whether it is possible to take a picture, the picture is taken with the target as a reference. However, for example, when there is no object or the like that can be targeted within the shooting range, it is difficult to capture the image having the overlap as described above. Also, when shooting a plurality of images used for a panoramic image, pay attention not only to the horizontal (horizontal) overlap of images as described above, but also to vertical (vertical) image misalignment. There must be.

  As a method for providing support for performing panoramic photography well, a method (Patent Document 1) is proposed in which the amount of movement between image data is detected by a physical sensor to determine the photographing timing.

JP 2006-217478 A

  However, in the above-described conventional method, since the movement amount is detected not by image data but by a sensor, the detection accuracy is poor, and a panoramic image intended by the user may not be obtained.

  The present invention has been made in view of the above circumstances, and an object thereof is to provide an imaging apparatus capable of performing panoramic photography easily and with high accuracy.

  An imaging device of the present invention is an imaging device having a panoramic shooting mode for generating and recording panoramic image data obtained by synthesizing a plurality of captured image data, and shooting the subject and outputting an imaging signal Imaging means for performing, in the panorama shooting mode, first image data generating means for generating first image data for generating the panoramic image data from an imaging signal output from the imaging means by shooting according to a shooting instruction; Second image data generation means for generating second image data for a through image to be displayed on the display means from an imaging signal output from the imaging means by shooting other than shooting by the shooting instruction in the panoramic shooting mode And the second image data in the same format as the second image data from the imaging signal output from the imaging means by imaging according to the imaging instruction. Third image data generating means for generating the image data, data in the first area set at the end of the third image data, and generation of the second image data after shooting by the shooting instruction Correlation degree detection means for detecting a degree of correlation between the second image data generated by the means and data in the second region set at an end opposite to the end of the third image data; Is provided.

  The imaging apparatus of the present invention includes display control means for displaying the detection result of the correlation degree by the correlation degree detection means on the display means.

  In the imaging apparatus of the present invention, the imaging unit automatically images a subject when a period in which the correlation detected by the correlation level detection unit is equal to or greater than a threshold value continues for a predetermined period.

  The image pickup apparatus of the present invention includes face detection means for detecting a human face from image data generated from the image pickup signal. When a face is detected from the first region by the face detection means, the correlation degree detection is performed. The means resets the first region to a region that does not include the face in the third image data, and detects the correlation.

  The imaging apparatus of the present invention includes face detection means for detecting a human face from the image data generated from the imaging signal, and the face detection from the second image data generated immediately before the shooting instruction is given. When a face is detected by the means, the imaging means performs a first imaging focused on the face and a second imaging focused on the background of the face according to the shooting instruction. The correlation degree detection unit is configured to generate the third image data generated from the imaging signal output from the imaging unit by the second imaging when the first imaging and the second imaging are performed. Is the image data used to detect the correlation, the first image data generated from the imaging signal output from the imaging means by the first imaging, and the imaging by the second imaging Output from the means And an image data generation means for generating image data for the panorama image data generated focus was the both the background and the face with said first image data generated from the image signal.

  The imaging apparatus of the present invention includes a moving object detection unit that detects a moving object from image data generated from the imaging signal, and the correlation degree detection unit extracts a feature point from the first region, and the feature point The degree of correlation is obtained by comparing with data in the second region, and when a moving object is detected from the first region by the moving object detection unit, the correlation degree detection unit includes a portion excluding the moving object. To extract the feature points.

  The imaging apparatus of the present invention includes warning means for giving a warning when a moving object is detected by the moving object detection means from the same area as the first area of the second image data.

  The photographing support method of the present invention includes a display means, an imaging means for photographing a subject and outputting an imaging signal, and a panoramic photographing mode for generating and recording panoramic image data obtained by combining these from a plurality of photographed image data. A shooting support method for supporting shooting in the panorama shooting mode in an imaging device having the panorama image data generation from the imaging signal output from the imaging means by shooting according to a shooting instruction in the panorama shooting mode. A first image data generation step for generating one image data; and for a through image to be displayed on the display unit from an imaging signal output from the imaging unit by shooting other than shooting by the shooting instruction in the panoramic shooting mode A second image data generation step for generating the second image data, and shooting according to a shooting instruction. A third image data generation step for generating third image data in the same format as the second image data from the image pickup signal output from the image pickup means, and an end of the third image data are set. The data in the first area and the end of the second image data generated by the second image data generation step after shooting by the shooting instruction opposite to the end of the third image data A correlation degree detecting step for detecting a degree of correlation with the data in the second region set in (2).

  The photographing support method of the present invention includes a display control step for displaying the detection result of the correlation degree in the correlation degree detection step on the display means.

  In the photographing support method of the present invention, when the period in which the correlation detected in the correlation level detection step is equal to or greater than a threshold value continues for a predetermined period, the imaging unit automatically captures the subject.

  The photographing support method of the present invention includes a face detection step of detecting a human face from image data generated from the imaging signal, and when a face is detected from the first region, the correlation degree detection step includes: In the third image data, the first area is reset to an area that does not include the face, and the degree of correlation is detected.

  The imaging support method of the present invention includes a face detection step of detecting a human face from image data generated from the imaging signal, and a face is detected from the second image data generated immediately before the imaging instruction is given. A control step for causing the imaging means to perform a first imaging focused on the face and a second imaging focused on the background of the face when detected. In the correlation degree detection step, when the first imaging and the second imaging are performed, the third image data generated from the imaging signal output from the imaging means by the second imaging. Is the image data used to detect the correlation, the first image data generated from the imaging signal output from the imaging means by the first imaging, and the imaging by the second imaging means An image data generation step of generating image data for generating the panoramic image data focused on both the face and the background using the first image data generated from the imaging signal output from .

  The imaging support method of the present invention includes a moving object detection step of detecting a moving object from image data generated from the imaging signal, and the correlation degree detection step extracts a feature point from the first region, and the feature point And the data in the second area to obtain the correlation, and when a moving object is detected from the first area, in the correlation degree detection step, the feature point from a portion excluding the moving object To extract.

  In the imaging support method of the present invention, when a moving object is detected from the same area as the first area of the second image data, a moving object detection step of detecting a moving object from the image data generated from the imaging signal, A warning step for giving a warning.

  The imaging support method according to the present invention includes a warning step for giving a warning when a moving object is detected from the same area as the first area of the second image data.

  The shooting support program of the present invention is a program for causing a computer to execute each step of the shooting support method.

  ADVANTAGE OF THE INVENTION According to this invention, the imaging device which can perform panoramic imaging simply and with high precision can be provided.

  Embodiments of the present invention will be described below with reference to the drawings.

(First embodiment)
FIG. 1 is a diagram showing a schematic configuration of a digital camera which is an example of an imaging apparatus for explaining a first embodiment of the present invention. The digital camera of the present embodiment has a panoramic shooting mode in which panoramic image data obtained by combining these from a plurality of captured image data is generated and recorded.
The imaging system of the digital camera shown in the figure includes a photographing lens 1, a solid-state imaging device 5 such as a CCD (Charge Coupled Device) type image sensor, an aperture 2 provided between them, an infrared cut filter 3, and an optical A low-pass filter 4. The taking lens 1, the diaphragm 2, the infrared cut filter 3, the optical low-pass filter 4, and the solid-state image sensor 5 constitute an image pickup unit that takes an image of a subject and outputs an image pickup signal.

  A system control unit 11 that performs overall control of the electrical control system of the digital camera controls the flash light emitting unit 12 and the light receiving unit 13 and controls the lens driving unit 8 to adjust the position of the photographing lens 1 to the focus position and zoom. The exposure amount is adjusted by adjusting the aperture amount of the aperture 2 via the aperture drive unit 9.

  Further, the system control unit 11 drives the solid-state imaging device 5 via the imaging device driving unit 10 and outputs a subject image captured through the photographing lens 1 as an imaging signal. An instruction signal (including a photographing instruction signal) from the user is input to the system control unit 11 through the operation unit 14.

  The electric control system of the digital camera further includes an analog signal processing unit 6 that performs analog signal processing such as correlated double sampling processing connected to the output of the solid-state imaging device 5, and RGB output from the analog signal processing unit 6. And an A / D conversion circuit 7 for converting the color signals into digital signals, which are controlled by the system control unit 11.

  Further, the electric control system of the digital camera includes a main memory 16, a memory control unit 15 connected to the main memory 16, and an interpolation operation, a gamma correction operation, and an RGB / YC for an image signal output from the solid-state image sensor 5 A digital signal processing unit 17 that performs conversion processing or the like to generate image data, and a compression / expansion processing unit 18 that compresses the image data generated by the digital signal processing unit 17 into a JPEG format or decompresses the compressed image data; A panoramic image generation unit 19 that generates panoramic image data, an external memory control unit 20 to which a detachable recording medium 21 is connected, and a display unit 23 such as a liquid crystal as display means mounted on the back of the camera are connected. Display control unit 22 and correlation degree detection unit 26, which are connected to each other by a control bus 24 and a data bus 25, and It is controlled by a command from the control unit 11.

  The digital signal processing unit 17 generates first image data for generating panoramic image data from the imaging signal output from the solid-state imaging device 5 by imaging according to the imaging instruction performed by the operation unit 14 in the panoramic imaging mode. In the first image data generation function and in the panoramic shooting mode, a through image for a through image to be displayed on the display unit 23 from an imaging signal output from the solid-state imaging device 5 by shooting other than the shooting instruction (shooting for through image acquisition). Second image data generation function for generating second image data, and third image data (first image data in the same format as the second image data from the imaging signal output from the solid-state imaging device 5 by imaging according to the imaging instruction) A third image data generation function for generating image data obtained by applying the same signal processing to the image pickup signal as the signal processing for generating the second image data. .

  The panorama image generation unit 19 generates a single panorama image data by combining the plurality of first image data generated by the digital signal processing unit 17 by superimposing the end portions of the first image data. The image data is recorded on the recording medium 21 via the external memory control unit 20.

  The correlation degree detection unit 26 includes the data in the first area set at the end of the third image data generated by the shooting instruction and the second image data generated after the shooting by the shooting instruction. The degree of correlation with the data in the second area having the same size as the first area set at the end opposite to the end of the third image data is detected. Specifically, feature points (object edges, corners, etc.) are extracted from the image data in the first region, and it is determined whether or not the extracted feature points exist in the second region. If the extracted feature point is in the second area, it detects how much the position of the feature point in the second area is different from the position of the feature point in the first area. , Detecting the degree of correlation. For example, the degree of correlation is maximized when the coordinates of the feature points in the first region and the feature points in the second region are completely matched, and the degree of correlation is lowered as the coordinate deviation between the two feature points increases.

  The correlation level detection unit 26 displays the detected correlation level on the through image being displayed on the display unit 23 via the display control unit 22.

Next, the operation of the digital camera in the panoramic shooting mode will be described.
FIG. 2 is a flowchart for explaining the operation of the digital camera of the first embodiment in the panoramic shooting mode. FIG. 3 is a diagram for explaining the operation of the digital camera according to the first embodiment in the panoramic shooting mode. FIG. 4 is a diagram illustrating an example of a screen displayed on the display unit 23 in the panoramic shooting mode of the digital camera according to the first embodiment. In the following description, it is assumed that the panoramic image generation unit 19 generates panoramic image data by superimposing n (n is a natural number of 2 or more) pieces of first image data.

  When the release button included in the operation unit 14 is fully pressed and a photographing instruction is issued (step S1: YES), the system control unit 11 controls the image sensor driving unit 10 to photograph the subject with the solid-state image sensor 5 ( Step S2).

  If the shooting in step S2 is not the n-th shooting after the panoramic shooting mode is set (step S3: NO), the digital signal processing unit 17 takes an image output from the solid-state imaging device 5 in the shooting in step S2. First image data and third image data for generating panoramic image data are generated from the signal (step S4), and these are temporarily stored in the main memory 16.

  Next, the digital signal processing unit 17 generates second image data (image data for a through image to be displayed on the display unit 23) from the imaging signal output from the solid-state imaging device 5 after the imaging in step S2. (Step S5), an image based on the generated second image data is displayed on the display unit 23 as a through image via the display control unit 22.

  Next, the digital signal processing unit 17 sets the first region at the end of the third image data generated in step S4 (for example, the right end as shown in FIG. 3), and the second region generated in step S5. A second area having the same size as the first area is set at the end of the image data opposite to the end of the third image data (the left end as shown in FIG. 3). The degree of correlation between this data and the data in the second area is detected.

  For example, the digital signal processing unit 17 extracts feature points from an area inside the first region, and searches for the extracted feature points in the second region. The search range for feature points is within a predetermined range inside the second region. When a feature point exists within the predetermined range, the position coordinates of the feature point in the second area are compared with the position coordinates of the feature point in the first area, The degree of correlation between the data in the area and the data in the second area is detected (step S6).

  Next, the digital signal processing unit 17 synthesizes and displays the correlation degree information detected in step S6 on the through image based on the second image data generated in step S5 displayed on the display unit 23 ( Step S7). For example, as shown in FIG. 4, a black bar is displayed on the through image, and the degree of correlation is expressed by the number of bars.

  The processing in steps S5 to S7 is repeated until a shooting instruction is given. If there is a shooting instruction in step S8, the process proceeds to step S2.

  When the shooting in step S2 is the n-th shooting after the panoramic shooting mode is set (step S3: YES), the digital signal processing unit 17 outputs the image output from the solid-state imaging device 5 in the shooting in step S2. First image data for generating panoramic image data is generated from the signal (step S9), and this is temporarily stored in the main memory 16.

  Then, the panorama image generation unit 19 performs a process of superimposing the ends of each of the n pieces of first image data stored in the main memory 16 to generate panorama image data (step S10), and panorama Exit shooting mode.

  For example, when n = 3, the panoramic image generation unit 19 sets the third region (the first region set in the third image data as the first region) at the right end of the first image data obtained by the first shooting. Set to the size of the image data) and set the fourth area (the second area set in the second image data as the first area) at the left end of the first image data obtained by the second shooting. The third area is set at the right end, and the fourth area is set at the left end of the first image data obtained by the third shooting. Then, the third region of the first image data obtained by the first photographing and the fourth region of the first image data obtained by the second photographing are overlapped, and obtained by the second photographing. Panorama image data is generated by performing a process of superimposing the third region of the first image data and the fourth region of the first image data obtained by the third shooting.

  As described above, according to the digital camera of the present embodiment, when a shooting instruction is performed after setting to the panoramic shooting mode, the through image data generated from the imaging signal obtained according to the shooting instruction, and the shooting The degree of correlation with the through image data generated after the instruction is detected, and the detected degree of correlation is displayed on the through image. For this reason, the user can easily obtain an image to be superimposed on the image captured immediately before by searching for a point having the maximum degree of correlation while moving the digital camera and instructing shooting at that point. . The digital camera according to the present embodiment uses image data to notify the optimal shooting timing when it is not based on a sensor that detects the movement of the digital camera. The panoramic image can be created with high accuracy.

(Second embodiment)
The digital camera according to the second embodiment of the present invention is such that the digital camera according to the first embodiment automatically performs shooting when a detected correlation degree is equal to or greater than a threshold value for a predetermined period. It is. The configuration of the digital camera of the second embodiment is the same as that shown in FIG.

FIG. 5 is a flowchart for explaining the operation in the panoramic shooting mode of the digital camera of the second embodiment. In FIG. 5, the same processes as those in FIG.
When the degree of correlation is detected in step S6, the system control unit 11 determines whether or not the degree of correlation is equal to or greater than a threshold during a predetermined period (for example, a period during which a few frames of a through image are displayed). (Step S17). When the degree of correlation is greater than or equal to the threshold value for a predetermined period (step S17: YES), the system control unit 11 performs shooting of the subject with the solid-state imaging device 5 regardless of whether there is a shooting instruction from the operation unit 14. (Step S18).

  As described above, according to the digital camera of the present embodiment, photographing is automatically performed when the detected degree of correlation maintains a high value for a predetermined period. For this reason, the user can obtain an image to be superimposed on the image captured immediately before by simply moving the digital camera to a point that maintains a high correlation value, and can easily perform panoramic shooting. .

(Third embodiment)
FIG. 6 is a diagram showing a schematic configuration of a digital camera which is an example of an imaging apparatus for explaining the third embodiment of the present invention.
The digital camera shown in FIG. 6 has a configuration in which a face detection unit 27 is added to the digital camera shown in FIG.

  The face detection unit 27 detects the face of a person from image data generated from the imaging signal output from the solid-state imaging device 5. A known process can be used for the face detection process.

Next, the operation in the panoramic shooting mode of the digital camera of the present embodiment will be described.
FIG. 7 is a flowchart for explaining the operation in the panoramic shooting mode of the digital camera of the third embodiment. FIG. 8 is a diagram for explaining the operation in the panoramic shooting mode of the digital camera of the third embodiment. In FIG. 7, the same processes as those in FIG.
After step S4, the face detection unit 27 performs face detection processing from the third image data generated in step S4 (step S21). Next, the digital signal processing unit 17 determines whether or not a face is included in the first region to be set in the third image data, and when the face is included in the first region (step S22: YES). ), The first area is reset to the area of the third image data excluding the face (step S23), and the processes after step S5 are performed. The first area to be reset is set as close as possible to the end of the third image data.

  When the face is not included in the first area (step S22: NO), the digital signal processing unit 17 performs the processing after step S5.

  The panoramic image generation unit 19 of the digital camera of the present embodiment sets the first image data obtained by photographing after performing the processing of step S23 when generating the panoramic image data in step S10. The three areas are different from the first embodiment in that the first area reset in step S23 is an area enlarged according to the size of the first image data (that is, an area excluding the face). That is, with respect to the first image data obtained by the photographing after step S23, the superimposition process is not performed on the portion where the face is reflected.

  As described above, according to the digital camera of the present embodiment, it is possible to prevent the images from being joined together in the person portion, so that it is possible to generate a more pleasant panoramic image. Also in the present embodiment, after step S6 in FIG. 7, when a period in which the degree of correlation is equal to or greater than a threshold continues for a predetermined period, it is possible to automatically perform shooting.

(Fourth embodiment)
The configuration of the digital camera according to the fourth embodiment of the present invention is the same as that shown in FIG.

FIG. 9 is a flowchart for explaining the operation in the panoramic shooting mode of the digital camera of the fourth embodiment. In FIG. 9, the same processes as those in FIG. FIG. 10 is a diagram for explaining the operation of the digital camera according to the fourth embodiment in the panoramic shooting mode.
When the panorama shooting mode is set, imaging signals are continuously output from the solid-state imaging device 5, second image data is generated by the digital signal processing unit 17, and a through image based on the image data is displayed on the display unit. 23. When there is a shooting instruction in step S1, the face detection unit 27 performs face detection from the second image data generated by the digital signal processing unit 17 immediately before the shooting instruction is given (step S31).

  When a face is detected (step S32: YES), the system control unit 11 controls the lens driving unit 8 and the image sensor driving unit 10 to perform first shooting focusing on the face of the subject, and the face The second shooting focusing on the background is performed (step S33).

  Next, the digital signal processing unit 17 generates first image data from imaging signals obtained by the first imaging and the second imaging (step S34). Next, the digital signal processing unit 17 generates new first image data focused on the face and the background using the two generated first image data (step S35), and this is panoramic. It is temporarily stored in the main memory 16 as image data for generating image data.

A method for generating new first image data will be described.
The digital signal processing unit 17 includes, from each of the first image data generated from the imaging signal obtained by the first imaging and the first image data generated from the imaging signal obtained by the second imaging. The contrast of pixel data is detected. Then, the pixel data having the higher contrast is selected from the pixel data of each of the two first image data, and new first image data is generated from the selected pixel data.

  If the shooting performed in step S33 is not the n-th shooting (step S36: NO), the digital signal processing unit 17 generates third image data from the imaging signal obtained by the second shooting (step S36). S37).

  Next, the digital signal processing unit 17 generates second image data from the imaging signal output from the solid-state imaging device 5 after imaging in step S33 (step S38).

  Next, the digital signal processing unit 17 sets the first region at the end of the third image data generated in step S37 (for example, the right end as shown in FIG. 3), and the second region generated in step S38. A second area having the same size as the first area is set at the end of the image data opposite to the end of the third image data (the left end as shown in FIG. 3). The degree of correlation between this data and the data in the second area is detected (step S39). The correlation degree detection method is as described in the first embodiment.

  Next, the digital signal processing unit 17 synthesizes and displays the correlation degree information detected in step S39 on the through image based on the second image data generated in step S38 displayed on the display unit 23 ( Step S40).

  The processing in steps S38 to S40 is repeated until a shooting instruction is given. If there is a shooting instruction in step S41, the process proceeds to step S31.

  If no face is detected in step S32, the processing from steps S2 to S7 in FIG. 2 is performed. If there is a shooting instruction after step S7 (step S48: YES), the process proceeds to step S31. If there is no shooting instruction (step S48: NO), the process proceeds to step S5.

  When the shooting in step S33 or the shooting in step S2 is the n-th shooting after the panorama shooting mode is set (step S36: YES, step S3: YES), the digital signal processing unit 17 performs the processing shown in FIG. The process of step S9 is performed.

  As described above, according to the digital camera of the present embodiment, when a subject includes a face at the time when the shooting instruction is given, shooting is performed focusing on the face and the background, and the face and the background are performed. First image data in focus is generated, and panoramic image data is generated using the first image data. Since panoramic photography generally captures a wide range, there are many scenery and the like. However, there are many situations of people and scenery, and if the person is in focus at that time, the scene at the corner will be blurred. On the other hand, if the scenery is in focus, the person at the corner is blurred. Therefore, as in the present embodiment, a good panoramic image can be obtained by generating the first image data focused on both the person and the scenery.

  In the present embodiment as well, it is possible to automatically perform photographing when a period in which the degree of correlation is equal to or greater than a threshold continues for a predetermined period.

(Fifth embodiment)
FIG. 11 is a diagram showing a schematic configuration of a digital camera which is an example of an imaging apparatus for explaining the fifth embodiment of the present invention.
The digital camera shown in FIG. 11 has a configuration in which a moving object detection unit 28 is added to the digital camera shown in FIG.
The moving object detection unit 28 detects a moving object that is a moving object from the image data generated from the imaging signal output from the solid-state imaging device 5. A well-known thing can be utilized for a moving body detection process.

FIG. 12 is a flowchart for explaining the operation of the digital camera according to the fifth embodiment in the panoramic shooting mode. In FIG. 12, the same processes as those in FIG. FIG. 13 is a diagram for explaining the operation of the digital camera of the fifth embodiment in the panoramic shooting mode.
After step S4, the moving object detection unit 28 compares the second image data generated immediately before the shooting instruction in step S1 with the third image data obtained in the shooting in step S2, and determines the third image data. It is detected whether or not there is a moving object in the first area of the image data (step S51).

  For example, the moving object detection unit 28 sets a first region in the third image data generated in step S4 and divides the first region into a plurality of blocks, and the second image generated immediately before the shooting instruction in step S1. The movement vector of each block with respect to data is obtained. If all the obtained movement vectors are in the same direction, it is detected that there is no moving object in the first area, and if any of the obtained movement vectors have a different direction from the other, Detects that there is a moving object.

  If no moving object is detected in the first region in step S51, the correlation degree detection unit 26 extracts feature points from the first region set in the third image data (step S52). On the other hand, if a moving object is detected from the first area in step S51, the correlation degree detection unit 26 detects the detected moving object part in the first area set in the third image data (blocks having different movement vectors). A feature point is extracted from the area excluding the item (step S57).

  Next, the digital signal processing unit 17 generates second image data from the imaging signal output from the solid-state imaging device 5 after imaging in step S2 (step S53), and an image based on the generated second image data. Is displayed as a through image on the display unit 23 via the display control unit 22.

  Next, the correlation detection unit 26 compares the feature points extracted from the first region of the third image data with the data in the second region set in the second image data generated in step S53. The degree of correlation between the first area of the third image data and the second area of the second image data is detected (step S54).

  Next, the digital signal processing unit 17 synthesizes and displays the correlation degree information detected in step S54 on the through image based on the second image data generated in step S53 displayed on the display unit 23 ( Step S55).

  The processing in steps S53 to S55 is repeated until a shooting instruction is given. If there is a shooting instruction in step S56, the process proceeds to step S2.

  As described above, according to the digital camera of the present embodiment, when a moving object is included in the first area of the third image data, feature points are extracted from the area excluding the moving object part, and the correlation degree is obtained. Is detected. For this reason, even when a moving object is included in the subject, erroneous detection of the correlation degree can be prevented, and a good panoramic image can be obtained.

  In the present embodiment, it is assumed that whether or not a moving object is included in the first area after step S4, but moving object detection is always performed during through image display, and set at the end of the through image data. When it is detected that a moving object is included in the first region, the user may be warned to that effect. As a warning method, a method such as making a sound from a speaker or displaying characters on the display unit 23 may be adopted. In accordance with this warning, the user waits until the moving object moves and then performs imaging to prevent erroneous detection of correlation.

  It is also possible to combine the above-described “process for always performing moving object detection and giving a warning” with the process shown in FIG. In the process shown in FIG. 12, when the moving object is included in the first region, the number of feature points is reduced, so that the detection accuracy of the correlation degree is slightly lowered. Therefore, it is possible to reduce the number of feature points and reduce the accuracy of correlation detection by combining warnings, so wait until the moving object moves, giving priority to the accuracy of correlation detection. It is possible to select whether to shoot afterwards, and the convenience of the digital camera can be improved.

  Further, the above-described “process for always performing moving object detection and giving a warning” can be applied to the digital cameras described in the first to fourth embodiments.

  In the present embodiment as well, it is possible to automatically perform photographing when a period in which the degree of correlation is equal to or greater than a threshold continues for a predetermined period.

  Each process in the digital camera described in the first to fifth embodiments is executed by a CPU (central processing unit) in which the program stored in the ROM in the digital camera is stored in the system control unit 11. And a processor such as a DSP (digital signal processor) constituting the digital signal processing unit 17 executes.

The figure which shows schematic structure of the digital camera which is an example of the imaging device for demonstrating 1st embodiment of this invention. The flowchart for demonstrating operation | movement at the time of the panoramic photography mode of the digital camera of 1st embodiment of this invention. The figure for demonstrating operation | movement at the time of the panoramic photography mode of the digital camera of 1st embodiment of this invention. The figure which showed the example of a screen displayed on a display part at the time of the panoramic photography mode of the digital camera of 1st embodiment of this invention. The flowchart for demonstrating operation | movement at the time of the panoramic photography mode of the digital camera of 2nd embodiment of this invention. The figure which shows schematic structure of the digital camera which is an example of the imaging device for demonstrating 3rd embodiment of this invention. The flowchart for demonstrating operation | movement at the time of the panoramic photography mode of the digital camera of 3rd embodiment of this invention. The figure for demonstrating operation | movement at the time of the panoramic photography mode of the digital camera of 3rd embodiment of this invention. The flowchart for demonstrating operation | movement at the time of the panoramic photography mode of the digital camera of 4th embodiment of this invention. The figure for demonstrating operation | movement at the time of the panoramic photography mode of the digital camera of 4th embodiment of this invention. The figure which shows schematic structure of the digital camera which is an example of the imaging device for describing 5th embodiment of this invention The flowchart for demonstrating operation | movement at the time of the panoramic photography mode of the digital camera of 5th embodiment of this invention. The figure for demonstrating operation | movement at the time of the panoramic photography mode of the digital camera of 5th embodiment of this invention.

Explanation of symbols

5 Solid-state image sensor 17 Digital signal processing unit 23 Display unit 26 Correlation degree detection unit

Claims (17)

  1. An imaging apparatus having a panoramic shooting mode for generating and recording panoramic image data obtained by combining these from a plurality of captured image data,
    Display means;
    An imaging means for photographing a subject and outputting an imaging signal;
    First image data generating means for generating first image data for generating panoramic image data from an image pickup signal output from the image pickup means by shooting according to a shooting instruction in the panorama shooting mode;
    Second image data generation means for generating second image data for a through image to be displayed on the display means from an imaging signal output from the imaging means by shooting other than shooting by the shooting instruction in the panoramic shooting mode When,
    Third image data generating means for generating third image data in the same format as the second image data from the imaging signal output from the imaging means by imaging according to the imaging instruction;
    The data in the first area set at the end of the third image data and the third of the second image data generated by the second image data generation means after shooting according to the shooting instruction. An image pickup apparatus comprising: correlation degree detection means for detecting a degree of correlation with data in a second region set at an end opposite to the end of the image data.
  2. The imaging apparatus according to claim 1,
    An imaging apparatus comprising display control means for causing the display means to display a detection result of the correlation degree by the correlation degree detection means.
  3. The imaging apparatus according to claim 1,
    An imaging apparatus in which the imaging unit automatically images a subject when a period in which the correlation level detected by the correlation level detection unit is equal to or greater than a threshold value continues for a predetermined period.
  4. The imaging apparatus according to any one of claims 1 to 3,
    Comprising face detection means for detecting a human face from image data generated from the imaging signal,
    When a face is detected from the first region by the face detection unit, the correlation degree detection unit resets the first region to a region that does not include the face in the third image data. An imaging device that detects the degree of correlation.
  5. The imaging apparatus according to any one of claims 1 to 3,
    Comprising face detection means for detecting a human face from image data generated from the imaging signal,
    When a face is detected by the face detection unit from the second image data generated immediately before the shooting instruction is issued, the imaging unit focuses on the face according to the shooting instruction. One imaging and a second imaging focused on the face background,
    When the first imaging and the second imaging are performed, the correlation degree detection unit obtains the third image data generated from the imaging signal output from the imaging unit by the second imaging. , And image data used to detect the degree of correlation,
    The first image data generated from the image pickup signal output from the image pickup means by the first image pickup and the first image data generated from the image pickup signal output from the image pickup means by the second image pickup. An imaging apparatus comprising: image data generating means for generating image data for generating the panoramic image data focused on both the face and the background using image data.
  6. The imaging apparatus according to any one of claims 1 to 3,
    A moving object detecting means for detecting a moving object from the image data generated from the imaging signal;
    The correlation degree detection means extracts a feature point from the first area, compares the feature point with data in the second area, and determines the correlation degree.
    When the moving object is detected from the first region by the moving object detection unit, the correlation degree detection unit extracts the feature point from a portion excluding the moving object.
  7. The imaging device according to any one of claims 1 to 5,
    Moving object detection means for detecting a moving object from image data generated from the imaging signal;
    An imaging apparatus comprising: a warning unit that issues a warning when a moving object is detected by the moving object detection unit from the same region as the first region of the second image data.
  8. The imaging apparatus according to claim 6,
    An imaging apparatus comprising warning means for giving a warning when a moving object is detected by the moving object detection means from the same area as the first area of the second image data.
  9. The panoramic shooting mode in the imaging apparatus, comprising: a display unit; an imaging unit that shoots a subject and outputs an imaging signal; and a panoramic shooting mode that generates and records panoramic image data obtained by combining these from a plurality of captured image data. A shooting support method for supporting shooting in
    A first image data generating step for generating first image data for generating the panoramic image data from an image pickup signal output from the image pickup means by shooting according to a shooting instruction in the panorama shooting mode;
    A second image data generation step of generating second image data for a through image to be displayed on the display unit from an imaging signal output from the imaging unit by shooting other than shooting by the shooting instruction in the panoramic shooting mode. When,
    A third image data generation step of generating third image data in the same format as the second image data from the imaging signal output from the imaging means by imaging according to the imaging instruction;
    The data in the first area set at the end of the third image data, and the third of the second image data generated by the second image data generation step after shooting according to the shooting instruction. And a correlation degree detecting step of detecting a degree of correlation with data in the second region set at an end opposite to the end of the image data.
  10. The shooting support method according to claim 9,
    An imaging support method comprising a display control step for displaying the detection result of the correlation degree in the correlation degree detection step on the display means.
  11. The shooting support method according to claim 9,
    An imaging support method for automatically imaging a subject by the imaging unit when a period in which the correlation level detected in the correlation level detection step is equal to or greater than a threshold value continues for a predetermined period.
  12. It is the imaging | photography assistance method of any one of Claims 9-11,
    A face detection step of detecting a human face from image data generated from the imaging signal,
    When a face is detected from the first area, the correlation degree detection step resets the first area to an area that does not include the face in the third image data, and detects the correlation degree. Shooting support method to be performed.
  13. It is the imaging | photography assistance method of any one of Claims 9-11,
    A face detection step of detecting a human face from image data generated from the imaging signal,
    When a face is detected from the second image data generated immediately before the shooting instruction is issued, a first imaging focused on the face and a background of the face according to the shooting instruction A control step for causing the imaging means to perform a focused second imaging;
    In the correlation degree detection step, when the first imaging and the second imaging are performed, the third image data generated from the imaging signal output from the imaging means by the second imaging is obtained. , And image data used to detect the degree of correlation,
    The first image data generated from the image pickup signal output from the image pickup means by the first image pickup and the first image data generated from the image pickup signal output from the image pickup means by the second image pickup. An imaging support method comprising an image data generation step of generating image data for generating the panoramic image data focused on both the face and the background using image data.
  14. It is the imaging | photography assistance method of any one of Claims 9-11,
    A moving object detection step of detecting a moving object from image data generated from the imaging signal;
    In the correlation degree detection step, a feature point is extracted from the first area, and the correlation degree is obtained by comparing the feature point with data in the second area.
    When a moving object is detected from the first area, the correlation detection step extracts the feature point from a portion excluding the moving object.
  15. It is the imaging | photography assistance method of any one of Claims 9-13,
    A moving object detection step of detecting a moving object from image data generated from the imaging signal;
    An imaging support method comprising: a warning step that issues a warning when a moving object is detected from the same area as the first area of the second image data.
  16. The shooting support method according to claim 14,
    An imaging support method comprising a warning step of giving a warning when a moving object is detected from the same area as the first area of the second image data.
  17.   17. A shooting support program for causing a computer to execute each step of the shooting support method according to claim 9.
JP2007294756A 2007-11-13 2007-11-13 Imaging apparatus, photographing support method, and photographing support program Pending JP2009124340A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007294756A JP2009124340A (en) 2007-11-13 2007-11-13 Imaging apparatus, photographing support method, and photographing support program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007294756A JP2009124340A (en) 2007-11-13 2007-11-13 Imaging apparatus, photographing support method, and photographing support program

Publications (1)

Publication Number Publication Date
JP2009124340A true JP2009124340A (en) 2009-06-04

Family

ID=40816038

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007294756A Pending JP2009124340A (en) 2007-11-13 2007-11-13 Imaging apparatus, photographing support method, and photographing support program

Country Status (1)

Country Link
JP (1) JP2009124340A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011033968A1 (en) * 2009-09-16 2011-03-24 ソニー株式会社 Device, method, and program for processing image
CN102045502A (en) * 2009-10-09 2011-05-04 索尼公司 Image processing device, image processing method, and program
CN102045501A (en) * 2009-10-09 2011-05-04 索尼公司 Image processing device and method, and program
JP2011097246A (en) * 2009-10-28 2011-05-12 Sony Corp Image processing apparatus, method, and program
WO2011099648A1 (en) * 2010-02-12 2011-08-18 国立大学法人東京工業大学 Image processing device
CN102239698A (en) * 2009-10-09 2011-11-09 索尼公司 Image processing device and method, and program
CN102326397A (en) * 2009-12-25 2012-01-18 索尼公司 Device, method and program for image processing
JP2013186853A (en) * 2012-03-12 2013-09-19 Casio Comput Co Ltd Image processing device, image processing method and program
US8873889B2 (en) 2010-02-12 2014-10-28 Tokyo Institute Of Technology Image processing apparatus
US9781341B2 (en) 2011-07-27 2017-10-03 Olympus Corporation Image processing system, information processing device, information storage device, and image processing method

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102210136A (en) * 2009-09-16 2011-10-05 索尼公司 Device, method, and program for processing image
JP2011066635A (en) * 2009-09-16 2011-03-31 Sony Corp Image processing device, method, and program
WO2011033968A1 (en) * 2009-09-16 2011-03-24 ソニー株式会社 Device, method, and program for processing image
CN102045502A (en) * 2009-10-09 2011-05-04 索尼公司 Image processing device, image processing method, and program
CN102045501A (en) * 2009-10-09 2011-05-04 索尼公司 Image processing device and method, and program
CN102045502B (en) * 2009-10-09 2013-05-08 索尼公司 Image processing device, image processing method, and program
CN102239698A (en) * 2009-10-09 2011-11-09 索尼公司 Image processing device and method, and program
JP2011097246A (en) * 2009-10-28 2011-05-12 Sony Corp Image processing apparatus, method, and program
CN102326397A (en) * 2009-12-25 2012-01-18 索尼公司 Device, method and program for image processing
JP2011165043A (en) * 2010-02-12 2011-08-25 Tokyo Institute Of Technology Image processing device
WO2011099648A1 (en) * 2010-02-12 2011-08-18 国立大学法人東京工業大学 Image processing device
US20130094781A1 (en) * 2010-02-12 2013-04-18 Olympus Corporation Image processing apparatus
US8861895B2 (en) 2010-02-12 2014-10-14 Olympus Corporation Image processing apparatus
US8873889B2 (en) 2010-02-12 2014-10-28 Tokyo Institute Of Technology Image processing apparatus
US9781341B2 (en) 2011-07-27 2017-10-03 Olympus Corporation Image processing system, information processing device, information storage device, and image processing method
JP2013186853A (en) * 2012-03-12 2013-09-19 Casio Comput Co Ltd Image processing device, image processing method and program

Similar Documents

Publication Publication Date Title
JP5235798B2 (en) Imaging apparatus and control method thereof
US8780200B2 (en) Imaging apparatus and image capturing method which combine a first image with a second image having a wider view
JP5867424B2 (en) Image processing apparatus, image processing method, and program
TWI387330B (en) Imaging apparatus provided with panning mode for taking panned image
US8212895B2 (en) Digital camera system with portrait effect
US8073207B2 (en) Method for displaying face detection frame, method for displaying character information, and image-taking device
US8208034B2 (en) Imaging apparatus
JP4656331B2 (en) Imaging apparatus and imaging method
JP4900401B2 (en) Imaging apparatus and program
JP4135100B2 (en) Imaging device
US9025044B2 (en) Imaging device, display method, and computer-readable recording medium
JP2014120844A (en) Image processing apparatus and imaging apparatus
KR101817657B1 (en) Digital photographing apparatus splay apparatus and control method thereof
KR101784176B1 (en) Image photographing device and control method thereof
JP4761146B2 (en) Imaging apparatus and program thereof
TWI393434B (en) Image capture device and program storage medium
TWI468770B (en) Imaging apparatus, focusing method, and computer-readable recording medium recording program
JP4985808B2 (en) Imaging apparatus and program
JP4818987B2 (en) Imaging apparatus, display method, and program
US20080131107A1 (en) Photographing apparatus
JP2005086499A (en) Imaging apparatus
JP2008252508A (en) Imaging device and photographing method
JP4127521B2 (en) Digital camera and control method thereof
JP4019200B2 (en) Camera with image display
JP2006162991A (en) Stereoscopic image photographing apparatus