CN103888665A - Imaging device and imaging method - Google Patents

Imaging device and imaging method Download PDF

Info

Publication number
CN103888665A
CN103888665A CN201310713342.3A CN201310713342A CN103888665A CN 103888665 A CN103888665 A CN 103888665A CN 201310713342 A CN201310713342 A CN 201310713342A CN 103888665 A CN103888665 A CN 103888665A
Authority
CN
China
Prior art keywords
image
image processing
combined photograph
mentioned
above
Prior art date
Application number
CN201310713342.3A
Other languages
Chinese (zh)
Other versions
CN103888665B (en
Inventor
市川学
奥村洋一郎
Original Assignee
奥林巴斯映像株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2012279726A priority Critical patent/JP2014123896A/en
Priority to JP2012-279726 priority
Application filed by 奥林巴斯映像株式会社 filed Critical 奥林巴斯映像株式会社
Publication of CN103888665A publication Critical patent/CN103888665A/en
Application granted granted Critical
Publication of CN103888665B publication Critical patent/CN103888665B/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23293Electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen

Abstract

An imaging device, capable of shooting a combined photograph that is a single image made up of a plurality of images, and capable of live view display, comprises a display panel (135), an EVF (153), an imaging section (103) for converting a subject image to image data and outputting, an image processing section (109) for subjecting the image data to image processing, and a combined photograph processing section (117) for combining a plurality of image data subjected to image processing by the image processing section to create a single image, wherein, when a live view image is to be displayed on the display panel (135), an image that has been subjected to processing by the combined photograph processing section is displayed, while when a live view image is to be displayed on the EVF (153), an image that has been subjected to image processing by the image processing section is displayed.

Description

Filming apparatus and image pickup method

Technical field

The present invention relates to generate filming apparatus and the image pickup method of the images such as the combined photograph being formed by multiple photos.

Background technology

Following filming apparatus has been proposed, this filming apparatus can be in the time taking photomontage, which frame of the photomontage showing in display part has been taken, and which frame be live view image become clear (for example, with reference to No. 4617417 communique of Japan Patent (patent documentation 1), Japanese Laid-Open Patent 2001-169150 communique (patent documentation 2)).In addition, the meaning of " frame " is that display part is divided into a viewing area behind multiple viewing areas.

Summary of the invention

The problem that invention will solve

In order to combine multiple photographic images and to generate combined photograph and while taking, in the case of wanting to concentrate on carrying out the shooting of 1 frame 1 frame, wish only to show the live view image in taking, and the combined photograph not completing.In above-mentioned patent documentation 1,2, although user can easily determine which frame is live view image,, cannot easily switch the pattern that shows 1 frame and the pattern that shows combined photograph.

As the solution countermeasure of above-mentioned unfavorable condition, consider diverter switch etc. is set, switch the pattern that shows 1 two field picture and the pattern that shows combined photograph.But in the method, switch kind increases, and UI(User Interface: user interface) numerous and diverseization, be difficult to become the filming apparatus of user's easy operating.

The present invention completes in view of such situation just, and its object is to provide filming apparatus and image pickup method, can easily switch the demonstration of 1 frame and the demonstration of combined photograph, is easy to when the shooting of combined photograph use making.

For the means of dealing with problems

Filming apparatus of the present invention, it can take multiple image combinings is 1 combined photograph that image forms, and can carry out live view demonstration, this filming apparatus has: the 1st display part; 2nd display part different from above-mentioned the 1st display part; Image pickup part, shot object image is converted to view data output by it; Image processing part, it carries out image processing to above-mentioned view data; Combined photograph handling part, it combines multiple view data of having been carried out image processing by above-mentioned image processing part, generate 1 image, wherein, the in the situation that of showing live view image in above-mentioned the 1st display part, be presented in combinations thereof photo disposal portion and carried out the image of processing, on the other hand, the in the situation that of showing live view image in above-mentioned the 2nd display part, be presented at the image that has carried out image processing in above-mentioned image processing part.

It is 1 combined photograph that image forms by multiple image combinings that image pickup method of the present invention is taken, and carries out live view demonstration, and this image pickup method comprises following steps: shot object image is converted to view data output; In the 2nd display part, show the live view image of 1 frame, show the image that above-mentioned view data has been carried out to image processing; In 1st display part different from above-mentioned the 2nd display part, show the live view image of combined photograph, the image that generates 1 combined photograph to having carried out multiple view data enforcement combined photograph processing of image processing, shows the image of this combined photograph.

The effect of invention

According to the present invention, filming apparatus and image pickup method can be provided, can easily switch the demonstration of 1 frame and the demonstration of combined photograph.

Brief description of the drawings

Fig. 1 is the block diagram that the main electrical structure of the camera of the 1st execution mode of the present invention is shown.

Fig. 2 is the flow chart that the main action of the camera of the 1st execution mode of the present invention is shown.

Fig. 3 is the flow chart that the main action of the camera of the 1st execution mode of the present invention is shown.

Fig. 4 is the flow chart that the action of the image processing of the camera of the 1st execution mode of the present invention is shown.

Fig. 5 A is the flow chart that the action of the primary image processing of the camera of the 1st execution mode of the present invention is shown, Fig. 5 B is the curve chart that the example of RGB gamma/Y gamma of image processing is shown.

Fig. 6 is the flow chart that the action of the particular image processing of the camera of the 1st execution mode of the present invention is shown.

Fig. 7 is the flow chart that the action of the combined photograph generation of the camera of the 1st execution mode of the present invention is shown.

Fig. 8 A-8I is the figure that is illustrated in the modification of the magazine image analysis of the 1st execution mode of the present invention.

Fig. 9 is the flow chart that the action of the still image record of the camera of the 1st execution mode of the present invention is shown.

Figure 10 is the flow chart that the action of the combined photograph operation of the camera of the 1st execution mode of the present invention is shown.

Figure 11 is the flow chart that the action of the combined photograph operation of the camera of the 1st execution mode of the present invention is shown.

Figure 12 is the flow chart of the action that shows of image processing that the camera of the 1st execution mode of the present invention is shown/live view.

Figure 13 A is the figure that is illustrated in the switching of the magazine EVF demonstration of the 1st execution mode of the present invention.

Figure 13 B is the figure that is illustrated in the switching of the magazine TFT demonstration of the 1st execution mode of the present invention.

Figure 14 A-14D is illustrated in the camera of the 1st execution mode of the present invention, the figure of the example of the shade (mask) outside the demonstration during EVF shows.

Figure 15 is the flow chart of the variation of the action that shows of image processing that the camera of the 1st execution mode of the present invention is shown/live view.

Figure 16 is the block diagram that the main electrical structure of the camera of the 2nd execution mode of the present invention is shown.

Figure 17 A-17D is the figure that an example of the outside drawing of the camera of the 2nd execution mode of the present invention is shown.

Figure 18 A-18E is the figure that the variation of the outside drawing of the camera of the 2nd execution mode of the present invention is shown.

Figure 19 illustrates the camera of the 2nd execution mode of the present invention and the figure that is connected example of external equipment.

Figure 20 A, 20B are the figure that the summary of the 1st~3rd execution mode of the present invention is described.

Embodiment

Below, with reference to the accompanying drawings, use the camera of having applied filming apparatus of the present invention to preferred embodiment describing.The camera of preferential execution mode of the present invention is digital camera.This digital camera has image pickup part, by this image pickup part, shot object image is converted to view data, according to the view data after this conversion, shot object image live view is presented in the display part being disposed on back side of main body etc.Photographer, by live view is shown and observed, decides composition and shutter opportunity.In the time carrying out shutter release operation, by Imagery Data Recording in recording medium.In the time having selected reproduction mode, can in display part, reproduce and show the view data being recorded in recording medium.

In addition, the display part of being located in back part of camera main-body etc. can be multiple by picture segmentation, and the combined photograph being made up of multiple images is reproduced to demonstration.Combined photograph is implemented to combined photograph processing, make multiple images there is unified sense, then show.In addition, this digital camera has electronic viewfinder (EVF), in this electronic viewfinder, can show the live view image in current shooting.For live view image, show 1 two field picture of not implementing combined photograph processing and implemented primary image processing.Like this, the camera of the preferred embodiment of the present invention can be to being that the photo that 1 image obtains is taken by multiple image combinings, and can carry out live view demonstration.

Fig. 1 is the block diagram that the main electrical structure of the camera of one embodiment of the present invention is shown.This camera is by camera main-body 100 and can on this camera main-body 100, form by the replacing formula camera lens 200 of dismounting.In addition, in the present embodiment, taking lens being made as to replacing formula camera lens, but being not limited to this, can certainly be the digital camera of the type of establishing shot camera lens on camera main-body.

Replacing formula camera lens 200 is made up of taking lens 201, aperture 203, driver 205, microcomputer 207, flash memory 209, and camera main-body described later 100 between there is interface (hereinafter referred to as I/F) 199.

Taking lens 201 is made up of the multiple optical lenses that are used to form shot object image, is single-focus lens or zoom lens.Dispose aperture 203 at the rear of the optical axis of this taking lens 201, the bore of aperture 203 is variable, in order to restricted passage the light quantity of subject light beam of taking lens 201.In addition, taking lens 201 can rely on driver 205 to move on optical axis direction, according to the control signal from microcomputer 207, the focal position of taking lens 201 is controlled, and in the situation that being zoom lens, also focusing is controlled.In addition, driver 205 also carries out the control of the bore of aperture 203.

The microcomputer 207 being connected with driver 205 is connected with I/F 199 and flash memory 209.Microcomputer 207, according to the program work of storage in flash memory 209, communicates with the microcomputer 121 in camera main-body 100 described later, according to the control of changing formula camera lens 200 from the control signal of microcomputer 121.

Flash memory 209, except described program, also stores the various information such as optical characteristics and adjusted value of changing camera lens 200.I/F 199 is for changing microcomputer 207 in formula camera lens 200 and microcomputer 121 in camera main-body 100 interface of communicating by letter each other.

In camera main-body 100 inside, on the optical axis of taking lens 201, dispose mechanical shutter 101.This mechanical shutter 101 is controlled passing through the time of subject light beam, adopts known focal plane shutter etc.At the rear of this mechanical shutter 101, dispose imaging apparatus 103 in the position that forms shot object image by taking lens 201.

Imaging apparatus 103 is realized the function as image pickup part, the photodiode that forms each pixel is configured to rectangular two-dimensionally, each photodiode produces and the corresponding opto-electronic conversion electric current of light income, accumulates the electric charge of this opto-electronic conversion electric current by the capacitor being connected with each photodiode.Dispose at the front surface of each pixel the RGB colour filter that Bayer is arranged.In addition, imaging apparatus 103 has electronic shutter.Electronic shutter carries out the control of time for exposure by the time of controlling till accumulating electric charge and read from the electric charge of imaging apparatus 103.In addition, imaging apparatus 103 is not limited to Bayer and arranges, for example, can be Foveon(registered trade mark) such laminated.

Imaging apparatus 103 is connected with simulation process portion 105, and this simulation process portion 105 carries out waveform shaping the photoelectric conversion signal reading from imaging apparatus 103 (analog picture signal) is reduced to reset noise, and then the lifting that gains is to become suitable brightness.

Simulation process portion 105 is connected with A/D transformation component 107, and this A/D transformation component 107 carries out analog-digital conversion to analog picture signal, and data image signal (being referred to as below view data) is outputed to bus 110.In addition, in this manual, the original view data before carrying out image processing at image processing part 109 is called to RAW data.

Bus 110 is for the forward-path to camera main-body 100 inside by the various data retransmissions that read or generate in camera main-body 100 inside.Bus 110, except being connected with described A/D transformation component 107, is also connecting image processing part 109, AE(Auto Exposure: automatic exposure) handling part 111, AF(Auto Focus: focusing automatically) handling part 113, image compression decompression portion 115, combined photograph handling part 117, microcomputer 121, SDRAM(Synchronous DRAM: synchronous dram) 127, memory interface (hereinafter referred to as memory I/F) 129, display driver 133, communication process portion 145 and EVF driver 151.

Image processing part 109 have carry out the primary image handling part 109a of basic image processing, in the situation that having set artistic filter for implementing the particular image handling part 109b of special-effect and image being resolved and detected the subject detection 109c of portion of subject.

Primary image handling part 109a realizes the function of the view data of being taken by image pickup part being carried out to the image processing part of image processing, and RAW data are implemented to carry out timeization processing, color reproduction processing, gamma correction processing, color matrix computing, noise reduction (NR) processing, edge strengthening processing etc. in optics black (OB) subtraction process, white balance (WB) correction, bayer data situation.Take and do not set artistic filter in the situation that, only can complete image processing with the processing of this primary image handling part 109a being only 1.

In addition, particular image handling part 109b is according to the special-effect (artistic filter) that sets, to implemented to reduce by primary image handling part image after treatment periphery brightness pin hole effect, make image blurring and with the synthetic soft focus processing of original image, with the synthetic noise effects of noise image, bright spot describe cross pattern cross filter effect, make the various special-effects such as micro effect that periphery is fuzzy.

The subject detection 109c of portion utilizes pattern matching etc. to carry out analysis diagram picture, detects the subjects such as face or pet.In the situation that subject can be detected, obtain the information such as kind, size, position, reliability of subject in image.

AE handling part 111 is measured subject brightness according to the view data of inputting via bus 110, and via bus 110, this subject monochrome information is exported to microcomputer 121.For the mensuration of carrying out subject brightness also can be set special photometry sensor, but in the present embodiment, calculate subject brightness according to view data.

AF handling part 113 extracts the signal of radio-frequency component from view data, obtains focusing evaluation of estimate, and export to microcomputer 121 via bus 110 by accumulative total processing.In the present embodiment, carry out the focusing of taking lens 201 by so-called contrast method.Be illustrated as an example of the AF control based on contrast method example in the present embodiment, but also can cut apart subject light beam, in its light path, phase difference sensor is set, or on imaging apparatus, phase difference sensor is set, control to focus by the AF based on phase difference AF.

Image compression decompression portion 115 is in the time arriving recording medium 131 by Imagery Data Recording, the view data of reading from SDRAM 127 is compressed it according to JPEG mode etc. the still image, the in the situation that of dynamic image, according to various compress modes such as MPEG, it is compressed.Microcomputer 121 forms jpeg file, the many image objects of MPO(Multi Picture Object(to jpeg image data or mpeg image data are additional): the picture format of use in 3D stereopsis etc.) file or the required file header of mpeg file, generate jpeg file, MPO file, mpeg file, the file of this generation is recorded in recording medium 131 by memory I/F 129.

In addition, image compression decompression portion 115 also carries out for the jpeg image data of image reproducing demonstration and the decompression of mpeg image data.In the time decompressing, read out in the file of record in recording medium 131, in image compression decompression portion 115, implement decompression, then the view data after decompressing is temporarily stored in SDRAM 127.In addition, in the present embodiment, as image compression mode, employing be JPEG compress mode and MPEG compress mode, but compress mode is not limited to this, also can adopt TIFF, other compress modes such as H.264.

Combined photograph handling part 117 carries out generating for combining multiple view data the processing of 1 image (combined photograph).In the time carrying out the combination of image, from flash memory 125, read in template, carry out brightness change with the gamma correction in primary image handling part 109a, make the lightness of image become identical.In addition, with the color reproduction processing in primary image handling part 109a, white balance (WB) is changed, make overall WB unified.In addition,, synthetic rear by special-effects such as particular image handling part 109b application pin hole effects, carry out the correction of combination image or replacement image etc.

Microcomputer 121 is realized the function as the control part of this camera main-body, controls the exercises sequential of camera blanketly.Microcomputer 121, except being connected with described I/F 199, is also connecting operating portion 123 and flash memory 125.

Operating portion 123 comprises power knob, shutter release button (hereinafter referred to as release-push), dynamic image button, reproduces the functional units such as various load buttons and various enter keies such as button, menu button, cross key, OK button, delete button, large buttons, operating portion 123 detects the mode of operation of these functional units, and testing result is exported to microcomputer 121.Microcomputer 121, according to the testing result of the functional unit from operating portion 123, is carried out the exercises sequential corresponding with user's operation.Power knob is the functional unit that is used to indicate the on/off of the power supply of this digital camera.While pressing power knob, the power connection of this digital camera, while again pressing power knob, the power supply of this digital camera disconnects.

Release-push is by by partly pressing the first release-push of opening and further pressing and become the second release-push of opening while entirely pressing and form from half down state.When microcomputer 121 is opened at the first release-push, carry out AF action and AF action etc. and take warming-up exercise sequential.In addition, in the time that the second release-push is opened, carry out and control mechanical shutter 101 etc., obtain the view data based on subject image from imaging apparatus 103 grades, this Imagery Data Recording, in a series of shooting action sequences of recording medium 131, is taken.

Dynamic image button is to be used to indicate the beginning of dynamic image shooting and the action button of end, starts dynamic image and take when at first dynamic image button operation, again operates and finishes dynamic image shooting.Reproducing button is the action button for setting and remove reproduction mode, in the time having set reproduction mode, reads the view data of photographic images from recording medium 131, and in display floater 135, reproduces demonstration photographic images.Delete button is for example for when the guide look display reproduction image or while showing combined photograph etc., specify image the action button of deleting.Large buttons are for amplify the action button that shows image in the time reproducing demonstration.

Menu button is for the action button at display floater 135 display menu pictures.On menu screen, can carry out various camera settings.As camera settings, for example, there is the setting of the image models such as the setting of the screening-mode such as common screening-mode and combined photograph pattern, natural, bright-coloured, flat, portrait, artistic filter.As artistic filter, there are pop art effect, toy camera effect, illusion focusing effect, black and white effect, perspective effect, Aero Glass etc.In addition, in the time can also carrying out selection, the combined photograph editor of replacement image of the pattern (template) of combined photograph, combined photograph, whether record the various settings such as the image before editor.In addition, the in the situation that of combined photograph, the demonstration (with reference to the S47 of Fig. 3) of changing live view (LV) according to pattern (template) and combination image choice situation.

In addition, in operating portion 123, be provided with and touch input part 123a.Display floater 135 can touch operation, touches the position of input part 123a detection user touch etc., exports to microcomputer 121.

Flash memory 125 stores the program of the exercises sequential for carrying out microcomputer 121.Microcomputer 121 carries out the control of camera entirety according to this program.

SDRAM 127 is the volatile memory for the carried out electricity rewriting of temporary transient storing image data etc.The view data that the temporary transient storage of this SDRAM 127 is exported from A/D transformation component 107 and the view data of having carried out processing image processing part 109 and image compression decompression portion 115, combined photograph handling part 117 etc.

Memory I/F 129 is connected with recording medium 131, carries out view data and the additional first-class data writing recording medium 131 of file and the control of reading these data from recording medium 131 in view data.Recording medium 131 be for example can be on camera main-body 100 recording medium such as storage card of disassembled and assembled freely, but be not limited to this, can be to be also built in hard disk in camera main-body 100 etc.

Display driver 133 is connected with display floater 135, according to from SDRAM 127 and recording medium 131, read and decompressed by image compression decompression portion 115 after view data, on display floater 135, show image.Display floater 135 is disposed at the back side of camera main-body 100 etc., carries out image demonstration.Because display floater 135 is configured in display surface on the externally mounted part of the camera main-bodies such as the back side, be therefore the display part that is easily subject to the impact of outer light, but can set large-scale display floater.In addition, can adopt the various display floaters such as display panels (LCD, TFT), organic EL as display part.

Show as image in display floater 135, be included in just finish to take the rear short time shows the view data that records record browse displays, the dynamic image such as reproduction demonstration and the live view demonstration demonstration of the still image of record and the image file of dynamic image in recording medium 131.In addition, can also be presented at and in combined photograph handling part 117, implement combined photograph combined photograph after treatment.

Communication process portion 145 is connected with Wireless-wire Department of Communication Force 147.Communication process portion 145 and Wireless-wire Department of Communication Force 147, by wire communication, radio communication etc. based on USB, LAN etc., communicate with outside, and the template of cutting apart use of interior the stored combined photograph of flash memory 125 is upgraded or appended.In addition, live view image (image after 1 frame or combination) is sent to the outside display part such as smart mobile phone or television set.

EVF driver 151 is connected with EVF 153, according to from SDRAM 127 or recording medium 131, read and decompressed by image compression decompression portion 115 after view data, in EVF 153, show image.EVF 153 is view finders of the image being presented on small-sized liquid crystal panel etc. being observed by eyepiece portion.EVF 153, owing to being to observe by eyepiece portion, is therefore the display part that is not vulnerable to the impact of outer light.Show as image, the live view can take time shows, in addition, also records the reproduction demonstration of the image file of browse displays, still image and dynamic image etc.

In the present embodiment, there are these 2 display units of display floater and EVF.In order more clearly to describe, using display floater 135 as the 1st display part, EVF 153 describes as the 2nd display part.In addition, as the 2nd display part, except EVF 153, can be also located at camera main-body front surface side display floater or be located at the display floater of the inner side of panel to be opened/closed.

EVF shows that transducer 155 is the transducers that show at EVF 153 for determining whether.Particularly, show transducer 155 as EVF, adopt when the eye sensors of photographer's eyes output detection signal during near eyepiece portion etc.In addition, as mentioned above, the 2nd display part is not limited to EVF 153, for example, can be also the display floater of main body front surface, the display floater of open and close type (with reference to Figure 17 A described later, 17B).In the case of being the display floater of main body front surface, show transducer 155 as EVF, use on/off switch etc., in addition, in the case of the display floater of open and close type, use the switching detecting elements such as hole element etc.The in the situation that of on/off switch when connect time, or opening and closing detecting element in the situation that when element is during in open mode, EVF153 or be located at camera main-body front surface side display floater or be located at display floater grade in an imperial examination 2 display parts of inner side of panel to be opened/closed in show.In addition, during EVF shows, also can make display floater 135 not show.

Then, use the flow chart shown in Fig. 2 and Fig. 3, the main processing of the camera to present embodiment describes.In addition, the flow chart shown in Fig. 2~Fig. 7 described later, Fig. 9~Figure 12, Figure 15 is controlled each several part and is carried out according to the program of storage in flash memory 125 by microcomputer 121.

The power knob in operating portion 123 is operated and made after power connection, microcomputer 121 starts action according to the main flow shown in Fig. 2.After the action of microcomputer 121 starts, first carry out initialization (S1).As initialization, carry out the electric initialization such as the initialization of mechanicalness initialization and various marks etc.As one of various marks, close (with reference to step S13, S15) by whether representing in the record among dynamic image record that mark resets to.

Carry out after initialization, next determined whether to supress and reproduce button (S3).At this, detect the mode of operation of the interior reproduction button of operating portion 123 and judge.This result of determination is to supress in the situation of reproducing button, carries out and reproduces (S5).At this, from recording medium 131, read view data, in LCD135, show the guide look of still image and dynamic image.User, by cross key is operated, selects image from guide look, determines image by OK key.In addition,, in this step, also carry out the editor of combined photograph.

Carrying out after the reproduction of step S5, or the result of determination of step S3 is not press to reproduce in the situation of button, determines whether and carry out camera settings (S7).In the time of the menu button having operated in operating portion 123, in menu screen, carry out camera settings.Therefore, in this step, judge according to whether having carried out this camera settings.

Be camera settings in the result of determination of step S7, carry out camera settings (S9).As mentioned above, can in menu screen, carry out various camera settings.As mentioned above, for example there is the setting of the image models such as the setting of the screening-modes such as common screening-mode and combined photograph, natural, bright-coloured, flat, portrait, artistic filter as camera settings.Effect as artistic filter is set, and has pop art effect, toy camera effect, illusion focusing effect, black and white effect, perspective effect, Aero Glass etc.In addition, can also carry out the replacement image of pattern (template), the combined photograph of combined photograph selection, in the time of combined photograph editor, whether record image before editor, select the various settings such as image for the image having recorded being embedded into the predetermined frame of combined photograph.

In step S9, carried out after camera settings, or the result of determination of step S7 is in the situation of not camera settings, next whether supresses the judgement (S11) of dynamic image button.At this, microcomputer 121 is inputted the mode of operation of dynamic image button and judges from operating portion 123.

Be to supress dynamic image button in the result of determination of step S11, judge mark (S13) in record.In dynamic image is taken, in record, mark is set to and opens (1), is reset as closing (0) in the situation that not taking dynamic image.In this step, by mark reversion, setting open (1) in the situation that, it is reversed to close (0), having set close (0) in the situation that, it is reversed to and opens (1).

In step S13, carry out, after the reversion of mark in recording, next taking a decision as to whether in dynamic image record (S15).At this, be set to open or be set to according to mark in the record being inverted in step S13 and close to judge.

In the case of the result of determination of step S15 be in dynamic image record, generate dynamic image file (S19).In step S61 described later, carry out the record of dynamic image, but in this step, generate the recording dynamic image file of dynamic image, prepare the view data can record dynamic image.

On the other hand, be not in dynamic image record in the situation that in result of determination, close dynamic image file (S17).Take owing to dynamic image button being operated to the dynamic image that is through with, thereby close dynamic image file in this step.In the time closing dynamic image file, record frame number etc. at the file header of dynamic image file, thereby dynamic image file is made as to the state that can reproduce, ends file writes.

In step S17, close after dynamic image file, or generate after dynamic image file in step S19, or the result of determination of step S11 is not press in the situation of dynamic image button, next determines whether under combined photograph pattern operating portion has been carried out operating (S21).As mentioned above, in the camera settings of step S9, can carry out the setting of combined photograph pattern.In this step, determine whether the operating portion that has operated operating portion 123 under the state of having set this combined photograph pattern.

Be, the in the situation that of operating portion having been carried out to operation under combined photograph pattern, to carry out combined photograph operation (S23) in the result of determination of step S21.In this combined photograph operates, in the time that the various operational example of having carried out the editor for carrying out combined photograph are read in the operations such as operation as photographed frame change, deletion action, recovery operation, temporary transient preservation operation, temporary transient preservation, carry out this combined photograph and operate.Use Figure 10 and Figure 11 that the detailed action of this combined photograph operation is described below.

When microcomputer 121 has been carried out after combined photograph operation in step S23, or the result of determination of step S21 is operating portion is not operated under combined photograph pattern in the situation that, determine whether and partly supress release-push, in other words judge whether the first release-push opens (S31) from closing to have become.This judgement is the state by detected the first release-push linking with release-push by operating portion 123, and carry out according to this testing result.In the case of the result detecting be the first release-push from close be converted to open result of determination be "Yes", on the other hand, in the situation that maintaining open mode or closed condition, result of determination is "No".

Be that release-push is partly pressed in the result of determination of step S31, from closing the situation that changes the first release into, carry out AE, AF action (S33).At this, AE handling part 111 detects subject brightness according to the view data being obtained by imaging apparatus 103, according to this subject brightness, calculates the shutter speed, the f-number etc. that become correct exposure.

In addition, in step S33, carry out AF action.At this, driver 205 moves the focal position of taking lens 201 by the microcomputer 207 of changing in formula camera lens 200, becomes peak value with the focusing evaluation of estimate that makes to be obtained by AF handling part 113.Therefore,, in the situation that not carrying out dynamic image shooting, if partly supress release-push, carry out the focusing of taking lens 201 in this moment.After this enter step S35.

Be that release-push does not change into the first release from cutting out in the result of determination of step S31, next determine whether that release-push is entirely pressed and makes the second release-push open (S37).In this step, detect the state of the second release-push linking with release-push by operating portion 123, judge according to this testing result.

Be entirely to supress release-push to make the second release-push opens, to carry out taking (S39) based on the still image of mechanical shutter in the result of determination of step S37.At this, control aperture 203 with the f-number that computing obtains in step S33, and the shutter speed obtaining with computing is controlled the shutter speed of mechanical shutter 101.Then, through with the corresponding time for exposure of shutter speed after, read picture signal from imaging apparatus 103, will carry out RAW data after treatment by simulation process portion 105 and A/D transformation component 107 and output to bus 110.

After the shooting that microcomputer 121 has carried out based on mechanical shutter, next carry out image processing (S41).At this, read the RAW data that obtained by imaging apparatus 103 by the shooting based on mechanical shutter, carry out image processing by image processing part 109.Use Fig. 4 that the detailed action of this image processing is described below.

Microcomputer 121 has carried out, after image processing, next carrying out still image record (S43).At this, by the Imagery Data Recording of having implemented image still image after treatment in recording medium 131.In addition, in the situation that having carried out with combined photograph taking, in the time that a part of image in (deletion) combined photograph is cancelled in hope, can delete by deletion action.Use Fig. 9 that the detailed action of this still image record is described below.

Whether microcomputer 121 has carried out after still image record, be next the judgement (S45) of combined photograph pattern.As mentioned above, combined photograph pattern is set in menu screen etc., determines whether this combined photograph pattern of having set in this step.

Be to have set combined photograph pattern in the result of determination of step S45, carry out the change (S47) that live view (LV) shows.In the situation that having set combined photograph pattern, according to the number of pattern (template) and image, change live view and show.In the situation that all the time all frames being carried out to live view demonstration, the part beyond photographic images is carried out live view demonstration.In addition,, in the situation that frame by frame is carried out live view demonstration, after shooting, switch live view to next frame (for example, according to the order of upper left → upper right → lower-left → bottom right) successively and show.No matter, in which kind of situation, captured frame all shows photographic images.In addition, also can use OSD(screen directly to show), take complete image or the mode of live view image shows knowing, can also be that mode can know shooting order shows and takes complete image.After this enter step S35.

Be not to be the second releasing operation in the result of determination of step S37, then carry out AE(S51).Be the situation of release-push not being carried out any operation from described step S21 through the situation of S27, in this situation, in step S55 described later, carry out live view demonstration.In step S51, carry out the exposure control for carrying out live view demonstration., calculate shutter speed and the iso sensitivity in order to carry out the electronic shutter of the imaging apparatus 103 of live view demonstration with correct exposure.

Microcomputer 121 has carried out after AE, next carries out the shooting (S53) based on electronic shutter.At this, shot object image is converted to view data.That is, during the time for exposure determining at electronic shutter by imaging apparatus 103 in, carry out electric charge and accumulate, after the time for exposure, accumulate electric charge and obtain view data by reading.

After the shooting that microcomputer 121 has carried out based on electronic shutter, next the view data obtaining is carried out to image processing/live view and show (S55).Image is herein processed for live view and is shown, by primary image handling part, 109a carries out primary image processing.In addition,, in the situation that having set the special-effects such as artistic filter, also can also carry out and set corresponding particular image processing.In addition processing for the image showing at display floater 135, is different from the image processing for showing at EVF 153.Will be described hereinafter in detail, process and only carry out primary image processing (with reference to the S233 of Figure 12) for the image showing at display floater 135, process except primary image is processed the image processing (with reference to S227, Fig. 4 of Figure 12) of also carrying out particular image processing, generating for combined photograph for the image showing at EVF 153.

Microcomputer 121 has carried out, after image processing, in display floater 135 or EVF 153, carrying out live view demonstration.Owing to obtaining view data in step S53, and carry out image processing, therefore used and carried out this image after treatment, carried out the renewal of live view demonstration.Photographer shows by observing this live view, can determine composition and shutter opportunity.

In the time carrying out live view demonstration, the live view that carries out 1 frame in EVF 153 shows., show (live view of this 1 frame was shown referred to as " 1 frame live view shows ") to having implemented the live view image of primary image 1 frame after treatment.In addition,, in the situation that having set combined photograph pattern, in display floater 135, show the image of combined photograph.Now, in the case of not determining multiple images whole that form combined photograph, by live view image, 1 frame is wherein shown to (live view when this combined photograph pattern is set show to be called for short do " demonstration of combined photograph live view ").In addition, in the situation that not setting combined photograph pattern, in display floater 135, similarly 1 frame live view image is shown with EVF 153.With Figure 12, the detailed action that image processing/live view shows is described below.

Whether microcomputer 121 has carried out after the demonstration of image processing/live view in step S55, be the judgement (S59) in dynamic image record.At this, judge in record and mark whether to open.In recording as dynamic image in this result of judging, carry out dynamic image record (S61).At this, the photographed data of reading from imaging apparatus 103 is carried out to image processing, become the view data that dynamic image is used, be recorded in recording medium 131 as dynamic image file, then enter step S35.

Whether microcomputer 121, in step S35, is the judgement that power supply disconnects.In this step, determine whether the power knob of again supressing operating portion 123.Be, the in the situation that not power supply disconnecting, to return to step S3 in this result of determination.On the other hand, in the situation that result of determination is power supply disconnection, carry out the tenth skill of main flow, then finish main flow.

Like this, in the main flow of one embodiment of the present invention, can carry out the setting (S9) of combined photograph pattern, setting combined photograph pattern, while carrying out live view demonstration, in EVF 153, show 1 frame live view image, in display floater 135, show combined photograph (S55).

Then, use the image processing of the flowchart text step S41 shown in Fig. 4.Microcomputer 121 enters after the flow process of image processing, first carries out primary image processing (S71).At this, primary image handling part 109a is to reading from imaging apparatus 103 and having carried out view data enforcement optics black (OB) subtraction that A/D changes, white balance (WB) correction, change processing simultaneously, color reproduction processing, brightness change processing, edge strengthening processing, noise reduction (NR) processing etc. by A/D transformation component 107.Use Fig. 5 A, 5B that the detailed action of this primary image processing is described below.

Microcomputer 121 has carried out, after primary image processing, next determining whether and having set artistic filter (S73).Owing to setting artistic filter in described camera settings (S9), therefore judge in this step, whether to have set artistic filter.

Be to have set artistic filter in the result of determination of step S73, carry out particular image processing (S75).At this, implement particular image processing according to the kind of the artistic filter setting.Use Fig. 6 that the detailed action of this particular image processing is described below.

Whether microcomputer 121 is not set artistic filter in the case of having carried out particular image processing result of determination rear or step S73, be next the judgement (S77) of combined photograph.As mentioned above, in menu screen, set combined photograph pattern.

Microcomputer 121 is to have set combined photograph pattern in the result of determination of step S77, carries out combined photograph generation (S79).In this combined photograph generates, carry out the preliminary treatment such as adjusted size and rotation by combined photograph handling part 117, carry out color change and brightness and change, then, carry out the synthetic of image according to pattern (template), generate combined photograph.Also can be made as in dynamic image is taken combined photograph function is closed.By combined photograph processing, can generate 1 photo on the whole with unified sense with multiple images.Use Fig. 7 that the detailed action that this combined photograph generates is described below.

Microcomputer 121 is not set combined photograph pattern in the case of having carried out combined photograph generation result of determination rear or step S77, returns to former flow process.

Like this, in the image of present embodiment is processed, carry out primary image processing, and carry out as required, after particular image processing, in the situation that having set combined photograph pattern, carrying out combined photograph generation.

Then, use the flowchart text step S71(Fig. 4 shown in Fig. 5 A) the detailed action of primary image processing.Microcomputer 121 enters after the flow process of primary image processing, first carries out optics black (OB) computing (S81).In this step, by the OB operational part in primary image handling part 109a, deduct respectively from the pixel value of each pixel of composing images data the optical black value that dark current by imaging apparatus 103 etc. causes.

Microcomputer 121 has carried out after OB computing, next carries out white balance (WB) and proofreaies and correct (S83).In this step, according to predefined white balance mode, by the WB correction unit in primary image handling part 109a, view data is carried out to WB correction.Particularly, read the corresponding R gain of the white balance mode of setting with user and B gain from the flash memory 125 of camera main-body, the view data that Bayer is arranged is multiplied by this value proofreaies and correct thus.Or the in the situation that of Automatic white balance, calculate R gain and B gain according to RAW data etc., use these values to proofread and correct.

Then, microcomputer 121 is changed processing (S85) simultaneously.In this step, the view data of having carried out white balance correction is converted to the data that each pixel is made up of RGB data by handling part by primary image handling part 109a time.Particularly, obtain non-existent data in this pixel by interpolation by periphery, and be converted to RGB data.But, for example, be the form (the RAW data that obtained by Foveon transducer etc.) identical with RGB data, without change processing simultaneously in RAW data.

Microcomputer 121 is carrying out, after change processing simultaneously, next carrying out color reproduction processing (S87).In this step, carry out view data to be multiplied by and the linear transformation of the corresponding color matrix coefficients of white balance mode setting by the color reproduction handling part in primary image handling part 109a the color of image correcting data.Color matrix coefficients is stored in flash memory 125, therefore needs to read out use.

Microcomputer 121 is carrying out after color reproduction processing, next carries out brightness change and processes (S89).In this step, read out in the gamma table of storage in flash memory 125 by the gamma handling part in primary image handling part 109a, view data is carried out to gamma correction processing.That is, utilize RGB to carry out gamma conversion, after color conversion is YCbCr, use Y further to carry out gamma conversion.

RGB gamma in the brightness change of step S89 shown in Fig. 5 B is processed and an example for Y gamma.In Fig. 5 B, γ RGB is an example for RGB gamma, also can change according to image model.γ Y1 is an example for the Y gamma while setting illusion focusing effect as artistic filter.γ Y2 is an example for the Y gamma while setting pop art effect or toy camera effect as artistic filter.γ Y3 is an example for the Y gamma in other situation.

The gamma of γ Y3 is linearity roughly, and γ Y1 sharply rises for low input value, slowly changes for high input value.On the other hand, γ Y2, for low input value rising, changes and increases for high input value.Like this, in the present embodiment, change in processing in the brightness of primary image processing, make gamma difference according to the artistic filter (special effect treatment) setting, thereby become optimal gamma.

Microcomputer 121 has carried out, after gamma conversion, next carrying out edge strengthening (S91) in step S89.In this step, by edge in primary image handling part 109a emphasize handling part by band pass filter to having carried out the image data extraction marginal element of gamma conversion, be multiplied by coefficient and be added with view data according to edge strengthening degree, thus the edge of view data being strengthened.

Microcomputer 121 then carries out NR(noise remove) (S93).In this step, image is carried out to frequency decomposition, according to frequency, low noise processing is fallen.Carry out returning after noise removal process former flow process.

Then, use the flowchart text step S75(Fig. 4 shown in Fig. 6) the detailed action of particular image processing.

Microcomputer 121 is entering after the flow process of particular image processing, first determines whether and has set toy camera effect as artistic filter pattern (S101).In this result of determination for set toy camera effect in the situation that, additional shadow effect (S103).At this, particular image handling part 109b, according to generating with the distance at center the gain diagram (yield value is below 1) that brightness is reduced gradually, is multiplied by with pixel and gains accordingly each pixel, gives hatching effect thus to image periphery.

Microcomputer 121 is not toy camera effect in the case of having carried out the additional result of determination rear or step S101 of shade, next determines whether and has set dreamlike focusing as artistic filter pattern (S105).For set dreamlike focusing in the situation that, carry out soft focus processing (S107) in this result of determination.At this, particular image handling part 109b makes original image fuzzy, the image after for example, according to predetermined ratio (image after fuzzy: the ratio of the image=2:3 before fuzzy) that this is fuzzy with fuzzy before image synthesize.

Microcomputer 121 is processed the not dreamlike focusing of result of determination of rear or step S105 in the case of having carried out soft focus, next determines whether and has set black and white effect as artistic filter pattern (S109).This result of determination is to have set in the situation of black and white effect, carries out the overlapping processing of noise (S111).At this, particular image handling part 109b implements noise additional treatments to view data.Noise additional treatments is by the noise graph image generating in advance and the processing of subject image addition.Can be according to generted noise graph images such as random numbers.

Microcomputer 121, in the case of having carried out after the overlapping processing of noise or the result of determination of step S109 is not black and white effect, next determines whether and has set perspective effect as artistic filter pattern (S113).For set perspective effect in the situation that, carry out Fuzzy Processing (S115) in this result of determination.At this, particular image handling part 109b carries out centered by the AF of original image target, according to up and down or left and right or distance fuzzy processing gradually.

Microcomputer 121, in the case of having carried out after Fuzzy Processing or the result of determination of step S113 is not perspective effect, next determines whether and has set Aero Glass as artistic filter pattern (S117)., carry out cross and filter and process (S119) for set Aero Glass in the situation that in this result of determination.At this, particular image handling part 109b detects bright spot from image, describes to possess to be the deduce cross figure of the effect shining of crosswise or many radial shapes centered by this bright spot.

Microcomputer 121 is not set Aero Glass in the case of having carried out cross optical filtering processing result of determination rear or step S117, finishes the flow process of particular image processing, returns to former flow process.

Then, use the flowchart text step S79(Fig. 4 shown in Fig. 7) the detailed action that generates of combined photograph.As mentioned above, in the flow process of the image processing shown in Fig. 4, first carry out primary image processing, then carry out particular image processing, finally carry out combined photograph generation.In this combined photograph generates, image is synthesized in predefined pattern (template).As pattern, for example picture can be divided into 4 parts etc. up and down, set from multiple patterns according to photographer's hobby.

Microcomputer 121, entering after the flow process of combined photograph generation, first carries out image analysis (S121).Resolve carrying out synthetic original image (replacement image), obtain characteristics of image (for example, Luminance Distribution and distribution of color etc.).The image that can be used to complete the image processing such as primary image processing is resolved, and in addition, also can resolve the RAW data as its initial data.

Microcomputer 121 is carrying out after image analysis, next carries out color change (S123).At this, combined photograph handling part 117 is proofreaied and correct CbCr, and making to carry out synthetic image becomes roughly the same color.Now, use the data in each two field picture region to process, but do not change the data in the two field picture region R1 of SDRAM 127.This is for later with MPO form save data.

The bearing calibration of changing as color, for example, make the CbCr skew of image, makes the peak value of the distribution of color (in the distribution of CbCr plane) of all images that synthesize become the mean value of each peak value.Use Fig. 8 D-8I that this bearing calibration is described.The aberration (Cb) of Fig. 8 D presentation video 1 and image 2, Fig. 8 G represents aberration (Cr).If make aberration be offset to make the peak value of distribution of color to become mean value (with reference to Fig. 8 E, 8H) for image 1,2 respectively, the aberration of two images becomes roughly the same (with reference to Fig. 8 F, 8I).

Microcomputer 121 has carried out color after changing in step S123, next carries out brightness change (S125).At this, the brightness of proofreading and correct to make combined photograph handling part 117 carry out synthetic image becomes identical.As bearing calibration, for example to proofread and correct to make in the mean value of the Luminance Distribution of all images that synthesize, the mean value of each image is roughly consistent.Now, if change by brightness gamma (the only gamma of brightness composition),, in the situation that significantly having changed brightness, color seems not nature.Therefore, for example, also can carry out RGB conversion, carry out gamma correction at RGB color space.

The example that uses Fig. 8 A-8C explanation brightness to change.The brightness of Fig. 8 A presentation video 1 and image 2.At RGB color space, the two is shown to conversion (with reference to Fig. 8 B).Known according to Fig. 8 A, low-light level composition is more on the whole for image 2, therefore changes by table the brightness that promotes low-light level side.After conversion, as shown in Figure 8 C, image 1 and image 2 become substantially same Luminance Distribution.

Microcomputer 121 has carried out brightness after changing in step S125, next synthesizes (S127).At this, combined photograph handling part 117 is synthetic by the each image on background image.That is, image is embedded to the position that is synthesized to predefined pattern (template), thereby generate combined photograph.

After microcomputer 121 has carried out synthesizing in step S127, next additional special-effect (S129).At this, to the image additional shadow after synthetic or the special-effect such as fuzzy.In addition, in the step S75 of Fig. 4, each image is added to special-effect, and in this step, the additional special-effect for composograph.

Microcomputer 121 has added after special-effect in step S129, carries out image storage, then returns to former flow process.

Like this, in the flow process generating at combined photograph, image is carried out to color and changed and brightness (S123, S125) after changing, by the image of combined photograph synthetic (S133), carried out recording image (S137).To multiple image adjustment, color and brightness are become identical, therefore generate the combined photograph on the whole with unified sense.

Then, use the flowchart text step S43(Fig. 3 shown in Fig. 9) the detailed action of still image record.This still image record is then to carry out afterwards in the shooting based on described mechanical shutter (S39), image processing (S41), carry out for by the Imagery Data Recording that has carried out image processing in the processing of recording medium 131.

First whether microcomputer 121, having entered after the flow process of still image record, be the judgement (S141) of combined photograph pattern.Be, the in the situation that of not being combined photograph pattern, to be common screening-mode in this result of determination, record browse displays (S157).This records browse displays is the function temporarily show the stipulated time in display floater 135 or EVF 153 during by the image of record in recording medium 131.Thus, user can confirm photographic images.

Microcomputer 121 records after browse displays having carried out, and next carries out still image record (S159).Beyond combined photograph, carry out JPEG compression and be recorded in recording medium 131 having carried out the image of image processing.But be not limited to this, also can such as, carry out record by non-compressed format (TIFF etc.), can also take other compressed formats.In addition, also can record RAW image.

Microcomputer 121 is to be combined photograph pattern in the result of determination of step S141, determines whether records photographing image (S143).In combined photograph pattern, each image is embedded in predefined pattern (template), but can is also, even in the situation that having set combined photograph pattern, record is not embedded into the image in pattern yet.In this step, judge that whether in the camera settings of the step S9 of Fig. 2 user has been made as the setting of whether recording each frame of combined photograph to open.

Microcomputer 121 is records photographing image in the result of determination of step S143, carries out still image record (S145).At this, with step S159 similarly, do not take as independent photographic images as combined photograph, compress etc. and be recorded in recording medium 131 by JPEG etc.

Microcomputer 121 in the case of after step S145 has carried out still image record or the result of determination of step S143 be not records photographing image, next combine the judgement (S147) whether completing.At this, determine whether and taken the image of the quantity that meets predefined pattern and synthesized.For example, in the combined photograph by 4 image constructions, determine whether and taken 4 photos and synthesized.In addition, for example, the in the situation that of synthetic 4 images, taken two images and from document image, selected the situation of two images to be also judged to be to have combined.

Microcomputer 121 in the case of the result of determination of step S147 be combined, record browse displays (S149).At this, all images of combined photograph are all complete, thereby on display floater 135, this combined photograph recorded to browse displays.In addition, recording in browse displays, can be also with reproduction flow process described later similarly, can amplify demonstration.

Microcomputer 121 records after browse displays having carried out, and next determines whether and has carried out deletion action (S151).User has observed while recording browse displays, sometimes wishes delete a part of photo and replace.Therefore, in the present embodiment, record browse displays, for example, in (3 seconds), carried out, in the situation of deletion action, can deleting specified frame at the appointed time, again take.In addition, as deletion action, both can specify by touch panel, also can delete by delete button.

Microcomputer 121 is to have carried out deletion action in the result of determination of step S151, carries out combined photograph operation (S155).In the subprogram of combined photograph operation, carry out deletion action (with reference to S165~S173 of Figure 10 described later).

On the other hand, be not carry out deletion action in the result of determination of step S151, carry out combined photograph record (S153).At this, by the composograph data record of the combined photograph completing in recording medium 131.

After microcomputer 121 has carried out combined photograph record in step S153 or carried out combined photograph operation in step S155 after or the result of determination of step S147 in the uncompleted situation of combination or carried out, after still image record, returning to former flow process in step S159.

Like this, record in flow process at still image, before the image number being determined by pattern that gathers together enough, do not carry out the record (S147 → no) of combined photograph, after number, carry out the record (S153) of combined photograph gathering together enough.In addition,, after combined photograph completes, by carrying out deletion action (S151), also can carry out the deletion of the image of the appointment in combined photograph.

Then, use the flowchart text step S23(Fig. 2 shown in Figure 10 and Figure 11) and step S155(Fig. 9) the detailed action that operates of combined photograph.As mentioned above, this combined photograph operation is to carry out in the time having carried out the various operation of the editor for carrying out combined photograph.

First whether microcomputer 121, entering after combined photograph operation, has carried out the judgement (S161) of photographed frame alter operation, as having carried out photographed frame alter operation, carries out the change (S163) of photographed frame in this result of judging.Which, in frame photographic images being embedded in the pattern of combined photograph, can suitably change.In the time having been undertaken by operation or the touch panel operation etc. of cross key and OK button changing the operation of the frame that will take, the frame that change will be taken.Now, in order to know, which frame is taken, can be on frame display box, or display icon, or the frame beyond photographed frame is shown secretlyer, or reduce the chroma of the frame beyond photographed frame.In the case of having touched the frame in live view demonstration, carry out releasing operation (the first releasing operation and the second releasing operation or only the first releasing operation) and take.In the situation that utilizing multiple frames to carry out live view demonstration, use release-push to select the frame that will take.

Microcomputer 121 in step S163, carried out photographed frame after changing or the result of determination of step S161 be not exist the alter operation of photographed frame, next determine whether and carried out deletion action (S165).As illustrated, for the combined photograph completing, sometimes wish that deleting a part of image is replaced into other images in the flow process of still image record.Therefore, in the present embodiment, by select captured frame by cross key and press the operation of delete button or with touch panel by the towing of captured frame with put into the operation of dustbin icon etc., can delete captured frame.In this step S165, determine whether and carried out these operations.

Microcomputer 121 is to have carried out deletion action in the result of determination of step S165, next judges and touches frame size whether less (S167) in operation.Judge and operating by touch carried out deletion action in the situation that, the size (size or area in length and breadth) whether less (for example area is below 1 square centimeter) of the frame that delete.If become misoperation touching when operation by finger because frame size is less.

Microcomputer 121 is that in touching operation, frame size is greater than predetermined value in the result of determination of step S167, next makes image keep out of the way (S169), carries out the change (S171) of live view demonstration.Sometimes in the situation that having deleted image by deletion action, also wish its recovery.Therefore, in order to realize recovery (being the cancellation of deletion action), temporarily keep out of the way other regions in memory by having carried out deleting the photographic images (image of the frame of combined photograph) of specifying, regard this frame as not shooting state, change live view and show.

Microcomputer 121 is carrying out, after live view demonstration, carrying out combined photograph generation (S173).Use Figure 17 that the subprogram that this combined photograph generates has been described.Due to deleted by the specified image of deletion action, therefore in this step, use residual image to regenerate combined photograph.

Microcomputer 121 in step S173, carried out combined photograph generate after or the result of determination of step S167 be to touch frame size in operation hour or the result of determination of step S165 is not carry out deletion action, next whether carry out the judgement (S175) of recovery operation.Wish the frame recovering and operate delete button having selected by cross key, or operate while having carried out dustbin icon to tow to the operation etc. of the frame of hope recovery by touchs, carrying out recovery operation by the operation of operating portion 123 or touch panel 123a.

Microcomputer 121 is to have carried out recovery operation in the result of determination of step S175, and next Recovery image (S177) carries out the change (S179) of live view demonstration.Now, the image of being kept out of the way is returned to original position, regard as this frame is taken, change live view and show.

Microcomputer 121 shows after changing having carried out live view, carries out combined photograph generation (S181).Use Fig. 7 that the subprogram that this combined photograph generates has been described.Because the image of specifying by recovery operation is resumed, therefore comprise that in this step recovered image is at the interior combined photograph that regenerates.

It is not carry out recovery operation that microcomputer 121 has carried out combined photograph generation result of determination rear or step S175 in step S181, next whether carries out temporary transient judgement (S183 of Figure 11) of preserving operation.In the case of repeatedly taking in during the short time and carrying out the generation of combined photograph, can take continuously.But in the case of using the image that morning, noon, night etc. took across the time interval to generate combined photograph, if can temporarily preserve image in manufacturing process more for convenience.Therefore can realize in the present embodiment, the temporary transient preservation of the image that forms combined photograph.As recovery operation, be by menu screen or touch panel operation etc., undertaken by the selection operation of temporary transient preservation menu and the selection operation of icon etc.

Microcomputer 121 is to have carried out, temporary transient preservation operation, carrying out the record (S185) of data splitting in the result of determination of step S183.Now, by the state of current combined photograph, which pattern to have taken which photo etc. by and be recorded in flash memory 125 or recording medium 131.As the data splitting of record, have the information relevant with pattern (template) at least, take the related information in picture data, picture data and the template obtaining.

Microcomputer 121 is carrying out after the record of data splitting, next carries out the replacement (S187) of combined photograph.After record, the information of captured combined photograph of resetting, is made as with the 1st and takes same state, shows by live view and state that SDRAM 127 resets etc.

Carried out the replacement of combined photograph in step S187 after or the result of determination of step S183 be not carry out temporary transient preservation operation, next determine whether that having carried out temporary transient preservation reads in operation (S189).This operation is temporarily to have preserved the data splitting of combined photograph in step S183, S185, carries out reading of this temporary transient data splitting of preserving.As operation, for example, carry out the temporary transient selection operation of reading in menu or icon of preserving by menu screen or touch panel etc.

Whether be to have carried out temporary transient preservation to read in operation in the result of determination of step S189, be next the judgement (S191) in shooting process.In the process that existence is just being taken under combined photograph pattern, carry out the temporary transient situation of reading in operation of preserving.In this case, owing to must temporarily interrupting the shooting under current combined photograph pattern of carrying out, thereby tackle in step S191~S195.In this step, take a decision as to whether the shooting under combined photograph pattern and template has been taken to 1 with epigraph.

In the case of the result of determination of step S191 be combined photograph take in, be next confirmed whether temporarily to preserve (S193).In this case, in display floater 135, show and whether temporarily preserve the confirmation picture of the current state in shooting process and inquire user.For carry out temporary transient preservation in the situation that, carry out the record (S195) of data splitting in this result of determination.At this, for the temporary transient current state of preserving, record the data splitting same with step S185.

After microcomputer 121 has carried out data splitting record in step S195 or the result of determination of step S193 be while not carrying out temporarily preserving or the result of determination of step S191 be not in shooting process, next read in data splitting (S197), carry out live view demonstration (S199), carry out combined photograph generation (S201).At this, read out in step S185 the temporary transient data splitting of preserving, similarly carry out the change of live view demonstration with step S171 and S179, similarly carry out the generation of combined photograph with step S173 and S181.

Carried out the generation of combined photograph in step S201 after or the result of determination of step S189 be not carry out temporary transient preservation to read in operation, finish the action of combined photograph operation, return to former flow process.

Like this, in the flow process of combined photograph operation, the frame (S163) that will take can be in the frame of pattern, changed, frame (S169, S177) can also be deleted and recover.And also temporary transient save data in combined photograph is taken, can also read this temporary transient data of preserving, and continues combined photograph shooting.

Then, use the flowchart text step S55(Fig. 3 shown in Figure 12) the detailed action that shows of image processing/live view.Show about this image processing/live view, after the image that carries out showing for live view is processed, in EVF 153, carry out 1 frame live view and show, or in display floater 135, carry out the demonstration of combined photograph live view.

First whether microcomputer 121, entering after the flow process of combined photograph operation, carries out the judgement (S211) of EVF demonstration.Here,, according to the output that shows transducer 155 from EVF, determine whether in the EVF 153 playing a role as the 2nd display part and show.As mentioned above, near EVF 153, be provided with the EVF such as eye sensors and show transducer 155, when photographer is during near the eyepiece portion of EVF 153, show transducer 155 output detection signals from EVF.In the situation that detecting this detection signal, be judged to be EVF and show.In addition, in the present embodiment, according to showing that from EVF the detection of transducer 155 exports to switch the demonstration of EVF 153/do not show, but also can make all the time EVF 153 show.

Be to carry out EVF demonstration in the result of determination of step S211, then, determine whether more than having taken 1 frame and the confirmation (S213) of taking complete two field picture.Here determine whether that it is more than 1 frame taking complete frame number, and photographer has carried out the operation for frame acknowledgment.In the present embodiment, in live view shows, can change the image being combined in combined photograph.Photographer, in the situation that carrying out this change, rotates the functional unit such as driver plate or operation cross key in operating portion 123.Therefore, in this embodiment, in the situation that having operated the functional unit such as driver plate or cross key, have and confirm that within a certain period of time other take the function of complete two field picture.

Microcomputer 121 in the case of the result of determination of step S213 be taken 1 frame more than and carried out, frame acknowledgment operation, then determining whether and carrying out acknowledgement frame alter operation (S215).Here, judge driver plate direction of operating/rotation amount, whether operated which cross key etc., alter operation is judged.

Be to have carried out alter operation in the result of determination of step S215, carry out frame change (S217).Here, according to the alter operation based on functional unit, the frame that carries out 1 frame demonstration according to the position of frame or shooting order is switched.In addition, in step S223, carry out the demonstration of two field picture after changing.

Microcomputer 121 has carried out frame after changing in step S217 or the result of determination of step 215 is not carry out alter operation, then carries out the reading in of primary image (S219) of two field picture.Captured image is implemented to primary image processing (with reference to Fig. 5 A), and in the situation that having set artistic filter etc., implement particular image processing (with reference to Fig. 6).In this step, in the view data of the photographic images of selected frame, read in the view data of only having carried out primary image processing from SDRAM127.The in the situation that of not storing photographic images in SDRAM 127, only read in this frame part of the image combining in combined photograph.

The result of determination of step 213 be do not take 1 frame more than, also do not take under combined photograph pattern in the situation that, or do not carry out in the situation of frame acknowledgment, carry out primary image processing (S233).Here the current view data of being taken and being obtained by imaging apparatus 103 is implemented to use the primary image processing of Fig. 5 A explanation.The view data of having carried out primary image processing is stored in SDRAM 127.In addition, in the present embodiment, as primary image processing, carry out image using natural effect and process as completing effect, also can implement other processing, and also can implement particular image processing.

In step S233, carry out after primary image processing, or in step S219, read in after the primary image of two field picture, then, carried out shade setting (S221).In the situation that generating combined photograph, according to the difference of the assigned address of pattern (template), can not show all parts of selected frame.Therefore, show the image of 1 frame in EVF 153 time, for the region of not combining in combined photograph, carry out half and see through the shades such as grey, netrual colour processing (dull processing) or Fuzzy Processing and set, make combination place and not the place of combination become clear and definite.

For example, the pattern shown in Figure 14 A (template) is made up of the image (Gu Gu 1~3, LV) of the lengthwise of 4 frames.In addition, in figure, LV represents live view image, Gu Gu 1~3 represent the image being combined into of having selected.Although the length-width ratio of photographic images is 4:3, in the example shown in Figure 14 A, only one portion is combined into.Therefore, as shown in Figure 14B, use the shade 153a of black to cover the such image processing of the part that is not combined.

In addition in the example shown in Figure 14 C, be the pattern (template) being formed by the image being combined into (Gu Gu 1~3) and live view image LV.In this situation, the length-width ratio of LV is different from the length-width ratio of photographic images, therefore also can make photographer identify the part that is not combined.Particularly, implement to cover the such image processing of the part that is not combined with the shade 153b of dullness or blur effect, with the colored live view image that shows the part that will combine.

Like this, in the time carrying out the demonstration of live view image, according to the length-width ratio being determined by the pattern of combined photograph, implement to cover such image processing with shade 153a, 153b.Therefore, photographer can understand the part combining in combined photograph and the part not being combined intuitively, can be applicable to simply the composition of combined photograph.In addition,, in the time utilizing dull or fuzzy shade 153b to cover, have advantages of that can in the degree that does not affect shooting, to grasp the peripheral situation of the part being combined such.

Microcomputer 121 has carried out, after shade setting, then carrying out EVF demonstration (S223) in step S221.Here for the live view image that has carried out primary image processing in the two field picture being changed in step S217 or step S233, in EVF 153, be presented at the image having carried out in step S221 after shade is set.In image, carry out icon demonstration, to can know live view image or the image having read in.In addition, in the situation that photographed frame is confirmed, also can show the image that has only carried out primary image processing is stored in to the mode that situation in SDRAM 127 and not stored situation distinguish.

At the example that shown in Figure 13 A, this EVF shows.Image P1~P3 be selected as being combined to image in combined photograph image, show that " 1 " " 2 " " 3 " are as picture number.In addition, image P4 is live view image, and display icon " LV " is live view image to illustrate.According to the alter operation of aforesaid step S215, switch image P1~P4.

Microcomputer 121 has been having carried out after EVF in step S223 shows, or the result of determination of step S211 is in the situation that not EVF shows, judgement be in dynamic image record or display floater show in (S225).Here determine whether it is in dynamic image record according to mark (with reference to S13~S19 of Fig. 2) in record.In addition, about whether showing, for example, in the time not showing, be judged to be "Yes" in EVF 153 in display floater 135.In addition, also can in display floater 135, show all the time.

In the case of the result of determination of step S225 be in dynamic image record, or show in the situation that, carry out image processing (S227) in display floater 135.Here carry out the image processing shown in Fig. 4.In this image is processed, implement primary image processing, particular image processing, combined photograph processing.In the situation that having selected combined photograph pattern, the image being combined and live view image are implemented to combined photograph processing.

In addition, microcomputer 121 shows in order to carry out 1 frame live view in EVF 153, carries out primary image processing (S233), the in the situation that of carrying out the demonstration of combined photograph live view in display floater 135, also carry out primary image processing (S227, the S71 of Fig. 4).But, carry out 1 frame live view with different parameters and show that the primary image of use is processed and the primary image processing of combined photograph live view demonstration use.For example, adopt the parameter of " nature " in 1 frame live view demonstration is used, on the other hand, in the demonstration of combined photograph live view is used, employing can obtain the parameter that unification is felt in multiple images.

In step S227, implement, after image processing, then, to determine whether and carry out display floater demonstration (S229).Here, use the result of determination in step S225, determine whether in display floater 135 and show.

Microcomputer 121 is show in display floater in the situation that in the result of determination of step S229, shows (S231) in display floater 135.Here in display floater 135, be presented at the image that has carried out image processing in step S227.In the situation that having set combined photograph pattern, the image reading in and live view image are carried out to combined photograph processing, show this image.

At the example of the demonstration of this display floater shown in Figure 13 B.Image P11~P13 is the image that has been selected as being combined to the image in combined photograph, shows that " 1 " " 2 " " 3 " are as picture number.In addition, image P14 is live view image, shows for representing it is the icon " LV " of live view image.Image 15 is images of combined photograph, shows to image P11~P14 has been implemented combined photograph processing and had the unified image of feeling.Be with the difference of image P1~P4 of Figure 13 A, image P1~P4 of Figure 13 A does not implement combined photograph processing, and image P11~P15 of Figure 13 B has been implemented to combined photograph processing, and has implemented to have the image processing of unified sense.

After microcomputer 121 has carried out showing in step S231 in display floater or the result of determination of step S225 be not in dynamic image record or the in the situation that of demonstration in display floater or the result of determination of step S229 be do not show in display floater in the situation that, finish the flow process that image processing/live view shows, return to former flow process.

Like this, in the image processing/live view of present embodiment shows, in EVF 153, show the live view image (S223) of 1 frame, on the other hand, in display floater 135, show combined photograph (S227, S231).In addition,, in the time carrying out the demonstration of combined photograph, the image that also uses the live view of obtaining by imaging apparatus 103 to use, shows the image (with reference to P14, the P15 of Figure 13 B) of having implemented combined photograph processing.

In addition, for the live view image of 1 frame, be not that a part for combined photograph is amplified, but implement the image processing (a part of image processing of the image processing in combined photograph carried out) different from combined photograph.In addition,, as the live view image of 1 frame, carry out shade demonstration (S221, with reference to Figure 14 B) for the part beyond the part being combined in combined photograph.In addition, even if the frame of having taken also can be carried out 1 frame demonstration and be confirmed (with reference to S215, S217) by shirtsleeve operation.

By carrying out control as described above, in the present embodiment, can in EVF 153, observe 1 frame live view and show, and can in display floater 135, be shown and be observed combined image by combined photograph live view, therefore photographer can take with the shooting pattern of liking.

In addition,, owing to being presented at the position being combined in combined photograph, therefore can also prevent the position grouping photographic images beyond anticipation.In addition, view data is implemented to particular image and process and combined photograph processing, can observe the combined photograph with unified sense in the time that live view shows, therefore can imagine effect, it is easy that the shooting of combined photograph becomes.

In addition, in 1 frame live view shows, therefore do not apply unnecessary effect by the image that has carried out primary image processing owing to showing in EVF 153, can more concentrate on shooting.In addition, even if in taking with 1 frame, also can confirm simply to take complete frame, therefore can realize the shooting of considering the balance in EVF 153 and between other frame.Particularly, because the screen size of EVF 153 is less, therefore can concentrate on shooting by carrying out 1 frame demonstration.

Then, use the variation of the action of Figure 15 key diagram picture processing/live view demonstration.In the image processing shown in aforesaid Figure 12/live view shows, except not being 1 frame with photographs and carrying out the situation of frame acknowledgment, carry out primary image processing (S233).On the other hand, in this variation, not only carry out primary image processing, also carry out image analysis, color change, brightness change etc.

The variation of the flow chart that the image processing shown in Figure 15/live view shows is compared with the flow chart shown in Figure 12, and difference is to have appended step S235~S239, and other step is identical with Figure 12.

In addition, the brightness of the image analysis of step S235, the change of the color of step S237 and step S239 is changed and is carried out respectively the processing identical with step S121, S123, the S125 of Fig. 7, therefore detailed.

Like this, in image processing/live view of this variation shows, except primary image is processed, also carry out color change and brightness and change.Processing is herein the image processing that the live view for carrying out 1 frame shows, does not therefore implement special effect treatment.Special-effect can hinder and concentrate on shooting, does not therefore execute special effect treatment.Color changes with the processing relevant with color reproduction such as brightness change can affect the final effect that completes, and therefore implements this processing.

Then, use Figure 16 to Figure 19 that the 2nd execution mode of the present invention is described.In the 1st execution mode, EVF153 is built in camera main-body 100, but is made as external EVF in the 2nd execution mode.In addition, in the 1st execution mode, in camera main-body 100, configured 1 display floater 135 playing a role as the 1st display part, but in the present embodiment, in camera main-body 100, configured 2.

Figure 16 is the block diagram that the main electrical structure of the camera of the 2nd execution mode of the present invention is shown.Compared with Fig. 1 of the 1st execution mode, difference is to be provided with external EVF 300, be provided with the contact 159 of communicating by letter for communicating with camera main-body 100, and be provided with the 1st display driver 133a, the 1st display floater 135a, the 2nd display driver 133b, the 2nd display floater 135b, other are identical with Fig. 1.Therefore, centered by difference, describe.

Display driver 133a, 133b are connected with display floater 135a, 135b respectively, same with the situation of the 1st execution mode, show image according to the view data of reading from SDRAM 127 or recording medium 131.The dynamic image such as reproduction demonstration and live view demonstration that can record the image file of browse displays, still image or dynamic image in display floater 135a, 135b shows.In addition, can also be presented at and in combined photograph handling part 117, implement combined photograph combined photograph after treatment.In addition, same with the 1st execution mode, can adopt display panels (LCD), organic EL etc. as display floater.The configuration of the 1st and the 2nd display floater 135a, 135b is described with Figure 17 A-17D and Figure 18 A-18E below.

External EVF 300 can be on camera main-body 100 disassembled and assembled freely, can be connected with the communication process portion 157 in camera main-body 100 by communication contact 159 when mounted.In addition, EVF 300 can receive from the view data of camera main-body 100 and show image.

In EVF 300, there is EVF driver 301, EVF 303, EVF show transducer 305.EVF driver 301 is connected with EVF 303, makes to show in EVF 303 image of the view data based on inputting by communication contact 159.EVF 303 can observe via eyepiece portion the image of display part.

The example of the outward appearance of the camera in present embodiment then, is described with Figure 17 A-17D.Figure 17 A, Figure 17 B are the 1st examples, and Figure 17 C, Figure 17 D are the 2nd examples.

In the 1st example of the outward appearance of camera, external EVF 300 can be installed on the top of camera main-body 100.This external EVF 300 plays a role as the 2nd display part.In addition, in camera main-body 100, be provided with the 1st display floater 135a and these 2 display floaters of the 2nd display floater 135b of playing a role as the 1st display part.

The 2nd display floater 135b is the display floater being fixed on camera main-body 100, and the 1st display floater 135a is moveable panel., the 1st display floater 135a is installed to camera main-body 100 by hinge 161 in the mode that can freely rotate around 2 axles as shown in Figure.

In Figure 17 A, open the 1st display floater 135a, can show the 1st display floater 135a and the 2nd display floater 135b simultaneously.In this situation, the 1st display floater 135a is positioned at a side far away apart from the optical axis 163 of taking lens 201, and the 2nd display floater 135b is positioned at a side nearer apart from optical axis 163.Under this state, the live view that carries out 1 frame at the 2nd display floater 135b that is arranged in a side nearer apart from optical axis 163 shows, shows the image of combined photograph in the 1st display floater 135a of a side far away apart from optical axis 163 by live view.Like this, although on the 1st and the 2nd display floater 135a, 135b two sides, show image,, because a side nearer apart from optical axis 163 easily carried out inching in the time taking, therefore in the 2nd display floater 135b, carry out 1 frame live view and show, can in 1 frame, observe careful part.

Figure 17 B utilizes the 1st display floater 135a to cover the 2nd display floater 135b, and the display surface of the 1st display floater 135a is towards the state of outside.Under this state, in the 1st display floater 135a, carry out 1 frame live view and show.

Like this, in the 1st example of present embodiment, according to the state of the display surface of the 1st display floater 135a and the 2nd display floater 135b, switch the live view demonstration of 1 frame and the live view of combined photograph and show.In addition, same with the situation of the 1st execution mode, in 1 frame live view shows, show the image of not implementing combined photograph processing, in combined photograph live view shows, show the image of having implemented combined photograph processing.

In the 2nd example of the outward appearance of the camera shown in Figure 17 C, Figure 17 D, on camera main-body 100, be provided with these 2 display floaters of the 1st display floater 135a and the 2nd display floater 135b.That is, dispose the 1st display floater 135a in the rear side of camera main-body 100, dispose the 2nd display floater 135b in face side.

In the 2nd example, omit external EVF 300, the live view that carries out combined photograph in the 1st display floater 135a shows, carries out the live view demonstration of 1 frame in the 2nd display floater 135b.That is, in this embodiment, the 1st display floater 135a plays a role as the 1st display part, and the 2nd display floater 135b plays a role as the 2nd display part.

In the 2nd example, carry out the live view demonstration of 1 frame in the front of camera main-body, therefore photographer is very convenient in the time autodyning.Autodyne time, due to from photographer to camera main-body 100 distant, even therefore in the 2nd display floater 135b, show the image of combined photograph be also not easy observe, therefore only carry out 1 frame live view show.Same with the situation of the 1st execution mode, in 1 frame live view shows, show the image of not implementing combined photograph processing, in combined photograph live view shows, show the image of having implemented combined photograph processing.

The variation of the outward appearance of the camera in the 2nd execution mode of the present invention then, is described with Figure 18 A-18D.In the example shown in Figure 17 A-17D, there are these 2 display floaters of the 1st display floater 135a and the 2nd display floater 135b, but in the example shown in Figure 18 A-18D, be made as and only have 1 display floater.

In the 1st example of the variation shown in Figure 18 A, Figure 18 B, the 1st display floater 135a is fixed on camera main-body 100, and external EVF 300 can be installed.In this embodiment, the live view that carries out 1 frame in external EVF 300 shows, carries out the demonstration of combined photograph in the 1st display floater 135a.About the image of the bottom right of Figure 18 A, owing to carrying out live view demonstration in EVF 300, therefore in this part, do not carry out live view demonstration.For by this advisory to the people beyond photographer and photographer, and show the image of black, and carry out the demonstration of " LV " and the identification of " EVF " demonstration.

In the 2nd example of the variation shown in Figure 18 C-18E, omit external EVF 300, the 1st display floater 135a is made as movable with respect to camera main-body 100.In the state of Figure 18 C, due to the display surface of the 1st display floater 135a toward the outer side, therefore can observe demonstration image.Under this state, carry out the demonstration of combined photograph live view.In addition, under this state, also can carry out 1 frame live view by handover operation etc. and show.

When the state from Figure 18 C is as Figure 18 D, by the top pull-out of the 1st display floater 135a, and while making top rotate as Figure 18 E, can be from the display surface of face side observation the 1st display floater 135a of camera main-body 100.In this situation, same with the situation of Figure 17 C, become and be applicable to the state that photographer autodynes.Therefore, carry out the live view demonstration of 1 frame.

In the variation shown in Figure 18 A-18E, be also in 1 frame live view shows, to show the image of not implementing combined photograph processing, in showing, combined photograph live view shows the image of having implemented combined photograph processing.

Then, with Figure 19, in present embodiment and the variation being connected external equipment are described.In this variation, can be connected with smart mobile phone 400, television set 500a, 500b by Wireless-wire Department of Communication Force 147.

Smart mobile phone 400 and camera main-body 100 carry out the transmitting-receiving of data wirelessly via Wireless-wire Department of Communication Force 147.The remote controller that can also serve as camera main-body 100 due to smart mobile phone 400 uses, and therefore in the display part of smart mobile phone 400, carries out 1 frame live view and shows, confirms photographer's position.In addition, also multiple still images can be sent several times, in smart mobile phone 400, generate the image of combined photograph.

In addition, television set 500a, 500b, via Wireless-wire Department of Communication Force 147, carry out the transmitting-receiving of data in the mode of radio communication or wire communication.The in the situation that of wire communication, the cable 600 of using by HDMI is connected with television set 500b.In this situation, due to broader bandwidth, therefore send the image of combined photograph.In addition,, the in the situation that of radio communication, because the capacity of the view data sending is limited, therefore only send 1 frame live view image to television set 500a.

Like this, can be also that camera main-body 100 is connected with external equipments such as smart mobile phone 400, television set 500a, 500b, optionally sends the image of 1 frame live view image and combined photograph.

Data flow while illustrating that with Figure 20 A carrying out live view in embodiments of the present invention described above and variation shows.In Fig. 1, be equivalent to image processing part 109 by image processing part 20(), to being equivalent to imaging apparatus 103 from image pickup part 10(among Fig. 1) view data of output carries out 1 frame live view and shows the image processing of use, according to this image view data after treatment, in the 2nd display part (being equivalent to EVF 153 in Fig. 1), carry out 1 frame live view and show.

In addition, by combined photograph handling part, 30(is equivalent to combined photograph handling part 117 in Fig. 1), the view data from image processing part 20 is carried out to the image processing that combined photograph live view shows use.In addition,, according to carrying out image view data after treatment by image processing part 20, in Fig. 1, be equivalent to display floater 135 at the 1st display part 50() carry out the demonstration of combined photograph live view.In addition, image processing part 20 also can make 1 frame live view show that the image processing of use shows that with combined photograph live view the image processing of use is different.

In addition, as shown in Figure 20 B, also image processing part 20 can be separated into the 1st image processing part 20a and the 2nd image processing part 20b.That is, from the 1st image processing part 20a of image pickup part 10 input image datas, the view data of having implemented primary image processing, particular image processing and combined photograph processing is outputed to the 1st display part 50, carry out the demonstration of combined photograph live view.In addition, from the 2nd image processing part 20b of image pickup part 10 input image datas, the view data of having implemented primary image processing is exported to the 2nd display part 40, carry out 1 frame live view and show.

As described above, in the embodiments of the present invention and each variation, the in the situation that of showing live view image in the 1st display part (display floater 135 etc.), be presented at the image (with reference to S227, the S231 of Figure 12, the S79 of Fig. 4) that has carried out processing in combined photograph handling part 117.On the other hand, the in the situation that of showing live view image in the 2nd display part (EVF153 etc.), be presented at the image (with reference to S233, the S223 of Figure 12) that has carried out image processing in image processing part 109.Therefore, can easily observe the demonstration of 1 frame and the demonstration of combined photograph.

In addition, in the embodiments of the present invention and each variation, the image showing in the case of being created on the image of demonstration in the 1st display part (display floater 135 etc.) and in being created on the 2nd display part (EVF 153 etc.), image processing part carries out different image processing.For example, in the case of being created on the image showing in the 1st display part, generate with " nature " effect.Therefore,, in the time carrying out the demonstration of combined photograph live view and while carrying out 1 frame live view demonstration, can carry out most suitable image processing separately.

In addition, in the embodiments of the present invention and each variation, image processing part carries out the image processing of the image for being created on the 2nd display part (EVF 153 etc.) demonstration, the image that image processing has been carried out in input in image processing part, carry out particular image processing, to generate the image for showing at the 1st display part (display floater 135 etc.).Therefore, in the time carrying out 1 frame live view demonstration, show the image approaching with the original image of not implementing particular image processing, in the time carrying out the demonstration of combined photograph live view, can show the image of having implemented the special-effect setting.In addition, in the embodiments of the present invention and each variation, image processing part is in the case of being created in the 2nd display part (EVF 153 etc.) the image showing, a part of image processing in the image processing of carrying out carrying out in the time being created on the image showing in the 1st display part (display floater 135 etc.).

In addition, in the embodiments of the present invention and each variation, image processing part is in the case of being created in the 2nd display part (EVF 153 etc.) the image showing, a part of image processing in the image processing of carrying out carrying out in the time being created on the image showing in the 1st display part (display floater 135 etc.) with different parameters.For example, in the case of being created on the image showing in the 1st display part, the parameter that uses " nature " effect to use is carried out image processing.

In addition, in the embodiments of the present invention and each variation, the 1st display part (display floater 135 etc.) is the display part that is subject to the impact of outer light, and the 2nd display part (EVF 153 etc.) is the display part that is not subject to the impact of outer light.Therefore, can utilize the 2nd display part to focus one's attention on takes.In addition, as the 1st display part, large-scale panel can be used, combined photograph live view can be easy to observe.

In addition, in the embodiments of the present invention and each variation, in EVF 153, carrying out 1 frame live view shows, in display floater 135, carry out the demonstration of combined photograph live view, but, also can be contrary, in display floater 135, carry out 1 frame live view and show, in EVF 153, carry out the demonstration of combined photograph live view.

In addition, in the embodiments of the present invention and each variation, use digital camera to be illustrated as filming apparatus, but can be both digital slr camera, no-mirror camera, compact digital camera as camera, it can also be the such dynamic image camera of camera, camera, and then can also be the camera being built in mobile phone, smart mobile phone or portable information terminal (PDA:Personal Digital Assist, personal digital assistant), game station etc.No matter which kind of situation, as long as taking the filming apparatus of combined photograph, just can apply the present invention.

In addition, about the motion flow in claims, specification and accompanying drawing, use for convenience's sake the word of the performance orders such as " first ", " next " to be illustrated, but in case of no particular description, do not represent to implement according to this order.

The present invention is not limited only to above-mentioned execution mode, can implementation phase in do not depart from and in the scope of its purport, inscape be out of shape and make its specific implementation.In addition, appropriately combined by disclosed multiple inscapes in above-mentioned execution mode, can form various inventions.For example can delete the several inscapes in all inscapes shown in execution mode.And then, the inscape in all right appropriately combined different execution modes.

Claims (8)

1. a filming apparatus, it can take multiple image combinings is 1 combined photograph that image forms, and can carry out live view demonstration, this filming apparatus has:
The 1st display part;
2nd display part different from above-mentioned the 1st display part;
Image pickup part, shot object image is converted to view data output by it;
Image processing part, it carries out image processing to above-mentioned view data; And
Combined photograph handling part, it combines multiple view data of having been carried out image processing by above-mentioned image processing part, generates 1 image,
Wherein, the in the situation that of showing live view image in above-mentioned the 1st display part, be presented at the image that has carried out processing in combinations thereof photo disposal portion, on the other hand, the in the situation that of showing live view image in above-mentioned the 2nd display part, be presented at the image that has carried out image processing in above-mentioned image processing part.
2. filming apparatus according to claim 1, wherein,
The image showing in being created on the image showing in above-mentioned the 1st display part and being created on above-mentioned the 2nd display part, above-mentioned image processing part carries out different image processing.
3. filming apparatus according to claim 2, wherein,
Above-mentioned image processing part carries out the image processing for being created on the image that above-mentioned the 2nd display part shows, the image that image processing has been carried out in input in above-mentioned image processing part, carries out particular image processing in order to be created on the image showing in above-mentioned the 1st display part.
4. filming apparatus according to claim 2, wherein,
In the case of being created on the image showing in above-mentioned the 2nd display part, above-mentioned image processing part carries out in the case of being created on a part of image processing of the image carrying out the image showing in above-mentioned the 1st display part in processing.
5. filming apparatus according to claim 2, wherein,
In the case of being created on the image showing in above-mentioned the 2nd display part, above-mentioned image processing part carries out in the case of being created on a part of image processing of the image being carried out the image showing in above-mentioned the 1st display part in processing with different parameters.
6. according to the filming apparatus described in any one in claim 1 to 5, wherein,
Above-mentioned the 1st display part is back displays panel, and above-mentioned the 2nd display part is electronic viewfinder.
7. according to the filming apparatus described in any one in claim 1 to 5, wherein,
Above-mentioned the 1st display part is the display part that is subject to the impact of outer light, and above-mentioned the 2nd display part is the display part that is not subject to the impact of outer light compared with above-mentioned the 1st display part.
8. an image pickup method, it is 1 combined photograph that image forms by multiple image combinings that this image pickup method is taken, and carries out live view demonstration, this image pickup method comprises following steps:
Shot object image is converted to view data output;
In the 2nd display part, show the live view image of 1 frame, show the image that above-mentioned view data has been carried out to image processing;
In 1st display part different from above-mentioned the 2nd display part, show the live view image of combined photograph, the image that generates 1 combined photograph to having carried out multiple view data enforcement combined photograph processing of image processing, shows the image of this combined photograph.
CN201310713342.3A 2012-12-21 2013-12-20 Filming apparatus and image pickup method CN103888665B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2012279726A JP2014123896A (en) 2012-12-21 2012-12-21 Imaging apparatus, imaging method and program
JP2012-279726 2012-12-21

Publications (2)

Publication Number Publication Date
CN103888665A true CN103888665A (en) 2014-06-25
CN103888665B CN103888665B (en) 2017-08-25

Family

ID=50957374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310713342.3A CN103888665B (en) 2012-12-21 2013-12-20 Filming apparatus and image pickup method

Country Status (3)

Country Link
US (1) US9426372B2 (en)
JP (1) JP2014123896A (en)
CN (1) CN103888665B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016048831A (en) * 2014-08-27 2016-04-07 オリンパス株式会社 Imaging device, imaging method, and program
CN105491284A (en) * 2015-11-30 2016-04-13 小米科技有限责任公司 Preview image display method and device

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8199212B2 (en) * 2008-05-03 2012-06-12 Olympus Imaging Corp. Image recording and playback device, and image recording and playback method
JP2014067310A (en) * 2012-09-26 2014-04-17 Olympus Imaging Corp Image editing device, image editing method, and program
US9146394B1 (en) * 2012-12-13 2015-09-29 Optics 1, Inc. Clip-on eye piece system for handheld and device-mounted digital imagers
US20150215530A1 (en) * 2014-01-27 2015-07-30 Microsoft Corporation Universal capture
US9762815B2 (en) * 2014-03-27 2017-09-12 Intel Corporation Camera to capture multiple sub-images for generation of an image
WO2015170521A1 (en) * 2014-05-08 2015-11-12 ソニー株式会社 Imaging device
US9350924B2 (en) 2014-08-25 2016-05-24 John G. Posa Portable electronic devices with integrated image/video compositing
JP6579899B2 (en) * 2015-10-09 2019-09-25 キヤノン株式会社 Imaging device, imaging device control method, program, and storage medium
CN105306830B (en) * 2015-12-08 2018-05-29 广东欧珀移动通信有限公司 A kind of camera control method and device
WO2017163685A1 (en) * 2016-03-24 2017-09-28 シャープ株式会社 Video processing device, display apparatus, video processing method, control program, and recording medium
JP2017184220A (en) * 2016-03-24 2017-10-05 シャープ株式会社 Video processing apparatus, display apparatus, video processing method, control program, and recording medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010024235A1 (en) * 2000-03-16 2001-09-27 Naoto Kinjo Image photographing/reproducing system and method, photographing apparatus and image reproducing apparatus used in the image photographing/reproducing system and method as well as image reproducing method
CN1659876A (en) * 2002-06-05 2005-08-24 精工爱普生株式会社 Digital cameras and image processing equipment
CN1692622A (en) * 2002-12-09 2005-11-02 卡西欧计算机株式会社 Image composing apparatus, electronic camera, and image composing method
CN1929538A (en) * 2005-09-09 2007-03-14 Lg电子株式会社 Image capturing and displaying method and system
US20090009614A1 (en) * 2007-07-03 2009-01-08 Tomoyuki Kawai Digital still camera and method of controlling operation of same

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4282189B2 (en) 1999-12-08 2009-06-17 オリンパス株式会社 Electronic camera with electronic viewfinder
JP4617417B2 (en) 2002-03-13 2011-01-26 カシオ計算機株式会社 Electronic camera, photographing and recording method, and program
US8081205B2 (en) * 2003-10-08 2011-12-20 Cisco Technology, Inc. Dynamically switched and static multiple video streams for a multimedia conference
AU2009243486B2 (en) * 2009-12-02 2012-12-13 Canon Kabushiki Kaisha Processing captured images having geolocations
JP5740826B2 (en) * 2010-03-29 2015-07-01 セイコーエプソン株式会社 Image display device, image information processing device, and image information processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010024235A1 (en) * 2000-03-16 2001-09-27 Naoto Kinjo Image photographing/reproducing system and method, photographing apparatus and image reproducing apparatus used in the image photographing/reproducing system and method as well as image reproducing method
CN1659876A (en) * 2002-06-05 2005-08-24 精工爱普生株式会社 Digital cameras and image processing equipment
CN1692622A (en) * 2002-12-09 2005-11-02 卡西欧计算机株式会社 Image composing apparatus, electronic camera, and image composing method
CN1929538A (en) * 2005-09-09 2007-03-14 Lg电子株式会社 Image capturing and displaying method and system
US20090009614A1 (en) * 2007-07-03 2009-01-08 Tomoyuki Kawai Digital still camera and method of controlling operation of same

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016048831A (en) * 2014-08-27 2016-04-07 オリンパス株式会社 Imaging device, imaging method, and program
CN105491284A (en) * 2015-11-30 2016-04-13 小米科技有限责任公司 Preview image display method and device
US10270975B2 (en) 2015-11-30 2019-04-23 Xiaomi Inc. Preview image display method, apparatus and storage medium

Also Published As

Publication number Publication date
CN103888665B (en) 2017-08-25
US20140176775A1 (en) 2014-06-26
JP2014123896A (en) 2014-07-03
US9426372B2 (en) 2016-08-23

Similar Documents

Publication Publication Date Title
US7656451B2 (en) Camera apparatus and imaging method
EP2541902B1 (en) Imaging processing device and image processing method
JP4443735B2 (en) Imaging apparatus and operation control method thereof
CN105765967B (en) The method, system and medium of the setting of first camera are adjusted using second camera
US9167163B2 (en) Digital image processing apparatus that displays first content, generates second content based on an input signal, and generates third content related to the second content, and method of controlling the same
JP2004207985A (en) Digital camera
JP4700164B2 (en) Image recording apparatus and method
CN102811313B (en) Camera head and image capture method
US7706674B2 (en) Device and method for controlling flash
JP2006025238A (en) Imaging device
US7432972B2 (en) Method of controlling digital photographing apparatus, and digital photographing apparatus utilizing the method
CN103327248B (en) Photographing unit
TWI293846B (en) Image pickup device with brightness correcting function and method of correcting brightness of image
CN101355651B (en) Camera head
CN101610363B (en) Apparatus and method of blurring background of image in digital image processing device
KR20090067910A (en) Apparatus and method for blurring an image background in digital image processing device
CN101841651B (en) Image processing apparatus, imaging apparatus and image processing method
KR101342477B1 (en) Imaging apparatus and imaging method for taking moving image
JP3899497B2 (en) Electronic camera, electronic camera control method, and recording medium
CN102348056B (en) Image synthesizing device and image synthesizing method
CN103782586B (en) Imaging device
US7667763B2 (en) Image pickup equipment and method
JP2014120844A (en) Image processing apparatus and imaging apparatus
CN100420278C (en) Image sensing apparatus and control method thereof
CN101237529A (en) Imaging apparatus and imaging method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20151211

Address after: Tokyo, Japan, Japan

Applicant after: Olympus Corporation

Address before: Tokyo, Japan, Japan

Applicant before: Olympus Imaging Corp.

GR01 Patent grant
GR01 Patent grant