CN103002211B - Photographic equipment - Google Patents

Photographic equipment Download PDF

Info

Publication number
CN103002211B
CN103002211B CN201210328345.0A CN201210328345A CN103002211B CN 103002211 B CN103002211 B CN 103002211B CN 201210328345 A CN201210328345 A CN 201210328345A CN 103002211 B CN103002211 B CN 103002211B
Authority
CN
China
Prior art keywords
image
processing
mentioned
control part
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210328345.0A
Other languages
Chinese (zh)
Other versions
CN103002211A (en
Inventor
木村亮史
新谷浩一
吉津宏和
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN103002211A publication Critical patent/CN103002211A/en
Application granted granted Critical
Publication of CN103002211B publication Critical patent/CN103002211B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a kind of photographic equipment, can in the time of photography action, be easy to realize HDR (HDR) image. Have: the image pickup part (12) of taking subject; Show the display part (23) of the image of the view data based on being obtained by image pickup part; Be disposed on display part, specify the touch panel (25) with the corresponding coordinate of input operation; The control part (11) of the appointed area that setting comprises the coordinate in the image corresponding with the coordinate of being specified by touch panel; The view data obtaining according to image pickup part is carried out photometry and is set correct exposure value object light, generates the exposure control part (14) of the image of the correct exposure value based on setting; Carry out the synthetic image processing part (15) of processing of image according to the multiple view data that obtained by image pickup part, exposure control part generates the image of the benchmark image of the correct exposure value based on setting according to view data and the correct exposure value based on setting according to the view data in appointed area, and multiple images that image processing part generates exposure control part synthesize processing.

Description

Photographic equipment
Technical field
The present invention relates to photographic equipment, in particular to can carry out HDR (HDR) image in equipmentThe synthetic photographic equipment of processing.
Background technology
In the past in the photographic equipment such as digital camera of having applied imaging apparatus etc., according to the figure being obtained by photography actionImplement various image processing as data, thereby can obtain various forms of images. Especially in the last few years, be subject to figureThe impact of raising of the processing speed of picture treatment circuit, the stage before carrying out photographic recording, carry out so-calledWhen live view shows, can confirm in advance to implement the figure of the result in the situation of various image processingPicture.
In addition, in photographic equipment in recent years, on the display part of display unit, arrange the functional units such as touch panel,Carry out the operation of intuitive by this touch panel, thereby can carry out the equipment of the various operations relevant with photography actionObtain practical.
For example, in the disclosed photographic equipments such as Japanese kokai publication hei 11-355617 communique, disclosing can be by touchingThe touch of panel operates to indicate photography action or specifies focusing position and aim at the operation such as position of exposure.
[patent documentation 1] Japanese kokai publication hei 11-355617 communique
But the disclosed means such as above-mentioned Japanese kokai publication hei 11-355617 communique only provide consideration to refer to by touching operationThe formation of various settings is carried out in the position showing, although for example disclose entering by touching the indicated image-region of operationThe content of capable predetermined control processing, but the situation that the image-region of not instruction is processed is considered in research.
Existing photographic equipment conventionally possesses and carries out auto-exposure control automatically to determine the function of correct exposure value. ButCan judge the exposure model compared with the appreciable scope of mankind's eyes such as imaging apparatus of applying in existing photographic equipmentEnclose (latitude: exposure latitude) narrower.
Therefore, for example, in the case of the larger multiple subjects of luminance difference in photographic picture coexist, comprising the brightness utmost pointThe region (so-called bright portion region) of high subject and the region that comprises the extremely low subject of brightness are (so-called darkPortion region) can produce the phenomenons such as the so-called high light causing because image is saturated overflows, dark portion disappearance, exist and make to obtainThe problem points of image quality variation of image.
For example, so in recent years, proposed use high dynamic range images (highdynamicrangeimaging)Synthetic technology, can obtain the photographic equipment of the image that dynamic range is larger and obtain practical. But existing at theseIn some photographic equipments, automatically carry out the synthetic processing of high dynamic range images by the control circuit that is assembled into device interior,Therefore the image of sometimes obtaining is not too natural.
Summary of the invention
The present invention In view of the foregoing completes, and its object is to provide a kind of and is using high dynamic range imagesSynthetic technology obtains in the photographic equipment of image, can obtain and reflect that user is intended to and has eliminated the height of naturally not feelingThe photographic equipment of dynamic image.
In order to reach above-mentioned purpose, the photographic equipment of one aspect of the present invention has: image pickup part, and it takes subject;Display part, it shows the image of the view data based on being obtained by above-mentioned image pickup part; Touch panel, it is disposed at above-mentionedIn the display frame of display part, specify and the corresponding coordinate of input operation; Control part, it sets appointed area, and this refers toDetermine the coordinate in above-mentioned image that district inclusion is corresponding with the coordinate of specifying by above-mentioned touch panel; Exposure control part,It carries out photometry and sets correct exposure value object light according to the view data being obtained by above-mentioned image pickup part, generates baseIn the image of the correct exposure value setting; And image processing part, it is according to the multiple figure that obtained by above-mentioned image pickup partCarry out the synthetic processing of image as data, above-mentioned exposure control part generates suitable based on what set according to above-mentioned view dataThe figure of the benchmark image of exposure value and the correct exposure value based on setting according to the view data in above-mentioned appointed areaPicture, above-mentioned image processing part carries out the synthetic processing of the multiple images that generated by above-mentioned exposure control part.
According to the present invention, can provide a kind of in the photographic equipment of use high dynamic range images synthetic technology, canObtain the photographic equipment that has reflected user's intention and eliminated the high dynamic range images of naturally not feeling.
Brief description of the drawings
Fig. 1 is the main composition portion that the inside of the photographic equipment (camera) that represents the present invention's the 1st embodiment formsThe structured flowchart of part.
Fig. 2 is the key diagram of the photographic equipment (camera) that represents the to use Fig. 1 situation of photographing.
Fig. 3 is the figure that is mainly illustrated in the display frame of photographic equipment (camera) in the state of Fig. 2.
Fig. 4 represents to obtain according to the state of Fig. 1, Fig. 2 an example of the demonstration image of the result that touches operation, is to representThe figure of the demonstration image while exposing control to touching indicating positions emphasis.
Fig. 5 represents to obtain according to the state of Fig. 1, Fig. 2 another example of the demonstration image of the result that touches operation, is tableWhen being shown in the state that maintains non-touch location, the demonstration image while touching the exposure control of indicating positionsFigure.
Fig. 6 (A), Fig. 6 (B) be explanation in the photographic equipment (camera) of Fig. 1 touch panel coordinate andThe figure of the relation of the coordinate of display unit.
Fig. 7 (A), Fig. 7 (B), Fig. 7 (C), Fig. 7 (D) are for illustrating that the photographic equipment at Fig. 1 (shinesCamera) the synthetic schematic diagram of processing of middle HDR (HDR) image of carrying out.
Fig. 8 is the flow chart of the processing sequence of the camera control of the photographic equipment (camera) of presentation graphs 1.
Fig. 9 is the flow process that represents to be added by the touch operation of Fig. 8 the subprogram of judgement (step S5) processingFigure.
Figure 10 is the processing order of the camera control of the photographic equipment (camera) that represents the present invention's the 2nd embodimentThe flow chart of row.
Figure 11 is the flow chart that represents the subprogram of HDR sequence (step S45) in the processing sequence of Figure 10.
Figure 12 is the flow chart that represents the subprogram of the Region specification (step S47) in the processing sequence of Figure 11.
Figure 13 is the details that represents the example A (step S49) of the HDR sequence in the processing sequence of Figure 11Flow chart.
Figure 14 is the subprogram that represents to specify in the processing sequence of Figure 13 area metering L (step S62, S76)Flow chart.
Figure 15 represents to specify area metering H to process the sub-journey of (step S71, S79) in the processing sequence of Figure 13The flow chart of order.
Figure 16 is the details that represents the example B (step S50) of HDR sequence in the processing sequence of Figure 11Flow chart.
Figure 17 is the flow chart that represents the details of dynamic image record (step S43) in the processing sequence of Figure 10.
Figure 18 represents that HDR photometry in the processing sequence of Figure 17, exposure are controlled to process (step S212, S220)The flow chart of subprogram.
Figure 19 (A), Figure 19 (B), Figure 19 (C), Figure 19 (D), Figure 19 (E), Figure 19 (F), Figure 19(G), Figure 19 (H) is the figure of the effect of the photographic equipment (camera) of explanation the present invention the 2nd embodiment,It is the key diagram that represents to obtain three (or more than three) and be added by repeatedly touching operation the sequence when the image.
Figure 20 (A), Figure 20 (B), Figure 20 (C), Figure 20 (D), Figure 20 (E) are explanation the present invention the 2ndThe figure of the effect of the photographic equipment (camera) of embodiment is to represent that only carrying out a underrange specifies to obtain multipleThe key diagram of sequence when different view data of exposing.
Figure 21 (A), Figure 21 (B), Figure 21 (C), Figure 21 (D), Figure 21 (E) are explanation the present invention theThe figure of the effect of the photographic equipment (camera) of 2 embodiments is the HDR while representing that obtaining dynamic image recordsThe key diagram of sequence when image.
Figure 22 is the figure of the effect of the photographic equipment (camera) of explanation the present invention the 2nd embodiment, is to represent to getThe sequential chart of the sequence when HDR image of dynamic image while recording.
Label declaration
1 camera; 10 camera bodies; 11 camera control parts; 12 imaging apparatus; 13 scratchpad memories; 14Exposure control part; 15 image processing parts; 16 operation control parts; 17 operating portions; 18 camera Department of Communication Forces; 20 depositReservoir interface; 21 recording mediums; 22 demonstration drivers; 23 display unit; 23a display frame; 24 touch facesSheet drive; 25 touch panels; 30 photographing lens barrels; 31 phtographic lenses; 32 camera lens holding frames; 33 aperturesMechanism; 34 drivers; 35 lens control portions; 36 camera lens Department of Communication Forces; 39a, 39b communication contact.
Detailed description of the invention
Below by illustrated embodiment explanation the present invention.
[the 1st embodiment]
Fig. 1~Fig. 9 is the figure that represents the present invention's the 1st embodiment. Wherein, Fig. 1 represents that the present invention the 1st implementsThe structured flowchart of the main composition parts that the inside of the photographic equipment (camera) of mode forms. Fig. 2 represents to useThe key diagram of the situation that the photographic equipment (camera) of present embodiment is photographed. Fig. 3 is main presentation graphs 2State in the figure of display frame of photographic equipment (camera). Fig. 4 represents to obtain according to the state of Fig. 1, Fig. 2Touch an example of the demonstration image of the result of operation, the demonstration while representing touch indicating positions emphasis to expose controlImage. Fig. 5 represents to obtain according to the state of Fig. 1, Fig. 2 another example of the demonstration image of the result that touches operation, isWhen being illustrated in the state that maintains non-touch location, the demonstration image while touching the exposure control of indicating positionsFigure.
Fig. 6 is coordinate and the display unit of the middle touch panel 25 of photographic equipment (camera) of explanation present embodimentThe figure of relation of coordinate. And Fig. 7 is for illustrating what photographic equipment (camera) in present embodiment was carried outThe synthetic schematic diagram of processing of HDR (HDR) image.
Fig. 8 is the flow chart of the processing sequence of the camera control of the photographic equipment (camera) that represents present embodiment.Fig. 9 operates by touch the addition of carrying out in the processing sequence of presentation graphs 8 to judge the sub-journey that (step S5) processesThe flow chart of order.
In the 1st embodiment of the present invention, illustrate for example the following camera as photographic equipment (hereinafter to be referred asFor camera): it is configured to and for example uses solid-state imager to carry out photoelectricity to the optical image being formed by optical lens to turnChange, the picture signal obtaining is thus converted to the DID that represents still image or dynamic image, will be as aboveThe DID generating is recorded in recording medium, and can be according to the DID being recorded in recording mediumUse display unit to reproduce and show still image or dynamic image.
And in the each accompanying drawing using in following explanation, each inscape is become can be identified on accompanying drawingSize, therefore sometimes take different contraction scales to show to each inscape. Therefore, in the present invention,The size ratio of the quantity of the inscape of recording in these accompanying drawings, the shape of inscape, inscape and each structureBecome the relative position relation of key element to be not limited only to illustrated mode.
First the following Fig. 1 of use illustrates that the main inner of camera 1 of the present invention's the 1st embodiment forms.
As shown in Figure 1, the photographic equipment of present embodiment is that camera 1 is mainly by camera body 10, photographic mirrorHead lens barrel 30 forms.
Camera body 10, by forming in the framework of inside storage various component parts described later, is to form this cameraThe formation portion of 1 basic comprising. Photographing lens barrel 30 possesses photographic optical system etc., is for accepting from quiltThe formation portion of taking the photograph the light beam of body the shot object image of image optics and arrange. And this camera 1 is by photographic mirror head mirrorCylinder 30 is disposed in the front surface of camera body 10 in the mode of disassembled and assembled freely and the so-called lens-interchangeable that forms shinesCamera.
The inside of above-mentioned camera body 10 is equipped with by electronic units such as CPU and forms and unified this camera 1 that carries outElectronically controlled camera control part 11 (being labeled as CPU in Fig. 1), in the control of this camera control part 11Under the various component parts that play a role, i.e. scratchpad memory 13, the exposure control such as imaging apparatus 12, flash memoryPortion 14 processed, image processing part 15, operation control part 16, operating portion 17, camera Department of Communication Force 18, SDRAM19,Display unit 23, the touches such as memory interface (I/F) 20, recording medium 21, demonstration driver 22, LCDPanel driver 24, touch panel 25 etc.
Imaging apparatus 12 is the shootings that are for example made up of the photo-electric conversion element such as CCD, CMOS and drive circuit thereof etc.Portion. Imaging apparatus 12 is optical images of accepting by the photographic optical system imaging of above-mentioned photographing lens barrel 30, to itCarry out optical transition processing, generate the formation portion of electronic image data.
Scratchpad memory 13 is made up of flash memory etc., is the journey of temporary storage while carrying out various image processing etc.The internal storage areas of order and data etc.
Exposure control part 14 is for according to from the output signal of imaging apparatus 12 etc. or according to from other settingThe signal of photometry unit (not shown) carries out the photometry of object light, sets correct exposure value according to this photometry result,(not shown according to the correct exposure value control tripper of this setting; Be contained in imaging apparatus 12 or be arranged in additionThe front portion of imaging apparatus 12), f-number mechanism (label 33; Aftermentioned), imaging apparatus 12 (sensitivity adjustment etc.)Deng control circuit. Obtain thus the image that becomes correct exposure, obtained image is shown in demonstration successively continuouslyDevice 23.
Image processing part 15 is HDRs (HDR) of carrying out in the camera 1 of present embodiment except carrying outImage is synthetic process outside, the circuit that also carries out the various picture signal processing of carrying out in common camera in the pastPortion.
Operating portion 17 is included in various functional units that operation needs on this camera 1 etc. and the formation portion that forms.
Operation control part 16 is receive from the index signal of operating portion 17 and pass to camera control part 11,Also the instruction of camera control part 11 is passed to the control circuit of the control of operating portion 17 grades.
Camera Department of Communication Force 18 be and photographing lens barrel 30 between communicate, the formation of transmitting-receiving control signal etc.Portion. Therefore communicate by letter with contact 39a, 39b with being respectively equipped with in photographing lens barrel 30 at camera body 10,Under the state that both are linked up, communication is switched on contact 39a, 39b contact.
SDRAM (SynchronousDynamicRandomAccessMemory) the 19th, temporary storage in advanceThe program that is stored in stores with the operation of various control use/data processing programs in ROM (not shown) etc. etc. with depositingStorage region.
Memory interface 20, as the medium between camera control part 11 and recording medium 21, is to assist to recordThe data recording and processing of medium 21 or from the formation portion of the data reading of recording medium 21.
Recording medium 21 is the memory cell for being recorded in the view data that this camera 1 obtains, for example, can makeWith small-sized small-scaled media such as semiconductor memory card, card type HDD. Recording medium 21 is configured to can be with respect toThis camera 1 disassembled and assembled freely. Also comprise the mode structure of the internal memory in camera body 10 with fixed configurationsThe medium becoming.
Demonstration is for driving via demonstration driver 22 under the control at camera control part 11 with driver 22Control display unit 23, suitably show as required image and various information etc. in the mode that can identify to visionDriver.
Display unit 23 is for example used the panel shape display units such as LCD (LiquidCrystalDisplay), be forShow the display part of image and various information etc. in the mode that can identify to vision. Display unit 23 is in camera controlUnder the control of portion 11 processed, drive control by demonstration with driver 22.
Touch panel driver 24 is, under the control of camera control part 11, touch panel 25 is driven to control,Detect and touch operation, and judge and its testing result operates the driver of input accordingly.
Touch panel 25 is overlapped on the display surface of above-mentioned display unit 23 and arranges, and is to be entered by user's counter plate faceRow touches operation, touch and slide etc., thereby carries out the operation input of various appointment inputs etc. according to these operationsThe formation portion comprising in unit.
On the other hand, photographing lens barrel 30 be configured to mainly possess phtographic lens 31, camera lens holding frame 32, apertureMechanism 33, driver 34, lens control portion 35 (being labeled as camera lens CPU in Fig. 1), camera lens Department of Communication Force 36 etc.
Phtographic lens 31 is made up of multiple optical lenses, is for accepting from the light beam of subject and optical image being becomeThe formation portion of picture.
Camera lens holding frame 32 is the each optical lens in order to keep respectively above-mentioned phtographic lens 31 and the formation portion that arranges.
Aperture device 33 is in order to adjust transmitted through the light quantity of the light beam of above-mentioned phtographic lens 31 and the formation portion arranging,For example, by aperture blades with drive the drive motors etc. of this aperture blades to form.
Between camera lens Department of Communication Force 36 and camera body 10, communicating, is the formation portion of transmitting-receiving control signal etc. AsThe above is provided with the communication corresponding with the communication contact 39a of camera body 10 and uses in photographing lens barrel 30Contact 39b, in the time of the state that both are linked up, for communication, contact 39a, 39b contact and switch on.
And the camera 1 of present embodiment also has face detection function, subject tracking merit as other functionsCan etc. Face detection function, subject tracking function are according to accepting shooting under the control of camera control part 11The output signal of element 12 and one of image processing function that the view data that generates is carried out. These face detection functions,Subject tracking function uses the function identical with the function using in existing camera. Therefore omit it detailedExplanation. Other formation has the formation substantially same with existing camera.
Below use the work of the camera 1 of the present embodiment of Fig. 2~Fig. 5 and Fig. 6, the above-mentioned formation of Fig. 7 simple declarationUse summary.
First the power supply that makes camera 1 is that on-state is started camera 1. When having carried out after this power connection operation,Camera 1 is can photograph the pattern (photograph mode) of action and with can be by display unit 23 as being shotBody is observed pattern (live view pattern) starting of the viewer of use.
As shown in Figure 2, user 100 using the camera of this state 1 towards as the subject of photography target of expectingAnd gripping. Then user 100 is observing in display unit 23, right with finger 101 etc. at any timeDesired site on the panel of the touch panel 25 in this display unit 23 touches operation. Fig. 3 illustrates nowSituation.
Start at camera 1, carry out live view demonstration in the display frame of display unit 23 time, in this demonstrationOn picture, for example switch continuously and show the image of being obtained by imaging apparatus 12 with 30fps. Demonstration image is now baseThe image of obtaining in the automatic exposure value of setting of exposure control part 14. Wherein, exposure control part 14 is considered shooting unitIn the image that part 12 is obtained, for example roughly photometry is carried out in whole region, and sets suitable exposure value.
As above carry out by display unit 23 and show continuously that successively the correct exposure image of automatically having set exposure value (willIt is called benchmark exposure image or benchmark image). Now according to photography target difference, sometimes in display unit 23Display frame on produce that high light overflows or dark portion disappearance etc. This is the exposure range due to imaging apparatus 12(latitude: exposure latitude) compared with narrow and produce.
For example, in example shown in Fig. 7 (A), represent benchmark exposure image with symbol (b). Observing the straight of correspondenceWhen side's figure image (referring to Fig. 7 (B)), near the centre picture substantial middle portion of this benchmark exposure image (b)Portion is correct exposure, and picture left part is low-light level portion, and picture right side is divided into high brightness portion and (in figure, uses oblique line tableShow low-light level portion and high brightness portion).
In the time carrying out this live view demonstration, as shown in Figure 3, user 100 is to (on touch panel) in display frameDesired locations touch operation. Wherein, what touched operation is the touch point that hope realizes correct exposure,Be for example produce that high light overflows, the position of dark portion disappearance etc.
Example as shown in Figure 3, as the personage 200 of main subject in close shot, and Chaoyang or the setting sun in backgroundThe subject 201 high Deng brightness enters on same picture. When in this situation, by close shot (picture centerPortion's near zone) subject regard main subject as, set the common of exposure value setting it as emphasisIn the situation of exposure settings action, personage 200 becomes correct exposure, and high brightness subject 201 becomes overexposureState, produces so-called high light and overflows. And in Fig. 3, in order to show the situation of subject 201 overexposures,Illustrate this subject 201 with dotted line. Wherein, common exposure settings refers to according to for example passing through in existing photographyThe Exposure Metering of general application in equipment, such as average metering, the photometry of central portion emphasis, multi-split photometry etc. and obtainExposure value set the common means of correct exposure.
So, in the camera 1 of present embodiment, can maintain current time (Fig. 3 in this caseState) when the exposure status of personage 200 in correct exposure, be presented at same with the state of correct exposureIn picture, become the image in the region (high brightness position) of overexposure. Therefore to comprising the region of this subject 201Touch operation (index mark P).
When attaching most importance to and carry out Liao Shigai district at interior image comprising the region P that has carried out above-mentioned touch operation and specifyTerritory becomes after the automatic exposure setting processing of correct exposure, and as shown in Figure 4, region P (subject 201) becomes suitableWork as exposure image, and part that becomes the region (personage 200) of correct exposure under the state of Fig. 3 etc. becomes exposureThe image of not enough state.
For example, as shown in Fig. 7 (A), example illustrates identical with it state, as described below. , settingFor the high brightness position of picture right side is become in the situation of correct exposure, picture substantial middle portion and picture left sidePortion (figure bend position) becomes under-exposure (figure shown in index mark (c)).
And with above-mentioned example on the contrary, in example shown in Fig. 7 (A), be set as making the low of picture left sideBrightness position becomes in the situation of correct exposure, and picture substantial middle portion becomes with picture right side (figure bend position)For under-exposed (figure shown in index mark (a)).
And, in the camera 1 of present embodiment, the panel of touch panel 25 is overlapped in to display unit 23Display surface on and arrange. The resolution ratio of display unit 23 is not necessarily consistent with the resolution ratio of touch panel 25. ThereforeNeed to make the coordinate system of display unit 23 consistent with the coordinate system of touch panel 25.
For example, shown in Fig. 6 (A), suppose display unit 23 for effective image-region be the longitudinal axis=Y, transverse axis=X,Picture size (resolution ratio)=iX × iY (point). On the other hand, touch panel 25 for effective coverage be the longitudinal axis=Y,Transverse axis=X, panel size=tX × tY.
Wherein, for the coordinate of the arbitrfary point on touch panel 25 (x, y) is converted to the image of display unit 23On coordinate (X, Y), use following formula X=(iX × (x/tX)), Y=(iY × (y/tY)). This Coordinate ConversionComputing is the coordinate of specifying according to the operator input signal by the touch of above-mentioned touch panel 25 is operated, for example, shiningIn camera control portion 11, carry out.
In the camera 1 of present embodiment, as mentioned above become different multiple in the position of correct exposure having obtainedView data (each view data of Fig. 3, Fig. 4) afterwards, according to these multiple view data, is carried out for eachView data has been considered the synthetic processing of the image of the image-region that becomes correct exposure. Thus as shown in Figure 5,Making is the image that comprises the subject that has luminance difference in a picture, also can make each region all become correct exposure,Can obtain and represent that picture does not exist the view data of an image of lightness overexposure and dark portion disappearance on the whole. By thisImage series processing (is high dynamic range images (highdynamicrangeimaging: be designated hereinafter simply as HDRImage)) be called to synthesize and process. And the pattern of carrying out this processing is called to HDR pattern.
That is, process the image of HDR pattern is synthetic, with the real image (a) of example shown in Fig. 7 (A),(b), (c) in corresponding histogram image (referring to Fig. 7 (B)), cuts out in each image as suitably exposing to the sun respectivelyThe image in the region of light, synthesizes processing to them. Its result, generates the composograph shown in Fig. 7 (D).And the histogram image corresponding with this composograph is as shown in Fig. 7 (C). Therefore in this composograph, energyObtain and eliminated that high light overflows, dark portion disappearance etc., on whole picture, be all roughly the image of correct exposure.
About the effect of the camera 1 of present embodiment as above, below use the flowchart text of Fig. 8, Fig. 9 to shineThe processing sequence of camera control.
First in the step S1 of Fig. 8, camera control part 11 monitors via operation control part 16 from operating portion17 output signal is also confirmed whether to have detected power throughing signal. Wherein, continue to monitor until detectTill power throughing signal, in the situation that this signal being detected, enter the processing of next step S2.
In step S2, camera control part 11 is via the camera of using contact 39a, 39b to connect by communicationDepartment of Communication Force 18, camera lens Department of Communication Force 36, and between the lens control portion 35 of photographing lens barrel 30, carry out predetermined leading toLetter is processed.
Then,, in step S3, whether the pattern that camera control part 11 carries out camera 1 is set to is taken the photographThe confirmation of shadow pattern. Confirming the processing that enters into next step S4 set photograph mode in the situation that herein.And in the case of confirming the processing that enters step S21 the pattern of having set beyond photograph mode.
In step S4, camera control part 11 is controlled driver 34 via lens control portion 35, drives photographic mirror31, aperture device 33 etc., and control that imaging apparatus 12, display unit 23 etc. carry out that shooting is processed and temporary transientRecording processing, and carry out Graphics Processing according to identical view data in display unit 23. Thus in display unitOn 23, carry out live view demonstration. And, the result (benchmark exposure image data) of above-mentioned placeholder record processingTemporary being recorded in scratchpad memory 13 grades successively.
Then,, in step S5, whether camera control part 11 carries out the addition processing based on touching operationJudge. Use Fig. 9 to narrate the details of the processing of this step S5 below.
That is, in the step S31 of Fig. 9, camera control part 11 monitor via touch panel driver 24 fromWhether the input signal of touch panel 25, detect the confirmation based on touching the input signal operating. Herein,In the situation that touch signal being detected, enter the processing of next step S32. And the situation of touch signal do not detectedUnder enter the step S10 of Fig. 8 processing.
When touch signal detected in the processing of above-mentioned steps S31, enter into after the processing of step S32, in this stepIn rapid S32, camera control part 11 carries out the confirmation that whether comprises face image in the region that operates instruction by touching.In the situation that having obtained face image, enter the processing of step S34. Enter down and do not obtain in the situation of face imageThe processing of one step S33.
When obtain face image in the processing of above-mentioned steps S32, enter into after the processing of step S34, in this stepIn S34, camera control part 11 detects face mask to obtained face image, monitors the figure in face maskPicture. After this enter the processing of step S35.
And enter into after the processing of step S33 when do not obtain face image in the processing of above-mentioned steps S32, at thisIn step S33, camera control part 11 monitors to operate being scheduled to centered by the position (coordinate) of indicating by touchingThe image in the region (for example accounting for the rectangular-shaped region of picture entire area 1/10) of scope. After this enter step S35Processing.
In step S35, camera control part 11 is confirmed the part below 10% in light exposure for monitor areaWhether (low-light level part: the part of be for example equivalent to approximately-3EV) accounts for the over half of this monitor area. This is inConfirm low-light level part in the situation that over half, enter the processing of the step S6 of Fig. 8. And confirm low brightIn the situation of the not enough half of degree part, the processing that enters step S36.
In step S36, camera control part 11 is confirmed the saturated above part (high brightness of exposure to monitor areaPart) whether occupy the over half of this monitor area. When confirming hi-lite in the situation that over half,Enter the processing of the step S6 of Fig. 8. And confirm in the situation of the not enough half of hi-lite the step that enters Fig. 8The processing of rapid S10.
As mentioned above, being added the result of determination of processing if no is while entering the processing of step S6 of Fig. 8,In this step S6, camera control part 11 confirms that only the Electronic Control by camera body 10 sides (is shutter speedThe change of degree value or Sensitirity va1ue) can obtain exposure correction image. Wherein, exposure correction image refers to benchmarkExposure image changes the image exposing and obtain. In order to change exposure, for example, except changing shutter speed value, f-numberOutside, also have and change electronically the means such as Sensitirity va1ue. Conventionally adopt and carry out high sensitivity and establish for low-light level partThe means of fixed (gain is amplified) can improve exposure, but these means may make gain noise increase, and cause pictureQualitative change is poor. On the other hand, can reduce exposure by hi-lite being carried out to muting sensitivity setting (gain reduces),But utilize these means can make shutter speed reduce, there is the possibility that causes picture shake thereupon. Therefore about exposureThe change that change is mainly preferably based on shutter speed value, f-number realizes. But according to the luminance state of subject,Sometimes only cannot tackle completely by the change of shutter speed value, f-number. Can become by Sensitirity va1ue in this caseMore carry out correspondence.
When in the processing of above-mentioned steps S6, only can obtain the feelings of exposure correction image by camera body 10 sidesUnder condition, enter the processing of step S8. In the case of the control of aperture device 33 that needs photographing lens barrel 30 sides,Enter the processing of step S7.
In step S7, camera control part 11 is via camera Department of Communication Force 18, camera lens Department of Communication Force 36 and camera lens controlBetween portion 35 processed, carry out camera lens communication process. Thus, lens control portion 35 carries out via driver 34 and carries out apertureThe control of diaphragm processing of the driving control of mechanism 33. This control of diaphragm processing is under the control of camera control part 11Carry out according to the exposure value being calculated by exposure control part 14.
Then,, in step S8, camera control part 11 carries out shutter speed according to the exposure value of exposure control part 14Degree value and Sensitirity va1ue change to be processed, and obtains addition image. The addition that obtains like this with view data placeholder record inScratchpad memory 13. Wherein, addition utilizes the exposure different from said reference exposure image to obtain with imageImage, with above-mentioned exposure correction image synonym. After this enter the processing of step S9.
In step S9, the image processing part 15 of camera control part 11 carries out the temporary transient note by above-mentioned steps S4Record process and the benchmark image data of placeholder record add the addition picture number obtaining by the processing of above-mentioned steps S8According to processing (synthetic process). The composograph of the addition result obtaining is like this shown in display unit 23. ThisAfter enter step S11 processing.
And, in the determination processing of above-mentioned steps S5, enter into the processing of step S10, in this stepIn S10, camera control part 11 will not be added image pickup result image, the i.e. above-mentioned steps that (synthesizing) processThe image of temporary record in the processing of S4 (benchmark exposure image) is shown in display unit 23. After this enter stepThe processing of S11.
In step S11, camera control part 11 via operation control part 16 supervisory work portions 17, and viaTouch panel driver 24 monitors touch panel 25, is confirmed whether to have carried out operation for carrying out photography action,Releasing operation. Particularly, be for example confirmed whether to have produced from being contained in releasing parts in operating portion 17 (notDiagram) release signal or from the release signal of touch panel 25 etc. The product of release signal detected herein,The processing that enters step S12 below in raw situation. And do not detect in the situation of generation of release signal, returnThe processing of above-mentioned steps S1, repeats processing after this.
In step S12, camera control part 11 is carried out photograph processing and recording processing. The photograph processing of now carrying outWith recording processing be view data or the above-mentioned step of the addition (synthesizing) that obtains in the processing for above-mentioned steps S9In the processing of rapid S10, carry out view data (the i.e. benchmark of the placeholder record exposure figure in step S4 of Graphics ProcessingPicture data) recording processing. After this return to the processing of step S1, repeat processing after this.
On the other hand, in the processing of above-mentioned steps S3, set photograph mode pattern in addition if be judged as,And enter into after the processing of step S21, in this step S21, camera control part 11 is confirmed the moving of camera 1Whether operation mode is set to reproduction mode. Wherein, confirming be set as reproduction mode in the situation that, below enteringThe processing of step S22. And the pattern setting not is in the situation of reproduction mode, return to locating of step S1Manage and repeat processing after this.
And, under normal circumstances, as the pattern of camera, be broadly divided into photograph mode and reproduction mode,But also there is the situation that possesses pattern in addition. In this case, can be further from the processing of step S21Branch's step is set, carries out the confirmation of pattern. But about the effect of other patterns outside photograph mode,And the part of not direct correlation of the present invention therefore to omit explanation and the diagram of these parts. And in the processing of Fig. 8The pattern of current setting in sequence is neither photograph mode also in the situation of non-reproduction mode, returns initial simplyStep S1.
In step S22, camera control part 11 is controlled recording medium 21 via memory interface 20 and is carried out imageReading in of data, and control display unit 23 via showing with driver 22, carry out predetermined image reproducing processing.
Then,, in step S23, camera control part 11 is confirmed whether to carry out to change and processes. Wherein, change placeReason for example refers to obtained according to the photography action carried out in the past and has been recorded in the picture number in recording medium 21According to, change the parameter of recording image data, obtain different exposure images, then to this stylish picture number obtainingAccording to the view data recording synthesize process etc. image processing. And image processing is in camera controlUnder the control of portion 11, undertaken by image processing part 15.
In the processing of above-mentioned steps S23, should change the operation index signal of processing in the case of detectingEnter the processing of step S24, in the processing of this step S24, the image addition carrying out while carrying out with above-mentioned photography action(synthesizing) processes roughly the same image changing processing. After this return to the processing of above-mentioned steps S1 and repeat after thisProcess.
In addition, in the processing of above-mentioned steps S23, the feelings that should change the operation index signal of processing do not detectedUnder condition, return to the processing of above-mentioned steps S1 and repeat processing after this.
As mentioned above, according to above-mentioned the 1st embodiment, make camera 1 carry out live view demonstration with photograph modeWhen time moves, the view data that live view is used is automatically temporary transient successively as benchmark exposure image dataRecord, and operate as triggering using user's touch, obtain to be operated the region of indicating by this touch as exposureThe image (image of the exposure different from benchmark exposure image) of emphasis, according to the picture number of these multiple (two)According to the image of correct exposure part being added to (synthesizing) processing, thus can be in the time of photography action easily simultaneouslyThe HDR image that to obtain at whole picture area be correct exposure. And carry out operating by touch corresponding to userThe image processing in specified region (desired subject), therefore can reflect user's intention. User can alsoBefore operation of recording, confirm composograph result (live view demonstration), if therefore result is carried on the back road with intentionAnd speed, the result before can deleting before operation of recording, carries out redo operation again. Therefore user's energyEnough obtaining reliably reflected the representation intention of self and eliminated the HDR image of naturally not feeling.
By each view data placeholder record of obtaining for the sequence of operations that obtains HDR image in temporary transient storageIn device 13, shown by display unit 23, therefore user's showing by display unit 23 in advance simultaneouslyShow confirmation processing result image. In this moment, view data, only by placeholder record, is therefore confirming that result is judgementIn situation about not wanting, can omit to the recording processing of recording medium 21. Therefore can alleviate recording medium 21The processing that needs of the recording processing such as driving, thereby can contribute to process the efficient activity of sequence and the saving of power consumption.
[the 2nd embodiment]
Then use Figure 10~Figure 22 to illustrate that as follows the photographic equipment of the present invention's the 2nd embodiment is camera.
The basic comprising of the camera of present embodiment is identical with the formation of above-mentioned the 1st embodiment, only its processingSequence has some differences.
In the camera of above-mentioned the 1st embodiment, obtain benchmark exposure by common photometry and exposure settings meansImage, and operate the instruction of sending and obtain the view data of different exposures by user's touch, to these two figurePicture is added (synthesizing) processing, obtains HDR image.
And in the camera of present embodiment, there are following three kinds of different effects.
(1) except benchmark exposure image, the repeatedly touch operation of also being undertaken by user obtain three (orMore) image, carry out same addition (synthesizing) process, obtain HDR image.
(2), except benchmark exposure image, user only carries out a underrange appointment and obtains multiple different exposuresView data, carry out same addition (synthesizing) process, obtain HDR image.
(3) effective expression means while record as dynamic image, obtains HDR image in due course quarter.
Therefore in description of the present embodiment, omit the explanation forming for camera itself, referring to the above-mentioned the 1stThe accompanying drawing using in embodiment, uses same numeral to describe to identical component parts.
Figure 10 is the processing order of the camera control of the photographic equipment (camera) that represents the present invention's the 2nd embodimentThe flow chart of row. Figure 11 is the subprogram that represents HDR sequence (step S45) in the processing sequence of Figure 10Flow chart. Figure 12 is the flow chart that represents the subprogram of the Region specification (step S47) in the processing sequence of Figure 11.
Figure 13 is the details that represents the example A (step S49) of the HDR sequence in the processing sequence of Figure 11Flow chart.
Figure 14 is the subprogram that represents the appointed area photometry L (step S62, S76) in the processing sequence of Figure 13Flow chart. Figure 15 represents that the appointed area photometry H in the processing sequence of Figure 13 processes (step S71, S79)The flow chart of subprogram.
Figure 16 is the details that represents the example B (step S50) of the HDR sequence in the processing sequence of Figure 11Flow chart.
And the Region specification (step S69, S86) in the processing sequence of Figure 13 can be applied with above-mentioned Figure 12'sThe processing that subprogram is same. And appointed area photometry L (step S111) in the processing sequence of Figure 16 can apply withThe same processing of subprogram of above-mentioned Figure 14. Appointed area photometry H in the processing sequence of Figure 16 processes (stepS112) can apply the processing same with the subprogram of above-mentioned Figure 15.
Figure 17 is the flow process that represents the details of the dynamic image record (step S43) in the processing sequence of Figure 10Figure. Figure 18 represents that HDR photometry, exposure in the processing sequence of Figure 17 controlled to process (step S212, S220)The flow chart of subprogram.
Figure 19~Figure 21 is the figure of the effect of the photographic equipment (camera) of explanation present embodiment. Wherein, Figure 19It is the key diagram that represents to obtain three (or more than three) and be added by repeatedly touching operation the sequence when the image. FigureThe 20th, represent only to carry out a underrange appointment and the key diagram of sequence while obtaining view data of multiple different exposures.The key diagram of sequence when Figure 21 is the HDR image while representing that obtaining dynamic image records. Figure 22 represents to obtainThe sequential chart of sequence when HDR image when dynamic image records.
First, suppose that the power supply of camera 1 is in on-state, the state of this camera 1 in having started.
Now the state of display unit 23 for example can show the display frame 23a that makes Figure 19 (A), Figure 20 (A) for example.In display frame 23a, show the live view image in record. In addition, (the figure of the predetermined position in display frame 23aBight, middle bottom right), overlapping being shown on this live view image of " HDR control " described later icon 51.
Now, in the step S41 of Figure 10, camera control part 11 confirms that whether the pattern of camera 1 isPhotograph mode. In the time having confirmed to be set to photograph mode, enter the processing of step S42 below.
In step S42, camera control part 11 monitors via operation control part 16 defeated from operating portion 17Go out signal or via touch panel driver 24 from the output signal of touch panel 25, be confirmed whether to have produced useIn the index signal that starts dynamic image operation of recording. Here confirming dynamic image recording start index signal,In situation, enter the processing of step S43 below. And the situation to dynamic image recording start index signal unconfirmedUnder, the processing that enters step S44.
In step S43, camera control part 11 is carried out predetermined dynamic image record. Use Figure 17 to narrate belowThe details of this dynamic image record.
And the state that has started the display unit 23 when dynamic image records herein for example can be illustrated as Figure 21 (A)Display frame 23a. In display frame 23a, show the live view image in record. In addition, at display frame 23aIn predetermined position, " HDR control " described later icon 51 (bight, bottom right in figure) and " ● record " described later iconOverlapping being shown on this live view image such as 57 (bights, upper left in figure).
On the other hand, in step S44, camera control part 11 monitor via touch panel driver 24 fromThe output signal of touch panel 25, is confirmed whether " HDR control " icon 51 to carry out touching operation. Should " HDRControl " icon 51 is to be used to indicate to start the operation indicatory device that HDR processes, for example, in the demonstration of display unit 23On picture, be shown as the icons such as pictograph. This icon is configured to and cooperates with touch panel 25 and conduct operation instruction handSection plays a role. And, as the concrete example of " HDR control " icon 51, can be referring to Figure 19 (A), Figure 20(A) symbol 51 of display frame 23a.
Herein, user, " HDR control " icon 51 is carried out touching operation, confirmed corresponding index signalSituation under, the processing that enters step S45 below. And the touch signal to " HDR control " icon 51 unconfirmedSituation under, return to the processing of step S42, repeat processing after this.
In step S45, camera control part 11 is carried out the sequence that predetermined HDR processes. Use Figure 11 is as followsIllustrate that this HDR processes the details of sequence.
User has carried out touching after operation to " HDR control " icon 51, as shown in Figure 19 (B), Figure 20 (B),On the display frame 23a of display unit 23, show that with the live view doubling of the image guiding such as " please touch " shows 55.Can be by camera control part 11 via showing that controlling display unit 23 with driver 22 carries out this guiding demonstration 55Graphics Processing.
Meanwhile, in the step S46 of Figure 11, camera control part 11 is carried out information initializing processing. At the beginning of this informationIt is to carry out initialized processing to processing relevant various setting values etc. with HDR that beginningization is processed.
Then,, in step S47, camera control part 11 execution areas are specified. The details of this Region specificationAs shown in figure 12.
That is, in the step S51 of Figure 12, camera control part 11 monitor via touch panel driver 24 fromThe output signal of touch panel 25, is confirmed whether to have produced based on touching the index signal of operation and producing thisMore than being confirmed whether to have continued the scheduled time in the situation of signal. Repeat to touching index signal in the situation that unconfirmedSame treatment is until confirm this signal. And in the situation that touch index signal being detected, if this touch instruction is newNumber be the touch index signal of the not enough scheduled time, the processing that enters step S52 below. And confirm lastingThe processing that enters step S54 in the situation of the touch index signal more than scheduled time.
That wherein, user touches operation is live view image (the benchmark exposure figure for obtaining and showPicture) different exposure view data position, for example produce that high light overflows (bright portion region) or dark portion disappearance is (darkPortion region) etc. position. For example, in the example shown in Figure 19 (C), Figure 20 (C), in display frame 23aSymbol 201 shown in subject be bright portion region.
In this case, for example the subject of symbol 201 has been carried out to the short time and touched when operation, Figure 12'sIn step S52, camera control part 11 is carried out FX and is specified. It is to touch with the short time that this FX is specifiedIn touching coordinate position on the display frame 23a that coordinate on the panel of the indicated touch panel 25 of operation is corresponding and beingThe heart, specifies the processing of the image-region (FX) of predefined preset range. Specify institute by this FXThe region of specifying is for example the rectangular-shaped region shown in the symbol 61 of Figure 19 (C). After this enter step S53'sProcess.
In step S53, camera control part 11 is carried out and is specified specified to the FX by above-mentioned steps S52The frame Graphics Processing in region. This frame Graphics Processing is for example to show the rectangle shown in Figure 19 (C), Figure 20 (C)The frame of shape shows 61 processing. After this revert to processing sequence originally, the processing that enters the step S48 of Figure 11 (is returnedReturn).
On the other hand, for example when the subject of symbol 201 continued to carry out more than the scheduled time touch operation time,Enter the processing of the step S54 of Figure 12.
In step S54, camera control part 11 similarly monitors via touch panel driver 24 from touchThe output signal of panel 25, is confirmed whether to have produced the slip indication signal based on touching operation. Slide unconfirmed arrivingIn the situation of moving index signal, return to the processing of above-mentioned steps S51, repeat processing after this. Indicate and confirm to slideIn the situation of signal, enter the processing of step S55 below.
In step S55, camera control part 11 is carried out appointed area and is determined processing. Processing is determined in this appointed areaThe processing of specifying the image-region (appointed area) of the preset range of essentially rectangular shape, this preset range be by exampleAs the demonstration picture corresponding with operated coordinate on indicated touch panel 25 by the lasting touch more than scheduled timeCoordinate position on face 23a with in the situation that maintaining this touch operation, carry out slide and indicate and be equivalent toCoordinate position on the corresponding display frame 23a of coordinate of the touch panel 25 of the maximal end point position of touch operation connectsThe line that knot gets up specifies as diagonal. Rectangular-shaped in example shown in Figure 20 (C) shown in is-symbol 63Region. After this, enter the processing of step S53, carry out equally display box and show 61 frame Graphics Processing. Then returnBe grouped into processing sequence originally, the processing (returning) that enters the step S48 of Figure 11.
Return to Figure 11, in step S48, camera control part 11 is for the processing (Figure 12 at above-mentioned steps S47Region specification subprogram) in specify, the image of definite appointed area, confirm in the image in appointed area brightWhether degree is poor less. This is confirmed to be via exposure control part 14 and carries out according to benchmark exposure image data. Wherein,Confirm luminance difference is less in appointed area situation judge for be the situation shown in above-mentioned (1) (based onRepeatedly touch obtains multiple images), the processing that enters step S49. And having confirmed that in appointed area, luminance difference is largerSituation under, be judged as YES the situation (obtaining multiple images by a underrange appointment) shown in above-mentioned (2), enterEnter the processing of step S50.
In step S49, camera control part 11 is carried out the example A of HDR sequence. After this revert to locating of scriptReason sequence, returns to the processing of the step S41 of Figure 10. Use Figure 13 to narrate the detailed of this HDR sequence example A belowDetails condition.
On the other hand, in step S50, camera control part 11 is carried out HDR sequence example B. After this revert toProcessing sequence originally, returns to the processing (returning) of the step S41 of Figure 10. Use Figure 16 to narrate this HDR belowThe details of sequence example B.
Wherein, the above-mentioned HDR sequence of the flowchart text example A's (the step S49 of Figure 11) of use Figure 13 is detailedDetails condition.
First,, in the step S61 of Figure 13, camera control part 11 is via exposure control part 14, according to benchmarkExposure image data, the processing (the Region specification subprogram of Figure 12) of carrying out by the step S47 of above-mentioned Figure 11 refers toImage section beyond the brightness of image section and the appointed area of benchmark exposure image of fixed, definite appointed areaThe comparison of brightness. Wherein, enter step in the case of being judged as " brightness of the brightness < benchmark image of appointed area "The processing of S62. And be not in the case of being judged as to enter step " brightness of the brightness < benchmark image of appointed area "The processing of S79.
In step S62, camera control part 11 is carried out appointed area photometry L via exposure control part 14. SpecifyThe details of area metering L as shown in figure 14.
Describe the subprogram of the appointed area photometry L shown in Figure 14 in detail at this. , at the step S101 of Figure 14In, the exposure control part 14 of camera control part 11 uses live view (LV) picture appraisal value to select to specifyDark portion region (dark areas) in region.
Then,, in step S102, the exposure control part 14 of camera control part 11 is temporarily recorded in above-mentioned stepsThe outline data of the image in the dark portion region of selecting in the processing of S101.
Then,, in step S103, the exposure control part 14 of camera control part 11 carries out photometry to make dark portionRegion becomes the mode of correct exposure. After this revert to processing sequence originally, enter the locating of step S63 of Figure 13Reason.
On the other hand, in step S79, camera control part 11 is carried out appointed area via exposure control part 14 and is surveyedLight H processes. The details that photometry H in appointed area processes as shown in figure 15.
Here describe the subprogram that the appointed area photometry H shown in Figure 15 processes in detail. , in the step of Figure 15In S106, the exposure control part 14 of camera control part 11 uses live view (LV) picture appraisal value to selectBright portion region (bright area) in appointed area.
Then,, in step S107, the exposure control part 14 of camera control part 11 is temporarily recorded in above-mentioned stepsThe outline data of the image in the bright portion region of selecting in the processing of S106.
Then,, in step S108, the exposure control part 14 of camera control part 11 carries out photometry and becomes clear makingPortion region becomes correct exposure. After this revert to processing sequence originally, the processing that enters the step S63 of Figure 13 (is returnedReturn).
In step S63, camera control part 11 is carried out subject tracking process. This subject tracking process refers toEven the subject of specifying, for example the subject in above-mentioned appointed area moves in picture, also moves and holds according to itContinuous processing of following the trail of the position in picture. This subject tracking process can be used in existing camera generally practicalThe processing sequence of changing. Therefore description is omitted.
Then,, in step S64, camera control part 11 is carried out frame per second control processing. In showing of display unit 23While showing the upper demonstration of picture 23a live view image, in common situation, for example the frame per second with 30fps shows control.But as mentioned above carry out HDR process time, need to obtain multiple view data. The plurality of view data preferablyWithin the short as far as possible time, obtain. Therefore by controlling frame per second, can within the unit interval, obtain multiple picture numberAccording to. For example show that control is 90fps with 3 times of becoming common 30fps.
In step S65, camera control part 11 carries out 1HDR processing, output (demonstration) its result.Wherein, 1HDR processes to refer to and obtains the view data that obtains with the exposure different from benchmark exposure image (nowFor the view data of the exposure of having harmonized for dark portion; Be called 1HDR image) processing.
By processing so far, the show state of display frame 23a is transferred to Figure 19 (D) by Figure 19 (C)., as mentioned above, user is operated appointed area is specified by touch, shows in frame in this appointed areaState time (referring to Figure 19 (C)), to this appointed area (being called the 1st appointed area) carry out " carry out photometry withCalculate suitable brightness " processing, above-mentioned 1HDR processes. And the image of this result (1HDR image)As shown in Figure 19 (D), be shown on display frame 23a. And carrying out after this 1HDR processing, as Figure 19(D), shown in, the predetermined position (bight, bottom right in figure) in display frame 23a, on live view imageIcon 52, " appending " icon 53, " OK " icon 54 etc. " are returned to " in overlapping demonstration.
Return to Figure 13, in step S66, camera control part 11 monitors via touch panel driver 24From the output signal of touch panel 25, be confirmed whether that operation has been carried out touching in the position to being equivalent to " returning " icon 52.In the time of the touch operation that confirms " returning " icon 52, the processing that enters step S67.
Then,, in step S67, camera control part 11 is carried out and is removed the processing that 1HDR processes. After this returnBe grouped into processing sequence originally, return to the processing of the step S41 of Figure 10. Now, the demonstration of display frame 23a byThe state of Figure 19 (D) turns back to the state of Figure 19 (A).
On the other hand, in the processing of above-mentioned steps S66, in the time of the operation of the touch to " returning " icon 52 unconfirmed,Enter the processing of step S68.
Then,, in step S68, camera control part 11 similarly monitors via touch panel driver 24From the output signal of touch panel 25, be confirmed whether the position beyond position to being equivalent to " returning " icon 52, example" appending " icon 53 as shown in Figure 19 (D) has carried out touching operation. Should " appending " icon 53 be to produce for enteringRow is carried out the operating unit of the index signal of the instruction of appending HDR processing. Here confirming " appending " icon 53,Touch when operation, the processing that enters step S69. And in the time of the operation of the touch to " appending " icon 53 unconfirmed,Enter the processing of step S75.
In the time of the touch operation that confirms " appending " icon 53, first as shown in Figure 19 (E), at display frame 23aIn, be overlapped in live view image and show that guiding such as " please touch " shows 55. And, live view image nowGet back to the image based on benchmark exposure image data.
Then in the step S69 of Figure 13, camera control part 11 is carried out the 2nd sub-region and is specified (referring to Figure 12).The appointed area of specifying is thus called to the 2nd appointed area.
In specify the 2nd sub-region, for example, in the display frame 23a shown in Figure 19 (F), designated person 200.Thus, in the region that comprises this personage 200, carry out frame and show 62.
Then,, in step S70, the exposure control part 14 of camera control part 11 carries out above-mentioned the 2nd appointed areaThe brightness of image section and the appointed area of benchmark exposure image beyond the comparison of brightness of image section. Wherein,In the case of being judged as the processing that enters step S71 " brightness of the brightness > benchmark image of appointed area ". And judgeFor not being the processing that enters step S76 in the situation of " brightness of the brightness > benchmark image of appointed area ".
In step S71, camera control part 11 is carried out appointed area photometry H by exposure control part 14 and is processed(being specially the subprogram of Figure 15). After this enter the processing of step S72.
And in step S76, camera control part 11 is carried out appointed area photometry L (being specially the subprogram of Figure 14).After this enter the processing of step S72.
In step S72, camera control part 11 carries out 2HDR processing, output (demonstration) this result.Wherein, 2HDR processes to refer to and obtains the view data that obtains with the exposure different from benchmark exposure image (nowFor the view data of the exposure of having harmonized for become clear portion (step S71) or dark portion (step S76); Be called 2HDRImage) processing.
By processing hereto, the show state of display frame 23a is transferred to Figure 19 (G) by Figure 19 (F)., as mentioned above, user is operated appointed area is specified by touch, shows in frame in this appointed areaState time (referring to Figure 19 (F)), the locating of " carrying out photometry to calculate suitable brightness " carried out in the 2nd appointed areaReason, above-mentioned 2HDR processes. And by image processing part 15 according to the view data of this result, above-mentionedThe view data of 1HDR result, benchmark exposure image data are carried out the figure of the synthetic result of processing of HDR imagePicture is shown on display frame 23a as shown in Figure 19 (G). And then, as shown in Figure 19 (G), in display framePredetermined position (bight, bottom right in figure) in 23a, on the synthetic live view image of processing of HDR imageIcon 52, " appending " icon 53, " OK " icon 54 etc. " are returned to " in overlapping demonstration.
In step S73, camera control part 11 monitors by touch panel driver 24 and from touch panel25 output signal, is confirmed whether that operation has been carried out touching in the position to being equivalent to " returning " icon 52. Herein, existWhile confirming the touch operation of " returning " icon 52, the processing that enters step S74.
Then,, in step S74, camera control part 11 is carried out and is removed the processing that 2HDR processes. After this returnReturn the processing of above-mentioned steps S66. Now, the demonstration of display frame 23a turns back to figure by the state of Figure 19 (G)The state of 19 (D).
On the other hand, in the processing of above-mentioned steps S73, confirming " returning " icon 52 touch operation in additionTime, the processing that enters step S77.
In step S77, camera control part 11 monitors send from operating portion 17 by operation control part 16 defeatedThe output signal that goes out signal or send from touch panel 25 by touch panel driver 24, is confirmed whether to have produced and releasesPut index signal. In the camera 1 of present embodiment, as producing the operating unit that discharges index signal, exceptThe mechanically actuated parts that for example formed by the release-push and the switch block of interlock with it etc. that are contained in operating portion 17Outside, also use touch panel 25. Therefore, user is the releasing operation parts of press operating part 17 at any time,For example, or the user at any time predetermined position to touch panel 25 (" OK " icon 54) touches operation,Discharge index signal thereby produce. Camera control part 11 discharges the generation of index signal in the case of confirming,Enter the processing of step S78 below, in this step S78, carry out HDR photograph processing. Will be by this HDRPhotograph processing, in above-mentioned a series of processing sequences, the result view data of temporary storage is via memory interface20 are recorded in recording medium 21.
Now, display frame 23a, as shown in Figure 19 (H), shows the image that HDR photograph processing is crossed, andThe various icons of cancellation show. After this revert to processing sequence originally, return to the processing of the step S41 of Figure 10.
And, as shown in Figure 19 (G), the demonstration after the 2HDR processing execution of above-mentioned steps S72 finishesIn picture 23a, again show " appending " icon 53. Therefore in this moment, by " appending " icon 53 is touchedOperation, processes thereby can again carry out (the 3rd) HDR. Now, the processing sequence that 3HDR processes and upperStating 2HDR processes identical. Therefore for fear of repeat specification, omit the figure while carrying out 3HDR processingShow and illustrate.
On the other hand, in the processing of above-mentioned steps S68, if the touch to " appending " icon 53 unconfirmed operationWhile entering into the processing of step S75 in situation, in this step S75, camera control part 11 monitors by operationThe control part 16 and output signal sent from operating portion 17 is confirmed whether to have produced release index signal. ConfirmingThe processing that enters step S78 in the situation of the generation of release index signal, in this step S78, carries out HDR and takes the photographShadow processing. After this revert to processing sequence originally, return to the processing of the step S41 of Figure 10.
Use the details of Figure 16 above-mentioned HDR sequence example B of following explanation (the step S50 of Figure 11) below.
This HDR sequence example B be as mentioned above in the case of the luminance difference in appointed area is larger (Figure 11'sThe branch of the processing of step S48) carry out processing. Wherein, the situation that the luminance difference in appointed area is larger for example canBe envisioned for the situation shown in Figure 20 (C). , carry out in the processing (referring to Figure 12) of the step S47 of Figure 111Region specification, and now except touch operation also carry out slide, thereby as the frame of Figure 20 (C)Show shown in 63, specify region in a big way. In appointed area at this moment, comprise the high brightness such as the sun being shotBody 201 and be relatively the subject 200 of low-light level.
In the step S111 of Figure 16, camera control part 11 is carried out appointed area photometry L (specifically Figure 14Subprogram). After this enter the processing of step S112.
In step S112, camera control part 11 is carried out appointed area photometry H and is processed (the specifically son of Figure 15Program). After this enter the processing of step S113.
Like this, in each processing of above-mentioned steps S111, S112, according to what specify by a Region specification operationView data in appointed area, continues automatically to obtain the high brightness subject based on being suitable in this appointed areaThe image of 201 exposure, the image of exposure based on being suitable for the low-light level subject 200 in this appointed area.
In step S113, camera control part 11 is carried out the subject tracking process in appointed area. After this enterThe processing of step S114.
In step S114, camera control part 11 is according to acquired benchmark exposure image data and above-mentioned stepsThe obtained view data of result of each processing of S111, S112, carries out HDR and processes. After this enter step S115Processing.
The result that this HDR processes is in display frame 23a, to carry out the demonstration shown in Figure 20 (D). Wherein, existIn display frame 23a, show the live view image that has reflected HDR result, and in this display frame 23aPredetermined position (bight, bottom right in figure), overlapping demonstration on this live view image " remove HDR " icon 56, " OK "Icon 54 etc.
In step S115, camera control part 11 monitors by touch panel driver 24 from 25 of touch panelsThe output signal going out, is confirmed whether that operation has been carried out touching in the position to being equivalent to " returning " icon 52. Wherein, existConfirm the processing that enters step S116 in the situation of touch operation of " returning " icon 52.
Then,, in step S116, camera control part 11 is carried out and is removed the processing that HDR processes. After this returnTo processing sequence originally, return to the processing of the step S41 of Figure 10.
On the other hand, in the processing of above-mentioned steps S115, confirming " returning " icon 52 touch operation in additionSituation under enter step S117 processing.
In step S117, camera control part 11 monitors send from operating portion 17 by operation control part 16 defeatedThe output signal that goes out signal or send from touch panel 25 by touch panel driver 24, is confirmed whether to have produced and releasesPut index signal. Herein, in the situation that confirming release index signal, the processing that enters step S118 below,In this step S118, carry out HDR photograph processing. After this revert to processing sequence originally, return to the step of Figure 10The processing of rapid S41.
As mentioned above, when passing through the products such as the pressing operation of releasing operation parts or the touch operation of above-mentioned " OK " icon 54While having given birth to release index signal, carry out photography action. On display frame 23a, show thus HDR photograph processingThe image of crossing, and the demonstration of the various icons of cancellation, become the situation shown in Figure 20 (E).
Use Figure 17 that the details of the processing (dynamic image record) of the step S43 of Figure 10 is described as follows below.
As mentioned above, in the processing of the step S42 of Figure 10, confirm expression and should start the instruction of dynamic image recordIn the situation of signal, carry out dynamic image record. The display case of display frame 23a is now shown in Figure 21 (A)State. As mentioned above, on display frame 23a, show live view image, at predetermined position, with this live viewDoubling of the image ground shows " HDR controls " icon 51 (bight, bottom right in figure) and " ● record " icon 57 (upper left in figureBight) etc. And the demonstration of " ● record " icon 57 is aobvious in this camera 1 is carried out dynamic image recording processThe icon showing.
First,, in the step S201 of Figure 17, camera control part 11 confirms that the pattern of this camera 1 isNo is photograph mode and for HDR pattern. In the situation that not being set as HDR pattern, enter step S202 herein,Processing. Be set as the processing that enters step S207 in the situation of HDR pattern.
In step S202, camera control part 11 monitors by touch panel driver 24 and from touch panel 25The output signal of sending, is confirmed whether that operation has been carried out touching in the position to being equivalent to " HDR control " icon 51. ThisPlace, the processing that enters step S206 in the case of the touch operation that confirms " HDR control " icon 51. AndThe processing that enters step S203 in the situation of the operation of the touch to " HDR control " icon 51 unconfirmed.
In step S203, camera control part 11 carries out common photometry processing, carries out common exposure control simultaneouslyProcess.
Then,, in step S204, camera control part 11 enters step after having carried out common image processingThe processing of S205, in this step S205, camera control part 11 is carried out common dynamic image record.
And in step S222, camera control part 11 monitors by operation control part 16 and sends from operating portion 17Output signal or the output signal of sending from touch panel 25 by touch panel driver 24, be confirmed whether produceThe end index signal of dynamic image operation of recording. Confirming dynamic image record end index signal herein,Situation is grouped into processing sequence originally next time, returns to the processing of the step S41 of Figure 10. And it is unconfirmed to dynamic imageIn the situation of record end index signal, return to the processing of above-mentioned steps S201, repeat processing after this.
On the other hand, when confirm the touch operation of " HDR control " icon 51 in the processing of above-mentioned steps S202And while entering into the processing of step S206, in this step S206, camera control part 11 switches patternFor HDR pattern, start the action processing based on this pattern. After this enter the processing of step S207.
Wherein, in the time having carried out the touch operation of " HDR control " icon 51, display frame 23a is as Figure 21 (B)Shown in change. Under this state, with the live view doubling of the image of display frame 23a show " please touch " etc.Guiding shows 55. The demonstration of " HDR control " icon 51 simultaneously disappears.
Return to Figure 17, in the processing of above-mentioned steps S201, confirm the setting of HDR pattern and in above-mentioned stepsIn the processing of S206, start in the situation of HDR mode treatment the processing that enters step S207.
In step S207, camera control part 11 is confirmed whether to have specified the appointed area of processing for HDR.Herein, the processing of the step S208 below entering in the situation that appointed area not being specified. And to refer toThe processing that enters step S211 in the situation of determining to have carried out specifying in region.
In step S208, camera control part 11 monitors by touch panel driver 24 from touch panel 25The output signal of sending, specifies the confirmation of operation index signal of the appointed area of processing for HDR. Wherein,In the processing that enters above-mentioned steps S203 index signal in the situation that to operation unconfirmed, repeat processing after this. AndThe processing of step S209 below entering in the situation that confirming operation index signal.
In step S209, camera control part 11 execution areas are specified (the step S47 of Figure 11; Specifically figure12 subprogram). After this enter the processing of step S210.
Wherein, when having carried out Region specification operation, while having carried out the Region specification of above-mentioned steps S209, showing pictureIn face 23a, show that with the mode display list shown in Figure 21 (B) the rectangular-shaped frame of appointed area shows 63. At figureIn example shown in 21 (B), show in 63 and comprise the different subject of multiple brightness at the frame as appointed area.
Then,, in step S210, camera control part 11 is carried out the subject tracking process in appointed area. ThisAfter, the processing that enters above-mentioned steps S203, repeats processing after this.
On the other hand, in above-mentioned steps S207, when appointed area being carried out having entered the in the situation that of appointmentWhen the processing of step S211, in this step S211, camera control part 11 is confirmed whether to carry out based on HDRThe recording processing of pattern. Herein, the step S212 below entering not in HDR recording processing in the situation thatProcessing. And the processing that enters step S218 in the situation that confirming in HDR recording processing.
And, in this HDR recording processing, refer to the recording processing that comprises the image that HDR had processed. This processingRefer to through the HDR of step S221 described later and process and carried out the dynamic image record of above-mentioned steps S205State.
In step S212, camera control part 11 is carried out for the HDR photometry processing of appointed area, exposure controlSystem is processed. The details that this HDR photometry processing, exposure control are processed as shown in figure 18.
In the step S301 of Figure 18, camera control part 11 is carried out appointed area photometry L (specifically Figure 14Subprogram). After this enter the processing of step S302.
Then,, in step S302, camera control part 11 is carried out appointed area photometry H and is processed (specifically figure15 subprogram). After this enter the processing of step S303.
Then,, in step S303, camera control part 11 is carried out the dynamic range (D scope) in appointed areaMeasure and process.
In step S304, camera control part 11 is carried out the selection of exposure frame number, shooting frame per second and is set processing.
In step S305, camera control part 11 is carried out the exposure calculation process of each frame.
Then,, in step S306, camera control part 11 is confirmed whether in dynamic image operation of recording. ThisPlace (is equivalent to the situation of the sequence of step S212 in the situation that confirming not in dynamic image operation of recordingUnder), revert to processing sequence originally, the processing that enters the step S213 of Figure 17. And confirming in dynamicallyIn situation in recording image action (being equivalent in the situation of sequence of step S220), enter step S307'sProcess.
In step S307, camera control part 11 is according to the HDR range information in HDR image processing last timeCarry out the setting of limits value.
The in the situation that of carrying out the processing of HDR image in the time that dynamic image records, when being switched to from common image is instantaneousWhen image that HDR processed, can become factitious image, thereby in the time carrying out HDR recording image, from conventionallyIf image switching is switched during to HDR processed image for example time of 2~3 seconds of cost at leisure, can becomeFor natural image appearance. That is, can take mode that the variation of processing image is changed at leisure, for example, carry outPossesses the switching to change the so-called effect of fading in of result according to every 0.1EV. In addition, in HDR enrollmentWhen the end of reason, also can possess the switching of the same effect of fading out. The processing of above-mentioned steps S307 is exactly for this reasonThe setting of carrying out, above-mentioned limits value (for example refers to the value set for making to process the treatment situation that image slowly changesEV value etc.).
Then,, in step S308, above-mentioned limits value is stored as HDR range information by camera control part 11.After this revert to processing sequence originally, the processing that enters the step S213 of Figure 17.
In step S213, camera control part 11 is carried out the processing of HDR preview image. After this enter step S214Processing.
The in the situation that of carrying out HDR processing in dynamic image record, if apply immediately the result that HDR processes,May become less desirable dynamic image record. So, the in the situation that of carrying out HDR processing in dynamic image record,If can preview HDR result, before application, confirm in advance to facilitate the most. So carry out above-mentioned HDRPreview image processing.
This HDR preview image processing is before the dynamic image data application HDR result in record,The processing that can confirm in advance in display frame 23a. For this reason, in the camera 1 of present embodiment, for example figureShown in 21 (D), with the live view doubling of the image of display frame 23a, use a part of this display frame 23aRegion shows that HDR result shows the secondary picture 64 of use. Meanwhile, at the predetermined position of display frame 23a(bight, bottom right in figure) shows " beginning " icon 58 and " returning " icon 52 etc.
Return to Figure 17, in step S214, camera control part 11 monitors by touch panel driver 24 from touchingTouch the output signal that panel 25 sends, be confirmed whether that operation has been carried out touching in the position to being equivalent to " beginning " icon 58.The processing that enters step S215 herein, in the case of the touch operation that confirms " beginning " icon 58.
In step S215, camera control part 11 starts HDR recording processing. Thus as shown in Figure 21 (E),In display frame 23a, the demonstration of secondary picture 64 disappears, and replaces " beginning " icon 58, " returning " icon 52 etc.,And show " end " icon 59. After this enter the processing of step S216.
And in the processing of above-mentioned steps S214, the touch to " beginning " icon 58 unconfirmed operates, directlyEnter the processing of step S216.
In step S216, camera control part 11 monitors by touch panel driver 24 from touch panel 25The output signal of sending, is confirmed whether that operation has been carried out touching in the position to being equivalent to " returning " icon 52. Herein,In the case of the operation of the touch to " returning " icon 52 unconfirmed, the processing that enters above-mentioned steps S205, repeats thisAfter processing. And confirm in the situation of touch operation of " returning " icon 52 processing that enters step S217.
In step S217, camera control part 11 is carried out the processing of removing HDR preview processing. After this, enterThe processing of above-mentioned steps S205, repeats processing after this.
And by the releasing processing of this HDR preview processing, display frame 23a revert to the shape of Figure 21 (A)State.
On the other hand, in above-mentioned steps S211, when confirming in HDR recording processing, enter step S218Processing time, in this step S218, camera control part 11 monitors by touch panel driver 24 from touchingTouch the output signal that panel 25 sends, be confirmed whether that operation has been carried out touching in the position to being equivalent to " end " icon 59.In the case of the touch operation that confirms " end " icon 58, the processing that enters step S219.
In step S219, camera control part 11 is carried out the processing of removing HDR recording processing. After this enter stepThe processing of rapid S220.
In addition, in the processing of above-mentioned steps S218, the touch to " end " icon 59 unconfirmed operates,Enter the processing of step S220.
In step S220, camera control part 11 is carried out HDR photometry, exposure is controlled to process and (is specially Figure 18Subprogram). After this enter the processing of step S221.
In step S221, camera control part 11 is carried out HDR image processing (record image processing). ThisAfter enter step S205 processing, repeat processing after this.
While using the sequential chart simple declaration of Figure 22 and dynamic image to record below and live view (LV) image showThe HDR of Shi Jinhang processes relevant signal and processes sequential.
Figure 22 A represents vertical synchronizing signal. In the time conventionally photographing, establishing frame per second is 30fps. In this case, at symbolMoment shown in number S1 starts HDR and processes. ,, in the moment of symbol S1, first carry out Region specification, thenCarry out photometry and the exposure-processed relevant with appointed area. Carrying out in the same time frame per second control mutually, frame per second is switched to 390fps doubly.
Thus, as shown in Figure 22 B, can utilize the image (benchmark image A) to a frame in the time conventionally photographing to enterThe time of row exposure is obtained the image (benchmark A, the dark B of portion, the bright C of portion) of three frames.
Wherein, benchmark image A is said reference exposure image. And the dark B of portion is that in above-mentioned explanation, low-light level part becomesFor the image of correct exposure. And the bright C of portion is the image that in above-mentioned explanation, hi-lite becomes correct exposure.
First as shown in Figure 22 D, the image (A, B, C) that uses HDR to process three frames of as above obtaining carries outIt is (above-mentioned that the composograph that composograph processing (HDR synthetic) obtains is shown in secondary picture in display frame 23aSecondary picture 64) in.
Then start HDR recording processing in the moment shown in the symbol S2 of Figure 22 A. That is, in the moment of symbol S2," beginning " icon 58 has been carried out touching operation.
Using this as triggering, by the image that HDR processed for the dynamic image data that records, and dynamicallyIn recording image or live view image, cut with the instantaneous image that carries out of instruction while of " beginning " icon 58 grades if thinkChange, become factitious demonstration.
So, as shown in Figure 22 E, start HDR while recording in the moment of symbol S2, at the benchmark from showWhen image switching is HDR processing image, for example 2~3 second time of cost slowly switches. Now possesses instituteThe switching of the effect of fading in of meaning shows, also slowly occurs with the variation that makes to process image, for example make result according toEvery 0.1EV changes and (synthesizes referring to the HDR shown in Figure 22 E synthetic (1), HDR synthetic (2), HDR(3)). And, at the touch EO by " end " icon 59 also can carry out when HDR recording processingThe same switching that possesses the effect of fading out shows.
As mentioned above, above-mentioned the 2nd embodiment, can obtain the effect identical with above-mentioned the 1st embodiment. AndIn present embodiment, in the time that appointed area is specified, utilize multiple means to tackle, therefore can be more simpleRealize the processing of HDR image.
In addition, in the present embodiment, even in dynamic image record, also can insert HDR in the moment of expectingProcess, in the moment of expecting, it is stopped, therefore obtaining the more extensively effective image of use of expression means.
And then in the situation that dynamic image records, the display mode when beginning that HDR is processed and end is takedMeasure, thus can obtain and comprise the effectively dynamic image of the image that HDR had processed in more natural mode,And make its demonstration.
In addition, in the present embodiment, camera control part 11 is carried out frame per second control processing in the time of HDR pattern, withHigher frame per second is moved, and therefore can not exist the live view image of awkward sense to show and dynamic image noteRecord.
About each processing sequence of explanation in the respective embodiments described above, only otherwise violate its character, just allow to walkRapid change. Therefore can for example change the execution sequence of various processes or carry out multiple simultaneously above-mentioned processing sequenceTreatment step, or whenever carrying out the order difference that makes various processes when a series of processing sequence. In addition, upperState in each embodiment, illustrational is that photographing lens barrel is configured to detachable in the what is called of camera bodyLens exchange type camera, but be not limited to this mode, for example can completely similarly be applied to photographing lens barrelBe fixed on integratedly the camera of the mode on camera body.
And, the invention is not restricted to above-mentioned embodiment, certainly can in the scope that does not depart from inventive concept, implement eachPlant distortion and application. And then the invention that comprises the various stages of above-mentioned embodiment, by disclosed multiple formations are wantedPart carries out appropriately combined, can extract various inventions. For example, as long as from all structures shown in an above-mentioned embodimentIn one-tenth important document, delete certain several constitutive requirements and also can solve the problem that invention wish solves, obtain in the situation of invention effect,The formation of having deleted after these constitutive requirements just can be extracted as invention.
It is photographic equipment specifically for the electronic equipment of camera function that the present invention is not limited only to digital camera etc., can also answerFor possessing the other forms of electronic equipment of camera function, for example portable phone, sound pick-up outfit, electronic notebook,Personal computer, game station, television set, alarm clock, use GPS (GlobalPositioningSystem) leadThe various electronic equipments with camera function such as boat equipment.

Claims (8)

1. a photographic equipment, is characterized in that, has:
Image pickup part, it takes subject;
Display part, it shows the image of the view data based on being obtained by above-mentioned image pickup part;
Touch panel, it is disposed in the display frame of above-mentioned display part, specifies and the corresponding coordinate of input operation;
Control part, it sets appointed area, and this appointed area comprises corresponding with the coordinate of specifying by above-mentioned touch panelAbove-mentioned image in coordinate;
Exposure control part, it carries out photometry and sets suitable object light according to the view data being obtained by above-mentioned image pickup partWork as exposure value, generate the image of the correct exposure value based on setting; And
Image processing part, it carries out according to the multiple view data that obtained by above-mentioned image pickup part, and image is synthetic to be processed,
Above-mentioned exposure control part generate the correct exposure value based on setting according to above-mentioned view data benchmark image andThe image of the correct exposure value based on setting according to the view data in above-mentioned appointed area,
Above-mentioned image processing part is carried out the multiple images that generated by above-mentioned exposure control part in dynamic image operation of recordingSynthetic processing,
In the live view display frame of above-mentioned display part in dynamic image record, show secondary picture, and documentarilyBefore stating the synthetic result of image processing part, in above-mentioned secondary picture, show synthesizing based on above-mentioned image processing partThe live view image of the view data of result.
2. photographic equipment according to claim 1, is characterized in that,
Above-mentioned control part is set above-mentioned appointment corresponding to multiple multiple coordinates difference with specifying by above-mentioned touch panelRegion,
Above-mentioned exposure control part generates multiple based on what set according to the each view data in above-mentioned multiple appointed areasMultiple images of correct exposure value,
Above-mentioned image processing part synthesizes processing to said reference image and above-mentioned multiple image.
3. photographic equipment according to claim 1, is characterized in that,
Above-mentioned control part is set the appointed area frame specifying according to multiple coordinates of specifying by above-mentioned touch panel,
Above-mentioned exposure control part generate comprise with above-mentioned appointed area frame in view data in low-light level partial responseCorrect exposure value image and with multiple images of the image of the corresponding correct exposure value of hi-lite,
Above-mentioned image processing part synthesizes processing to said reference image and above-mentioned multiple image.
4. according to the photographic equipment described in any one in claims 1 to 3, it is characterized in that, above-mentioned in carrying outWhen the image of image processing part synthesizes the pattern of processing, above-mentioned control part moves above-mentioned image pickup part with high frame per secondControl.
5. according to the photographic equipment described in any one in claims 1 to 3, it is characterized in that above-mentioned image processing partGenerate synthetic processing of image of high dynamic range images.
6. according to the photographic equipment described in any one in claims 1 to 3, it is characterized in that,
Above-mentioned image processing part is carried out the synthetic processing of above-mentioned multiple images in live view display action,
Above-mentioned display part carries out live view demonstration according to the view data of the synthetic result of above-mentioned image processing part.
7. photographic equipment according to claim 1, is characterized in that, above-mentioned image processing part is being accepted recordAfter index signal, carry out the view data of the synthetic result of above-mentioned image is reflected in to the dynamic image data in recordImage processing.
8. photographic equipment according to claim 7, is characterized in that, above-mentioned image processing part is by above-mentioned imageWhen the view data of synthetic result is reflected in the dynamic image data in record, carry out slowly expanding moving according to every frameThe image processing of state scope.
CN201210328345.0A 2011-09-08 2012-09-06 Photographic equipment Expired - Fee Related CN103002211B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011196258A JP5683418B2 (en) 2011-09-08 2011-09-08 Photographing equipment and photographing method
JP2011-196258 2011-09-08

Publications (2)

Publication Number Publication Date
CN103002211A CN103002211A (en) 2013-03-27
CN103002211B true CN103002211B (en) 2016-05-25

Family

ID=47930305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210328345.0A Expired - Fee Related CN103002211B (en) 2011-09-08 2012-09-06 Photographic equipment

Country Status (2)

Country Link
JP (1) JP5683418B2 (en)
CN (1) CN103002211B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6152009B2 (en) * 2013-08-02 2017-06-21 キヤノン株式会社 Imaging apparatus, imaging method, program, and recording medium
WO2015081562A1 (en) * 2013-12-06 2015-06-11 华为终端有限公司 Terminal, image processing method, and image acquisition method
JP5825401B1 (en) * 2014-06-30 2015-12-02 カシオ計算機株式会社 Imaging apparatus, imaging method, and program
WO2016035423A1 (en) * 2014-09-05 2016-03-10 富士フイルム株式会社 Moving image editing device, moving image editing method, and moving image editing program
CN105812645B (en) * 2014-12-29 2019-12-24 联想(北京)有限公司 Information processing method and electronic equipment
WO2017002544A1 (en) * 2015-06-30 2017-01-05 オリンパス株式会社 Image processing apparatus and imaging system
WO2017208991A1 (en) * 2016-06-01 2017-12-07 シャープ株式会社 Image capturing and processing device, electronic instrument, image capturing and processing method, and image capturing and processing device control program
JP6914633B2 (en) 2016-09-30 2021-08-04 キヤノン株式会社 Imaging device and imaging method
EP3454547A1 (en) * 2017-09-11 2019-03-13 Canon Kabushiki Kaisha Imaging apparatus, image processing apparatus, imaging method, image processing method, and storage medium
CN108540729A (en) * 2018-03-05 2018-09-14 维沃移动通信有限公司 Image processing method and mobile terminal
CN109194855A (en) * 2018-09-20 2019-01-11 Oppo广东移动通信有限公司 Imaging method, device and electronic equipment
CN108881701B (en) * 2018-09-30 2021-04-02 华勤技术股份有限公司 Shooting method, camera, terminal device and computer readable storage medium
KR20210101941A (en) * 2020-02-11 2021-08-19 삼성전자주식회사 Electronic device and method for generating high dynamic range images

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101969534A (en) * 2009-07-27 2011-02-09 鸿富锦精密工业(深圳)有限公司 Method and system for realizing regional exposure of picture in photographic equipment
CN102129148A (en) * 2010-01-20 2011-07-20 鸿富锦精密工业(深圳)有限公司 Camera and photo shooting and processing method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5929908A (en) * 1995-02-03 1999-07-27 Canon Kabushiki Kaisha Image sensing apparatus which performs dynamic range expansion and image sensing method for dynamic range expansion
JP3639627B2 (en) * 1995-02-03 2005-04-20 キヤノン株式会社 Image synthesizer
JP2002232777A (en) * 2001-02-06 2002-08-16 Olympus Optical Co Ltd Imaging system
JP3974799B2 (en) * 2002-03-07 2007-09-12 オリンパス株式会社 Digital camera
JP5105139B2 (en) * 2006-08-03 2012-12-19 ソニー株式会社 Imaging apparatus, display method, and program
JP4306752B2 (en) * 2007-03-19 2009-08-05 ソニー株式会社 Imaging device, photometry method, luminance calculation method, program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101969534A (en) * 2009-07-27 2011-02-09 鸿富锦精密工业(深圳)有限公司 Method and system for realizing regional exposure of picture in photographic equipment
CN102129148A (en) * 2010-01-20 2011-07-20 鸿富锦精密工业(深圳)有限公司 Camera and photo shooting and processing method

Also Published As

Publication number Publication date
CN103002211A (en) 2013-03-27
JP2013058922A (en) 2013-03-28
JP5683418B2 (en) 2015-03-11

Similar Documents

Publication Publication Date Title
CN103002211B (en) Photographic equipment
CN100553294C (en) Camera head, direct picture display control method
CN102158650B (en) Image processing equipment and image processing method
CN103248813B (en) Photographic equipment and method of controlling operation thereof thereof
CN101076997B (en) Image processing and image processing method used therein
CN101990067B (en) Camera and camera control method
US7769287B2 (en) Image taking apparatus and image taking method
US20060007346A1 (en) Image capturing apparatus and image capturing method
CN100553296C (en) Filming apparatus and exposal control method
JPH11355617A (en) Camera with image display device
CN101311962A (en) Image editing apparatus and method for controlling image editing apparatus
JP2010073002A (en) Image processor and camera
CN103248815A (en) Image pickup apparatus and image pickup method
US6806905B1 (en) Digital camera
CN103002223A (en) Photographic equipment
CN101841654B (en) Image processing apparatus and image processing method
CN103139472A (en) Digital photographing apparatus and control method thereof
JP2012060299A (en) Image pickup device, control method thereof, program, and recording medium
JP3918228B2 (en) Information processing apparatus and recording medium
JP4374490B2 (en) Imaging device
JP2015109684A (en) Photographing apparatus and photographing method
JP2009118084A (en) Digital camera
JP2001211354A (en) Electronic camera
CN104584529A (en) Image processing device, image capture device, and program
JP2007116370A (en) Image reproduction program, reproducing apparatus, and imaging apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20151224

Address after: Tokyo, Japan

Applicant after: OLYMPUS Corp.

Address before: Tokyo, Japan

Applicant before: Olympus Imaging Corp.

C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160525

CF01 Termination of patent right due to non-payment of annual fee