Embodiment
Explanation is used for implementing mode of the present invention (hereinafter referred to as " execution mode ") with reference to the accompanying drawings.In addition, the invention is not restricted to following execution mode.In addition, in the record of accompanying drawing, identical part being marked to identical label describes.
Fig. 1 means that the camera head of the present invention's the 1st execution mode is towards the figure of the structure of subject one side (front face side).Fig. 2 means that the camera head of the present invention's the 1st execution mode is towards the figure of the structure of cameraman's one side (rear side).Fig. 3 means the block diagram of functional structure of the camera head of the present invention's the 1st execution mode.
Camera head 1 shown in Fig. 1~Fig. 3 has: main part 2; Can be on main part 2 disassembled and assembled freely, from predetermined area of visual field gathered light and can carry out the camera lens part 3 of optical zoom; And can be on main part 2 external equipment 4 of disassembled and assembled freely.
First main part 2 is described.Main part 2 has shutter 201, shutter drive division 202, imaging apparatus 203, imaging apparatus drive division 204, signal processing part 205, A/D converter section 206, image processing part 207, AE handling part 208, AF handling part 209, image compression decompression portion 210, input part 211, annex Department of Communication Force 212, eyepiece display part 213, eye transducer 214, movable part 215, display part 216, touch panel 217, rotate detection unit 218, state-detection portion 219, clock 220, recording medium 221, memory I/F222, SDRAM(Synchronous Dynamic Random Access Memory: Synchronous Dynamic Random Access Memory) 223, flash memory 224, main body Department of Communication Force 225, bus 226 and main body control portion 227.
Shutter 201 is set as exposure status or shading status by the state of imaging apparatus 203.Shutter drive division 202 is to use stepping motor or DC motor etc. to form, according to driving shutter 201 from the index signal of main body control portion 227 inputs.
Complementary metal oxide semiconductors (CMOS)) etc. charge coupled cell) or CMOS(Complementary Metal Oxide Semiconductor imaging apparatus 203 is use to receive the light that camera lens part 3 assembles the CCD(Charge Coupled Device that is converted to the signal of telecommunication:: form, generate the view data of subject.Imaging apparatus drive division 204 outputs to signal processing part 205 by view data (analog signal) from imaging apparatus 203 at predetermined instant.Based on this meaning, imaging apparatus drive division 204 plays a role as electronic shutter.
205 pairs of view data of inputting from imaging apparatus 203 of signal processing part are implemented simulation process and are exported to A/D converter section 206.For example, after 205 pairs of view data of signal processing part reduce reset noise, carry out waveform shaping, then gain to become object brightness.
206 pairs of simulated image datas of inputting from signal processing part 205 of A/D converter section carry out A/D conversion, thereby generating digital view data (RAW data) is exported to SDRAM223 via bus 226.
Image processing part 207 is obtained view data (RAW data) via bus 226 from SDRAM223, and the view data obtaining is carried out various image processing and generated image data processing.Particularly, image processing part 207 comprises that the black subtraction process of optics, white balance (WB) regulate the basic image of processing, colour matrix calculation process, gamma correction processing, color rendering processing and edge enhancing processing etc. to process.In addition, at imaging apparatus 203, present when situation hypograph handling part 207 that Bayer arranges carries out view data and change and process.Image processing part 207 is exported to SDRAM223 or display part 216 via bus 226 by image data processing.
At this, the detailed structure of image processing part 207 is described.Image processing part 207 has the 207a of cutting portion, characteristic quantity test section 207b and follows the tracks of subject configuration part 207c.
The control of the 207a of cutting portion based on main body control portion 227, intercepts the presumptive area of the image corresponding with view data, generates cutting image.In addition, the 207a of cutting portion basis is from the index signal of input part 211 inputs or the position signalling of inputting from touch panel 217, from image corresponding to the view data with imaging apparatus 203 generations, the intercepting photography region corresponding with the predetermined zoom multiplying power of camera lens part 3, generates cutting image.For example, the 207a of cutting portion intercepts and comprises from the region of the position signalling of touch panel 217 input as photography region corresponding to the predetermined zoom multiplying power with camera lens part 3 from view data, generates cutting image.Based on this meaning, the effect of the 207a of cutting portion performance electronic zoom.
The control of characteristic quantity test section 207b based on main body control portion 227, the characteristic quantity of the subject that detection comprises in the presumptive area of the image corresponding with view data.Particularly, characteristic quantity test section 207b detects the brightness composition of the subject comprising the index signal by from input part 211 inputs or the presumptive area set from the position signalling of touch panel 217 inputs, as characteristic quantity.
Follow the tracks of subject configuration part 207c according to the detected characteristic quantity of characteristic quantity test section 207b, the tracking subject that setting will be followed the tracks of, the tracking subject of following the tracks of this setting between adjacent image.In addition, following the tracks of subject configuration part 207c also can use the known technologies such as pattern matching to following the tracks of subject, to follow the tracks of between adjacent image.
AE handling part 208 is obtained the view data recording in SDRAM223 via bus 226, the conditions of exposure while carrying out still image photography or dynamic image photography according to the view data setting camera head 1 of obtaining.Particularly, AE handling part 208 calculates brightness according to view data, determines such as f-number, shutter speed, ISO photosensitivity etc., thereby carry out the automatic exposure of camera head 1 according to the brightness calculating.
AF handling part 209 is obtained the view data recording in SDRAM223 via bus 226, carry out the adjustment of the auto-focus of camera head 1 according to obtained view data.For example, AF handling part 209 takes out the signal of radio-frequency component from view data, the signal of radio-frequency component is carried out to AF(Auto Focus: focusing automatically) calculation process, thus the adjustment of the auto-focus of camera head 1 is carried out in the focusing evaluation of definite camera head 1 thus.In addition, AF handling part 209 also can be used pupil to cut apart the adjustment that phase difference method carries out the auto-focus of camera head 1.
Image compression decompression portion 210 obtains view data and image data processing via bus 226 from SDRAM223, according to predetermined form, compresses obtained view data, by memory I/F222, the view data after this compression is exported to recording medium 221.At this, as predetermined form, be JPEG(Joint Photographic Experts Group: JPEG (joint photographic experts group)) mode, MotionJPEG mode and MP4(H.264) mode etc.In addition, the view data (compressing image data) that image compression decompression portion 210 obtains record in recording medium 221 via bus 226 and memory I/F222, decompresses (decompress(ion)) and exports to SDRAM223 obtained view data.
Input part 211 has the mains switch 211a that the power supply status of camera head 1 is switched to on-state or off-state, accept for the release-push 211b of input of still image release signal of the indication of still image photography is provided, switch the console switch 211c of the various settings of camera head 1, make display part 216 show the menu switch 211d of the various settings of camera head 1, accept for the dynamic image switch 211e of the input of the dynamic image dynamic image release signal that photography is indicated is provided, make display part 216 show the reproduction switch 211f of the image corresponding with being recorded in view data in recording medium 211.Release-push 211b can be by advancing and retreat from pressing of outside, in the situation that partly pressing, accept the input as the 1st release signal of the index signal of indication photography warming-up exercise, on the other hand, in the situation that all pressing, accept the input of the 2nd release signal of indication still image photography.
Annex Department of Communication Force 212 is for carrying out and the communication interface of communicating by letter that is arranged on the external equipment 4 on main part 2.
Eyepiece display part 213 be use driver and by liquid crystal or organic EL(Electro Luminescence) display floater that forms etc. forms.The control of eyepiece display part 213 based on main body control portion 227, shows the image corresponding be recorded in view data in SDRAM223 via bus 226.Based on this meaning, eyepiece display part 213 plays a role as electronic viewfinder (EVF).
Eye transducer 214 is to use the formations such as contact pickup.Eye transducer 214 detection users approach to eyepiece display part 213, and this testing result is exported to main body control portion 227.Particularly, whether eye transducer 214 detects user by eyepiece display part 213 confirmation images.
Movable part 215 is provided with display part 216 and touch panel 217, via hinge 215a, is arranged in a movable manner on main part 2.For example, movable part 215 can change to respect to the vertical direction of main part 2 with display part 216 mode facing upward or downward and be arranged at (with reference to Fig. 2) on main part 2.
Display part 216 is to use driver and the display floater that consists of liquid crystal or organic EL etc. forms.The control of display part 216 based on main body control portion 227, obtains via bus 226 and is recorded in the view data in SDRAM223 or is recorded in the view data in recording medium 221, shows the image corresponding with obtained view data.At this, the demonstration of image comprises: the live view demonstration etc. that view data after just photographing is shown to the reproduction of recording browse displays, being reproduced in the view data of record in recording medium 221 of the scheduled time (during 3 seconds) shows and show successively the live view image that the view data that generates continuously with imaging apparatus 203 is corresponding according to time series.In addition, display part 216 suitably shows the information relevant with photography with the operation information of camera head 1.
Touch panel 217 is arranged in the display frame of display part 216 overlappingly.The touch that touch panel 217 detects from outside object, exports to main body control portion 227 by position signalling corresponding to the touch location detecting with this.In addition, also can be, touch panel 217 detect users according to the information showing in display part 216 for example icon image and thumbnail image carried out the position touching, the position detecting according to this is accepted and is used to indicate the index signal of the action that camera head 1 carries out and for selecting the input of the selection signal of image.Conventionally, touch panel 217 has resistive film mode, electrostatic capacitance mode and optical mode etc.In this 1st execution mode, can use the wherein touch panel of any one mode.In addition, movable part 215, display part 216 and touch panel 217 also can form as one.
Rotate detection unit 218 and judge the rotation situation of movable part 215, this testing result is exported to main body control portion 227.For example, rotate detection unit 218 and judge that whether movable part 215 can rotate with respect to main part 2, and this result of determination is exported to main body control portion 227.
State-detection portion 219 is used acceleration transducer and gyro sensor to form, and detects respectively the acceleration and the angular speed that in camera head 1, produce, and this testing result is exported to main body control portion 227.
Clock 220 has the decision-making function of clocking capability and photography time on date.Clock 220 is exported to main body control portion 227 by date time data, to the additional date time data of view data being obtained by imaging apparatus 203 shootings.
Recording medium 221 is to use the storage card installed from the outside of camera head 1 etc. to form.Recording medium 221 via memory I/F222 detachable be installed on camera head 1.The view data of having been implemented to obtain after processing by image processing part 207 or image compression decompression portion 210 is written to recording medium 221.In addition, by main body control portion 227, read the view data being recorded in recording medium 221.
SDRAM223 is used volatile memory to form.SDRAM223 placeholder record is the information from the view data of A/D converter section 206 inputs, the view data of inputting from image processing part 207 and camera head 1 are processed via bus 226.For example, the view data that SDRAM223 placeholder record imaging apparatus 203 is exported according to every frame successively via signal processing part 205, A/D converter section 206 and bus 226.
Flash memory 224 is to use nonvolatile memory to form.Flash memory 224 has program recording unit 224a.Program recording unit 224a record is for making the various programs of camera head 1 work, the various data that program is carried out use and the image that image processing part 207 carries out process each required image processing parameter of action etc.
Main body Department of Communication Force 225 is for carrying out and the communication interface of communicating by letter that is arranged on the camera lens part 3 on main part 2.
Bus 226 is that the transmission path etc. that use to connect each structure position of camera head 1 forms.Bus 226 is by each structural portion to camera head 1 in the inner various transfer of data that produce of camera head 1.
CPU) etc. main body control portion 227 is used CPU(Central Processing Unit: form.Main body control portion 227 carries out the indication corresponding with each portion that forms camera head 1 and the transmission of data etc. according to the index signal from input part 211 or from the position signalling of touch panel 217, controls the action of camera head 1 blanketly.
The following describes the detailed structure of main body control portion 227.Main body control portion 227 has the test section of touch 227a, photography control part 227b, display control unit 227c.
Touch test section 227a according to the position signalling from touch panel 217 inputs, detect the touch location on touch panel 217.
In the situation that having inputted the 2nd release signal from release-push 211b, photography control part 227b starts the control of the photography action of camera head 1.At this, photography in camera head 1 action refers to, signal processing part 205, A/D converter section 206 and 207 pairs of drivings by shutter drive division 202 and imaging apparatus drive division 204 of image processing part and the view data exported by imaging apparatus 203 is implemented the action of predetermined process.Implemented like this control based on photography control part 227b of view data after processing and by 210 compressions of image compression decompression portion, via bus 226 and memory I/F222, be recorded in recording medium 221.In addition,, in the situation that having inputted position signalling from touch panel 217, photography control part 227b, according to zoom ratio corresponding to the cutting image with being generated by the 207a of cutting portion, controls the optical zoom of camera lens part 3.Particularly, photography control part 227b controls camera lens parts 3 via main body Department of Communication Force 225, thereby drives the optical zoom of camera lens part 3, makes to become the zoom ratio of the cutting image that the 207a of cutting portion generates.
Display control unit 227c makes display part 216 and/or eyepiece display part 213 show the image corresponding with view data.Particularly, in the situation that the power supply of eyepiece display part 213 is on-states, display control unit 227c makes eyepiece display part 213 show the live view image corresponding with view data, on the other hand, in the situation that the power supply of eyepiece display part 213 is off-states, make display part 216 show the live view image corresponding with view data.In addition, display control unit 227c makes the cutting doubling of the image that the zoom information relevant with the photography region of predetermined zoom multiplying power corresponding to camera lens part 3 and the 207a of cutting portion generate on live view image, in eyepiece display part 213 and/or display part 216, shows.At this, photography region refers to, according to the determined angle of visual field of size sensor (area of visual field) of the focal length of camera lens part 3 and imaging apparatus 203.In addition, can be also that display control unit 227c is overlapped on live view image the index relevant with the optical axis O of camera lens part 3 and shows at display part 216.
Can also make to have the as above main part 2 of structure possess Speech input output function, flash function and can with communication function of outside two-way communication etc.
Camera lens part 3 is then described.Camera lens part 3 has zoom lens 301, zoom drive portion 302, zoom position test section 303, aperture 304, aperture drive division 305, f-number test section 306, focus lens 307, focusing drive division 308, focusing position test section 309, camera lens operating portion 310, camera lens flash memory 311, camera lens Department of Communication Force 312 and lens control portion 313.
Zoom lens 301 is to use one or more lens to form.Zoom lens 301 moves along the optical axis O of camera lens part 3, changes thus the multiplying power of the optical zoom of camera head 1.For example, zoom lens 301 can be to change focal length between 12mm~50mm at focal length.
Zoom drive portion 302 is used DC motor or stepping motor etc. to form, and the control based on lens control portion 313, makes zoom lens 301 move along optical axis O, carries out thus the change of the optical zoom of camera head 1.
Zoom position test section 303 is to use the formations such as Photo Interrupter, and it detects zoom lens 301 position on optical axis O, and this testing result is exported to Zoom control portion 313.
The adjustment that the amount of incident of the light that aperture 304 is assembled by restriction zoom lens 301 exposes.
Aperture drive division 305 is to use the formations such as stepping motor, and the control based on lens control portion 313 drives aperture 304, changes thus the f-number (F value) of camera head 1.
F-number test section 306 is to use Photo Interrupter or encoder etc. to form, the condition detection f-number current according to aperture 304, and this testing result is exported to lens control portion 313.
Focus lens 307 is to use one or more lens to form.Focus lens 307 moves along the optical axis O of camera lens part 3, changes thus the focal position of camera head 1.
Focusing drive division 308 is to use DC motor or stepping motor etc. to form, and the control based on lens control portion 313, makes focus lens 307 move along optical axis O, adjusts thus the focal position of camera head 1.
Focusing position test section 309 is to use the formations such as Photo Interrupter, detects the position of focus lens 307 on optical axis O, and this testing result is exported to lens control portion 313.
As shown in Figure 1, camera lens operating portion 310 is the rings around of lens barrel that are arranged at camera lens part 3, accepts the input of the input of index signal of change of the optical zoom that is used to indicate camera lens part 3 or the index signal of the adjustment of the focal position of rotating mirror head 3.In addition, camera lens operating portion 310 can be also push switch or pole type switch etc.
Camera lens flash memory 311 records are for determining respectively lens properties and the various parameter of zoom lens 301, aperture 304 and the position of focus lens 307 and the control program of action, camera lens part 3.At this, lens properties refers to, the chromatic aberration of camera lens part 3, angle of visual field information, monochrome information (f value) and focus information (for example 50mm~300mm).
Camera lens Department of Communication Force 312 is communication interfaces that camera lens part 3 communicates for the main body Department of Communication Force 225 with main part 2 while being installed on main part 2.
Lens control portion 313 is used the formations such as CPU.Lens control portion 313 is according to the index signal from camera lens operating portion 310 or from the index signal of main part 2, control the action of camera lens part 3.Particularly, lens control portion 313 drives to carry out the focus adjustment of focus lens 307 according to the drive division 308 that makes to focus of the index signal from camera lens operating portion 310, makes zoom drive portion 302 drive to change the zoom ratio of the optical zoom of zoom lens 301.In addition, can be also that lens control portion 313 sends to main part 2 by the lens properties of camera lens part 3 with for identifying the identifying information of camera lens part 3 when camera lens part 3 is installed on main part 2.
External equipment 4 is then described.External equipment 4 can be on main part 2 disassembled and assembled freely.External equipment 4 be for example flash unit, can input and output sound recording device, according to predetermined way, be connected with external network and will be recorded in view data in recording medium 221 communicator to external transmission.In addition, external equipment 4 has Department of Communication Force 401.Department of Communication Force 401 is when externally equipment 4 is installed on main part 2 and the communication interface that communicates of the annex Department of Communication Force 212 of main part 2.
The processing of camera head 1 execution with above structure is described.Fig. 4 means the flow chart of the summary of the processing that camera head 1 is carried out.
First illustrate that camera head 1 is set to the situation of photograph mode (step S101: be).In this situation, when camera lens part 3 is replaced by other camera lens parts 3 (step S102: be), main body control portion 227 carries out camera lens with changed camera lens part 3 and communicates by letter, and from camera lens part 3, obtains lens properties (step S103).After this, camera head 1 is transferred to step S104.On the other hand, when camera lens part 3 is not replaced by other camera lens parts 3 (step S102: no), camera head 1 is transferred to step S104.
Then, display control unit 227c makes display part 216 or eyepiece display part 213 show the sub-picture (step S104) that the live view image corresponding with the view data of imaging apparatus 203 generations, the zoom frame that is illustrated in the candidate in the region of amplification in live view image and the 207a of cutting portion obtain from the region that view data intercepts and zoom frame is corresponding.Example as shown in Figure 5, in the situation that user uses 1 pair of celestial body of camera head to photograph, display control unit 227c makes to represent that the zoom frame K1 of magnification region and the sub-picture K2 that the 207a of cutting portion generates from the view data intercepting region corresponding with zoom frame K1 are overlapped in display part 216 or the shown live view image W1 of eyepiece display part 213 and show.In addition, in Fig. 5, as the initial condition of zoom frame K1, represent and the predetermined zoom ratio of camera lens part 3, photography region that for example zoom ratio of 2 times is corresponding.In addition can suitably change, display position and the size of sub-picture K2.
In the picture of the live view image that after this, whether touch test section 227a detection shows at display part 216, carried out touching (step S105).Particularly, as shown in Figure 6, touch test section 227a and detect whether by touching in the picture in the live view image W1 being shown at display part 216 by user, from touch panel 217, inputted position signalling.Touch test section 227a in the situation that touch (step S105: be) detected in the picture of the live view image W1 being shown by display part 216, camera head 1 is transferred to step S122 described later.On the other hand, touch test section 227a via touch panel 217 in the situation that touch (step S105: no) do not detected in the picture of the live view image W1 being shown by display part 216, camera head 1 is transferred to step S106 described later.
In step S106, touch test section 227a and detect whether zoom frame K1 has been carried out to finger zoom operations.Whether specifically as shown in Figure 7, touch test section 227a detection has touched two places by user and from touch panel 217, has inputted 2 position signallings that represent different touch locations in the Nei region, border that comprises zoom frame K1.In addition, in Fig. 7, touch test section 227a and judge finger zoom operations by touching 2 places, but can be also, for example, in the situation that touch also detection on the frame of zoom frame K1 for having carried out finger zoom operations.In the situation that touch test section 227a detects finger zoom operations to zoom frame K1 (step S106: be), camera head 1 is transferred to step S107 described later.On the other hand, in the situation that touch test section 227a does not detect finger zoom operations to zoom frame K1 (step S106: no), camera head 1 is transferred to step S110 described later.
In step S107, in the situation that the optical zoom of camera lens part 3 reaches the limit of (step S107: be), on the live view image W1 that display control unit 227c shows at display part 216, demonstration represents the warning (step S108) that the optical zoom of camera lens part 3 reaches the limit of.For example, the demonstration of the highlighted demonstration zoom of display control unit 227c frame K1, makes display part 216 show the warning that the optical zoom of expression camera lens part 3 reaches the limit of thus.In addition, can be also that display control unit 227c makes display part 216 show by icon image or word the warning that the optical zoom of expression camera lens part 3 reaches the limit of.After step S108, camera head 1 is transferred to step S110.
In step S107, in the situation that the optical zoom of camera lens part 3 does not reach the limit of (step S107: no), display control unit 227c, according to 2 position signallings from touch panel 217 inputs, zooms in or out the viewing area (step S109) of the zoom frame K1 on live view image W1.Particularly, in the situation that represent 2 touch locations of 2 different position signallings to pass through in time, away from (while there is the operation of amplifying by finger), display control unit 227c increases the viewing area of zoom frame K1.In addition, approaching in the situation that represent 2 touch locations of 2 different position signallings to pass through in time (while there is the operation of dwindling by finger) dwindles the viewing area of zoom frame K1.After step S109, camera head 1 is transferred to step S110.
Then, touch test section 227a and detect whether zoom frame K1 has been carried out to slide (step S110).Particularly, touch test section 227a whether detect by user in the Nei region, border that comprises zoom frame K1 while touch mobile touch location, position signalling from the expression touch location of touch panel 217 is passed through in time and variation has occurred.Touch in test section 227a detects slide situation to zoom frame K1 (step S110: be), display control unit 227c moves the zoom frame K1 on live view image W1 and in display part 216, shows (step S111) according to slide.After step S111, camera head 1 is transferred to step S112.On the other hand, in the situation that touch test section 227a does not detect slide to zoom frame K1 (step S110: no), camera head 1 is transferred to step S112.
After this, in the situation that to sub-picture K2, operation is touched in existence (step S112: be), when zoom frame K1 is positioned at the position that comprises the picture center on live view image W1 (step S113: be), the zoom ratio of the optical zoom of photography control part 227b change camera lens part 3 is to be adjusted to the optical zoom of camera lens part 3 size (step S114) that is equivalent to zoom frame K1.Specifically as shown in Figure 8, in the situation that user has been undertaken touching operation by 217 couples of sub-picture K2 of touch panel, when zoom frame K1 is positioned at the position at the picture center on live view image W1 and is (Fig. 8 (a)) while partly comprising on the position of optical axis O of camera lens part 3, photography control part 227b controls lens control portions 313 via main body Department of Communication Force 225 and camera lens Department of Communication Force 312.And, by driving zoom drive portion 302, zoom lens 301 is moved along optical axis O direction, thus the zoom ratio of the optical zoom of camera lens part 3 is changed to gradually to the size ((a) → Fig. 8 of Fig. 8 (b)) of the zoom ratio that is equivalent to zoom frame K1.Thus, camera head 1 can zoom in or out the region of user's expectation in the zoom ratio of change optical zoom.In addition, can be also that, when camera head 1 carries out optical zoom, in the situation that stopped the input of position signalling from touch panel 217, photography control part 227b stops the driving of zoom lens 301 in the position of current zoom ratio.In addition can be also, in the situation that camera lens part 3 carries out optical zoom, photography control part 227b makes the 207a of cutting portion generate successively cutting image, until camera lens part 3 reaches the big or small zoom ratio of the zoom ratio that is equivalent to zoom frame K1, display control unit 227c makes display part 216 show the cutting image that the 207a of cutting portion generates.Thus, can be hypothetically with the suitable zoom ratio of the zoom ratio with zoom frame K1 come the optical zoom of display device head 3 arrive till the big or small zoom ratio of the zoom ratio that is equivalent to zoom frame K1 during, can instantaneously zoom in or out the region of user's expectation.
Then, by all press release-push 211b and input the 2nd release signal and carried out shooting operation in the situation that (step S115: be), photography control part 227b makes imaging apparatus 203 carry out still image photography (step S116), and the Imagery Data Recording that imaging apparatus 203 is generated is in recording medium 221(step S117).Now, can be also that display control unit 227c makes the image corresponding with view data for example, the enterprising line item browse displays scheduled time of display part 216 (3 seconds during).On the other hand, in the situation that not carrying out shooting operation by release-push 211b (step S115: no), camera head 1 is transferred to step S118.
After this, in the situation that make the power supply of camera head 1 disconnect (step S118: be) by operating power switch 211a, camera head 1 finishes this processing.On the other hand, in the situation that do not make the power supply of camera head 1 disconnect (step S118: no) by mains switch 211a, camera head 1 returns to step S101.
At step S112, sub-picture K2 has been carried out touching in the situation of operation (step S112: be), when the zoom frame K1 picture in live view image W1 when central authorities (step S113: no), photography control part 227b makes the 207a of cutting portion generate the cutting image that size is suitable with the zoom ratio of zoom frame K1, thereby carries out electronic zoom to be adjusted to the size (step S119) of the zoom ratio that is equivalent to zoom frame K1.Example as shown in Figure 9, in the situation that user has been undertaken touching operation by 217 couples of sub-picture K3 of touch panel, when the zoom frame K1 picture in live view image W1 when central authorities (Fig. 9 (a)), photography control part 227b makes the 207a of cutting portion generate the cutting image that size is suitable with the zoom ratio of zoom frame K1.And display control unit 227c is shown on the whole picture of display part 216 ((a) → Fig. 9 of Fig. 9 (b)) cutting image W2 that the 207a of cutting portion generates.Thus, camera head 1 in the situation that cannot carry out the optical zoom of camera lens part 3, also can zoom in or out by electronic zoom the region of user's expectation.After step S119, camera head 1 is transferred to step S115.
In step S112, what do not carry out zoom frame K1 by touch panel 217, touch operation (step S112: no), and carried out in the situation of half push of release-push 211b (step S120: be), camera head 1 is transferred to step S113.On the other hand, what do not carry out zoom frame K1 by touch panel 217, touch operation (step S112: no), also do not carry out in the situation of half push of release-push 211b (step S120: no), camera head 1 is transferred to step S121.
In step S121, on the live view image W1 that 227 judgements of main body control portion show at display part 216, follow the tracks of subject and whether be positioned at picture central authorities.Particularly, whether 227 judgements of main body control portion are positioned at the region of the picture central authorities on live view image W1 by the tracking subject of following the tracks of subject configuration part 207c setting, and are positioned at the region of zoom frame K1.On the live view image W1 showing at display part 216, follow the tracks of subject and be positioned in the situation of picture central authorities (step S121: be), camera head 1 is transferred to step S113.On the other hand, on the live view image W1 showing, follow the tracks of subject and be not positioned in the situation of picture central authorities (step S121: no) at display part 216, camera head 1 is transferred to step S115.
In step S122, photography control part 227b is adjusted to the region corresponding with touch location by the focus of camera lens part 3.Particularly, photography control part 227b controls lens control portion 313 via main body Department of Communication Force 225 and camera lens Department of Communication Force 312, drive focusing drive division 308, make thus focus lens 307 move along optical axis O direction, thereby the focus of camera lens part 3 is adjusted to the region corresponding with touch location.After step S122, camera head 1 is transferred to step S116.
The following describes in step S101, camera head 1 is not set to photograph mode (step S101: no), and camera head 1 is set to the situation (step S123: be) of reproduction mode.In this situation, display control unit 227c shows display part 216 and is recorded in the view data in recording medium 221, reproduces (step S124).
Then, there is in the situation of alter operation of Altered Graphs picture (step S125: be) display control unit 227c Altered Graphs picture (step S126).After this, camera head 1 returns to step S124.On the other hand, do not exist in the situation of alter operation of Altered Graphs picture (step S125: no), camera head 1 is transferred to step S118.
In step S123, camera head 1 is not set in the situation of reproduction mode (step S123: no), and camera head 1 is transferred to step S118.
According to described the 1st execution mode, the 207a of cutting portion intercepts the photography region corresponding with the predetermined zoom multiplying power of camera lens part 3 from the view data being generated by imaging apparatus 203, generate cutting image, the zoom ratio that photography control part 227b is corresponding according to the cutting image with being generated by the 207a of cutting portion, the optical zoom of control camera lens part 3.Can amplify the region that shows user's expectation by operation intuitively thus, and can not reduce image resolution ratio.
In addition, according to this 1st execution mode, display control unit 227c, using zoom frame K1 as the relevant information of the zoom area with camera lens part 3, is overlapped in respectively live view image W1 above and shows in display part 216 using the cutting image being generated by the 207a of cutting portion as sub-picture.Thus, on one side user can confirm the image of the zoom ratio of expectation, Yi Bian change to the zoom ratio of expectation.
In addition, according to this 1st execution mode, owing to can, by position or the size of touch panel 217 change zoom frames, therefore carrying out by operation intuitively the optical zoom of camera lens part 3.
In addition, according to this 1st execution mode, in the situation that having touched sub-picture K2 by touch panel 217, or inputted the situation of index signal of optical zoom of rotating mirror head 3 from input part 211, photography control part 227b makes camera lens part 3 carry out optical zoom, until become the size of the zoom ratio that is equivalent to zoom frame K1.User can be by the optical zoom that operation is expected intuitively thus.
In addition,, according to this 1st execution mode, in the situation that zoom frame is not present on the optical axis O of camera lens part 3, the cutting image K3 that display control unit 227c generates the 207a of cutting portion is shown on the whole picture of display part 216.Thus, in the situation that cannot carry out the optical zoom of camera lens part 3, also can hypothetically with the zoom ratio of user's expectation, show by electronic zoom.
In addition,, according to this 1st execution mode, when zoom frame K1 is greater than the photography region of the optical zoom based on camera lens part 3, the cutting image that display control unit 227c generates the 207a of cutting portion is shown on the whole picture of display part 216.Thus, in the situation that cannot carry out the optical zoom of camera lens part 3, also can hypothetically with the zoom ratio of user's expectation, in display part 216, show image by electronic zoom.
(the 2nd execution mode)
The 2nd execution mode of the present invention is then described.The camera head of this 2nd execution mode has the structure identical with the camera head of above-mentioned the 1st execution mode.Therefore the following describes the processing of the camera head execution of this 2nd execution mode.In addition, the structure identical with above-mentioned the 1st execution mode being marked to identical label describes.
Figure 10 means the flow chart of the summary of the processing that the camera head 1 of this 2nd execution mode is carried out.
First, illustrate that camera head 1 is set to the situation of photograph mode (step S201: be).In this situation, main body control portion 227 carries out camera lens communicate by letter (step S202) via main body Department of Communication Force 225 and camera lens Department of Communication Force 312 and lens control portion 313.Now, main body control portion 227 obtains lens properties from camera lens part 3.
Then, photography control part 227b drives photographic element drive division 204, makes thus imaging apparatus 203 carry out shooting (step S203).
After this, display control unit 227c makes display part 216 show the live view image (step S204) corresponding with the view data being generated by imaging apparatus 203.
Then,, in the situation that exist zoom frame to show (step S205: be) on the live view image that display part 216 shows, camera head 1 is transferred to step S210 described later.On the other hand, in the situation that do not exist zoom frame to show (step S205: no) on the live view image that display part 216 shows, camera head 1 is transferred to step S206 described later.
At step S206, touch test section 227a detection and whether touch panel 217 is touched.Touch in the situation of touch that test section 227a detects touch panel 217 (step S206: be), camera head 1 is transferred to step S207 described later.On the other hand, touch in the situation of touch that test section 227a do not detect touch panel 217 (step S206: no), camera head 1 is transferred to step S210 described later.
At step S207, display control unit 227c obtains the coordinate (step S207) of touch location, make display part 216 on live view image, show zoom frame (step S208), and sub-picture is overlapped on live view image and in display part 216, shows (step S209).Specifically as shown in figure 11, by touch panel 217, in the situation that carried out on the live view image W10 that display part 216 shows touching, display control unit 227c shows zoom frame K10 to the Nei region, cornerwise position that comprises 2 touch locations.And the right part that the sub-picture K11 that intercepts the region of zoom frame K10 by the 207a of cutting portion from view data and generate is overlapped in to live view image W10 shows.User can grasp photography region after zoom and the image after zoom thus.After step S209, camera head 1 is transferred to step S210.
Illustrate that display control unit 227c is overlapped in zoom frame K10 the computational methods of the viewing area while showing on live view image W10 herein.To be explanation display control unit 227c be overlapped in the figure of the computational methods of the viewing area while showing on live view image W10 by zoom frame K10 to Figure 12.In Figure 12, can consider following coordinate system: take the lower-left of display part 216 is initial point, and what take display part 216 is laterally X-axis, and what take display part 216 is longitudinally Y-axis.
As shown in figure 12, to establish the size of display part 216 in X-direction be X(f), zoom frame is in the situation that the size of X-direction is X(fmax), the focal length of current camera lens part 3 is that the maximum focal length of F, camera lens part 3 is Fmax, can pass through following formula (1) and calculate zoom frame in the indication range of X-direction.
X(fmax)=(X(f)×F)/Fmax…(1)
In addition, to establish the size of display part 216 in Y direction be Y(f), zoom frame is in the situation that the size of Y direction is Y(fmax), the focal length of current camera lens part 3 is that the maximum focal length of F, camera lens part 3 is Fmax, can pass through following formula (2) and calculate zoom frame in the indication range of Y direction.
Y(fmax)=(Y(f)×F)/Fmax…(2)
Like this, display control unit 227c is used formula (1) and formula (2), calculates live view image W10 and goes up the big or small X(fmax of the viewing area of zoom frame K10), Y(fmax).
Return to Figure 10, continue the later explanation of step S210.In step S210, touch test section 227a and detect whether zoom frame K10 has been carried out to slide.Touch test section 227a and detect (step S210: be) in the situation of the slide of zoom frame K10, camera head 1 is transferred to step S211 described later.On the other hand, touch test section 227a and do not detect (step S210: no) in the situation of the slide of zoom frame K10, camera head 1 is transferred to step S214 described later.
In step S211, display control unit 227c obtains expression and from having carried out the starting position of slide, plays the slip coordinate (step S211) that end slide user's finger has left the position of touch panel 217, makes display part 216 on live view image W10, show the zoom frame K10(step S212 corresponding with slip coordinate).And, sub-picture K11 is overlapped in to live view image W10 and above in display part 216, shows (step S213).As shown in figure 13, in the situation that zoom frame K10 has been carried out amplifying the slide of zoom ratio, display control unit 227c dwindles the viewing area of zoom frame K10 and shows in display part 216 according to slide example.And, the sub-picture K11 that intercepts the region of zoom frame K10 and generate is overlapped in to live view image W10 above in display part 216, shows by the 207a of cutting portion from view data.Thus, user can grasp photography region after zoom and the image after zoom.After step S213, camera head 1 is transferred to step S214.
Return to Figure 10, continue the later explanation of step S214.In step S214, release-push 211b has been carried out in the situation of half push (step S214: be), while showing zoom frame K10 on the live view image W10 showing at display part 216 (step S215: be), display control unit 227c deletes sub-picture K11(step S216 from live view image W10).
Then, photography control part 227b makes camera lens part 3 carry out optical zoom, until become the size (step S217) that is equivalent to zoom frame K1.Particularly, photography control part 227b is scaled the focal length of the camera lens part 3 corresponding with the viewing area of zoom frame K10 on live view image W10, the focal length that drives camera lens part 3 to obtain to become this conversion.
The control part of explanation photography herein 227b is scaled the computational methods of the focal length corresponding with the viewing area of zoom frame K10.Figure 14 is the figure of the summary of the explanation photography control part 227b conversion method that is scaled the focal length corresponding with the viewing area of zoom frame K10.In Figure 14, take the lower-left of display part 216 is initial point, and what take display part 216 is laterally X-axis.
As shown in figure 14, to establish the size of display part 216 in X-direction be X(f), the zoom frame K10 after slide is in the situation that the size of X-direction is X(fmin), the focal length of current camera lens part 3 is F, can pass through the focal length Fmin that following formula (3) calculates the camera lens part 3 after slide.
Fmin=(X(f)×F)/X(fmin)…(3)
Like this, photography control part 227b is used formula (3), is scaled the focal length Fmin of the camera lens part 3 corresponding with the viewing area of zoom frame K10 on live view image W10.Thus, as shown in figure 15, photography control part 227b controls lens control portion 313 via main body Department of Communication Force 225 and camera lens Department of Communication Force 312.And the focal length Fmin of the camera lens part 3 calculating according to use formula (3), drives zoom drive portion 302, makes thus zoom lens 301 move along optical axis O, changes to the zoom ratio of the optical zoom suitable with zoom frame K10.
After step S217 or in the situation that do not show zoom frame K10 (step S215: no) on the shown live view image W10 of display part 216, camera head 1 is transferred to step S218.
Then, photography control part 227b makes AE handling part 208 carry out AE processing, and makes AF handling part 209 carry out AF processing (step S218).After step S218 or in the situation that release-push 211b not being carried out to half push (step S214: no), camera head 1 is transferred to step S219.
After this, by release-push 211b being carried out to full push carried out shooting operation in the situation that (step S219: be), photography control part 227b makes imaging apparatus 203 carry out still images photographies (step S220).
Then the Imagery Data Recording (step S221) in recording medium 221 that, photography control part 227b generates imaging apparatus 203.
After this, in the situation that make the power supply of camera head 1 disconnect (step S222: be) by mains switch 211a is operated, camera head 1 finishes this processing.On the other hand, in the situation that do not make the power supply of camera head 1 disconnect (step S222: no) by mains switch 211a is operated, camera head 1 returns to step S201.
In step S219, in the situation that do not carry out the shooting operation (step S219: no) of release-push 211b, camera head 1 is transferred to step S222.
The following describes at step S201, camera head 1 is not set to the situation (step S201: no) of photograph mode.In this situation, camera head 1 is carried out the image reproducing processing (step S223) that makes display part 216 show the image corresponding with being recorded in view data in recording medium 221 and reproduce.After this camera head 1 is transferred to step S222.
According to described above the 2nd execution mode, the 207a of cutting portion intercepts the photography region corresponding with the predetermined zoom multiplying power of camera lens part 3 from the view data being generated by imaging apparatus 203, generate cutting image, photography control part 227b controls the optical zoom of camera lens part 3 according to the zoom ratio of the cutting image being generated by the 207a of cutting portion.User can be amplified and be shown desired region by operation intuitively thus, and does not reduce image resolution ratio.
(the 3rd execution mode)
The present invention's the 3rd execution mode is then described.The camera head of this 3rd execution mode has the structure identical with the camera head of above-mentioned the 1st execution mode.Therefore the following describes the processing of the camera head execution of this 3rd execution mode.In addition, the structure identical with described the 1st execution mode marked identical label and described.
Figure 16 means the flow chart of the summary of the processing that the camera head 1 of this 3rd execution mode is carried out.
In Figure 16, step S301~step S309 corresponds respectively to step S201~step S209 of above-mentioned Figure 10.
In step S310, follow the tracks of subject configuration part 207c according to the characteristic quantity region that comprises touch location being detected by characteristic quantity test section 207b, start to follow the tracks of the subject as target between adjacent image.Specifically as shown in figure 17, follow the tracks of subject configuration part 207c according to by characteristic quantity test section 207b to by touching the detected characteristic quantity of target T1 of appointment, start to follow the tracks of the subject as target between adjacent image.After step S310, camera head 1 is transferred to step S311.
Step S311~step S314 corresponds respectively to step S210~step S213 of Figure 10.
In step S315, main body control portion 227 judges whether target T1 is positioned at the zoom frame K20 of camera lens part 3.Specifically as shown in figure 18, judge whether target T1 is positioned at zoom frame K20.In the situation that main body control portion 227 is judged as target T1 and is positioned at zoom frame K20 (step S315: be), camera head 1 is transferred to step S316.On the other hand, in the situation that main body control portion 227 is judged as target T1 and is not positioned at zoom frame K20 (step S315: no), camera head 1 is transferred to step S320.
In step S316, the live view image W20 that display control unit 227c shows from display part 216, delete sub-picture K21.
Then, photography control part 227b carries out the optical zoom of zoom portion 3, until become the size (step S317) suitable with current target T1.
After this, in the situation that target T1 leaves from zoom frame K20 (step S318: be), photography control part 227b stops the optical zoom (step S319) of camera lens part 3.After step S319, camera head 1 is transferred to step S320.On the other hand, in the situation that target T1 does not leave from zoom frame K20 (step S318: no), camera head 1 is transferred to step S320.
Then, in the situation that carried out half push (step S320: be) of release-push 211b, photography control part 227b makes AE handling part 208 carry out AE processing, and makes AF handling part 209 carry out AF processing (step S321).After step S321 or in the situation that do not carry out half push (step S320: no) of release-push 211b, camera head 1 is transferred to step S322.
Step S322~step S326 corresponds respectively to step S219~step S223 of Figure 10.
According to described above the 3rd execution mode, in the situation that it is consistent with zoom frame K20 to follow the tracks of the target T1 of subject, photography control part 227b drives camera lens part 3 to the sizable optical zoom with zoom frame K20.Thus, user only makes zoom frame K20 aim at the target T1 that follows the tracks of subject, the photography of the zoom ratio that just can expect by the photography direction of operation camera head 1.
In addition, according to this 3rd execution mode, in the situation that follow the tracks of the target T1 of subject, leave zoom frame K20, photography control part 227b stops the optical zoom of camera lens part 3.Can prevent thus the situation that the less desirable subject of user is exaggerated.
(the 4th execution mode)
The following describes the 4th execution mode of the present invention.The camera head of this 4th execution mode has the structure identical with the camera head of above-mentioned the 1st execution mode, and the processing that only camera head is carried out is different.Therefore the processing that only camera head of this 4th execution mode of explanation is carried out below.And the structure identical with above-mentioned the 1st execution mode marked identical label and described.
Figure 19 means the flow chart of the summary of the processing that the camera head 1 of this 4th execution mode is carried out.
Step S401~step S411 corresponds respectively to step S201~step S211 of Figure 10.
In step S412, whether the coordinate that main body control portion 227 judges sliding position is in the region of the optical zoom of camera lens part 3.In main body control portion 227, be judged as the coordinate of sliding position in the situation that in the region of the optical zoom of camera lens part 3 (step S412: be), camera head 1 is transferred to step S413 described later.On the other hand, the coordinate that judges sliding positions in main body control portion 227 is not in the situation that in the region of the optical zoom of camera lens part 3 (step S412: no), camera head 1 is transferred to step S417 described later.
In step S413, in the situation that the optical zoom of camera lens part 3 reaches the limit of (step S413: be), display control unit 227c makes that the minimum electronic zoom frame corresponding with the limiting value of optical zoom is overlapping to be shown on the live view image that display part 216 shows (step S414).On the other hand, in the situation that the optical zoom of camera lens part 3 does not reach the limit of (step S413: no), display control unit 227c makes that the optical zoom frame based on optical zoom is overlapping to be shown on the live view image that display part 216 shows (step S415).
Then, display control unit 227c intercepts for zoom frame K30 or electronic zoom frame the 207a of cutting portion and the sub-picture K31 that generates is overlapped in live view image W30 shows (step S416) from view data.Concrete as shown in Figure 21 (a), display control unit 227c is shown on the live view image W30 of display part 216 demonstrations zoom frame K30.After step S416, camera head 1 is transferred to step S420.
In step S417, display control unit 227c is shown on the live view image W30 that display part 216 shows the minimum zoom frame corresponding with the limiting value of the optical zoom of camera lens part 3.
Then, display control unit 227c makes the electronic zoom frame based on electronic zoom be shown in the live view image W30 upper (step S418) that display part 216 shows, makes sub-picture be shown in live view image W30 upper (step S419).Specifically as shown in figure 21, display control unit 227c makes after the minimum zoom frame K33 corresponding with the limiting value of the optical zoom of camera lens part 3 be shown on the live view image W30 that display part 216 shows, to make electronic zoom frame K34 be shown in live view image W30 upper ((a) → Figure 21 of Figure 21 (b)).Now, display control unit 227c intercepts the 207a of cutting portion the sub-picture K31 generating and is overlapped in live view image W30 and shows from the view data region corresponding with electronic zoom frame K34.After step S419, camera head 1 is transferred to step S420.
Step S420~step S423 corresponds respectively to step S214~step S217 of Figure 10.
In step S424, in the situation that the optical zoom of camera lens part 3 reaches the limit of (step S424: be), photography control part 227b carries out optical zoom from being undertaken by camera lens part 3 to the switching (step S445) of the electronic zoom being undertaken by the 207a of cutting portion.After this camera head 1 is transferred to step S426.On the other hand, in the situation that the optical zoom of camera lens part 3 does not reach the limit of (step S424: no), camera head 1 is transferred to step S426.
Step S426 and step S427 correspond respectively to step S218 and the step S219 of Figure 10.
In step S428, photography control part 227b makes imaging apparatus 203 carry out still image photography.In this situation, when the optical zoom of camera lens part 3 reaches the limit of, imaging control part 227b make the 207a of cutting portion from by imaging apparatus 203 by the still image intercepting region suitable with electronic zoom frame K34 in the view data generating of photographing, generation cutting image.
Step S429~step S431 corresponds respectively to step S221~step S223 of Figure 10.
According to described above the 4th execution mode, the 207a of cutting portion intercepts the photography region corresponding with the predetermined zoom multiplying power of camera lens part 3 from the view data being generated by imaging apparatus 203, generate cutting image, photography control part 227b, according to the zoom ratio of the cutting image being generated by the 207a of cutting portion, controls the optical zoom of camera lens part 3.Can amplify the region that shows user's expectation by operation intuitively thus, and not reduce image resolution ratio.
In addition,, according to this 4th execution mode, in the situation that zoom frame K31 is not present on the optical axis O of camera lens part 3, the cutting image that display control unit 227c generates the 207a of cutting portion is shown on the whole picture of display part 216.Thus, in the situation that cannot carry out the optical zoom of camera lens part 3, also can hypothetically with the zoom ratio of user's expectation, show by electronic zoom.
In addition, according to this 4th execution mode, in the situation that the photography region zoom frame of the optical zoom based on camera lens part 3 is larger, the cutting image that display control unit 227c generates the 207a of cutting portion is shown on the whole picture of display part 216.Thus, in the situation that cannot carry out the optical zoom of camera lens part 3, also can hypothetically with the zoom ratio of user's expectation, on display part 216, show image by electronic zoom.
(the 5th execution mode)
The following describes the present invention's the 5th execution mode.This 5th execution mode has the structure identical with the camera head of above-mentioned the 1st execution mode, also possesses the structure shown in Figure 22.Therefore the structure of camera head and the processing of execution of following this 5th execution mode of only explanation.And the structure identical with above-mentioned the 1st execution mode marked identical label and described.
External equipment 4 can be not only flash unit, can also be EVF(Electronic View Finder: electronic viewfinder) device.EVF also has eyepiece camera lens 402, semi-transparent semi-reflecting lens 403, liquid crystal 404, EVF line-of-sight detection transducer 405 for EVF except Department of Communication Force 401.EVF is the control based on main body control portion 227 with line-of-sight detection transducer 405, according to the sight line of the motion detection subject of subject pupil.
In addition, line-of-sight detection transducer 405 also can be positioned at around the touch panel 217 of main part 2, as line-of-sight detection transducer 228 for the touch panel of the back side.
The following describes the processing of camera head 1 execution with above structure.Figure 23 means the flow chart of the summary of the processing that camera head 1 is carried out.
Step S101~step S126 corresponds respectively to step S101~step S126 of Fig. 4.
Step S106: after result is no processing, carry out the processing of the console switch 211c of step S127.Particularly, at step S127, detect whether by console switch 211c, carried out position or the big or small alter operation for the zoom area of zoom frame K1, wherein, in the situation that console switch 211c is for example dial switch, this is operating as the direction that changes driver plate, in the situation that console switch 211c is for example crossbar switch, this is operating as presses up and down or the switch of left and right.In the situation that carried out the operation (step S127: be) for the zoom area change of zoom frame K1, camera head 1 is transferred to step S107.On the other hand, do not carry out in the situation of the operation of the zoom area change of zoom frame K1 (step S127: no), camera head 1 is transferred to step S128.
Step S127: after result is no processing, carry out the processing based on sight line of step S128.Particularly, in step S128, by line-of-sight detection transducer 405 for the EVF in the external equipments such as line-of-sight detection transducer 228 or EVF for the touch panel of the back side, whether detect for example by the variation of watching position attentively to four angular vertexs of zoom frame K1, carried out the position of zoom area or the alter operation of size.In the situation that carried out the operation (step S128: be) to the zoom area change of zoom frame K1, camera head 1 is transferred to step S107.On the other hand, do not carry out (step S128: no) in the situation of the zoom area alter operation of zoom frame K1, camera head 1 is transferred to step S110.
Step S121: after result is no processing, carry out the processing based on sight line of step S129.Particularly, in step S129, by line-of-sight detection transducer 405 for the EVF in the external equipments such as line-of-sight detection transducer 228 or EVF for the touch panel of the back side, whether detect by more than sub-picture K2 is watched attentively to the scheduled time, carried out the zoom executable operations based on sight line.In the situation that carried out the zoom executable operations (step S129: be) based on sight line, photography control part 227b makes camera lens part 3 carry out optical zoom, until become the size suitable with the zoom ratio of zoom frame K1, camera head 1 is transferred to step S113.On the other hand, in the situation that do not carry out the zoom executable operations (step S129: no) based on sight line, camera head 1 is transferred to step S115.
According to described above the 5th execution mode, the 207a of cutting portion intercepts the photography region corresponding with the predetermined zoom multiplying power of camera lens part 3 from the view data being generated by imaging apparatus 203, generate cutting image, the zoom ratio that photography control part 227b is corresponding according to the cutting image with being generated by the 207a of cutting portion, the optical zoom of control camera lens part 3.Can amplify the region that shows user's expectation by operation intuitively thus, and not reduce image resolution ratio.
In addition, according to this 5th execution mode, can with the EVF in the external equipments such as line-of-sight detection transducer 228 or EVF, with line-of-sight detection transducer 405, change position or the size of zoom frames by console switch 211c and back side touch panel, therefore can carry out by operation intuitively the optical zoom of camera lens part 3.
In addition, according to this 5th execution mode, in the situation that inputted the index signal of the optical zoom that is used to indicate camera lens part 3 from back side touch panel with line-of-sight detection transducer 405 with the EVF in the external equipments such as line-of-sight detection transducer 228 or EVF, photography control part 227b makes camera lens part 3 carry out optical zoom, until become the size suitable with the zoom ratio of zoom frame K1.User can be by the optical zoom that operation is expected intuitively thus.
(other execution modes)
Explanation is that have can be with respect to the digital single-lens reflex camera of the camera lens part of main part disassembled and assembled freely in the above-described embodiment, but for example also can be set to one with the camera lens part that comprises optical system by main part.
In addition, in above-mentioned execution mode, at camera lens part, be provided with the operation ring of ring-type, but such as push switch or rod-type diverter switch etc. also can be set.
In addition, in above-mentioned execution mode, eyepiece display part and main part form as one, but can be also, and eyepiece display part can be with respect to main part disassembled and assembled freely.
In addition, camera head of the present invention is except being used as executed in compact digital cameras, digital single-lens reflex camera, such as can also and possessing the mobile phone of camera function and the electronic equipment such as flat mobile device as Digital Video that can installation accessories etc.
In addition, in the explanation of the flow chart of this specification, use the performances such as " first ", " after this ", " following " clearly to represent the context of processing between step, yet not by these performances, determine uniquely for implementing the order of processing required for the present invention.That is, can in the scope that does not have contradiction, change the processing sequence in flow chart described in this specification.
As mentioned above, the present invention can comprise various execution modes not described here, can in the scope by the determined technological thought of claims, carry out various design alterations etc.