CN102843512A - Photographic device, photographic method - Google Patents

Photographic device, photographic method Download PDF

Info

Publication number
CN102843512A
CN102843512A CN2012102104699A CN201210210469A CN102843512A CN 102843512 A CN102843512 A CN 102843512A CN 2012102104699 A CN2012102104699 A CN 2012102104699A CN 201210210469 A CN201210210469 A CN 201210210469A CN 102843512 A CN102843512 A CN 102843512A
Authority
CN
China
Prior art keywords
photograph
taken
camera head
mainly
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012102104699A
Other languages
Chinese (zh)
Other versions
CN102843512B (en
Inventor
石原晴之
福谷佳之
野中修
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp filed Critical Olympus Imaging Corp
Priority to CN201610326097.4A priority Critical patent/CN105827985B/en
Publication of CN102843512A publication Critical patent/CN102843512A/en
Application granted granted Critical
Publication of CN102843512B publication Critical patent/CN102843512B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)

Abstract

The invention provides a photographic device and photographic method, which can be used to help the photographer with the expected picture of the photographic composition, even under the condition of shooting the moving objects. The photographic device is provided with a shooting part, which can be used to generate the image data continuously; a moving direction determining part, which can be used to determine the moving direction of the photographic device; a movement detection part, which can be used to detect a plurality of images of objects being shot included by the image data; a determination part of main objects to be shot, which can be used to determine the main objects to be shot according to the moving direction determined by the moving direction determining part and the movement of the objects to be shot detected by the above mentioned movement detection part.

Description

Camera head, image capture method
Technical field
The present invention relates to be taken the photograph body and carry out camera head, the image capture method that opto-electronic conversion generates the view data of electronics through shooting.
Background technology
In recent years, in camera heads such as digital camera and digital camera, known have such technology: when being taken the photograph can photograph automatically when body becomes specific expression or posture (with reference to patent documentation 1).In this technology, from the live view image that generates successively by image pickup part, extract the face of being taken the photograph body, photograph when the face of taking the photograph body when this quilt that extracts is consistent with specific pattern.
[patent documentation 1] TOHKEMY 2004-294498 communique
Yet; In above-mentioned technology; Under being taken the photograph the situation that body is in to-and-fro situation in the photography; Take in quilt on the live view image at every turn and take the photograph the position and the photography composition of the face of body and all change, thereby the composition of the face that causes from the live view image, extracting when consistent with specific pattern change, differ to take with predetermined composition surely and mainly taken the photograph body.
Summary of the invention
The present invention In view of the foregoing makes; Even the purpose of this invention is to provide and a kind ofly in photography, come up away under the situation that rises being taken the photograph body as photography target main, the cameraman also can photograph absorption with predetermined pattern and mainly taken the photograph the camera head of the image of body, image capture method.
In order to solve above-mentioned problem and to achieve the goal, the camera head that the present invention relates to has: image pickup part, and it generates the view data of being taken the photograph body continuously; The moving direction detection unit, it judges the moving direction that this camera head moves; Move test section, its detection is included in above-mentioned in a plurality of said view data and is taken the photograph in the picture of body and move; Mainly taken the photograph the body detection unit, it is according to the moving direction that is determined by said moving direction detection unit with by moving in the detected above-mentioned picture of being taken the photograph body of said mobile test section, and the said body of being taken the photograph is judged to be mainly and is taken the photograph body.
And the camera head that the present invention relates to is characterized in that, in foregoing invention, said camera head has the photography control part, and this photography control part is taken by said taken the photograph mainly that the body detection unit determines said and mainly taken the photograph the image that body is positioned at presumptive area.
And; The camera head that the present invention relates to; It is characterized in that; In foregoing invention, mainly to be taken the photograph under the repeatedly consistent situation of body candidate moving direction that moves and the said moving direction that determines by said moving direction detection unit said, the said body detection unit of mainly being taken the photograph is mainly taken the photograph the body candidate and is set at and is saidly mainly taken the photograph body said.
And; The camera head that the present invention relates to is characterized in that, in foregoing invention; Said camera head has display part, and this display part is judged to be the said quilt of mainly being taken the photograph body and takes the photograph the image that body is positioned at the picture predetermined portions can identification mode to show by the said body detection unit of mainly being taken the photograph.
And the camera head that the present invention relates to is characterized in that, in foregoing invention, said camera head also has:
Input part, it accepts the input of from least a portion of said a plurality of images of said display part demonstration, selecting the selection signal of image; And image data storage portion, it stores the corresponding pairing view data of image of said selection signal of accepting with said input part.
And; The camera head that the present invention relates to is characterized in that, in foregoing invention; Said camera head also has Contrast Detection portion; Said Contrast Detection portion is according to said image detection contrast, and said moving direction detection unit is judged said moving direction according to the variation of the detected contrast of said Contrast Detection portion.
And the image capture method that the present invention relates to is carried out by camera head, it is characterized in that, said image capture method is carried out following steps: the shooting step generates the view data of being taken the photograph body continuously; The moving direction determination step is judged the moving direction that this camera head moves; Move to detect step, detect and be included in above-mentioned in a plurality of said view data and taken the photograph in the picture of body and move; Mainly taken the photograph the body determination step,, the said body of being taken the photograph is judged to be mainly and is taken the photograph body according to the moving direction that determines by said moving direction determination step with by moving in the detected above-mentioned picture of being taken the photograph body of said mobile detection step.
According to the present invention; In that body couple candidate detection portion is detected mainly to be taken the photograph under the situation that the moving direction of body candidate at the camera head that is determined by the moving direction detection unit move by mainly being taken the photograph; Mainly taken the photograph the body configuration part and will be mainly be taken the photograph the body candidate and be set at and mainly taken the photograph body, image detection portion detects from the interim image stored group of interim storage part by what mainly taken the photograph the setting of body configuration part and is mainly taken the photograph the image that body is positioned at presumptive area.Thus, obtain such effect:, also can extract and take the image that main quilt is taken the photograph body with predetermined pattern even the cameraman adopts make photography target main to be taken the photograph under the situation that body moves in photography.
Description of drawings
Fig. 1 is the figure in the face of the structure of a side of being taken the photograph body that camera head that execution mode 1 of the present invention relates to is shown.
Fig. 2 is the figure in the face of the structure of cameraman's a side that camera head that execution mode 1 of the present invention relates to is shown.
Fig. 3 is the block diagram that the structure of the camera head that execution mode 1 of the present invention relates to is shown.
Fig. 4 is the flow chart that the summary of the processing that camera head that execution mode 1 of the present invention relates to carries out is shown.
Fig. 5 is the figure of the situation when the cameraman being shown using camera head to take.
Fig. 6 is the main flow chart of being taken the photograph the summary of body candidate determination processing that Fig. 4 is shown.
Fig. 7 schematically shows mainly to be taken the photograph the body detection unit and judge mainly the figure of the situation when taking the photograph body.
Fig. 8 is the figure that is illustrated in an example of image corresponding with the view data of image pickup part generation under the situation shown in Figure 7.
Fig. 9 is the figure that is illustrated in the relation in the photography zone of being taken the photograph body and camera head under the situation shown in Figure 7.
Figure 10 illustrates the flow chart that slip image shown in Figure 4 shows the summary of handling.
Figure 11 is the figure that an example of display part images displayed is shown.
Figure 12 is the block diagram that the structure of the camera head that execution mode 2 of the present invention relates to is shown.
Figure 13 illustrates the main flow chart of being taken the photograph the summary of body candidate determination processing that camera head that execution mode 2 of the present invention relates to carries out.
Figure 14 is the figure of the situation of schematic illustration moving direction detection unit when judging the mobile status of camera head.
Figure 15 be Figure 14 to the vertical view of looking the A direction.
Figure 16 be Figure 14 to the end view of looking the B direction.
Figure 17 is the speed of schematic illustration cameraman camera head when camera head is moved and the figure of the detected relationship with acceleration of acceleration detecting section.
Figure 18 is the figure of the situation of schematic illustration moving direction detection unit when judging the mobile status of camera head.
Figure 19 is the figure of relation of the detected testing result of speed and orientation detection portion of the camera head of schematic illustration cameraman when camera head is moved.
Figure 20 is the figure of the situation of schematic illustration moving direction detection unit when judging the mobile status of camera head.
Figure 21 is the figure of relation of the acceleration of gravity of optical axis direction and vertical direction in the camera head of schematic illustration cameraman when camera head is moved.
Figure 22 is the block diagram that the structure of the camera head that execution mode 3 of the present invention relates to is shown.
Figure 23 is the flow chart that the summary of the action that camera head that execution mode 3 of the present invention relates to carries out is shown.
Figure 24 is the figure of the situation when schematically showing Contrast Detection portion and detecting the contrast of the view data that is generated by image pickup part.
Figure 25 is the figure that is illustrated in the contrast of the detected view data of Contrast Detection portion under the situation shown in Figure 24 and the relation from camera head to the photo distance of being taken the photograph body.
Figure 26 is the main flow chart of being taken the photograph the summary of body candidate determination processing that Figure 23 is shown.
Label declaration
1,100,200: camera head; 2: image pickup part; 3: acceleration detecting section; 4: timer; 5: illuminating part; 6: operation inputting part; 7: display part; 8: touch panel; 9,209: storage part; 10,210: control part; 21: camera lens part; 22: lens driving portion; 23: aperture; 24: the aperture drive division; 25: shutter; 26: the shutter drive division; 27: imaging apparatus; 28: the shooting drive division; 29: signal processing part; 61: mains switch; 62: release-push; 63: the photograph mode diverter switch; 64: menu switch; 91: image data storage portion; 92: program storage part; 93: interim storage part; 94: the contrast storage part; 101: image processing part; 102: the moving direction detection unit; 103: mainly taken the photograph body couple candidate detection portion; 104: mainly taken the photograph the body configuration part; 105: image detection portion; 106: the information appendix; 107: the photography control part; 108: display control unit; 110: orientation detection portion; 211: Contrast Detection portion.
Embodiment
Below, be used for the mode (below be called " execution mode ") of embodiment of the present invention with reference to description of drawings.In addition, the present invention is not limited by the execution mode of following explanation.And, in the accompanying drawing record, same section is enclosed same numeral.
(execution mode 1)
Fig. 1 is the figure in the face of the structure of a side (front face side) of being taken the photograph body that camera head 1 that execution mode 1 of the present invention relates to is shown.Fig. 2 is the figure in the face of the structure of cameraman's a side (rear side) that camera head 1 that execution mode 1 of the present invention relates to is shown.Fig. 3 is the block diagram that the structure of the camera head 1 that execution mode 1 of the present invention relates to is shown.
Like Fig. 1~shown in Figure 3, camera head 1 has: image pickup part 2, acceleration detecting section 3, timer 4, illuminating part 5, operation inputting part 6, display part 7, touch panel 8, storage part 9 and control part 10.
Image pickup part 2 is taken predetermined area of visual field and is generated view data.Image pickup part 2 has: camera lens part 21, lens driving portion 22, aperture 23, aperture drive division 24, shutter 25, shutter drive division 26, imaging apparatus 27, shooting drive division 28 and signal processing part 29.
Camera lens part 21 is made up of a plurality of set of lenses that can focus with zoom, from predetermined area of visual field converging light.Lens driving portion 22 uses the motor of step modes or direct current to constitute, and moves along optical axis O1 through the set of lenses that makes camera lens part 21, carries out the change of focal position and the focal length etc. of camera lens part 21.
The adjustment that makes public of the amount of incident of the light that aperture 23 is assembled through restriction camera lens part 21.Aperture drive division 24 is made up of stepping motor etc., drives aperture 23.
Shutter 25 is exposure status or shading state with the setting state of imaging apparatus 27.Shutter drive division 26 is made up of stepping motor etc., drives shutter 25 according to release signal.
Imaging apparatus 27 is by CCD (Charge Coupled Device, charge coupled device) or CMOS formations such as (Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductors (CMOS)s).Imaging apparatus 27 carries out opto-electronic conversion through the light that receives camera lens part 21 convergences, converts light into the signal of telecommunication (analog signal).Shooting drive division 28 generates the commutator pulse that drives imaging apparatus 27, will carry out the signal of telecommunication after the opto-electronic conversion by imaging apparatus 27 and output to signal processing part 29.
Signal processing part 29 is made up of analogue amplifier, A/D converter etc.29 pairs of signals of telecommunication from imaging apparatus 27 outputs of signal processing part implement to amplify signal processing such as (gain adjustment), change the direct picture data that convert numeral into and output to control part 10 through carrying out A/D afterwards.
Acceleration detecting section 3 uses the acceleration transducer of the capacitance type that forms through MEMS (Micro Electro Mechanical Systems, MEMS) technology to constitute.Acceleration detecting section 3 has the detection side of acceleration to orthogonal three acceleration transducers.Specifically; As the intrinsic coordinate system of camera head 1; Get the x axle parallel, the y axle parallel and the z axle parallel, three acceleration transducers that detect each axial component of acceleration respectively are installed in the precalculated position of camera head 1 with the optical axis O1 of image pickup part 2 with the vertical direction of camera head 1 with the Width of camera head 1.According to the acceleration detecting section with this structure 3, make camera head 1 under the mobile situation of object side (z direction), horizontal direction (x direction) and vertical direction (y direction) the cameraman, can accurately detect the acceleration that moves generation through this.And acceleration detecting section 3 detects the acceleration of all directions of camera head 1 under the horizontal direction situation about equally of the transverse direction of display part 7 images displayed and camera head 1.According to the acceleration detecting section with this structure 3, make camera head 1 under the mobile situation of object side (z direction) the user, can accurately detect the acceleration that moves generation through this.
Timer 4 has the decision-making function of clocking capability and photography time on date.Timer 4 outputs to control part 10 with the date time data, so that to the additional date time data of captured view data.
Illuminating part 5 uses xenon lamp or LED formations such as (Light Emitting Diode, light-emitting diodes).The area of visual field that illuminating part 5 is taken to camera head 1 shines the flash light as fill-in light.
Operation inputting part 6 has: mains switch 61, and its power supply status with camera head 1 switches to on-state or off-state; Release-push 62, it accepts the input of the release signal that the photography indication is provided; Photograph mode diverter switch 63, it switches in the various photograph modes of setting in the camera head 1; And menu switch 64, it sets the various parameters of camera head 1.
Display part 7 uses to be realized by the display floater of liquid crystal or organic EL formations such as (Electro Luminescence, electroluminescence).Display part 7 shows the image corresponding with view data.Display part 7 shows information relevant with the action indication of camera head 1 and the photographic information relevant with photography.
Touch panel 8 is located in the display frame of display part 7.Touch panel 8 detects users and contacts the position of (touch) according to the information that on display part 7, shows, accepts the input of the index signal corresponding with this detected contact position.Generally, as touch panel, resistive film mode, electrostatic capacitance mode, optical mode etc. are arranged.In this execution mode 1, the touch panel of arbitrary mode can both be used.In addition, in this execution mode 1, touch panel 8 is carried out function as input part.
Storage part 9 uses the flash memory or the DRAM semiconductor memories such as (Dynamic Random Access Memory, dynamic random access memory) of the inside that fixedly is located at camera head 1 to realize.Storage part 9 has: image data storage portion 91, its storing image data; Program storage part 92, various programs and various data of in program implementation, using and parameter etc. that its storage camera head 1 is carried out; And interim storage part 93, the various contents during it is stored the image pickup part 2 continuous a plurality of view data that generate temporarily and handles.In addition, storage part 9 can comprise the storage medium that the storage card installed from the outside etc. can be read by computer.
Control part 10 uses CPU formations such as (Central Processing Unit, CPU).Control part 10 is unified to control to the action of camera head 1 according to carrying out indication corresponding with each one of formation camera head 1 and forwarding of data etc. from the index signal of operation inputting part 6 and touch panel 8 and switching signal etc.
The detailed structure of control part 10 is described.Control part 10 has: image processing part 101, moving direction detection unit 102, mainly taken the photograph body couple candidate detection portion 103, mainly taken the photograph body configuration part 104, image detection portion 105, information appendix 106, photography control part 107 and display control unit 108.
101 pairs of view data of image processing part are implemented various image processing.Specifically, 101 pairs of view data of image processing part comprise the emphasical processing in edge, white balance processing and γ treatment for correcting in interior image processing.Image processing part 101 carries out the processed compressed and the decompression of view data according to JPEG (Joint Photographic Experts Group, JPEG) mode etc.
Moving direction detection unit 102 is judged the moving direction that camera head 1 moves according to the testing result of acceleration detecting section 3.Specifically, moving direction detection unit 102 judges according to the variation of the acceleration of acceleration detecting section 3 detected horizontal directions whether camera head 1 moves in the horizontal direction.
Mainly taken the photograph the variation of body couple candidate detection portion 103 according to the image information that comprises respectively in a plurality of view data that generate continuously, the mobile object that will in picture, move along with the process of time is surveyed to promptly mainly being taken the photograph the body candidate according to the definite main candidate who is taken the photograph body of moving direction.Here, image information is marginal information, colouring information, monochrome information and deep or light information.The a plurality of view data each side of being taken the photograph mainly that 103 pairs in body couple candidate detection portion generates continuously carries out predetermined process; For example edge detection process and binary conversion treatment etc., the mobile object that will in picture, move along with the process of time is surveyed to promptly mainly being taken the photograph the body candidate according to the definite main candidate who is taken the photograph body of moving direction.Specifically, mainly taken the photograph image information, for example colouring information along with the process of time mobile position or the distance of body couple candidate detection portion 103, detect the mobile quilt that in picture, moves along with the process of time and take the photograph body according to each pixel.In addition, mainly being taken the photograph body couple candidate detection portion 103 can use the mobile object that pattern matching or other known technology will move to survey to mainly being taken the photograph the body candidate.
Body couple candidate detection portion 103 is detected is mainly taken the photograph the body candidate under situation about moving on the moving direction of the camera head that is determined by moving direction detection unit 102 1 mainly being taken the photograph, and is mainly taken the photograph body configuration part 104 and will mainly be taken the photograph the body candidate and be set at mainly and taken the photograph body.
Image detection portion 105 detects from interim storage part 93 interim image stored groups by what taken the photograph mainly that body configuration part 104 sets and is mainly taken the photograph the image that body is positioned at presumptive area.Specifically, image detection portion 105 detects from interim storage part 93 image stored groups by what taken the photograph mainly that body configuration part 104 sets and is mainly taken the photograph the image that body is positioned at presumptive area, for example substantial middle portion.
Information appendix 106 is to being the information of center image as expression with the image detection portion corresponding view data additional marking of 105 detected center image.
Photography control part 107 begins the control of the photography action in the camera head 1 under the situation of having imported release signal.Here, the action of the photography in the camera head 1 is meant that 101 pairs of drivings through shooting drive division 28 of signal processing part 29 and image processing part make the view data of imaging apparatus 27 outputs implement the action of predetermined process.View data after having been implemented like this to handle is stored in the image data storage portion 91 under the control of photography control part 107.And photography control part 107 is not imported via release-push 62 under the situation of release signal, and the view data that image pickup part 2 is exported is continuously pressed the output sequential storage in interim storage part 93.
Display control unit 108 makes display part 7 show the corresponding image of view data that generates with image pickup part 2.Display control unit 108 makes display part 7 show the photographs corresponding with captured view data, and makes display part 7 show at least a portion in a plurality of images that comprise in the interim storage part 93 image stored data sets successively.
Can make camera head 1 have the electronic viewfinder (EVF) of sound input/output function, disassembled and assembled freely and can carry out Department of Communication Force of two-way communication etc. via external treatment devices (not shown) such as the Internet and personal computers with above structure.
Then, the processing that camera head 1 that this execution mode 1 relates to carries out is described.Fig. 4 is the flow chart that the summary of the processing that camera head 1 carries out is shown.
In Fig. 4, the situation (step S101: be) that camera head 1 is set to photograph mode describes.In this case, camera head 1 makes image pickup part 2 take predetermined area of visual field and generate view data (step S102) under the control of photography control part 107, and the view data that is generated is stored in the interim storage part 93 (step S103) temporarily.
Then, display control unit 108 makes display part 7 show the view data corresponding real-time viewfinder image (step S104) that generates with image pickup part 2.
Fig. 5 is the figure of the situation when the cameraman being shown using camera head 1 to take.As shown in Figure 5, while cameraman K1 for example watches the live view image that is shown on the display part 7 to determine to take the composition when taking the photograph body A1 (flower).
Behind step S104, moving direction detection unit 102 judges whether camera head 1 moves (step S105).Specifically; The device that is undertaken by the cameraman in acceleration detecting section 3 detects the acceleration of vertical direction (y direction of principal axis) and horizontal direction (x direction of principal axis), beyond the constant acceleration of gravity that applies moves under the situation of the acceleration change that causes, moving direction detection unit 102 is judged to be camera head 1 and moves.Be judged to be under the situation that camera head 1 moves (step S105: be) the step S106 that states after camera head 1 is transferred at moving direction detection unit 102.On the other hand, be judged to be (step S105: deny) under the mobile situation of camera head 1, the step S111 that states after camera head 1 is transferred at moving direction detection unit 102.
In step S106, camera head 1 detects the candidate who is mainly taken the photograph body, carries out the main body candidate determination processing of judging with detected candidate's mobile relevant characteristic of being taken the photograph.
Fig. 6 is the main flow chart of being taken the photograph the summary of body candidate determination processing that the step S106 of Fig. 4 is shown.
As shown in Figure 6; The testing result that moving direction detection unit 102 detects based on acceleration detecting section 3; Moving direction (step S201) is judged in the variation of the acceleration that produces according to camera head 1, judges whether camera head 1 moves (step S202) in the horizontal direction.Specifically, moving direction detection unit 102 judges whether acceleration detecting section 3 detects acceleration in the horizontal direction.Be judged to be under the mobile in the horizontal direction situation of camera head 1 (step S202: be) the step S203 that states after camera head 1 is transferred at moving direction detection unit 102.On the other hand, be judged to be under the mobile in the horizontal direction situation of camera head 1 (step S202: deny) at moving direction detection unit 102, camera head 1 is got back to main routine shown in Figure 4.Under the situation of this external cameraman's kinesitherapy nerve prosperity, also there are camera and the situation about moving matchingly that moves of being taken the photograph body sometimes.At this moment, the quilt in the picture is taken the photograph body and is looked like static.
In step S203, mainly being taken the photograph body couple candidate detection portion 103 will have mobile mobile object to survey to mainly being taken the photograph the body candidate in the live view image that display part 7 shows.Specifically, mainly taken the photograph the variation of body couple candidate detection portion 103 according to the image information that comprises respectively in the continuous live view image, zone that will approximate period property variation between continuous images is detected to moving and is taken the photograph body region.For example, mainly being taken the photograph body couple candidate detection portion 103 is each presumptive area (for example 9 cut apart) with the live view image segmentation, will each zone after this is cut apart in the zone that in predetermined frequency band, changes of image information detect to moving quilt and taken the photograph body region.This predetermined frequency band is 2Hz~5Hz.
Then, mainly being taken the photograph body configuration part 104 judges mainly and to be taken the photograph that body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and whether move (step S204) to the moving direction of the camera head of being judged by moving direction detection unit 102 1.Mainly being taken the photograph under the situation that the moving direction of body candidate to the camera head of being judged by moving direction detection unit 102 1 move (step S204: be), mainly taken the photograph body configuration part 104 and make interim storage part 93 storages mainly taken the photograph body candidate's characteristic (step S205).Afterwards, camera head 1 is got back to the main routine of Fig. 4.On the other hand, (step S204: not), camera head 1 is not got back to main routine shown in Figure 4 under the situation that the moving direction of the camera head of being judged by moving direction detection unit 102 1 moves mainly being taken the photograph the body candidate.
Fig. 7 schematically shows mainly to be taken the photograph body configuration part 104 and judge mainly the figure of the situation when taking the photograph body.Fig. 8 is the figure that is illustrated in an example of image corresponding with the view data of image pickup part 2 generations under the situation shown in Figure 7.Fig. 9 is the figure that is illustrated in the relation in the photography zone of being taken the photograph body A1 and camera head 1 under the situation shown in Figure 7.In addition, in Fig. 7, the photography zone of camera head 1 is by single-point line expression.And in Fig. 9, transverse axis t representes that longitudinal axis D representes to be taken the photograph the displacement that body A1 and camera head 1 move from the position of static state constantly.And, in Fig. 9, the right-hand displacement D to situation about moving of court shown in Figure 7 just is set at.And, in Fig. 9, curve L 1The displacement of the center of body A1, curve L are taken the photograph in expression 2The displacement of the center in the photography zone of expression camera head 1.
As shown in Figure 7, want being taken the photograph body A1 as mainly being taken the photograph under the situation that body takes (Fig. 7 (a)) at cameraman K1, when fitful wind to the right direction (arrow Y1) taken the photograph body A1 direction inclination to the right and rocked (Fig. 7 (b)) when blowing.At this moment, taken the photograph body A1 in order to take in the substantial middle in photography zone, cameraman K1 the position of camera head 1 (pan operation) according to being taken the photograph moving of body A1.As a result, the photography zone of camera head 1 becomes the regional F2 of photography from the regional F1 that photographs.
Afterwards, taken the photograph body A1 and move and will get back to reset condition (Fig. 7 (c)) to (arrow Y2) towards left owing to fitful wind stops to make, thus the position that cameraman K1 makes camera head 1 in the opposite direction (left to) move.As a result, the photography zone of camera head 1 becomes the regional F3 of photography from the regional F2 that photographs.
Then, taken the photograph body A1 and further moved (Fig. 7 (d)) to (arrow Y2) from the position of reset condition towards left owing to inertia.Therefore, cameraman K1 is moved further the position of camera head 1, thereby the photography zone of camera head 1 becomes the regional F4 of photography from the regional F3 that photographs.
Through being taken the photograph the operation of photographing in the body A1 by what cameraman K1 carried out following the tracks of like this, while watch the live view image that is shown on the display part 7 to photograph till the action to cameraman K1, can generation time poor.Specifically, as shown in Figure 9, cameraman K1 is moved to and camera head 1 is moved can generation time postpone the (time T of Fig. 9 from taking the photograph body A1 1).And then cameraman K1 is difficult to making wind-engaging mobile quilt and takes the photograph under the state of central authorities that body A1 is fixed in the photography zone and follow the tracks of, the result, and the moment that arrives maximum displacement postpones the (T of Fig. 9 2).For example, shown in Fig. 8 (a) and Fig. 8 (b), owing to taken the photograph body A1 towards right-hand to mobile (image W 1→ image W 2), thereby cameraman K1 through follow the tracks of taken the photograph body A1 move dollying device 1, make the photography zone move the (moment t of Fig. 9 1~t 2).
Afterwards, shown in Fig. 8 (b) and Fig. 8 (c), (moment t of Fig. 9 when fitful wind stops, being taken the photograph body A1 and will get back to reset condition towards left to moving 2~t 3) time, making under its situation consistent with the displacement of being taken the photograph body A1 (direction of advance is reverse each other) at cameraman K1 towards the displacement in right-hand photography zone to the camera head 1 that moves, quilt is taken the photograph body A1 becomes image (the image W that is positioned at substantial middle 3).
Then, shown in Fig. 8 (d)~Fig. 8 (e), further moved the back towards right-hand to (arrow b2) to mobile (image W towards left owing to take the photograph body A1 3→ image W 4→ image W 5), thereby cameraman K1 through follow the tracks of taken the photograph body A1 move dollying device 1, make the photography zone move the (moment t of Fig. 9 3~t 6).At this moment, intersected (the moment t of Fig. 9 once more owing to take the photograph the regional middle position of photography of middle position and the camera head 1 of body A1 6), thereby photograph and taken the photograph image (the image W that body A1 is positioned at substantial middle 5) (with reference to Fig. 8 (e)).The displacement in the photography zone when displacement and the cameraman K1 that then, is taken the photograph body A1 moves camera head 1 is mutual approximate period property.Here, it is the general expression that in displacement for just and between negative moving back and forth for approximate period property.
Like this; Make camera head 1 that camera head 1 has been carried out under the reciprocating situation with predetermined amplitude at cameraman K1; When the quilt that moves as the moving direction towards camera head 1 is taken the photograph body A1, mainly taken the photograph body configuration part 104 and will be taken the photograph body A1 and be set at the main of cameraman K1 expectation photography and taken the photograph body.Thus; Being taken the photograph that body A1 receives wind etc. under the fierce situation that moves; Even when not determining photography composition, mainly taken the photograph body configuration part 104 and also can be judged the tracking action of cameraman K1, can cameraman K1 expectation main be taken the photograph body and be set at the object of in photography, following the tracks of.
Get back to Fig. 4, the explanation of step afterwards that continues and begin from step S107.In step S107, mainly taken the photograph body configuration part 104 and judged whether detected mainly to be taken the photograph the body candidate repeatedly identical by mainly being taken the photograph body couple candidate detection portion 103.In that body couple candidate detection portion 103 is detected is mainly taken the photograph (step S107: be) under the repeatedly identical situation of body candidate by mainly being taken the photograph, mainly taken the photograph body configuration part 104 and will be taken the photograph mainly that body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and be set at and mainly taken the photograph body (step S108).
Then, image detection portion 105 judges whether mainly taken the photograph body is clapped in the middle section of live view image (step S109).Be judged to be mainly in image detection portion 105 and taken the photograph body and beated in (step S109: be) under the situation in the middle section of live view image, information appendix 106 to the view data additional marking corresponding with the live view image of display part 7 current demonstrations as with respect to other image data set identifiable information (step S110) that is stored in the interim storage part 93.Afterwards, the step S111 that states after camera head 1 is transferred to.On the other hand, image detection portion 105 be judged to be mainly taken the photograph body do not beated under the situation in the middle section of live view image (step S109: not), the step S111 that states after camera head 1 is transferred to.
Then; Thereby under the situation of having imported release signal through operation release-push 62 (step S111: be); Camera head 1 is photographed (step S112) under the control of photography control part 107, the view data (step S113) that image data storage portion 91 storage image pickup parts 2 are generated.
Afterwards, control part 10 judges whether view data (step S114) is arranged in the interim storage part 93.The step S115 that states after camera head 1 is transferred to, control part 10 is arranged in being judged as interim storage part 93 under the situation of view data (step S114: be).On the other hand, in control part 10 is judged as interim storage part 93, do not have under the situation of view data (step S114: not), the step S118 that states after camera head 1 is transferred to.
In step S115, whether have in the image data set that control part 10 is judged in being stored in interim storage part 93 and attached tagged view data.Being judged as at control part 10 has (step S115: be) under the situation of being attached tagged view data in the image data set that is being stored in the interim storage part 93; Camera head 1 is carried out the slip image that this image data set is slided and is shown processing (step S116) when making display part 7 show the image data set that is stored in the interim storage part 93 successively in the display frame of display part 7.In addition, show the details of handling, describe in the back about the slip image.After step S116, camera head 1 is transferred to step S117.
Then, control part 10 judges whether to make the power supply of camera head 1 be in off-state (step S117) through operating power switch 61.Be (step S117: be) under the situation of off-state at the power supply that control part 10 is judged as camera head 1, camera head 1 finishes this processing.On the other hand, not that (step S117: not), camera head 1 is got back to step S101 under the situation of off-state at the power supply that control part 10 is judged as camera head 1.
Control part 10 in step S114 is judged as the situation that temporarily is not stored in the view data in the interim storage part 93, and (step S114: control part 10 is judged as the situation (step S115: deny) of not attached tagged view data in the image data set that is being stored in interim storage part 93 in and describes not) and in step S115.In this case, display control unit 108 makes 7 pairs of images corresponding with captured view data of display part write down the browse displays scheduled time (for example 2 seconds) (step S118).Afterwards, camera head 1 is transferred to step S117.
(step S111: not), camera head 1 is not transferred to step S117 under the situation via release-push 62 input release signals in step S111.
In step S107, be judged as that body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and do not have repeatedly that (step S107: not), camera head 1 is transferred to step S111 under the identical situation by mainly being taken the photograph mainly being taken the photograph body configuration part 104.
Then, (step S101: the situation (step S119: be) that) is not set to reproduction mode describes camera head 1 not to be set to photograph mode.In this case, display control unit 108 makes display part 7 show the guide look (step S120) of the downscaled images (thumbnail image) after the corresponding image of storing with image data storage portion 91 of each view data dwindles.
Then, under the situation of the image of from the guide look of downscaled images, having selected to carry out via operation inputting part 6 or touch panel 8 enlarged and displayed (step S121: be), display control unit 108 makes display part 7 full frames show selected image (step S122).
Afterwards, under the situation of the handover operation that has carried out image via operation inputting part 6 or touch panel 8 (step S123: be), camera head 1 is got back to step S120.On the other hand, (step S123: not), camera head 1 is got back to step S122 under the situation of the handover operation that does not carry out image via operation inputting part 6 or touch panel 8.
To in step S121, from the guide look of downscaled images, not selecting to carry out situation (the step S121 :) do not describe of the image of enlarged and displayed via operation inputting part 6 or touch panel 8.In this case, control part 10 judges from the guide look of display part 7 demonstration downscaled images whether passed through the scheduled time (for example 3 seconds) (step S124).Be judged as under the situation of having passed through the scheduled time from the guide look of display part 7 demonstration downscaled images (step S124: be) at control part 10, camera head 1 is transferred to step S117.On the other hand, be judged as under the situation of not passing through the scheduled time from the guide look of display part 7 demonstration downscaled images (step S124: deny) at control part 10, camera head 1 is got back to step S120.
In step S119, be not set at camera head 1 that (step S119: not), camera head 1 is transferred to step S117 under the situation of reproduction mode.
Then, the slip image to the step S116 of Fig. 4 shows that the summary of handling describes.Figure 10 illustrates the flow chart that the slip image shows the summary of handling.
Shown in figure 10, display control unit 108 makes 7 pairs of photographss corresponding with captured view data of display part write down browse displays (step S301).Specifically, shown in Figure 11 (a), display control unit 108 makes the right zone of display part 7 to photographs W 11Write down browse displays.
Then, display control unit 108 begin to show be stored in interim storage part 93 temporarily in the corresponding respectively image sets (step S302) of image data set.Specifically, shown in Figure 11 (b) and Figure 11 (c), Yi Bian display control unit 108 makes and the interim storage part 93 interim corresponding image W of each view data that store n(n=natural number) is from 7 couples of photographs W of display part 11Write down the left of the lower area of browse displays and slide mobile (arrow (a)), Yi Bian make display part 7 show (image W successively n→ image W N+1→ image W N+2→ ... Image W N+3).
Afterwards, control part 10 judges that the demonstration of in display part 7 images displayed groups, whether just sliding attaches tagged tape label image (step S303) by information appendix 106.Be judged as in display part 7 images displayed groups, just sliding at control part 10 and show under the situation of attaching tagged tape label image by information appendix 106 (step S303: be) that display control unit 108 makes display part 7 stress to show tape label images (step S304).Specifically, shown in Figure 11 (d), display control unit 108 makes tape label image W N+3With can with photographs W 11The mode enlarged and displayed that compares is in the viewing area on the top of display part 7.Thus, cameraman K1 can be with taken the image W that is taken the photograph body A1 in the substantial middle zone of image N+3With cameraman K1 through the captured photographs W of action that photographs 11Confirm when comparing.After step S304, the step S305 that states after camera head 1 is transferred to.
In step S303, be judged as at control part 10 and in display part 7 images displayed groups, just do not sliding and show under the situation of attaching tagged tape label image by information appendix 106 (step S303: not), the step S305 that states after camera head 1 is transferred to.
In step S305, control part 10 judges to the display part 7 images displayed group of sliding whether carried out selecting to make the selection operation that is stored in the image in the image data storage portion 91.Specifically, control part 10 has judged whether to import the selection signal of selecting from the image of menu switch 64 or touch panel 8 inputs.Be judged as the images displayed group of sliding to display part 7 at control part 10 and carried out under the situation of selection operation (step S305: be), control part 10 is selected image, for example image W N+3Image data storage (step S306) in image data storage portion 91.Afterwards, the step S307 that states after camera head 1 is transferred to.
In step S305; Control part 10 be judged as via menu switch 64 or touch panel 8 and in to (for example 5 seconds) at the fixed time the display part 7 images displayed group of sliding do not carry out under the situation of selection operation (step S305: deny) the step S307 that states after camera head 1 is transferred to.
In step S307, control part 10 judged whether the to be all over demonstration of interim storage part 93 interim image stored data sets.Be judged as at control part 10 under the situation of demonstration of the interim storage part 93 interim image stored data sets that have been all over (step S307: be), control part 10 is the interim storage part 93 image stored data sets (step S308) of deletion all.Afterwards, camera head 1 is got back to the main routine of Fig. 4.
In step S307, be judged as at control part 10 that (step S307: not), camera head 1 is got back to step S303 under the situation of demonstration of the interim storage part 93 interim image stored data sets that are not all over.
Execution mode of the present invention 1 according to above explanation; In that body couple candidate detection portion 103 is detected is mainly taken the photograph the body candidate under situation about moving on the moving direction of the camera head that is determined by moving direction detection unit 102 1 by mainly being taken the photograph; Mainly taken the photograph body configuration part 104 and will mainly be taken the photograph the body candidate and be set at and mainly taken the photograph body, image detection portion 105 detects from interim storage part 93 image stored groups by what mainly taken the photograph 104 settings of body configuration part and is mainly taken the photograph the image that body is positioned at presumptive area.Thus, even adopt make photography target main to be taken the photograph under the situation that body moves in photography, also can extract and take the image that main quilt is taken the photograph body with predetermined pattern the cameraman.
And, according to the embodiment of the present invention 1, display control unit 108 makes display part 7 show photographss, and display part 7 is shown and the corresponding image of storage part 93 image stored data sets temporarily.Thus, can give the cameraman with the picture cues before and after the photography.
And, according to the embodiment of the present invention 1, display control unit 108 makes display part 7 show successively by information appendix 106 with the display mode that can discern and has added mark and mainly taken the photograph the body center image in substantial middle of being beated in.Thus, can point out the predetermined pattern of expecting with the cameraman to take the image that main quilt is taken the photograph body.
(execution mode 2)
Then, execution mode 2 of the present invention is described.The camera head that execution mode 2 of the present invention relates to also has the orientation detection portion of detection orientation.And in the action that the camera head that execution mode 2 of the present invention relates to carries out, it is different with above-mentioned execution mode 1 mainly to be taken the photograph body candidate determination processing.Therefore, in following, the structure of orientation detection portion is described, the main body candidate determination processing of being taken the photograph of the action of the camera head that execution mode 2 of the present invention relates to is described afterwards.In addition, in the accompanying drawing record, same section is enclosed same numeral.
Figure 12 is the block diagram that the structure of the camera head 100 that execution mode 2 of the present invention relates to is shown.Shown in figure 12, camera head 100 has: image pickup part 2, acceleration detecting section 3, timer 4, illuminating part 5, operation inputting part 6, display part 7, touch panel 8, storage part 9, control part 10 and orientation detection portion 110.
Orientation detection portion 110 is made up of geomagnetic sensor.Orientation detection portion 110 detects and preestablishes the appointment orientation in camera head 100.Specifically; Orientation detection portion 110 is through detecting when the vertical direction of the horizontal direction of the transverse direction of display part 7 images displayed and camera head 100 earth magnetism about equally time the and the component of horizontal direction, and detection is when the orientation of the camera head 100 during as the reference bearing with the optical axis O1 of image pickup part 2.
Then, the main body candidate determination processing of being taken the photograph that the camera head 100 that this execution mode 2 is related to carries out describes.Figure 13 illustrates the main flow chart of being taken the photograph the summary of body candidate determination processing (the step S106 of Fig. 4) that camera head 100 that this execution mode 2 relates to carries out.
Shown in figure 13, moving direction detection unit 102 is judged the acceleration (step S401) that in camera head 100, produces according to acceleration detecting section 3 detected testing results.Being judged to be in z direction (optical axis O1 direction) at moving direction detection unit 102 does not have under the big situation of the acceleration of acceleration ratio x direction (horizontal direction) of acceleration (step S402: be), y direction (vertical direction) (step S403: be); When moving direction detection unit 102 is judged to be the testing result cyclic variation of acceleration of x direction or y direction (step S404: be), mainly being taken the photograph body couple candidate detection portion 103 will have mobile mobile object to survey to being taken the photograph body candidate (step S405) in the live view image that display part 7 shows.
Figure 14 is the figure of schematic illustration moving direction detection unit 102 situation when judging the mobile status of camera heads 100.Figure 15 observes the vertical view taken the photograph body from Figure 14 to looking the A direction.Figure 16 observes the end view taken the photograph body from Figure 14 to looking the B direction.Figure 17 is the speed of schematic illustration cameraman K1 camera head 100 when camera head 100 is moved and the figure of acceleration detecting section 3 detected relationship with acceleration.In Figure 17 (a), transverse axis t express time, longitudinal axis v representes the speed of camera head 100, curve L 11The velocity variations of expression camera head 100.And, in Figure 17 (b), transverse axis t express time, longitudinal axis a representes acceleration.And, in Figure 17 (b), curve L X1The acceleration of the horizontal direction of expression camera head 100, curve L Y1The acceleration of the vertical direction of expression camera head 100, curve L Z1The acceleration of the optical axis direction of expression camera head 100.In addition, in Figure 14 and Figure 15, consideration will make camera head 100 just be set to situation about moving towards right-hand, with making camera head 100 be set at negative towards left to situation about moving.
Like Figure 14~shown in Figure 16; Taken the photograph under the irregular mobile situation of body A1; Cameraman K1 follows the tracks of while the live view image of watching display part 7 to show and is taken the photograph body A1, thereby the acceleration of the vertical direction that produces in the camera head 1 shows the value bigger than the acceleration of horizontal direction.Specifically, shown in figure 17, at cameraman K1 according to mobile make under the situation that camera head 1 moves of being taken the photograph body A1 (t constantly 1~t 2, moment t 5~t 6), the acceleration of acceleration detecting section 3 detection level directions and vertical direction, and detect the big value of acceleration of the acceleration ratio horizontal direction of vertical direction.And, stopped under the situation about moving of camera head 100 (t constantly at cameraman K1 3~t 4, moment t 7~t 8), the acceleration of acceleration detecting section 3 detection level directions and vertical direction, and detect the big value of acceleration of the acceleration ratio horizontal direction of vertical direction.And cameraman K1 moves according to the mobile camera head 100 that makes of the approximate period property of being taken the photograph body A1, thus the acceleration cyclic variation of acceleration detecting section 3 detected horizontal directions and vertical direction.
As stated; Under the big situation of the acceleration of the acceleration ratio horizontal direction that does not have acceleration, vertical direction on the optical axis O1 direction; When the testing result of the acceleration of horizontal direction and vertical direction is periodically the time, is mainly taken the photograph body configuration part 104 and will be taken the photograph mainly in the live view image that display part 7 shows that body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and be set at mainly and taken the photograph body.
After step S405, step S406 carries out the processing identical with step S205 with above-mentioned step S204 with step S407, thereby omits explanation.After step S407, camera head 100 is got back to main routine shown in Figure 4.
The testing result of acceleration that moving direction detection unit 102 in step S404 is judged to be x direction or the y direction of camera head 100 is not periodic situation (step S404: not) describe.In this case, moving direction detection unit 102 judges whether the testing result of orientation detection portion 110 is periodic (step S408).The testing result that is judged to be orientation detection portion 110 at moving direction detection unit 102 is (step S408: be) under the periodic situation, and camera head 100 is transferred to step S405.On the other hand, the testing result that is judged to be orientation detection portion 110 at moving direction detection unit 102 is not (step S408: deny) under the periodic situation, and camera head 100 is got back to main routine shown in Figure 4.
Figure 18 is the figure of schematic illustration moving direction detection unit 102 situation when judging the mobile status of camera heads 100.Figure 19 is the figure of relation of speed and the orientation detection portion 110 detected testing results of the camera head 100 of schematic illustration cameraman K1 when camera head 100 is moved.In Figure 19 (a), transverse axis t express time, longitudinal axis θ representes the testing result of orientation detection portion 110, curve L 11The velocity variations of expression camera head 100.And in Figure 19 (b), transverse axis express time, the longitudinal axis are represented the testing result of orientation detection portion 110, curve L 12The variation of the testing result of expression orientation detection portion 110.In addition, in Figure 18, consideration will make camera head 100 just be set to situation about moving towards right-hand.
Like Figure 18 and shown in Figure 19, taken the photograph under the situation that body A1 approximate period property moves, taken the photograph body A1 while the live view image that cameraman K1 watches display part 7 to show is followed the tracks of, thereby orientation detection portion 110 detected testing results are approximate period property.Specifically, shown in figure 19, at cameraman K1 according to mobile make under the situation that camera head 100 approximate period property move of being taken the photograph body A1 (t constantly 1~t 2, moment t 5~t 6), the 110 detected numerical value approximate period property variations of orientation detection portion.
Like this; Testing result at the acceleration of the x of camera head 100 direction or y direction is not under the situation of approximate period property; When the testing result approximate period property variation of orientation detection portion 110, mainly taken the photograph body configuration part 104 and will be taken the photograph mainly in the live view image that display part 7 shows that body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and be set at mainly and taken the photograph body.
Moving direction detection unit 102 in step S402 is judged to be the situation that on acceleration detecting section 3 detected z directions, do not have acceleration change, and (step S402: moving direction detection unit 102 is judged to be situation (the step S403 :) do not describe that the axial acceleration of acceleration detecting section 3 detected y is not more than the acceleration of x direction not) and in step S403.In this case, camera head 100 is got back to the main routine of Fig. 4.
Execution mode of the present invention 2 according to above explanation; In that body couple candidate detection portion 103 is detected is mainly taken the photograph the body candidate under situation about moving on the moving direction of the camera head that is determined by moving direction detection unit 102 100 by mainly being taken the photograph; Taken the photograph mainly that body configuration part 104 will mainly being taken the photograph the body candidate be set at the main body of being taken the photograph that in photography, will follow the tracks of, from interim storage part 93 image stored groups, detect by what taken the photograph mainly that body configuration part 104 sets and mainly taken the photograph the image that body is positioned at presumptive area.Thus, even adopt make photography target main to be taken the photograph under the situation that body moves in photography, also can extract and take the image that main quilt is taken the photograph body with predetermined pattern the cameraman.
And; According to the embodiment of the present invention 2; In the horizontal direction or the testing result of the acceleration of vertical direction be not under the periodic situation; When the testing result cyclic variation of orientation detection portion 110; Mainly taken the photograph body configuration part 104 and will be taken the photograph mainly in the live view image that display part 7 shows that body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and be set at and mainly taken the photograph body, and from interim storage part 93 image stored groups, detect by what taken the photograph mainly that body configuration part 104 sets and mainly taken the photograph the image that body is positioned at presumptive area.Thus, can obtain the image of the photography composition that has determined cameraman's expectation reliably.
(variation 1 of execution mode 2)
In above-mentioned execution mode 2, cameraman K1 makes camera head 100 be parallel to the situation that horizontal direction and vertical direction move, yet for example when cameraman K1 does not change the stretching, extension mode of arm and camera head 100 is periodically moved on circular arc, also can not use.
Figure 20 is the figure of schematic illustration moving direction detection unit 102 situation when judging the mobile status of camera heads 100.In Figure 20, as the intrinsic reference axis of the earth, get the X axle in the horizontal direction, get Y axle (with vertically downward for just) in vertical direction, get the Z axle in the direction vertical with horizontal direction.And Figure 21 is the figure of the relationship with acceleration of optical axis O1 direction and vertical direction in the camera head 100 of schematic illustration cameraman K1 when camera head 100 is moved.In Figure 21, transverse axis t representes that longitudinal axis a representes the acceleration of camera head 100 constantly.And, curve L Y2The acceleration of the vertical direction of expression when being benchmark with the intrinsic reference axis of the earth, curve L Z2The acceleration of the optical axis O1 direction of expression when being benchmark with the intrinsic reference axis of the earth.
Like Figure 20 and shown in Figure 21; By taken the photograph body A1 with certain altitude under the situation that vertical direction approximate period property moves; Cameraman K1 follows the tracks of while the live view image of watching display part 7 to show and is taken the photograph body A1; Thereby camera head 1 is moved up and down, so the acceleration cyclic variation of optical axis O1 direction and vertical direction.
Like this; Under the periodically variable situation of the acceleration of optical axis O1 direction and vertical direction, mainly taken the photograph body configuration part 104 and will be taken the photograph mainly in the live view image that display part 7 shows that body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and be set at mainly and taken the photograph body.
Variation according to the execution mode of the present invention 2 of above explanation; Under the periodically variable situation of testing result of the acceleration of optical axis direction or vertical direction; Mainly taken the photograph body configuration part 104 and will be taken the photograph mainly in the live view image that display part 7 shows that body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and be set at and mainly taken the photograph body, and from interim storage part 93 image stored groups, detect by what taken the photograph mainly that body configuration part 104 sets and mainly taken the photograph the image that body is positioned at presumptive area.Thus, can obtain the image of the photography composition that has determined cameraman's expectation reliably.
(execution mode 3)
Then, execution mode 3 of the present invention is described.The storage part of the camera head that execution mode 3 of the present invention relates to is different with above-mentioned camera head with the structure of control part.And the action that the camera head that execution mode 3 of the present invention relates to carries out is different with above-mentioned execution mode.Therefore, in following, explain and above-mentioned execution mode various structure, the action of the camera head that relates to execution mode 3 of the present invention is described afterwards.In addition, in the accompanying drawing record, same section is enclosed same numeral.
Figure 22 is the block diagram that the structure of the camera head 200 that execution mode 3 of the present invention relates to is shown.Shown in figure 22, camera head 200 has: storage part 209, image data storage portion 91, program storage part 92, interim storage part 93 and contrast storage part 94.
Each displacement that contrast storage part 94 storage AF evaluations of estimate, this AF evaluation of estimate are represented to move according to camera head 200 with after the focal position of the image pickup part 2 that is mapped of the contrast of the Contrast Detection portion 211 detected view data stated.Here, the AF evaluation of estimate is the value that the focal position of maximum and image pickup part 2 of the contrast of the view data that image pickup part 2 generates is mapped.
Control part 210 has: image processing part 101, moving direction detection unit 102, mainly taken the photograph body couple candidate detection portion 103, mainly taken the photograph body configuration part 104, image detection portion 105, information appendix 106, photography control part 107, display control unit 108 and Contrast Detection portion 211.
When camera head 200 moved, the contrast of the view data of image pickup part 2 generations just detected in Contrast Detection portion 211.Specifically, Contrast Detection portion 211 is by each fixed period (60fps) or when camera head 200 moves, just detect the contrast of the view data that image pickup part 2 generates.Contrast Detection portion 211 outputs to contrast storage part 94 with the contrast of detected view data.
The action that camera head 200 with above structure is carried out describes.Figure 23 is the flow chart that the summary of the action that camera head 200 that execution mode 3 of the present invention relates to carries out is shown.
Step S501~step S505 corresponds respectively to step S101 shown in Figure 4~step S105.
In step S506, moving direction detection unit 102 judges whether camera head 200 moves on optical axis O1 direction.Specifically, moving direction detection unit 102 judges whether acceleration detecting section 3 detects acceleration to optical axis O1 direction.Be judged to be camera head 200 at moving direction detection unit 102 and carried out under the mobile situation (step S506: be) the step S507 that states after camera head 200 is transferred on the optical axis O1 direction.On the other hand, be judged to be camera head 200 (step S506: deny) not under situation about moving on the optical axis O1 direction, the step S514 that states after camera head 200 is transferred at moving direction detection unit 102.
In step S507, the contrast of the view data of image pickup part 2 generations detects in Contrast Detection portion 211, and detected contrast is stored in the contrast storage part 94 (step S508).
Figure 24 is the figure of the situation when schematically showing Contrast Detection portion 211 and detecting the contrast of the view data that is generated by image pickup part 2.Figure 25 is the figure that is illustrated in the contrast of Contrast Detection portion 211 detected view data under the situation shown in Figure 24 and the relation from camera head 200 to the photo distance of being taken the photograph body A2.In addition, in Figure 24, think that the focus lens that the camera lens part 21 of image pickup part 2 has is the precalculated position on the optical axis O1 of image pickup part 2, for example the position of nearside stops the focus lens of (fixing).And in Figure 25, transverse axis d representes camera head 200 and the photo distance of being taken the photograph body A2, and longitudinal axis c representes contrast.And curve L21 representes contrast.
Under situation shown in Figure 24, cameraman K1 makes with the fixing camera head 200 of the focal length D2 of image pickup part 2 and is taken the photograph body A2 (Figure 24 (a)~Figure 24 (c)) near quilt, thus Contrast Detection portion 211 detected contrasts curve L shown in figure 25 21Such variation.Specifically, shown in figure 25, at camera head 200 and the photo distance D that is taken the photograph body A2 2Down, the contrast of Contrast Detection portion 211 detected view data is at a P 2Become peak C 2Camera head 200 is in this peak C 2Photograph, thereby can obtain to be taken the photograph in focus the view data of body A2.That is, under state fixing on the optical axis O1 that makes camera lens part 21 at image pickup part 2, cameraman K1 only need make camera head 200 near or away from being taken the photograph body A2, just can take view data in focus.
And; When camera head 200 moves on optical axis O1 direction; Photography control part 107 just compares the contrast of Contrast Detection portion 211 detected view data and the contrast of contrast storage part 94 storages; Under the situation that the contrast of view data reduces continuously, the location determination before the contrast of view data is about to reduce is the peak value of the contrast of view data.Specifically, in Figure 25, at the P that continues of Contrast Detection portion 211 1The back is detecting the some P of contrast 2(contrast C 2) and some P 3(contrast C 3) under the contrast situation about reducing, photography control part 107 will be put P 2Be judged to be the peak value of the contrast of view data.
In step S509, photography control part 107 judges whether Contrast Detection portion 211 detected contrasts are peak value (maximums).Be judged as under the situation that Contrast Detection portion 211 detected contrasts are peak values (step S509: be) at photography control part 107, photography control part 107 is stored in the peak value of contrast in the contrast storage part 94 (step S510).Afterwards, the step S511 that states after camera head 200 is transferred to.On the other hand, be judged as (step S509: deny) under the situation that Contrast Detection portion 211 detected contrasts are not peak values, the step S514 that states after camera head 200 is transferred at photography control part 107.
In step S511, camera head 200 execution are judged in the photography composition that the cameraman sets becomes the main main body candidate determination processing of being taken the photograph of being taken the photograph the body candidate that main quilt is taken the photograph body.
Figure 26 is the main flow chart of being taken the photograph the summary of body candidate determination processing that the step S511 of Figure 23 is shown.
Shown in figure 26, moving direction detection unit 102 judges according to acceleration detecting section 3 detected acceleration change whether camera head 200 moves (step S601) on optical axis O1 direction.Be judged to be camera head 200 under situation about moving on the optical axis O1 direction (step S601: be) at moving direction detection unit 102, the step S602 that states after camera head 200 is transferred to.On the other hand, (step S601: not), camera head 200 is got back to the main routine of Figure 23 not under situation about moving on the optical axis O1 direction to be judged to be camera head 200 at moving direction detection unit 102.
In step S602, mainly taken the photograph the object that body couple candidate detection portion 103 will change in the live view image inner region that display part 7 shows and surveyed to mainly being taken the photograph the body candidate.Specifically, mainly taken the photograph body couple candidate detection portion 103 and will be beated in respectively that zone that quilt in continuous live view image takes the photograph body dwindles or the object that enlarges is surveyed to being taken the photograph the body candidate.
Then, mainly taken the photograph body configuration part 104 and judged that body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and whether on the moving direction of the camera head that is determined by moving direction detection unit 102 200, change (step S603) by mainly being taken the photograph.Body couple candidate detection portion 103 is detected is mainly taken the photograph body candidate (step S603: be) under situation about changing on the moving direction of the camera head that is determined by moving direction detection unit 102 200 mainly being taken the photograph, mainly taken the photograph will mainly be taken the photograph the body candidate in body configuration part 104 characteristic storage in interim storage part 93 (step S604).Afterwards, camera head 1 is got back to the main routine of Figure 23.On the other hand; Body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate (step S603: not), camera head 200 is got back to the main routine of Figure 23 not under situation about changing on the moving direction of the camera head that is determined by moving direction detection unit 102 200 mainly being taken the photograph.
Get back to Figure 23, continue the explanation that step S512 rises.In step S512; At the live view image that shows by display part 7 be near the peak value of contrast view data and with by under the repeatedly consistent situation of the main characteristic of being taken the photograph body of being taken the photograph mainly that body configuration part 104 sets (step S512: be), photography control part 107 is with the image data storage (step S513) in image data storage portion 91 of the live view image of display part 7 demonstrations.Afterwards, the step S514 that states after camera head 200 is transferred to.On the other hand; At the live view image that shows by display part 7 be near the peak value of contrast view data and not with by under the repeatedly consistent situation of the main characteristic of being taken the photograph body of being taken the photograph mainly that body configuration part 104 sets (step S512: deny), the step S514 that states after camera head 200 is transferred to.
Step S111~S113, S117 and step S118 with shown in Figure 4 is corresponding respectively owing to step S514~step S518, thereby omits explanation.
Then, (step S501: the situation (step S519: be) that) is not set to reproduction mode describes camera head 200 not to be set to photograph mode.In this case, camera head 200 execution in step S520~step S524.In addition, the step S120~step S124 with shown in Figure 4 is corresponding respectively owing to step S520~step S524, thereby omits explanation.
Camera head 200 in step S519 is not set to situation (the step S519 :) do not describe of reproduction mode.In this case, camera head 200 is transferred to step S518.
Execution mode of the present invention 3 according to above explanation; Under the situation of the peak value that detects the contrast that detects by Contrast Detection portion 211; Mainly taken the photograph body configuration part 104 will be in the live view image that display part 7 shows body couple candidate detection portion 103 is detected mainly to be taken the photograph the body candidate and is set at and is mainly taken the photograph body by mainly being taken the photograph, and from interim storage part 93 image stored groups, detect by what taken the photograph mainly that body configuration part 104 sets and mainly taken the photograph the image that body is positioned at presumptive area.Thus, can obtain the image of the photography composition that has determined cameraman's expectation reliably, and can obtain the image of mainly being taken the photograph body in focus.
(other execution mode)
And, in the above-described embodiment,, use " at first ", " then " etc. to be described for convenience's sake, yet and do not mean that and must implement in proper order by this about the motion flow in claims, specification and the accompanying drawing.
And, in above-mentioned execution mode, be that digital camera is described as camera head, yet, the electronic equipment such as portable phone and plate portable equipment that also can be applied to for example digital injection camera, digital camera and have 2 kinds of camera functions.

Claims (7)

1. a camera head is characterized in that, said camera head has:
Image pickup part, it generates the view data of being taken the photograph body continuously;
The moving direction detection unit, it judges the moving direction that this camera head moves;
Move test section, its detection is included in above-mentioned in a plurality of said view data and is taken the photograph in the picture of body and move;
Mainly taken the photograph the body detection unit, it is according to the moving direction that is determined by said moving direction detection unit with by moving in the detected above-mentioned picture of being taken the photograph body of said mobile test section, and the said body of being taken the photograph is judged to be mainly and is taken the photograph body.
2. camera head according to claim 1 is characterized in that said camera head has the photography control part, and this photography control part is taken by said taken the photograph mainly that the body detection unit determines said and mainly taken the photograph the image that body is positioned at presumptive area.
3. camera head according to claim 1; It is characterized in that; Taken the photograph under the repeatedly consistent situation of body moving direction that moves and the said moving direction that is determined by said moving direction detection unit said, the said body detection unit of mainly being taken the photograph is mainly taken the photograph the body candidate and is judged to be and is saidly mainly taken the photograph body said.
4. camera head according to claim 2; It is characterized in that; Said camera head has display part, and this display part is judged to be the said quilt of mainly being taken the photograph body and takes the photograph the image that body is positioned at the picture predetermined portions can identification mode to show by the said body detection unit of mainly being taken the photograph.
5. camera head according to claim 4 is characterized in that, said camera head also has:
Input part, it accepts the input of from least a portion of said a plurality of images of said display part demonstration, selecting the selection signal of image; And
Image data storage portion, it stores the corresponding pairing view data of image of said selection signal of accepting with said input part.
6. camera head according to claim 1 is characterized in that,
Said camera head also has Contrast Detection portion, and said Contrast Detection portion is according to said image detection contrast,
Said moving direction detection unit is judged said moving direction according to the variation of the detected contrast of said Contrast Detection portion.
7. an image capture method of being carried out by camera head is characterized in that, said image capture method is carried out following steps:
The shooting step generates the view data of being taken the photograph body continuously;
The moving direction determination step is judged the moving direction that this camera head moves;
Move to detect step, detect and be included in above-mentioned in a plurality of said view data and taken the photograph in the picture of body and move;
Mainly taken the photograph the body determination step,, the said body of being taken the photograph is judged to be mainly and is taken the photograph body according to the moving direction that determines by said moving direction determination step with by moving in the detected above-mentioned picture of being taken the photograph body of said mobile detection step.
CN201210210469.9A 2011-06-24 2012-06-20 Camera head, image capture method Expired - Fee Related CN102843512B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610326097.4A CN105827985B (en) 2011-06-24 2012-06-20 Photographic device, image capture method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011141146A JP5800600B2 (en) 2011-06-24 2011-06-24 Imaging apparatus, imaging method, and program
JP2011-141146 2011-06-24

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201610326097.4A Division CN105827985B (en) 2011-06-24 2012-06-20 Photographic device, image capture method

Publications (2)

Publication Number Publication Date
CN102843512A true CN102843512A (en) 2012-12-26
CN102843512B CN102843512B (en) 2016-05-25

Family

ID=47370543

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201210210469.9A Expired - Fee Related CN102843512B (en) 2011-06-24 2012-06-20 Camera head, image capture method
CN201610326097.4A Expired - Fee Related CN105827985B (en) 2011-06-24 2012-06-20 Photographic device, image capture method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201610326097.4A Expired - Fee Related CN105827985B (en) 2011-06-24 2012-06-20 Photographic device, image capture method

Country Status (2)

Country Link
JP (1) JP5800600B2 (en)
CN (2) CN102843512B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104796597A (en) * 2014-01-17 2015-07-22 奥林巴斯映像株式会社 Display equipment and display method
CN105704393A (en) * 2014-12-12 2016-06-22 卡西欧计算机株式会社 Image-capturing apparatus and image-capturing direction control method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5929774B2 (en) * 2013-02-08 2016-06-08 カシオ計算機株式会社 Image acquisition method, apparatus, and program

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040189829A1 (en) * 2003-03-25 2004-09-30 Fujitsu Limited Shooting device and shooting method
JP2008278480A (en) * 2007-04-02 2008-11-13 Sharp Corp Photographing apparatus, photographing method, photographing apparatus control program and computer readable recording medium with the program recorded thereon
CN101489037A (en) * 2008-01-16 2009-07-22 佳能株式会社 Imaging apparatus and its control method
CN101547310A (en) * 2008-03-24 2009-09-30 奥林巴斯映像株式会社 Pick-up device
CN101770157A (en) * 2008-12-26 2010-07-07 奥林巴斯映像株式会社 Camera and camera system
CN101867725A (en) * 2009-01-23 2010-10-20 卡西欧计算机株式会社 Camera head and reference object tracking
CN101931746A (en) * 2009-06-18 2010-12-29 奥林巴斯映像株式会社 Camera head and image capture method
CN102075674A (en) * 2009-11-25 2011-05-25 奥林巴斯映像株式会社 Imaging apparatus

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1536633A1 (en) * 2003-11-27 2005-06-01 Sony Corporation Photographing apparatus and method, supervising system, program and recording medium
JP4398346B2 (en) * 2004-10-29 2010-01-13 オリンパス株式会社 Camera, photographing sensitivity control method, shutter speed control method, and photographing sensitivity control program
JP4290164B2 (en) * 2006-01-31 2009-07-01 キヤノン株式会社 Display method for displaying display showing identification area together with image, program executed by computer apparatus, and imaging apparatus
JP2010021598A (en) * 2008-07-08 2010-01-28 Victor Co Of Japan Ltd Image capturing apparatus and method
CN101739551B (en) * 2009-02-11 2012-04-18 北京智安邦科技有限公司 Method and system for identifying moving objects
JP5205337B2 (en) * 2009-06-18 2013-06-05 富士フイルム株式会社 Target tracking device, image tracking device, operation control method thereof, and digital camera
JP5397078B2 (en) * 2009-08-11 2014-01-22 株式会社ニコン Imaging device
JP5483953B2 (en) * 2009-08-18 2014-05-07 キヤノン株式会社 Focus adjustment device, focus adjustment method and program
JP2011082770A (en) * 2009-10-06 2011-04-21 Canon Inc Data generation apparatus, method of controlling the same, and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040189829A1 (en) * 2003-03-25 2004-09-30 Fujitsu Limited Shooting device and shooting method
JP2008278480A (en) * 2007-04-02 2008-11-13 Sharp Corp Photographing apparatus, photographing method, photographing apparatus control program and computer readable recording medium with the program recorded thereon
CN101489037A (en) * 2008-01-16 2009-07-22 佳能株式会社 Imaging apparatus and its control method
CN101547310A (en) * 2008-03-24 2009-09-30 奥林巴斯映像株式会社 Pick-up device
CN101770157A (en) * 2008-12-26 2010-07-07 奥林巴斯映像株式会社 Camera and camera system
CN101867725A (en) * 2009-01-23 2010-10-20 卡西欧计算机株式会社 Camera head and reference object tracking
CN101867725B (en) * 2009-01-23 2013-06-05 卡西欧计算机株式会社 Imaging apparatus and subject tracking method
CN101931746A (en) * 2009-06-18 2010-12-29 奥林巴斯映像株式会社 Camera head and image capture method
CN102075674A (en) * 2009-11-25 2011-05-25 奥林巴斯映像株式会社 Imaging apparatus

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104796597A (en) * 2014-01-17 2015-07-22 奥林巴斯映像株式会社 Display equipment and display method
CN104796597B (en) * 2014-01-17 2018-02-13 奥林巴斯株式会社 Display device and display methods
CN105704393A (en) * 2014-12-12 2016-06-22 卡西欧计算机株式会社 Image-capturing apparatus and image-capturing direction control method
CN105704393B (en) * 2014-12-12 2018-01-19 卡西欧计算机株式会社 The control method and recording medium of camera, photography direction

Also Published As

Publication number Publication date
CN105827985A (en) 2016-08-03
JP2013009204A (en) 2013-01-10
JP5800600B2 (en) 2015-10-28
CN105827985B (en) 2019-02-05
CN102843512B (en) 2016-05-25

Similar Documents

Publication Publication Date Title
CN103731595B (en) Electronic equipment and its driving method
US7623774B2 (en) Devices and methods for determining orientation of a camera
JP6219017B2 (en) OLED display device with light detection function
CN101453564B (en) Method and apparatus for image capturing
CN103813093B (en) Photographic device and its image capture method
CN110463185B (en) Image pickup apparatus, image pickup method, and storage medium
CN103248813A (en) Photographic equipment and operating control method thereof
CN103220457A (en) Image photographing device and display method
CN110445914A (en) A kind of terminal device
CN102498712A (en) Control device, image-capturing system, control method, and program
CN103999447A (en) Pre-setting the foreground view of a photograph
CN104902187A (en) Self-portrait control method and control system for mobile terminal
CN104702848B (en) Show the method and device of framing information
CN102843512A (en) Photographic device, photographic method
EP2653918A2 (en) Photographing apparatus for recognizing type of external device, method of controlling the photographing apparatus, and the external device
CN105824569A (en) Method and device for adjusting photographing focus and wearable equipment
CN109640075A (en) A kind of image pick-up detection device, method and mobile terminal
EP3885723A1 (en) Imager, user equipment, method for acquiring ambient light, and storage medium
CN102164243B (en) Method and apparatus for reducing continuous autofocus power consumption
CN102316256B (en) Portable device and reproduction display method
JP2015114880A (en) Display device with touch panel
KR101514510B1 (en) Portable electronic device
JP5558233B2 (en) Imaging apparatus and direction setting method
KR101643611B1 (en) A digital photographing apparatus, a method for controlling the same, and a computer-readable storage medium
KR101653269B1 (en) A digital photographing device and a method for controlling a digital photographing device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20151130

Address after: Tokyo, Japan

Applicant after: Olympus Corporation

Address before: Tokyo, Japan

Applicant before: Olympus Imaging Corp.

C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160525

Termination date: 20210620

CF01 Termination of patent right due to non-payment of annual fee