CN103312979B - Shooting device and control method thereof - Google Patents

Shooting device and control method thereof Download PDF

Info

Publication number
CN103312979B
CN103312979B CN201310082769.8A CN201310082769A CN103312979B CN 103312979 B CN103312979 B CN 103312979B CN 201310082769 A CN201310082769 A CN 201310082769A CN 103312979 B CN103312979 B CN 103312979B
Authority
CN
China
Prior art keywords
mentioned
subject
region
touch
touch operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201310082769.8A
Other languages
Chinese (zh)
Other versions
CN103312979A (en
Inventor
本间伸祐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN103312979A publication Critical patent/CN103312979A/en
Application granted granted Critical
Publication of CN103312979B publication Critical patent/CN103312979B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)

Abstract

The invention provides a shooting device and a control method thereof. According to the invention, the focusing of a shot body can be realized, even when the shot body is moving after being touched. According to the control method of the shooting device, the shooting device can be used to display the image of the shot body on a display part 7, and can be used for detecting the touch operation on the display part 7. In the control method, a detection (S1) of the touch operation can be carried out. When the touch operation is carried out, the shot body in the area of the touch operation is in the static state or the dynamic state in the time period before the touch. When the shot body is determined to be in the dynamic state, the tracking of the shot body (S11), which is just disposed in the area of the touch operation, can be carried out, and the focusing of the area of the tracked shot body can be carried out, and then the shutter can be released.

Description

The control method of photographic equipment and photographic equipment
Technical field
The present invention relates to photographic equipment, specifically, relate to carry out shot object image live view and show touching The photographic equipment photographed during the display picture shown and the control method of photographic equipment.
Background technology
In the past, it is in the photographic equipments such as photographing unit, fast so as to carry out by being operated to functional units such as release buttons Door release, however, it is proposed by touch LCD display picture so as to carrying out shutter release (referenced patent document 1). In photographing unit disclosed in the patent documentation 1, in order to prevent shake when picture is touched, from touching touch panel When from postpone certain hour after carry out shutter release.
【Patent documentation 1】Japanese Unexamined Patent Publication 11-252427 publication
When shutter release is carried out when display picture is touched, if the region to being touched is focused, can be right The subject that user wants is focused, and ease of use is good.But, from user want touch display picture start to The time delay of 0.5 second or so can be produced till actually being touched, therefore, in the case where subject is moved, have When there is no the subject that user wants in the region for being touched.In this case, even if the region to being touched Focused, can also there is the subject focusing do not wanted to user.
The content of the invention
The present invention proposes in view of the above problems, its object is to provide the control of a kind of photographic equipment and photographic equipment Method, even if in the case that subject is moved when touch has been carried out, it is also possible to which it is right that the subject wanted to user is carried out It is burnt.
In order to achieve the above object, the photographic equipment of the 1st invention, which has:Image pickup part, which is carried out to shot object image Opto-electronic conversion output image data;Display part, which shows above-mentioned quilt according to the view data from the output of above-mentioned image pickup part Take the photograph body image;Touch panel, its touch of detection to the display picture of above-mentioned display part;Determining section, which is according to above-mentioned The touch operation of touch panel and the motion of the above-mentioned subject image detected according to above-mentioned view data, it is determined that making For the subject of the focusing object of photographic lenss;And automatic focal adjustments portion, which is to the quilt that determined by above-mentioned determining section Take the photograph body to be focused.
The photographic equipment of the 2nd invention, in above-mentioned 1st invention, above-mentioned touch panel is divided into multiple regions, on Determining section is stated using the subject being present in before the certain hour of touch operation in the region for being touched as above-mentioned right Burnt object.
The photographic equipment of the 3rd invention, in above-mentioned 1st invention, above-mentioned touch panel is divided into multiple regions, on State determining section using from before certain hour till now always present in the subject in the region for being touched as right Burnt object.
The photographic equipment of the 4th invention, in above-mentioned 1st invention, above-mentioned touch panel is divided into multiple regions, on Determining section is stated using the subject being present in before certain hour in touched region as focusing object.
The photographic equipment of the 5th invention, in above-mentioned 1st invention, above-mentioned touch panel is divided into multiple regions, on Determining section is stated using the most long subject of existence time is right as focusing in the region for being touched during certain certain hour As.
The photographic equipment of the 6th invention, in above-mentioned 1st invention, above-mentioned determining section by using touched 2 points as The subject moved in diagonal rectangle is used as focusing object.
The photographic equipment of the 7th invention, in above-mentioned 1st invention, when slide has been carried out to above-mentioned touch panel, Above-mentioned determining section will be present in subject in the region of the destination county of slide as focusing object.
The photographic equipment of the 8th invention, in above-mentioned 1st invention, in above-mentioned touch operation more than the 3rd certain hour During continue in the case of, above-mentioned determining section using the region for being touched as focusing object.
The photographic equipment of the 9th invention, which has:Image pickup part, which carries out opto-electronic conversion output image to shot object image Data;Display part, which shows above-mentioned shot object image according to the view data from the output of above-mentioned image pickup part;Touch panel, Which that be divided in multiple regions its detection be to the touch of the display picture of above-mentioned display part;Determining section, Which is determined as and is taken the photograph according to the motion of the subject from before the certain hour of the touch operation to above-mentioned touch panel The AF regions of the focusing object of shadow camera lens;And automatic focal adjustments portion, which is to the region that determined by above-mentioned determining section Focused.
The photographic equipment of the 10th invention, in above-mentioned 9th invention, above-mentioned determining section is from before above-mentioned certain hour Rise above-mentioned zone in subject it is static in the case of, the touch area of above-mentioned touch operation is defined as into above-mentioned AF areas Domain, in the case where the subject being present in from before above-mentioned certain hour in above-mentioned zone there occurs motion, follows the trail of The subject is determining above-mentioned AF regions.
The control method of the photographic equipment of the 11st invention, the photographic equipment are shown in shot object image on display part, and The touch operation of above-mentioned display part can be detected, it is characterised in that the control method has steps of:Detection step Suddenly, detect whether to have carried out above-mentioned touch operation;Determination step, detect in above-mentioned detecting step carried out it is above-mentioned In the case of touch operation, judge that the subject in the region of above-mentioned touch operation is before touch during certain hour It is no static;Tracing step, the result of determination in above-mentioned determination step be the subject in the region of above-mentioned touch operation not In the case of being static, the subject being present in before following the trail of just now in the region of above-mentioned touch operation;And photography step Suddenly, which is focused and is discharged shutter to the region that the subject followed the trail of in above-mentioned tracing step is present.
Even if in accordance with the invention it is possible to subject is in case of motion when touch has been carried out to provide one kind, it is also possible to The photographic equipment focused by the subject that user wants and the control method of photographic equipment.
Description of the drawings
Fig. 1 is the figure of the structure of the digital camera of the 1st embodiment for illustrating the present invention, and Fig. 1 (a) is to illustrate The block diagram of essential electrical structure, Fig. 1 (b) be illustrate the imaging surface to imaging apparatuss split after region configuration Top view.
Fig. 2 is the flow chart of the shooting action of the digital camera of the 1st embodiment for illustrating the present invention.
Fig. 3 is illustrated in the digital camera of the 1st embodiment of the present invention, the motion of the subject in each region Change block diagram.
Fig. 4 is illustrated in the digital camera of the 1st embodiment of the present invention, the motion of the subject in each region Change block diagram.
Fig. 5 is the flow chart of the shooting action of the digital camera of the 2nd embodiment for illustrating the present invention.
Fig. 6 is illustrated in the digital camera of the 2nd embodiment of the present invention, the motion of the subject in each region Change block diagram.
Fig. 7 is the flow chart of the shooting action of the digital camera of the 3rd embodiment for illustrating the present invention.
Fig. 8 is the flow chart of the action of the search of the digital camera of the 3rd embodiment for illustrating the present invention.
Fig. 9 is illustrated in the digital camera of the 3rd embodiment of the present invention, calculates the sequential of focusing from vector Figure.
Figure 10 is illustrated in the digital camera of the 3rd embodiment of the present invention, the fortune of the subject in each region The block diagram of dynamic change.
Figure 11 is the flow chart of the shooting action of the digital camera of the 4th embodiment for illustrating the present invention.
Label declaration
1:Camera lens;3:Image pickup part;3A:Imaging apparatuss;3Aa:Region;3Ab:Aggregation;5:Image procossing Portion;7:Display part;9:Record portion;11:Bus;13:Lens control portion;15:Tracking part;17:Touch surface Plate portion;21:CPU;23:Intervalometer;25:Memorizer.
Specific embodiment
Hereinafter, preferred implementation is illustrated using the digital camera for having used the present invention.The present invention's is preferred The digital camera of embodiment has image pickup part, and shot object image is converted into view data, and root by the image pickup part According to the view data after the conversion, shot object image live view is shown to the display part being configured on main body back side etc.. Cameraman is shown by observing live view, so that it is determined that composition and shutter opportunity.
Additionally, arrange touch surface plate portion in display part, user when composition and shutter opportunity is determined, to touch panel Portion is touched.When being touched, the subject to being present in touch area is focused and is exposed.Now, it is considered to The motion of time delay and subject before being touched to user, the AF that the subject wanted using user is present The view data in region is controlled.Afterwards, shutter release movement is carried out, the view data to exporting from image pickup part is entered Row image procossing, by the Imagery Data Recording after image procossing in recording medium.When reproduction mode is selected, can be by The view data reproduction display in recording medium is recorded in display part.
(the 1st embodiment)
Fig. 1 (a) is the block diagram of the essential electrical structure of the digital camera of the 1st embodiment for illustrating the present invention. In Fig. 1 (a), camera lens 1 is the optical lens for forming shot object image, with focus lens and zoom lens Deng.
The image pickup part 3 comprising imaging apparatuss 3A is configured with the optical axis of camera lens 1 and in the vicinity for forming shot object image. Image pickup part 3 carries out opto-electronic conversion to shot object image by imaging apparatuss, and the view data output of digital signal form is arrived Bus 11, also from CPU21 input control signals etc..View data is in the present embodiment, per second to export 60 frames. It is of course also possible to it is more to compare 60 frames, it is also possible to fewer than 60 frames.
Fig. 1 (b) illustrates multiple regions that the imaging surface to imaging apparatuss 3A is constituted after splitting.In this embodiment party In formula, the imaging surface of imaging apparatuss 3A is divided into the region 3Aa of 30 (A, a)~(F, f).Each region 3Aa is in little rectangle, and two-dimensional arrangement has many pixels in each region 3Aa, from each pixel output photoelectric conversion signal. Each region 3Aa is partial vector zoning, and is to be adapted to AF (Auto Focus:Auto-focusing) rectangle Size.Tracking part described later 15 calculates the fortune of the subject (for example, face of personage etc.) moved in the 3Aa of region Momentum and the direction of motion are used as partial vector.The calculating is based on following known method, i.e. ask in order that from shooting Direction of displacement and shift amount needed for the image of each frame of each region output of element 3A is consistent etc..The direction of displacement And shift amount is calculated as, each region is cut out from time upper 2 adjacent frames, make the region shifting 1 that this cuts out Carry out the accumulative of difference according to each pixel after the amount of pixel, and make after the amount of 2 pixel of region shifting for cutting out according to Each pixel carries out the integration of difference, thus makes pixel shift successively while carrying out the product of the difference of each pixel Point.By obtaining direction of displacement and shifting when pixel shift is made while becoming minimum among the integrated value obtained Position amount such that it is able to obtain the quantity of motion and the direction of motion of subject.Further, the time is made to go up adjacent 2 successively Individual frame staggers, and repeats the calculating such that it is able to follow the trail of subject.
The aggregation 3Ab of each region 3Aa is in big rectangle, and is Global Vector zoning.The Global Vector Zoning calculates the quantity of motion and the direction of motion of whole picture.Calculate the hand shaking due to user etc. and cause whole numeral The motion of whole picture during camera motion.
In the calculating of above-mentioned partial vector and Global Vector, use bright from the view data of the output of image pickup part 3 Degree information and colour difference information, are calculated by tracking part described later 15 or CPU21.Additionally, the meter of partial vector Calculate each rectangular dimension of region 3Aa and the quantity set of rectangle be can according to the software processing capability of CPU21, The key elements such as the hardware circuit scale and DRAM frequency bands of image processing part 5 are processed.Additionally, rectangular dimension is except upper Beyond the key element stated, the subject wanted also dependent on cameraman determines into optimal size.For example, also dependent on face, The subjects such as bird, insect, mountain are set to optimal size.
Return Fig. 1 (a), image processing part 5 be input into from image pickup part 3 output view data and be temporarily stored in storage View data in device 25 etc., carries out various image procossings, and exports bus 11.For example, carry out white balance etc. Primary image process, the extraction of contrast information, the compression of view data and decompression, live view show With image procossing etc..
There is display part 7 liquid crystal display (LCD) or organic EL that are configured in the back side of camera body etc. to show Device etc., is input into the view data from image pickup part 3 via bus 11, carry out live view show, reproduction display and Menu shows.Record portion 9 includes built-in or installable storage medium, view data is recorded, is also entered The reading of the view data that row has been recorded.
Lens control portion 13 is connected with CPU21 and image processing part 5 etc. via bus 11, is also carried out in camera lens 1 Focus lens drive control.Lens control portion 13 is in order that the view data extracted by image processing part 5 Contrast reach peak value, under the control of CPU21, drive focus lens.So as to carry out the focusing of camera lens 1.
Tracking part 15 is connected via bus 11 and image processing part 5 and CPU21 etc., carries out the tracking of subject. Herein, tracking action refers to following known action, even if tracing object and subject motion, also make which follow the trail of the quilt Body is taken the photograph, in the focusing of motion destination, and exposure also becomes correct exposure.As method for tracing, for example, according to every Frame is compared to the image from each region 3Aa, and follows the trail of its motion destination.For the tracking, carry out as The calculating of upper described partial vector and Global Vector etc. is (however, it is possible to carry out a part or complete by CPU21 Portion).Additionally, according to the partial vector of each region 3Aa, or as described later (being illustrated using Fig. 3), root According to the comparison to the view data before the present view data and certain hour in each region 3Aa, each area is judged Subject in the 3Aa of domain is to move or static, by result of determination storage to memorizer 25.
Touch surface plate portion 17 is configured in the display such as liquid crystal display (LCD) or organic el display of display part 7 Before device, or it is integrally formed with display, detection user is entered to the display picture of display part 7 by finger etc. Gone touch situation, by touch detection signal output to bus 11.Touch detection signal represents touch location, because This is capable of detecting when that user touches region where, additionally it is possible to detect carry out starting point in the case of slide and Terminal etc..Additionally, the touch surface of touch surface plate portion 17 is divided into multiple regions, user's touch operation is can detect that Which region.
CPU21 is central processor (Central Processing Unit), according to the journey being stored in memorizer 25 Sequence carries out the overall control of digital camera.For example, play as following determining section function, the determining section by When user has carried out touch operation to touch surface plate portion 17, detect according to view data of the basis from image pickup part 3 Shot object image motion, it is determined that the subject focused and exposed.Additionally, can also play following determining section Effect, fortune of the determining section according to the subject from before the certain hour of the touch operation to touch surface plate portion 17 It is dynamic, it is determined to become the AF regions of the focusing object of camera lens 1.Additionally, the effect in automatic focal adjustments portion is played, should Automatically focal adjustments portion is driven according to the contrast information extracted by image processing part 5 for the subject determined The focus lens of camera lens 1 make which be focused.
Timer 23 carries out timing action.For example, have passed through after timing commencing signal is received from CPU21 pre- In the case of the certain hour for first determining, will be through information output to CPU21.
Memorizer 25 can carry out nonvolatile memory, DRAM (the Dynamic Random of electric rewriting comprising flash memory etc. Access Memory:Dynamic random access memory), SDRAM (Synchronous Dynamic Random Access Memory:Synchronous DRAM) etc. can electricity rewrite volatile memory.Memorizer 25 except Storage make outside the program of CPU21 actions as above, be also stored with factory shipment when various adjusted values. Additionally, the view data that interim storage is exported from image pickup part 3.In the present embodiment, it was stored with before each second 0.5 second view data (due to being 60fps, therefore, store 30 frames amount view data).Additionally, as above Described, according to each region 3Aa, tracking part 15 judges that subject is to move or static, therefore, store the judgement As a result history.
Then, the action of present embodiment is illustrated using Fig. 2 to Fig. 4.First, the one of subject is illustrated using Fig. 3 Individual example and focusing now.In figure 3, the change of longitudinal axis express time, moment T0 are present, moment T1 Before being 0.1 second, before moment T2 is 0.3 second, before moment T3 is 0.5 second.Imaging apparatuss 3A be divided into (A, 20 (30 are divided in Fig. 1 (b), and illustrate the example for being divided into 20 herein) of a)~(E, e).
Tracking part 15 on the basis of to a certain moment T3 before certain hour, according to each region to moment T3 and The view data of moment T1 is compared, and judges that subject is to move or static in each region.Equally, according to Each region compares the view data of moment T3 and moment T2, also has the view data of moment T3 and moment T0, Judge that subject is to move or static in each region.In addition, be not limited to the moment on the basis of moment T3 judging In each region subject be motion or it is static, for example, it is also possible on the basis of moment T0 (present) respectively with when Carve T1, T2, T3 to be compared.
In the example shown in Fig. 3, in moment T3, Zone Full (A, a)~(E, e) of imaging apparatuss 3A Subject it is static, in moment T2, only region (B, b) motion, other regions are static, in moment T1, only area Domain (C, c), (D, c), (B, b) motion, other regions are static.
At present, moment T0 is located at, user has carried out touch operation to region (D, b).Known by eyes Do not play till actually being touched, there is the delay of about 0.5 second or so, it is, therefore, usually considered that user be with when The subject that quarter T3 (before 0.5 second) is present in region (D, b) is that the object being intended to is touched.In figure In example shown in 3, it is present in the subject in region (D, b) in moment T2, T1, T0 in moment T3 Do not move, remains stationary state, therefore the coordinate (D, b) in the region to being touched carries out focus operation, meanwhile, Correct exposure is controlled into, shutter release movement is carried out.
Consequently, it is possible in the example shown in Fig. 3, judge that the subject in touched region is to move or static, In the case where result of determination is static, the region to being touched is focused, and directly carries out shutter release movement.
Then, another example of subject and focusing now are illustrated using Fig. 4.In the example in figure 3, Petting Area The subject in domain is static, but, in the example in fig. 4, the subject motion of touch area.Even if in the diagram, Also as Fig. 3, the change (moment T0~moment T3) of longitudinal axis express time, imaging apparatuss 3A by (A, a)~ (E, e) is constituted.
In the example in fig. 4, the also moment on the basis of moment T3 of image processing part 5, when comparing according to each region Carve view data, the also view data of moment T3 and moment T2, moment T3 and the moment of T3 and moment T1 The view data of T0, judges that subject is to move or static in each region.
At present, moment T0 is located at, user has carried out touch operation to region (C, b).As described above, passing through Eyes are identified playing till actually being touched, the delay for having about 0.5 second or so, it is, therefore, usually considered that using Family is carried out with the object that the subject being present in region (C, b) in moment T3 (before 0.5 second) is intention Touch.In the example shown in Figure 4, moment T3 be present in the subject in region (C, b) moment T2, T1 (at least before 0.5 second~before 0.1 second till during) is static, but (at 0.1 second during moment T1 to T0 Before~now till during) motion, move to region (D, b).Therefore, to move destination region coordinate (D, b) (also referred to as AF regions) carries out focus operation, and controls into correct exposure, carries out shutter release dynamic Make.
Consequently, it is possible in the example shown in Figure 4, judge the subject in touched region in cake slightly before It is whether static, focus operation etc. is carried out to the subject in just existing region, shutter release movement is carried out.That is, In order to in touch area (C, b) just be moment T1 till static subject carry out focus operation etc., The motion of delay and subject in view of vision, the region that AF is detected is altered to from touch area (C, b) AF regions (D, b).
Then, the action of present embodiment is illustrated using the flow chart shown in Fig. 2.According to being stored in memorizer 25 Program performs the flow process by CPU21.When shooting action starts, the flow process starts.First, judge touch panel Whether portion 17 is touched (S1).When shooting action starts, export from image pickup part 3 according to the ratio of 60 frame per second View data, carries out live view in display part 7 according to the view data and shows.In this step, according to from tactile Whether the touch detection signal determining user for touching panel part 17 touches the display picture of display part 7.
Result of determination in step sl be do not touch in the case of, record the monitoring result (S3) of each regional area. As described above, judging that subject is to move or static according to each region 3Aa, the result of determination recorded and deposit Reservoir 25.Additionally, while update the view data (amounts of 30 frames) suitable with nearest 0.5 second in memorizer 25, While carrying out interim storage.In step s3, when the record and the subject in each region that have carried out view data During the record of static/motion, return to step S1.
Result of determination in step sl be touched in the case of, subsequently determine that the being shot of touched region Whether body is moved (S7).In step s3, have according to each frame recording each region subject it is static The history of/motion.In this step, using the history, in the seat target area for belonging to touched, judge being shot Body is motion or still remains stationary.When judging, all do not move if in the whole moment remained in history Then be judged to no, moved if at least one moment, then be judged to be.
Result of determination in the step s 7 be in the case that the subject in touched region is moved, then, Judge the 1st certain hour whether static (S9) before touch.Herein, in step s3 using interim storage, Whether the view data of the nearest scheduled time, judge static during the 1st certain hour before touch.In this enforcement In mode, the 1st certain hour is 0.5 second, is judged during present (T0) to (T3) before 0.5 second, touched Whether the subject in the region touched is static.In the example illustrated using Fig. 3, due to from before 0.5 second till now Till during (the 1st certain hour) it is static, be judged to be.On the other hand, in the example illustrated using Fig. 4 In son, although from before 0.5 second to before 0.1 second during it is static, but from before 0.5 second till now during Motion, it is determined that being no.
Result of determination in step s 9 is, in the case where the certain hour before touch is not static, to have followed the trail of just also The subject (S11) being present in touched region.Herein, due to the quilt being present in the region touched by user Take the photograph body to be moved, follow the trail of the subject.For example, example as shown in Figure 4 like that, follow the trail of touch area (C, B) static subject during moment T3~T1 in.As method for tracing, local can be calculated by tracking part 15 Vector, and being tracked, however, it is possible to the view data in the region for passing through to compare touched and adjacent with the region The view data in region, so as to obtain the direction of motion and quantity of motion, and is tracked.The subject of the tracking is present Region be set to AF regions.
Result of determination in the step s 7 in the case of no, or, the feelings that result of determination in step s 9 is yes Under condition, or, when following the trail of subject in step s 11, then focused and carried out shutter release (S21). Judgement in situation and step S9 that judgement in step S7 is no in the case of being, due to the quilt is intended to by user Take the photograph body and be present in touched region with resting state, therefore focus operation is carried out to the region.On the other hand, exist In the case of subject has been followed the trail of in step S11, focus operation is carried out to the AF regions that the subject of the tracking is present.
The focus operation for carrying out in the step s 21 is carried out based on contrast using the view data from selected region The focusing of degree AF.Additionally, using the view data from selected region, calculating subject brightness, and calculating Exposure control value is to become correct exposure.At the end of the calculating of focus operation and exposure control value, carry out shutter and release Put action.At the end of shutter release movement, terminate the flow process.
Consequently, it is possible in the 1st embodiment of the present invention, it is contemplated that want touch from user and start to actually carrying out The time delay (the 1st certain hour) of touch, for the region for being touched, decision delay during subject fortune It is dynamic, select for carrying out the region of focus operation.Therefore, even there is the subject of motion, it is also possible to user The subject of intention is focused.
Additionally, the step of the flow process of present embodiment in S3, every time during the view data of the amount of one frame of output, pressing The result of determination of the static/motion of subject is stored according to each region in memorizer 25, in the step s 7, is adopted The history for storing in step s3, whether there is the subject with motion in judging touched region.Therefore, When touch surface plate portion 17 is touched, promptly can be judged.
Additionally, the step of the flow process of present embodiment in S9, the static/motion of the subject in determinating area 3Aa When, respectively each view data now and before 0.1 second, before 0.3 second, before 0.5 second is compared, so as to judge Whether moved before 0.1 second, whether moved before 0.3 second, whether moved before 0.5 second.Therefore, it is possible to shorten The calculating time.
Additionally, as described above, CPU21 is with as the function of determining the determining section of subject that will be focused, The determining section is being shot in touched region to being present in before the time delay of touch operation (the 1st certain hour) Body is focused (with reference to the S9 of Fig. 2).Additionally, the determining section is to existing till now from before certain hour Subject in the region for being touched is focused (with reference to the S9 of Fig. 2, with reference to Fig. 3).
(the 2nd embodiment)
Then, using the 2nd embodiment of Fig. 5 and Fig. 6 explanation present invention.In the 1st embodiment, judge Whether there is static subject during the 1st certain hour in touch area, in the situation that subject has been moved Under, follow the trail of the subject.On the other hand, in the 2nd embodiment, also judge in touch area with the presence or absence of 2nd certain hour (the 1st certain hour>2nd certain hour) before static subject, changed according to the result Method for tracing.
Structure in present embodiment is omitted specifically due to identical with the block diagram shown in Fig. 1 of the 1st embodiment It is bright.First, the action of present embodiment is illustrated using Fig. 6.Example shown in the Fig. 6 illustrates that presence is certain from the 2nd The method for tracing of the situation of the subject of motion has been proceeded by before time.In Fig. 3, Fig. 4, the longitudinal axis is the time, But, in figure 6, transverse axis is the time.Moment T0 illustrates present moment (moment for being touched), moment T11 is the moment suitable with before 28 frames, and moment T12 is the moment suitable with before 29 frames, and moment T13 is and 30 (before about 0.5 second) the suitable moment before frame.
Moment T0 is located at, region (B, b) (position of the P in figure) is touched.In the present embodiment, as Want to touch from user and start to the time delay for actually being touched to be set to take about 0.5 second.Prolonging with this Slow time suitable moment T13 (before 0.5 second of touch), be present in the subject in region (B, b) moment T12, T13 moves to region (C, c), so moment T0 (present) move to region (E, c) (Q's in figure Position).Can follow the trail of the motion of the subject by partial vector being calculated in tracking part 15.
In the present embodiment, before touch there is no static feelings within the 1st certain hour (for example, 0.5 second) Condition, and in the case that the 2nd certain hour (for example, 0.1 second) before touch is static before, in adjacent area In be tracked.On the other hand, in the case of moving before the 2nd certain hour before touch, such as adopt Fig. 6 Illustrated ground like that, carries out the tracking of subject using local vector.
Then, the action of present embodiment is illustrated using the flow chart shown in Fig. 5.It is identical with the 1st embodiment, press The flow process is performed by CPU21 according to the program being stored in memorizer 25.
When shooting action starts, the flow process starts.First, determine whether to be touched (S1), in result of determination be Not by touch in the case of, record the monitoring result (S3) of each regional area, return to step S1.Sentencing in S1 Determine result be touched in the case of, judge whether the subject in touched region there occurs motion (S7). The result of determination is in the case of being, to judge the 1st certain hour whether static (S9) before touch.Above-mentioned steps S1~ Process in each step of S9 is due to identical with the flow process shown in Fig. 2 of the 1st embodiment, thus omits detailed Explanation.
Result of determination in S9 be the 1st certain hour before touch do not have it is static in the case of, it is determined that Before the 2nd certain hour before touch whether static (S13).In the present embodiment, due to the 1st certain hour For 0.5 second, the 2nd certain hour was 0.1 second, judge before 0.5 second for touching~before 0.1 second during touched Whether the subject in region is static.The monitoring result stored in being usable in step S3 carries out the judgement, additionally, Can pass through respectively to the present view data that illustrated using Fig. 3,4 and before 0.1 second, before 0.3 second, 0.5 second Front each view data is compared so as to carry out the judgement.
Result of determination in step s 13 is for, in the case of being, tracking is present in touched from before the 2nd certain hour Subject (S15) in the region touched.Due in the region for being touched from before 0.5 second to before 0.1 second, subject It is static, therefore follow the trail of the subject.Follow the trail of when, partial vector may also be employed, as illustrated in fig. 6 as enter Row is followed the trail of, but, in the present embodiment, view data during by being touched before 0.5 second and present The view data of adjacent area in the region for being touched be compared, it is adjacent so as to pass through to judge that subject is present in Which it is tracked in region.Due to subject touch it is tight before move, therefore be present in touch area Probability in adjacent area is high, in addition, can be tracked at short notice compared with being tracked using partial vector.
On the other hand, in the case that result of determination in step s 13 is no, tracking is present in the 1st certain hour Subject (S17) in front touched region.As the result of determination in step S13 is no and certain from the 2nd Before time (in the present embodiment for 0.1 second before) play motion, thus exist move to it is adjacent with touch area Region away from region probability.Therefore, as illustrated by using Fig. 6, tracking part 15 is using each frame View data, carries out the tracking of subject, obtains that present which frame has subject.
Result of determination in the step s 7 in the case of no, or, the feelings that result of determination in step s 9 is yes Under condition, or, it is when the tracking of subject has been carried out in step S15 or S17, identical with the 1st embodiment, Focused and carried out shutter release (S21).When focus operation and shutter release movement has been carried out, terminate the stream Journey.
Thus, in the 2nd embodiment of the present invention, (it is being 0.5 in the present embodiment from the 1st certain hour Second) before to before the 2nd certain hour (being 0.1 second in the present embodiment) during touch area subject it is static Situation and from before the 2nd certain hour touch area subject move in the case of, in step S15 and S17 Middle change method for tracing.Therefore, it is possible to basis from the moment (present) for being touched play the 1st certain hour it Before during subject motion selecting appropriate method for tracing.
Additionally, as described above, CPU21 has as determining the work(of the determining section of the subject that will be focused Can, the determining section is focused to the subject being present in before certain hour in touched region (reference S15)。
In addition, in the present embodiment, if the 1st certain hour is 0.5 second, if the 2nd certain hour is 0.1 second, It is however not limited to this, or other times, as long as the 1st certain hour is wanted touch from user and is started to reality Border touched till or so time delay time, as long as the 2nd certain hour is shorter than the 1st certain hour Time.1st and the 2nd certain hour can coordinate the characteristic of user to be changed.Additionally, adopting Fig. 6 In illustrated method for tracing (S17 in Fig. 5), although carried out using the view data exported according to every frame Follow the trail of, it is however not limited to this, can be every two frame, or every three frame, suitably can be changed.Additionally, Before the 2nd certain hour it is static in the case of, the tracking being not limited in adjacent area, naturally it is also possible to carry out base In the accumulative tracking for surmounting adjacent area of the partial vector as illustrated by using Fig. 6.
Additionally, as the 1st variation of present embodiment, in step s 13, can be to from now on to the 1st During before certain hour, static subject more than certain hour is tracked.Now, by tracking part 15 Subject to being present in touch area is tracked, and is judged according to the tracking result.
(the 3rd embodiment)
Then, the 3rd embodiment of the present invention is illustrated using Fig. 7 to Figure 10.In the 1st and the 2nd embodiment In, to the subject being present in before the 1st certain hour corresponding with the time delay that user touches in touch area It is tracked, is controlled to focus which.On the other hand, in the 3rd embodiment, touched to being present in Time in the region touched most long subject is tracked, and is controlled to focus which.
Structure in present embodiment is identical with the structure of the block diagram shown in Fig. 1 of the 1st embodiment, therefore, omit Detailed description.First, the action of present embodiment is illustrated using Fig. 9 and Figure 10.
In the example shown in Fig. 9, moment T0 represents that user touches the moment of touch surface plate portion 17.Moment T25 It is the moment for shifting to an earlier date the time suitable with the 1st certain hour than moment T0.Two field picture F1 is exported in moment T25, Afterwards, according to two field picture F2 (moment T24), two field picture F3 (moment T23), two field picture F4 (moment T22) Order be sequentially output two field picture.
In the present embodiment, for two field picture on the basis of the region for being touched in the two field picture F1 of moment T25 F1, F2, F3, F4 ... calculate partial vector.That is, obtain being present in for moment T25 being shot in touch area Body is moved to which in two field picture F2, F3, F4 ....Equally, with the two field picture F2 of moment T24 In the region for being touched on the basis of, for two field picture F2, F3, F4 ... calculate partial vector.That is, when obtaining The subject being present in touch area for carving T24 is transported to which in two field picture F2, F3, F4 ... It is dynamic.Next, for two field picture F3, F4 ... also obtain the fortune of the subject being present in touch area in the same manner It is dynamic.
It is present in an example of the motion of the subject in the touch area using Figure 10 explanations.Assume in moment T0, Touch region (B, b).Primary search is the tracking on the basis of two field picture F1, follows the trail of moment T25's The motion of the subject in the region (B, b) for being touched.It is present in region (B, b) in moment T25 subject A, It is present in region (B, b) in moment T24 subject A, is present in region (C, c) in moment T23 subject A, It is present in region (C, c) in moment T22 subject A, is present in region (D, c) in moment T21 subject A. Therefore, the Search Results of the 1st time be the touch area time of staying of subject A be equivalent to the time of 2 frames.
The search of the 2nd time is the tracking on the basis of two field picture F2, is followed the trail of in the region (B, b) of moment T24 The motion of subject.It is present in region (B, b) in moment T24 subject A, deposits in moment T23 subject A It is region (C, c), is present in region (C, c) in moment T22 subject A, in moment T21 subject A It is present in region (D, c).Therefore, the Search Results of the 2nd time be subject A the touch area time of staying be phase When in the time of 1 frame.
The search of the 3rd time is the tracking on the basis of two field picture F3, is followed the trail of in the region (B, b) of moment T23 The motion of subject.It is present in region (B, b) in moment T23 subject B, deposits in moment T22 subject B It is region (C, b), is present in region (C, c) in moment T21 subject B.In addition, in figure, while Also show that the position of subject A.In fig .9, although the specific example of two field picture F1~F3 is shown, but when Carve T0 it is tight before, similarly carry out the tracking of subject being present in touch area.Therefore, search the 3rd time Hitch fruit be touch area time of staying of subject B be equivalent to the time of 1 frame.
Return Fig. 9, when for the whole two field pictures before the moment T0 for being touched, obtained be present in it is tactile After touching the time of staying of subject in region, then, calculated according to vector in each frame, carry out what is should focused The determination in AF regions.When i.e., on the basis of above-mentioned two field picture F1, F2, F3, F4 ..., it is determined that being stored in In subject in touch area, the time of staying most long subject is in the region existing for moment T0.In Figure 10 In shown example, be only F1, F2, F3 these three, but, in this case, subject A exists than subject B In touch area, residence time is longer, therefore, the region that subject A of moment T0 is present is defined as into AF areas Domain.
When AF regions are determined, the AF regions are focused.That is, the figure according to the AF regions determined As the contrasting signal of data carries out the focusing of camera lens 1.
Then, using the action in the flow chart explanation present embodiment shown in Fig. 7 and Fig. 8.Shown in Fig. 7 In flow chart, step S1~S15 is identical with the flow process shown in Fig. 5 of the 2nd embodiment, therefore, omit detailed Illustrate, only to differing the step of S19 illustrate.
Result of determination in the step s 7 was moved before the 1st certain hour before touch, also, in S13 Result of determination be before the 2nd certain hour before touch also move in the case of, search in touch area long-time The subject (S19) of presence.Herein, as illustrated by using Fig. 9 and Figure 10, search in touch area Most long subject of the middle time of staying.Illustrated using detailed actions of the Fig. 8 to the search of the subject.
Then, the flow process of the search in step S19 is illustrated using Fig. 8.When into the search shown in Fig. 8 During flow process, first, it is assumed that the frame number (S31) of n=certain hour amounts.In the present embodiment, if the 1st one timing Between be 0.5 second, if frame number be per second 60, therefore, n=30.As described above, the 1st certain hour and frame number are only examples Show, determine also dependent on the design load of photographing unit.
Next, calculating the fortune with the region that touched as basic point till playing before n frames from the moment for being touched Dynamic track (S33).Herein, as illustrated by using Fig. 9, successively according to two field picture F1, F2 ..., According to n, the subject to being present in touched region is tracked.
Then, the result calculated in step S33 is saved in into memory area (memorizer 25) (S35).Protecting When having deposited result, (S37) is calculated to n=n-1, be whether 0 to be judged (S39) to the n that calculates. When the result of determination is not 0, return to step S33, the region for calculating to be touched to next two field picture is as basic point Motion track, and result is saved in into memory area.
When result of determination in step S39 is n=0, then, will be present among the subject for searching out tactile Touch the most long region for being embodied in place being shot of the time in region and be defined as AF regions (S41).When determining AF During region, terminate the flow process of search, return original flow process.
Thus, in the 3rd embodiment of the present invention, respectively with the figure from before the 1st certain hour till now As frame is basic point, it is tracked until now to being present in subject during touch in the touch area of (present), choosing Select region conduct of the most long subject of existence time when touching existing for (present) in the touch area when touching AF regions.As subject, for example, be the Herba Setariae Viridis shaken by wind fringe in the case of, sometimes in picture Middle shake, the fringe and background of Herba Setariae Viridis are entered in touch area.Now, it is however generally that, the fringe of Herba Setariae Viridis is entered and is touched The time longer situation for touching region is in the majority, therefore, it is possible to the subject focusing wanted to user.
Additionally, as described above, CPU211 has as determining the work(of the determining section of the subject that will be focused Can, the determining section to during certain certain hour in the region for being touched the most long subject of existence time carry out it is right Burnt (with reference to S19).
In addition, in the present embodiment, the whole two field pictures included with the 1st certain hour have carried out quilt as basic point The tracking of body is taken the photograph, but naturally it is also possible to every 1 two field picture and every 2 two field pictures, jumping carries out chasing after for subject Track.
(the 4th embodiment)
Then, the 4th embodiment of the present invention is illustrated using Figure 11.In the 1st~the 3rd embodiment, touch For a bit, but, in the 4th embodiment, the searching moving in the 2 points of rectangular extents for being formed touched by user Body, setting consider the AF regions of the time delay of the motion and touch of the movable body.Additionally, the 1st~the 3rd In embodiment, by setting AF regions, shutter release is focused and is carried out, so as to carry out 1 photography, but It is, in the 4th embodiment, after the photography of the 1st time has been carried out, when slide has been carried out, to sliding End place is focused, and carries out the photography of the 2nd time.
The structure of present embodiment is identical with the block diagram shown in Fig. 1 of the 1st embodiment, therefore, omit specifically It is bright.In the action of present embodiment, illustrated using Figure 11.When shooting has been started, first, judgement is It is no to have carried out a touch (S1A).Herein, whether user is judged according to the touch detection from touch surface plate portion 17 Touch display part 7 display picture a bit.
In the case of being not to be a touch in the result of determination in step S1A, then to whether being that 2 touches are entered Row judges (S2).Sometimes when subject moves around, it is difficult to passing through 1 touch specifies the subject.Now, By diagonal 2 point for touching the rectangle for including the subject for moving around such that it is able to relatively easily specify being shot Body.Therefore, in the present embodiment, it is arranged to by 2 touches specify subject.In this step, Judge whether user touches 2 points of the display picture of display part 7 according to the touch detection from touch surface plate portion 17.
Result of determination in step s 2 in the case of being not 2 touches, the step with Fig. 2 of the 1st embodiment Rapid S3 is identical, records the monitoring result (S3) of each regional area.Herein, as described above, according to each region 3Aa To the subject in region be motion or it is static judge, by result of determination storage to memorizer 25.Work as record During monitoring result, (S4) is then imaged.Herein, the shot object image of 1 frame amount is obtained by image pickup part 3 The view data of acquirement is temporarily stored to memorizer 25 by view data.In addition, at least in the 1st certain hour phase Between keep the interim storage.
In the case that result of determination in step s 2 is 2 touches, searches for and obtained by 2 touches entering into Rectangle in subject in the subject (S5) moved.Herein, using the monitoring result in step S3, retrieve By the subject moved in the region that includes in the rectangle that 2 touches are obtained.
Process in step S7, S9, S11 is identical with the flow process of Fig. 2 of the 1st embodiment, therefore, omit detailed Thin explanation, but, the situation from step S5 to step S9 is supplemented.
When in step s 5 by 2 touches determine rectangle in from before the 1st certain hour till now During when being found that motion subject, judge whether the 1st certain hour before touch static in step s 9. If the result of determination is no, in step s 11, the motion subject is followed the trail of, if the area that movable body is located now Domain is AF regions.On the other hand, when in step s 5 by 2 touches determine rectangle in do not find fortune During dynamic subject, the result of determination of step S9 is yes, if the region in rectangle is AF regions.
When the result of determination in step S7 is no, or result of determination in step s 9 is yes, or in step When tracking has been carried out in S11, then, be tracked focused and carried out shutter release (S23).This Place, for the subject moved, continues to follow the trail of, and the subject is focused, while make exposure become appropriate, Perform shutter release movement.When shutter release movement is performed, view data is obtained from image pickup part 3.
When release movement has been carried out, then, (S25) is judged to whether having carried out slide.Touching In the case of touch surface plate portion 17, it is contemplated that user touched till time delay setting AF regions, but It is that, in addition to the AF regions of the automatic setting, user also wants to manually set AF regions sometimes.Now, make into Gone touch finger maintain the original state slide into user want set region.In step S25, according to next Export from the detection of touch surface plate portion 17, determine whether to have carried out slide.
Result of determination in step s 25 be carried out slide in the case of, then, slip end place is entered Row is focused and carries out shutter release movement (S27).Herein, exported according to the detection from touch surface plate portion 17, sentenced Determine the end place of slide, it is determined that the region comprising the end place.Also, according to from the area for being determined The view data in domain, carries out focus operation and spectrum assignment action, and carries out shutter release movement, so as to obtain quilt Take the photograph the view data of body image.
Result of determination in step s 25 be do not carry out slide in the case of, or, it is right in step s 27 Slip end place is focused, and when having carried out shutter release, terminates the flow process of shooting.
Thus, in the 4th embodiment of the present invention, it is being to move in diagonal rectangle with 2 points touched to search for Subject, it is considered to the motion of the motion subject for retrieving and the time delay to touch, select user to think The region that the subject wanted is located.By carrying out 2 touches, even motion subject also simply can be referred to It is fixed.
Additionally, in the present embodiment, it is arranged to when user has carried out slide after touch, to slide The subject of destination county is focused, and carries out the photography of the 2nd time.Therefore, in the AF regions for automatically determining On the basis of photography, additionally it is possible to carry out the photography in the AF regions determined by the manual operation of user.
In addition, in the present embodiment, the process of step S7~S11 is identical with the 1st embodiment, but, certainly Step S7~S19 in the step S7~S17 that can also be replaced as in the 2nd embodiment or the 3rd embodiment.
As mentioned above, in the embodiments of the present invention and variation, according to touch surface plate portion 17 Touch operation and the motion of the subject image detected according to the view data from image pickup part 3, determine conduct The subject of the focusing object of photographic lenss.Additionally, according to the timing from the touch operation to touch surface plate portion 17 Between before rise subject motion, determine as photographic lenss focusing object AF regions.Therefore, even if When touch has been carried out, subject is in case of motion, it is also possible to which the subject that user wants is focused.
In addition, in the embodiments of the present invention and variation, subject is carried out using hardware by tracking part 15 It is static or motion judgement, and processes the tracking of subject, it is however not limited to this, can also passes through image procossing 5 grade other hardware of portion are carried out, additionally, can certainly be carried out by software in CPU21.
Additionally, in the embodiments of the present invention and variation, by present view data and past image It is static or motion that data are compared so as to obtain subject, but naturally it is also possible to pass through the motion of subject The additive methods such as vector are detected.
Additionally, in the embodiments of the present invention and variation, in the area for having carried out the touch operation of touch operation when institute In the case that the subject in domain is moved, focusing is changed according to the motion of the subject before the certain hour before touch Region.It is however not limited to this, can be configured as when long-time touch has been carried out to 1 point, directly to touched behaviour The region of work carries out AF, and is photographed.In order to carry out this kind of action, for example, if in the flow process of Fig. 2, It is arranged to determine in step sl when having carried out touch operation, to touch operation before the judgement of step S7 is carried out Whether continue for the scheduled time is judged, in the case where touch operation continue for the scheduled time, into step S21, And AF is carried out, photographed.
Additionally, in the embodiments of the present invention and variation, although using digital camera setting as photography It is standby to be illustrated, but, as photographing unit, it is of course possible to be digital single-lens reflex camera, or Miniature digital Photographing unit, or such as video camera, the photographing unit of the moving image of film camera, and it is, or interior It is placed in motion phone and smart mobile phone, portable information terminal (PDA:Personal Digital Assist) and game set Photographing unit in standby etc..
Additionally, with regard to the motion flow of claims, description and accompanying drawing, even with expression " first ", " then " etc. the word of order is illustrated, but, especially not illustrating part, being not meant to must be by Sequentially implemented according to this.
The present invention is not limited to above-mentioned embodiment, in implementation phase, can be right in the range of without departing from its purport Structural element is deformed and is embodied.Additionally, by the multiple structural elements disclosed in above-mentioned embodiment It is appropriately combined, it is also possible to form various inventions.For example, the several of entire infrastructure key element shown in embodiment can also be deleted Individual structural element.Also, structural element that also can be in the range of appropriately combined different embodiments.

Claims (11)

1. a kind of photographic equipment, it is characterised in that have:
Image pickup part, which carries out opto-electronic conversion output image data to shot object image;
Display part, which shows above-mentioned shot object image according to the view data from the output of above-mentioned image pickup part;
Touch panel, its touch of detection to the display picture of above-mentioned display part;
Determining section, its according to the touch operation to above-mentioned touch panel and according to above-mentioned view data detect it is above-mentioned The motion of shot object image, is determined as the subject of the focusing object of photographic lenss;And
Automatic focal adjustments portion, which is focused to the subject determined by above-mentioned determining section.
2. photographic equipment according to claim 1, it is characterised in that
Above-mentioned touch panel is divided into multiple regions,
In the certain hour before touch operation, in the case that the subject that is present in the region for being touched is not static, Above-mentioned determining section is using the subject being present in before the certain hour of touch operation in the region for being touched as above-mentioned Focusing object.
3. photographic equipment according to claim 1, it is characterised in that
Above-mentioned touch panel is divided into multiple regions,
Above-mentioned determining section by from before the certain hour of touch operation when touch operation is carried out always present in institute Subject in the region of touch is used as focusing object.
4. photographic equipment according to claim 1, it is characterised in that
Above-mentioned touch panel is divided into multiple regions,
Subject in the 1st certain hour before touch operation is not static, and the 2nd one timing before touch operation In the case that interior subject is not static, above-mentioned determining section is by before the 1st certain hour before above-mentioned touch operation It is present in subject in touched region as focusing object, wherein, the 2nd certain hour is less than described the 1 certain hour.
5. photographic equipment according to claim 1, it is characterised in that
Above-mentioned touch panel is divided into multiple regions,
Above-mentioned determining section by from before the certain hour of touch operation when touch operation is carried out during touched Region in the most long subject of existence time as focusing object.
6. photographic equipment according to claim 1, it is characterised in that
Above-mentioned determining section is by using 2 points of subjects moved as in diagonal rectangle being touched as focusing object.
7. photographic equipment according to claim 1, it is characterised in that
When slide has been carried out to above-mentioned touch panel, above-mentioned determining section will be present in the destination county of slide Subject in region is used as focusing object.
8. photographic equipment according to claim 1, it is characterised in that
In the case of continuing during above-mentioned touch operation is more than certain hour, above-mentioned determining section is by the area for being touched Domain is used as focusing object.
9. a kind of photographic equipment, it is characterised in that have:
Image pickup part, which carries out opto-electronic conversion output image data to shot object image;
Display part, which shows above-mentioned shot object image according to the view data from the output of above-mentioned image pickup part;
Which that be divided in multiple regions touch panel, its detection are to touch of the display picture of above-mentioned display part One;
Determining section, its fortune according to the subject from before the certain hour of the touch operation to above-mentioned touch panel It is dynamic, it is determined as the AF regions of the focusing object of photographic lenss;And
Automatic focal adjustments portion, which is focused to the region determined by above-mentioned determining section.
10. photographic equipment according to claim 9, it is characterised in that
Above-mentioned determining section, will be above-mentioned in the case where the subject in the above-mentioned zone from before above-mentioned certain hour is static The touch area of touch operation is defined as above-mentioned AF regions, from before above-mentioned certain hour is being present in above-mentioned zone In the case that interior subject there occurs motion, follow the trail of the subject to determine above-mentioned AF regions.
A kind of 11. control methods of photographic equipment, the photographic equipment are shown in shot object image on display part, and can Detect the touch operation of above-mentioned display part, it is characterised in that
The control method has steps of:
Detecting step, detects whether to have carried out above-mentioned touch operation;
Determination step, detects in the case of having carried out above-mentioned touch operation in above-mentioned detecting step, judges touching Before during certain hour, whether the subject in the region of above-mentioned touch operation is static;
Tracing step, it is not quiet that the result of determination in above-mentioned determination step is the subject in the region of above-mentioned touch operation In the case of only, it is present in the region of above-mentioned touch operation before following the trail of during the certain hour of touch operation Subject;And
Photography step, the region that the subject to following the trail of in above-mentioned tracing step is located are focused and are discharged shutter.
CN201310082769.8A 2012-03-15 2013-03-15 Shooting device and control method thereof Expired - Fee Related CN103312979B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012058420A JP5635546B2 (en) 2012-03-15 2012-03-15 Imaging device and method for controlling imaging device
JP2012-058420 2012-03-15

Publications (2)

Publication Number Publication Date
CN103312979A CN103312979A (en) 2013-09-18
CN103312979B true CN103312979B (en) 2017-04-26

Family

ID=49137721

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310082769.8A Expired - Fee Related CN103312979B (en) 2012-03-15 2013-03-15 Shooting device and control method thereof

Country Status (2)

Country Link
JP (1) JP5635546B2 (en)
CN (1) CN103312979B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6531397B2 (en) * 2014-03-07 2019-06-19 Jsr株式会社 Pattern forming method and composition used therefor
CN107076961B (en) 2014-07-30 2019-08-02 宇龙计算机通信科技(深圳)有限公司 Focusing method and focusing mechanism
CN107124556B (en) * 2017-05-31 2021-03-02 Oppo广东移动通信有限公司 Focusing method, focusing device, computer readable storage medium and mobile terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002033949A (en) * 2000-07-17 2002-01-31 Casio Comput Co Ltd Electronic camera
JP2005347891A (en) * 2004-05-31 2005-12-15 Sharp Corp Camera device and portable electronic information equipment with camera function
CN1893561A (en) * 2005-07-06 2007-01-10 索尼株式会社 Image pickup apparatus, control method, and program
CN102244727A (en) * 2010-05-10 2011-11-16 佳能株式会社 Image pickup apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004117490A (en) * 2002-09-24 2004-04-15 Fuji Photo Optical Co Ltd Autofocus system
JP2011030008A (en) * 2009-07-27 2011-02-10 Canon Inc Imaging apparatus
JP2012010061A (en) * 2010-06-24 2012-01-12 Nikon Corp Imaging apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002033949A (en) * 2000-07-17 2002-01-31 Casio Comput Co Ltd Electronic camera
JP2005347891A (en) * 2004-05-31 2005-12-15 Sharp Corp Camera device and portable electronic information equipment with camera function
CN1893561A (en) * 2005-07-06 2007-01-10 索尼株式会社 Image pickup apparatus, control method, and program
CN102244727A (en) * 2010-05-10 2011-11-16 佳能株式会社 Image pickup apparatus

Also Published As

Publication number Publication date
JP2013190735A (en) 2013-09-26
CN103312979A (en) 2013-09-18
JP5635546B2 (en) 2014-12-03

Similar Documents

Publication Publication Date Title
CN103988490B (en) Image processing apparatus, image processing method and recording medium
CN105516582B (en) The control method of photographic device and photographic device
CN104780324B (en) A kind of method and apparatus of shooting
CN102957864B (en) Imaging device and control method thereof
US10419661B2 (en) Shooting method and shooting device
CN105282435B (en) Zoom control equipment and Zoom control method
CN101616259B (en) Photographic device and camera shooting method
CN102625036A (en) Image processing apparatus, image capturing apparatus and recording medium
US20130141604A1 (en) Image capturing device and method for capturing details of entire shooting scene
CN101355651A (en) Image pickup device
CN101931753A (en) Target following and image tracking apparatus, method of controlling operation thereof and digital camera
CN104954665B (en) Pan focus is realized
CN103945131A (en) Electronic device and image acquisition method
JP2011259415A (en) Imaging apparatus
CN101621617A (en) Image sensing apparatus and storage medium
KR101559992B1 (en) Imaging apparatus which performs development processing on captured image
CN103248815A (en) Image pickup apparatus and image pickup method
CN104539864A (en) Method and device for recording images
CN111669495B (en) Photographing method, photographing device and electronic equipment
CN106998424A (en) Camera device and image capture method
CN103312979B (en) Shooting device and control method thereof
JP6230386B2 (en) Image processing apparatus, image processing method, and image processing program
CN103108123A (en) Digital photographing apparatus and method of controlling the same
CN103685969B (en) Camera device
CN101841654B (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20151208

Address after: Tokyo, Japan

Applicant after: OLYMPUS Corp.

Address before: Tokyo, Japan

Applicant before: Olympus Imaging Corp.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170426