JP5325626B2 - Camera and subject tracking method - Google Patents

Camera and subject tracking method Download PDF

Info

Publication number
JP5325626B2
JP5325626B2 JP2009073125A JP2009073125A JP5325626B2 JP 5325626 B2 JP5325626 B2 JP 5325626B2 JP 2009073125 A JP2009073125 A JP 2009073125A JP 2009073125 A JP2009073125 A JP 2009073125A JP 5325626 B2 JP5325626 B2 JP 5325626B2
Authority
JP
Japan
Prior art keywords
camera
subject
unit
movement
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2009073125A
Other languages
Japanese (ja)
Other versions
JP2010226554A (en
JP2010226554A5 (en
Inventor
伸祐 本間
Original Assignee
オリンパスイメージング株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパスイメージング株式会社 filed Critical オリンパスイメージング株式会社
Priority to JP2009073125A priority Critical patent/JP5325626B2/en
Publication of JP2010226554A publication Critical patent/JP2010226554A/en
Publication of JP2010226554A5 publication Critical patent/JP2010226554A5/ja
Application granted granted Critical
Publication of JP5325626B2 publication Critical patent/JP5325626B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Description

  The present invention relates to a camera and a subject tracking method, and more particularly to a camera and a subject tracking method capable of tracking without losing sight of a fast-moving subject.

  In a photographing apparatus such as a camera, a subject is observed with an optical finder, a liquid crystal monitor, or the like, and a composition is determined before photographing. For example, in a subject moving like a bird, the subject may move out of the screen while deciding on the composition while facing the camera, and may be lost. If you lose sight, the photographer does not know where the subject has moved, and often turns the camera in various directions, but in the end it is often not found.

  Various cameras have been proposed that include an automatic tracking device that can automatically track a subject. For example, Patent Document 1 discloses a camera that expands the tracking range of a subject image by extending the shooting angle of view when the subject to be tracked is about to go out of the shooting screen. .

Japanese Patent No. 2622237

  In the camera disclosed in Patent Document 1, tracking is possible by changing to the wide side when the subject is likely to deviate from the shooting screen, but cannot be tracked beyond the shortest focal length of the shooting lens. In addition, the focal length may be unintended by the photographer by changing to the wide side.

  The present invention has been made in view of such circumstances, and an object of the present invention is to provide a camera and a subject tracking method that can track a moving subject without a focal length unintended by a photographer.

In order to achieve the above object, a camera according to a first aspect of the present invention includes a photographing lens for forming a subject image, a photographing unit that picks up a subject image formed by the photographing lens and outputs image data, and the photographing unit. A display unit that displays the subject image based on the image data by the unit, a focal length output unit that outputs focal length information of the photographing lens, and a movement of the subject based on the image data from the imaging unit A subject movement determination unit; a subject movement position prediction unit that predicts a subject movement position of the moving subject when the subject movement determination unit detects movement of the subject; and a camera motion detection that detects the movement of the camera body. And the captured image is positioned at the subject movement position predicted by the subject movement position prediction unit based on the detection result of the camera movement detection unit. As to the notification section for notifying that it has determined the determining camera motion determination section whether or not the camera body is moved, and the camera body to said subject movement position is moved by the camera motion determination unit, the A holding determination unit that determines that the camera body does not vibrate for a predetermined time and / or that the predetermined position is held, and the subject movement determination unit is operated by the holding determination unit. when it is determined that the camera is poised to determine the presence or absence of motion of the subject based on the image data from the imaging unit.

In the camera according to a second aspect, in the first aspect, the subject movement position prediction unit predicts a movement position based on a time required to move between two points on the screen and a distance between the two points. To do .
The camera according to a third aspect is the camera according to the first aspect, wherein the camera movement determination unit includes the movement position predicted by the subject movement position prediction unit and the movement direction detected by the camera movement detection unit. The camera according to claim 2, wherein the determination is made based on a movement amount and a movement time .

In the camera according to a fourth aspect of the present invention, in the first camera, the notification unit displays a direction in which the moving body exists on the display unit.

A camera according to a fifth aspect of the invention is determined by an imaging unit that converts a subject image into image data, a moving object determination unit that determines whether a moving object exists based on the image data, and the moving object determination unit. A position prediction unit that predicts the current position based on the moving speed and the moving direction of the moving object, a motion detection unit that detects and detects the movement of the camera, and determines whether the camera has reached the current position; A notification unit for notifying a determination result by the motion determination unit , and a holding determination unit for determining that the camera body has not vibrated and / or a predetermined position is maintained for a predetermined time. The moving object determination unit determines whether the moving object is present based on the image data from the imaging unit when the holding determination unit determines that the camera is set. A constant.

In the subject tracking method according to the sixth aspect of the invention, the subject is converted into image data by the imaging unit, the subject image is displayed on the display unit based on the image data, and the camera body does not vibrate for a predetermined time. And / or whether or not the camera is held based on whether or not the predetermined position is held, and when it is determined that the camera is held, The moving body is determined, the moving direction of the moving body is determined, and when the camera is determined to be held, the change in the shooting direction of the camera is compared with the direction in which the moving body exists. , Announce the comparison result.

  According to the present invention, it is possible to provide a camera and a subject tracking method that can track a moving subject without a focal length not intended by the photographer.

It is a block diagram which shows the structure of the camera concerning one Embodiment of this invention. It is a perspective view which shows the structure of the acceleration detection part in the camera concerning one Embodiment of this invention. It is a figure explaining the angular acceleration detection part in the camera concerning one embodiment of the present invention, (a) is a perspective view showing the structure of an angular acceleration part, (b) is integration of the detection output of an angular acceleration part. It is a graph to show. It is a figure which shows the detection direction of an acceleration and an angular acceleration in the camera concerning one Embodiment of this invention. It is a figure explaining the photoelectric sensor type touch panel of the camera concerning one embodiment of the present invention, (a) is a sectional view of a touch panel, (b) is a sectional view of a touch panel in the state where it touched with a finger. It is a figure which shows the state in which the photographer is holding the camera concerning one Embodiment of this invention. 2A and 2B are diagrams illustrating a use state of a camera according to an embodiment of the present invention, in which FIG. 1A illustrates a state in which shooting is attempted, and FIG. 2B illustrates a state in which a subject has moved and searched for a subject. It is. It is a figure which shows the use condition of the camera concerning one Embodiment of this invention, (a) shows a mode that the to-be-photographed object moved at the time of imaging | photography, (b) is a figure which shows a mode that it looks for a to-be-photographed object. . It is a figure which shows the use condition of the camera concerning one Embodiment of this invention, and shows a mode that the position of the to-be-moved subject is notified. It is a figure explaining how to obtain | require a moving body position in the camera concerning one Embodiment of this invention. It is a figure which shows the use condition of the camera concerning one Embodiment of this invention, and is a figure which shows a mode that a to-be-photographed object is tracked based on the calculated | required moving body position. It is a flowchart which shows the operation | movement of the camera control of the camera concerning one Embodiment of this invention. It is a flowchart which shows determination of having held the camera concerning one Embodiment of this invention. It is a flowchart which shows the operation | movement of the moving body position determination of the camera concerning one Embodiment of this invention.

  Hereinafter, a preferred embodiment using a digital camera to which the present invention is applied will be described with reference to the drawings. FIG. 1 is a block diagram showing a configuration of a camera 10 according to an embodiment of the present invention. The camera 10 includes an image processing and control unit 1, an imaging unit 2, a zoom lens 2a, a vibration unit 3, a recording unit 4, an acceleration detection unit 5, an angular acceleration detection unit 7, an operation determination unit 6, a display unit 8, a touch panel 8b, A clock unit 9 is provided.

  The imaging unit 2 includes an imaging element that photoelectrically converts a subject image formed by the zoom lens 2a, and outputs image data to the image processing and control unit 1 and the like. The zoom lens 2 a has a variable focal length, can change the focal state to the wide side or the tele side, and the focal length information is output to the image processing and control unit 1. The image processing and control unit 1 performs image processing on the image data output from the imaging unit 2. The image processing and control unit 1 controls the entire camera 10 according to a program stored in advance in a storage unit (not shown). Note that as the image processing, still image processing and moving image processing are possible.

  In the image processing and control unit 1, a moving object determination unit 1a, a position prediction unit 1b, and a motion determination unit 1c are provided. The moving object determination unit 1a determines whether or not a moving object exists in the subject image based on the image data input from the imaging unit 2, and recognizes the shape of the moving object when the moving object exists. Then, the direction and speed of the moving body are determined. The position predicting unit 1b performs subject position prediction using the determination result of the moving object determining unit 1a. The position prediction result here is displayed on the display unit 8 to be described later, the photographer is notified of the predicted position of the subject, and the direction of the camera 10 is guided in that direction.

  The motion determination unit 1c determines how much the photographer has moved the camera 10 in which direction based on a detection output from an angular acceleration detection unit 7 described later. The vibration unit 3 vibrates the camera 10 and notifies the photographer. When the movement direction / movement amount detected by the movement determination unit 1c matches the position predicted by the position prediction unit 1b, the vibration unit 3 vibrates the camera 10 and is predicted to have moved the subject. Announce that you have reached In addition to the vibration unit 3, the notification to the photographer may be a visual display on the display unit 8 or an audio display provided with a sounding member such as a buzzer.

  The acceleration detector 3 detects acceleration applied to the camera 10. Based on this acceleration, it is determined whether the camera 10 is moved in the longitudinal direction or in a direction orthogonal thereto, and whether the camera 10 is firmly held. The angular acceleration detection unit 7 detects angular acceleration applied to the camera 10. Based on this angular acceleration, a rotation around the longitudinal axis of the camera 10 or around an axis in a direction orthogonal thereto is determined. The structure of the acceleration detector 3 will be described later with reference to FIG. 2, and the structure of the angular acceleration detector 7 will be described later with reference to FIG.

  The operation determination unit 6 determines an operation state of operation members such as a release button, a zoom button, a reproduction button, and a menu button provided on the camera 10. In the menu mode, various modes such as a moving image shooting mode and a still image shooting mode can be set. The clock unit 9 outputs date and time information, and also uses it to measure the movement time of the subject when the position is predicted by the position prediction unit 1b. The recording unit 4 is a recording medium that records image data. When the image processing and control unit 1 determines that the operation of the shooting instruction has been performed by the operation determination unit 6, the image processing and control unit 1 records the shooting date and time information in association with the image data in the recording unit 4. This is to enable image management based on the shooting date and time.

  The display unit 8 includes a display device such as a liquid crystal monitor disposed on the back surface of the camera 10. Further, the display unit 8 can display a subject image in live view as a moving image based on the image data from the imaging unit 2, and can reproduce a captured image recorded in the recording unit 4. A touch panel 8b is disposed on the surface of the display unit 8, and a touch of a photographer's finger or the like can be detected. When the menu mode is set, a menu screen is displayed on the display unit 8 and can be set by touching a displayed menu, for example, moving image shooting, still image shooting, or the like. Further, when the playback mode is selected, it is possible to display an enlarged image by touching a desired image from a list of captured images.

  Next, the structure of the acceleration sensor 50 of the acceleration detection unit 5 that detects the acceleration applied to the camera 10 will be described with reference to FIG. The acceleration sensor 50 includes a metal part 52 on the chip surface and a cross-linked metal part 51. That is, in the acceleration sensor 50, the metal part 51 includes four base points 51a, an H-shaped bridge part 51b held by the base point 51a, and a detection part 51c facing the metal part 52. The acceleration sensor 50 detects the capacitance of the capacitor formed by the detection unit 51 c and the metal unit 52. When the metal part 51 moves in the direction of the arrow in FIG. 2, the capacitance of the capacitor changes. Therefore, by obtaining this amount of change, the acceleration in the direction of the arrow can be detected.

  The structure of the angular acceleration sensor of the angular acceleration detector 7 that detects the angular acceleration applied around the axis of the camera 10 will be described with reference to FIG. FIG. 3A shows the structure of the angular acceleration sensor 55. The angular acceleration sensor 55 is composed of a pair of piezoelectric ceramic elements 56. The Coriolis force generated when the pair of piezoelectric ceramic elements 56 is vibrated and an angular velocity is generated causes the piezoelectric ceramic elements 56 to be deformed, thereby generating a voltage. If the output voltage at this time is integrated (gyro integration) as shown in FIG. 5B, the rotation amount can be obtained.

  In the camera 10, as shown in FIG. 4, a lateral slide is detected by the acceleration sensor 50 (in the direction of the arrow 61), and a movement of panning to the side (horizontal direction) of the camera 10 is detected by the angular acceleration sensor 55. (Movement around arrow 62). In addition, an angular acceleration sensor 55 that detects the movement of panning in the vertical direction (vertical direction) of the camera 10 is also provided.

  Next, the structure of the touch panel 8b is shown in FIG. This touch panel 8b is a photoelectric sensor type touch panel. In the touch panel 8b, as shown in FIG. 5A, optical sensors 80b are two-dimensionally arranged in a liquid crystal 80 at a predetermined interval, and a backlight 80a is provided on the back surface of the liquid crystal 80. . Note that this photoelectric sensor type touch panel is not disposed on the liquid crystal but integrated into the liquid crystal. As the touch panel, other than the photoelectric sensor type, another touch panel such as a resistive film type may be used.

  When the user's finger 81 approaches the touch panel 8b, the light from the backlight 80a is reflected by the finger 81 and detected by the optical sensor 80b, as shown in FIG. 5B. The touch position can be detected depending on which optical sensor 80 b detects the reflected light from the finger 81.

  By using the detection output of the acceleration detection unit 5 and the detection output from the touch panel 8b, it is possible to determine whether or not the camera 10 is firmly held. As shown in FIG. 6, when the photographer holds the camera 10 firmly, vibration due to camera shake disappears, and the detection output of the acceleration sensor 50 of the acceleration detection unit 5 becomes substantially zero. Further, since the photographer's finger or the like holds the upper right portion of the touch panel 8b, the touch signal of the upper right portion of the touch panel 8b is detected. Therefore, when these detection signals are detected, it can be determined that the camera 10 is held correctly.

  Next, a method of using the camera 10 according to the present embodiment will be described with reference to FIGS. An example will be described in which a moving subject, here a bird such as a white-eye, is photographed while holding the camera 10. FIG. 7A shows a state where a moving subject such as a white-eye is found, the camera 10 is taken out from a bag or the like, and is held. FIG. 7B shows that the photographer is looking at the display unit 8 while holding the camera 10 because the white-eye fly up and escape while the camera 10 is held and the composition is determined. Show the state.

  FIG. 8 shows the relationship between the display unit 8 and the subject in the situation of FIG. FIG. 8A shows a state in which when the subject 15 a such as a white-eye is displayed in the display unit 8, the subject 15 a has jumped away and the subject 15 b has moved to the outside of the display unit 8. Since the subject 15b is outside the screen of the display unit 8, the photographer cannot observe it. Therefore, as shown in FIG. 8B, the photographer searches for the subjects 15c and 15d, but since it is a moving body, it cannot be easily found and cannot be observed on the display unit 8.

  Therefore, in this embodiment, the position prediction unit 1b or the like predicts the movement destination of the subject 15, and displays the movement destination on the display unit 8 as shown in FIG. Is directed toward the subject 15. In the example shown in FIG. 9, “This is” is displayed and the direction is indicated by an arrow, but the display method may be changed as appropriate. When the movement determining unit 1c or the like detects that it has moved by the amount of movement predicted in the prediction direction and has reached the predicted position, a notification display 18b is displayed on the display unit 8 or notification is performed by vibration from the vibration unit 3. I am doing so. In the example shown in FIG. 9, “this side” is displayed, but this display method may be changed as appropriate.

Next, a method for predicting the moving object position will be described with reference to FIG. The angle of view of the imaging unit 2 is θ0 (θ0 is the angle from the center to the end of the screen), and the screen range is 1. It is assumed that the subject 15g at the time T = 0 is at a position 1 / A from the edge of the screen, and the subject 15h comes to the edge of the screen at the time T = T0. Assuming that the current time is T = T, the angle θ from the center of the subject 15j at this time is obtained by using the angle θ1 between the time T = 0 and the time T = T0 and the distance 1 / A.
θ = θ1 × (T / T0)
= Θ0 × (1 / A) × (T / T0) (Formula 1)
Is required.

  Therefore, as shown in FIG. 11, the photographer can capture the subject 15j again on the display unit 8 by rotating it by the angle θ obtained by Expression 1. The rotation angle (angle θ) at this time may be determined by the motion determination unit 1c.

  Next, the operation of the present embodiment will be described with reference to the camera control flowchart shown in FIG. If the camera control flow is entered, it is first determined whether or not the power switch is on (S101). As a result of this determination, if the result of determination by the operation determination unit 6 is that the power switch is not on, this camera control flow is terminated. When finished, the sleep state is entered, and when the power switch is turned on, an interrupt process is performed, which is executed from step S101.

  If the result of determination in step S101 is that the power switch is on, it is next determined whether or not it is in shooting mode (S102). If the result of this determination is that shooting mode has been set, then image capture is performed (S103) and image display is performed (S104). That is, the image data is taken in from the imaging unit 2 and the subject image is displayed in live view on the display unit 8 as a moving image. The photographer determines the composition by looking at the live view display.

  Once image display is performed, next, vibration and the like are determined (S105). In this step, the detection output of the acceleration detector 5 is input, and the touch detection output is input from the touch panel 8b. Subsequently, it is determined whether or not the camera is held (S106). Here, it is determined whether or not the camera 10 is correctly held based on detection outputs from the acceleration detection unit 5 and the touch panel 8b. Details of this determination will be described later with reference to FIG.

  If the result of determination in step S106 is that the camera is held, a moving object tracking operation is performed in step S111 and subsequent steps. First, it is determined whether or not there is a moving object (S111). In this step, the moving body determination unit 1a determines whether or not the moving subject 15 exists in the subject image based on the image data from the imaging unit 2. If the result of this determination is that there is a moving object, direction determination is performed (S112). In this step, the moving body determination unit 1a determines the moving direction of the moving subject 15. Subsequently, the direction position is displayed (S113). In this step, the notification display 18a as shown in FIG. 9 is displayed on the display unit 8 to display the direction in which the camera 10 is directed to the photographer.

  As a result of the determination in step S106, if the camera is not held, or if the result of determination in step S111 is that there is no moving object, or if the direction position display is performed in step S113, whether or not to perform shooting next A determination is made (S121). In this step, the operation determination unit 6 determines whether or not the release button has been operated.

  If the result of determination in step S <b> 121 is that there is no shooting, the moving object position is next determined (S <b> 124). In this step, the position prediction unit 1b obtains the predicted position of the moving subject 15 based on Equation 1. Details of this moving object position determination will be described later with reference to FIG. Subsequently, direction determination, timing start, and gyro integration are started (S125). Here, the movement determination unit 1c determines the movement of the camera 10. That is, based on the detection output of the angular acceleration sensor 55 of the angular acceleration detection unit 7, the direction of movement of the camera 10 is detected, and the movement amount is obtained by integrating the detection output. Further, the timepiece 9 measures the time from when the camera 10 is firmly held in step S106.

  Next, it is determined whether the camera orientation has reached the moving object position (S126). Here, whether or not the camera 10 has reached the position predicted in step S124 is determined using the moving direction, moving amount, and moving time obtained in step S125. If the result of this determination is that it has reached, OK display, vibration and sounding are performed (S127). Here, as in the notification display 18b shown in FIG. 9, the display unit 8 is notified that the object 15 has been reached, and is also notified by vibrations from the vibration unit 3 and pronunciation by a buzzer or the like.

  If the result of determination in step S126 is that the camera has not reached the moving object position, or if OK display or the like is performed in step S127, it is next determined whether or not a predetermined time has passed (S128). Here, the time counting function of the clock unit 9 is used, and in step S106, it is determined whether or not the time during which the steps S124 to S127 have been processed has elapsed from the predetermined time since the camera was held.

  If the result of determination in step S128 is that the predetermined time has not elapsed, processing returns to step S121. On the other hand, when a predetermined time has elapsed, initialization is performed (S129). In this step, information such as the timing operation started when the camera is held in step S106 and the moving direction of the moving subject are initialized. After initialization, the process returns to step S101.

  If the result of determination in step S121 is shooting, shooting is performed (S122). In this step, image data from the imaging unit 2 is acquired. Subsequently, recording is performed together with the date (S123). Here, the acquired image data is subjected to image processing in the image processing and control unit 1, and the image processed image data is recorded in the recording unit 4. In recording, the shooting date / time information acquired from the clock unit 9 is also recorded in association with the image. After recording, the process returns to step S101.

  If the result of determination in step S <b> 102 is not shooting mode, it is determined whether or not playback mode is set (S <b> 131). Here, it is determined whether or not the playback button has been operated by the operation determination unit 6. If the result of this determination is not playback mode, processing returns to step S101. On the other hand, if the playback mode is set, playback is performed (S132). Here, the image data recorded in the recording unit 4 is read out and reproduced and displayed on the display unit 8.

  Once playback has been carried out, it is next determined whether or not playback has ended (S133). Here, it is determined whether or not the operation determining unit 6 has performed an operation for ending reproduction. If the result of this determination is that playback has not ended, processing returns to step S132 and playback continues. On the other hand, if the result of determination is that playback has ended, processing returns to step S101.

  Thus, in the camera control flow according to the present embodiment, when there is a moving subject such as a white-eye, the subject can be tracked. That is, it is detected that the camera 10 is firmly held (S106), and if there is a moving object in the subject image (S111), the direction is determined and displayed (S112, S113). Then, the position of the moving object is predicted and the movement of the camera 10 is detected (S124, S125), and the prediction of the moving object position and the detection of the movement amount of the camera 10 are repeated until the predicted position is reached (steps S124, S125, 126 → When the predicted position is reached (S126 → Yes), the photographer is notified of this (S127).

  In this embodiment, if it is not determined in step S106 that the camera 10 is correctly held, the moving body tracking cannot be started, and the normal shooting mode is executed as it is. If the camera 10 is not properly held, the moving object tracking is not started. If the camera 10 is not held firmly, the moving object direction determination (see S112) cannot be performed correctly, and the moving object tracking accuracy decreases. Because. In step S128, it is determined whether or not the predetermined time has elapsed because the tracking is stopped when the subject cannot be captured even after tracking over the predetermined time.

  Next, the determination of whether or not the camera is held in step S106 will be described using the flowchart shown in FIG. When the flow shown in FIG. 13 is entered, it is first determined whether or not the composition is horizontal (S201). If the composition is not horizontal, it is determined whether or not the composition is vertical (S202). Whether the composition is horizontal or vertical is determined based on the output of the acceleration detector 5. In addition to the acceleration detection unit 5, for example, a center of gravity detector may be provided and the determination may be made based on this output.

  If YES is determined in step S201 or step S202, that is, if the composition is a horizontal composition or a vertical composition, it is next determined whether or not there is no vibration for one second (S203). In this step, based on the detection output of the acceleration detector 5, it is determined whether or not there has been no vibration for one second. Note that one second is an example, and this time may be longer or shorter than this, and it is sufficient that there is no vibration for a time period that can be said to hold the camera correctly.

  If the result of determination in step S203 is that there is no vibration for 1 second, it is next determined whether or not a predetermined position has been touched (S204). In this step, the determination is made based on the detection output from the touch panel 8b. In the case of the horizontal composition, as shown in FIG. 6, since the upper right is held, it is determined whether or not there is a detection output corresponding to this portion. Also, in the case of the vertical composition, since this portion is held, the determination is similarly made based on the detection output of the touch panel 8b.

  If the result of determination in step S204 is that a predetermined position has been touched, a Yes determination is made. That is, in this case, it is determined that the camera is correctly held because there is no vibration for a predetermined time in the horizontal composition or the vertical composition and the photographer holds the predetermined position. On the other hand, if the result of determination in step S202 is neither horizontal composition nor vertical composition, if there is vibration in step S203, or if the result of determination in step S204 is that the predetermined position has not been touched, No. In this case, the camera is not properly held.

  Thus, in the flow shown in FIG. 13, it can be determined whether or not the camera is firmly held. As described above, in the camera control flow, moving object tracking is started on the basis of the time when the camera is firmly held.

  Next, the moving body position determination in step S124 will be described using the flowchart shown in FIG. In this flowchart, the position of the moving object is predicted every moment. When the flow shown in FIG. 14 is entered, first, a moving object is determined (S301). In this step, the moving object determination unit 1a determines whether or not there is a moving subject from the image data. Whether or not there is a moving object is determined when there is a portion that changes in the image data at at least two time points even though the camera 10 is not moving by the acceleration detection unit 5.

  If it is determined in step S301 that the object moves, then T count is performed (S302). Here, the time T shown in FIG. Subsequently, it is determined whether or not the moving object is outside the screen (S303). In this step, it is determined whether the moving object determined by the moving object determination unit 1a has gone out of the screen. If the result of this determination is that it is within the screen, it waits until it is outside the screen.

  If the moving object goes out of the screen as a result of the determination in step S303, the time T at that time is set to T0 (S304). Subsequently, the angle of view θ0 is detected from the zoom position (S305). In this step, the focal length (zoom position) of the zoom lens 2a is detected, and the angle of view θ0 determined from this focal length is obtained. When the angle of view θ0 is detected, the predicted position (field angle θ) of the moving object at the current time T is then obtained by Equation 1, that is, θ = θ0 × (1 / A) × (T / T0) (S306). When the predicted position (angle of view θ) of the moving object is obtained, the original flow is restored.

  As described above, in the moving object position determination, when a moving object exists, the angle of view θ0 of the imaging unit 2, the distance (1 / A) from the initial position of the moving object to the screen edge, and the moving object goes out of the screen. The predicted position (view angle θ) of the moving object at the current time T can be obtained based on the time until (T0). For this reason, it is possible to predict the position of the moving object that moves moment by moment. In the flow of FIG. 12, the moving object position determination flow is repeatedly executed even after the moving object goes out of the screen. In this case, steps S301 to S305 may be skipped and step S306 may be executed. .

  As described above, in the embodiment of the present invention, when it is determined that there is a moving subject, the change in the shooting direction of the camera is compared with the direction in which the moving subject exists, and the comparison result is notified. Like to do. For this reason, a moving subject can be tracked without a focal length not intended by the photographer.

  In the embodiment of the present invention, the presence of the moving subject is determined when the camera is correctly held, so that the presence of the moving subject can be accurately determined. Further, when the presence of the moving subject is determined, the moving direction is determined and displayed, so that even if the photographer once loses sight of the moving subject, he orients the camera in the direction in which the moving subject exists. The moving subject can be captured in a short time.

  Furthermore, in one embodiment of the present invention, the determination as to whether or not there is a moving subject is performed at a position where the camera is fixed in space (that is, the position where the camera is correctly held). Therefore, it is possible to accurately determine the presence of the moving subject. Further, since the moving speed of the moving subject is detected based on this time point, the moving position of the moving subject can be accurately predicted.

  In the embodiment of the present invention, an example in which still image shooting is performed at the time of shooting has been described, but a moving image may also be shot. In this case, if there is a moving subject when shooting a moving image, the moving direction is indicated as in the case of the embodiment, and when the predicted position is reached, visual display is performed on the display unit 8. You can do it.

  In one embodiment of the present invention, in predicting the movement position of a moving subject, the movement position is predicted based on the time required to move from the initial position to the edge of the screen and the distance therebetween. However, the moving position may be predicted based on the time required to move from an arbitrary position on the screen to an arbitrary position and the distance between them.

  Furthermore, in one embodiment of the present invention, in determining whether the camera is correctly held (see FIG. 13), the determination is made based on three conditions: composition, presence / absence of vibration, and touch at a predetermined position. In addition to these three conditions, one condition or two conditions among the three conditions may be added. Further, when the predicted position is reached, the notification is made by three display methods of vibration, visual display, and auditory display, but any one of them may be combined.

  Furthermore, in the embodiment of the present invention, the digital camera is used as the photographing device. However, the camera may be a digital single lens reflex camera or a compact digital camera, such as a video camera or a movie camera. It may be a camera for moving images, or may be a camera built in a mobile phone, a personal digital assistant (PDA), a game device, or the like.

  The present invention is not limited to the above-described embodiments as they are, and can be embodied by modifying the constituent elements without departing from the scope of the invention in the implementation stage. In addition, various inventions can be formed by appropriately combining a plurality of components disclosed in the embodiment. For example, you may delete some components of all the components shown by embodiment. Furthermore, constituent elements over different embodiments may be appropriately combined.

DESCRIPTION OF SYMBOLS 1 ... Image processing and control part, 1a ... Moving body determination part, 1b ... Position prediction part, 1c ... Motion determination part, 2 ... Imaging part, 2a ... Zoom lens, 3. ..Vibration unit, 4 ... recording unit, 5 ... acceleration detection unit, 6 ... operation determination unit, 7 ... angular acceleration detection unit, 8 ... display unit, 8b ... touch panel, DESCRIPTION OF SYMBOLS 9 ... Clock part, 10 ... Camera, 15a-15j ... Subject, 18a ... Notification display, 18b ... Notification display, 50 ... Acceleration sensor, 51 ... Metal part, 51a・ ・ ・ Base point, 51b ・ ・ ・ Bridged part, 51c ・ ・ ・ Sensing part, 52 ・ ・ ・ Metal part, 55 ・ ・ ・ Angular acceleration sensor, 56 ・ ・ ・ Piezoelectric ceramic element, 61 ・ ・ ・ Arrow, 62 ..Arrow, 80 ... Liquid crystal, 80a ... Backlight, 80b ... Optical sensor, 81 ... Finger

Claims (6)

  1. A photographic lens for forming a subject image;
    An imaging unit that captures a subject image formed by the photographing lens and outputs image data;
    A display unit for displaying the subject image based on the image data by the imaging unit;
    A focal length output unit for outputting focal length information of the photographing lens;
    A subject movement determination unit that determines the movement of a subject based on image data from the imaging unit;
    A subject movement position prediction unit that predicts a subject movement position of the moving subject when the subject movement determination unit detects the movement of the subject;
    A camera motion detector for detecting the motion of the camera body;
    A camera motion determination unit that determines whether or not the camera body has moved so that the captured image is positioned at the subject movement position predicted by the subject movement position prediction unit based on the detection result of the camera motion detection unit When,
    A notification unit for notifying that the camera body has moved to the subject movement position by the camera movement determination unit;
    A holding determination unit that determines that the camera body has not vibrated for a predetermined time and / or that a predetermined position is held;
    Have
    The subject movement determination unit determines whether or not a subject moves based on image data from the imaging unit when the holding determination unit determines that the camera is held. .
  2.   The camera according to claim 1, wherein the subject movement position prediction unit predicts a movement position based on a time and a distance that the subject moves within the screen.
  3.   The camera motion determination unit is configured to make a determination based on the movement position predicted by the subject movement position prediction unit, a movement direction, a movement amount, and a movement time detected by the camera movement detection unit. The camera according to claim 1.
  4.   The camera according to claim 1, wherein the notification unit displays a direction in which the moving body exists on the display unit.
  5. An imaging unit for converting a subject image into image data;
    A moving object determination unit that determines whether a moving object exists based on the image data;
    A position predicting unit that predicts a current position based on the moving speed and moving direction of the moving object determined by the moving object determining unit;
    Detecting the movement of the camera, and a motion determination unit for determining whether the camera has reached the current position;
    A notification unit for notifying the determination result by the motion determination unit;
    A holding determination unit that determines that the camera body has not vibrated for a predetermined time and / or that a predetermined position is held;
    Have
    The moving object determination unit determines whether or not there is a moving object based on image data from the imaging unit when the holding determination unit determines that the camera is held.
  6. The subject is converted into image data by the imaging unit,
    A subject image is displayed on the display unit based on the image data,
    Determining whether the camera is being held based on whether the camera body has not vibrated for a predetermined time and / or whether the predetermined position is maintained ;
    When it is determined that the camera is set, a moving body that moves within the screen of the display unit is determined,
    Determine the moving direction of the moving body,
    Since it is determined that the camera is set up, the change in the shooting direction of the camera is compared with the direction in which the moving object is present, and the comparison result is announced.
    An object tracking method characterized by comprising:
JP2009073125A 2009-03-25 2009-03-25 Camera and subject tracking method Active JP5325626B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009073125A JP5325626B2 (en) 2009-03-25 2009-03-25 Camera and subject tracking method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009073125A JP5325626B2 (en) 2009-03-25 2009-03-25 Camera and subject tracking method

Publications (3)

Publication Number Publication Date
JP2010226554A JP2010226554A (en) 2010-10-07
JP2010226554A5 JP2010226554A5 (en) 2012-05-10
JP5325626B2 true JP5325626B2 (en) 2013-10-23

Family

ID=43043237

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009073125A Active JP5325626B2 (en) 2009-03-25 2009-03-25 Camera and subject tracking method

Country Status (1)

Country Link
JP (1) JP5325626B2 (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4970716B2 (en) * 2004-09-01 2012-07-11 株式会社ニコン Electronic camera
JP4586709B2 (en) * 2005-11-02 2010-11-24 オムロン株式会社 Imaging device

Also Published As

Publication number Publication date
JP2010226554A (en) 2010-10-07

Similar Documents

Publication Publication Date Title
US20160112642A1 (en) Folded imaging path camera
US10404913B2 (en) Portable terminal device and method for controlling power of portable terminal device
US8976270B2 (en) Imaging device and imaging device control method capable of taking pictures rapidly with an intuitive operation
US20180059511A1 (en) Continuous autofocus mechanisms for image capturing devices
CN103477277B (en) Imaging device
EP1505826B1 (en) Camera
US8194173B2 (en) Auto-focusing electronic camera that focuses on a characterized portion of an object
US8624998B2 (en) Camera image selection based on detected device movement
JP4463792B2 (en) Imaging device
US20120242818A1 (en) Method for operating electronic device and electronic device using the same
JP5259464B2 (en) Imaging apparatus and mode switching method thereof
TWI442328B (en) Shadow and reflection identification in image capturing devices
CN102668541B (en) Image capture device having tilt or perspective correction
JP2011249981A (en) Photographing apparatus and its control method
US8676047B2 (en) Image capture apparatus and method of controlling the same
JP2010025962A (en) Image stabilization control apparatus and image capturing apparatus
US7889237B2 (en) Digital camera
JP5431083B2 (en) Image capturing apparatus and method for controlling image capturing apparatus
RU2502117C2 (en) Image capturing device and driving method therefor
JP6084026B2 (en) Imaging apparatus, notification method, notification program, and recording medium
US20050052538A1 (en) Camera and subject observing method
JP2005003719A (en) Photographing device
US7932950B2 (en) Automatic focusing apparatus and image pickup apparatus
US20060132612A1 (en) Motion picture taking apparatus and method
KR101256543B1 (en) Image capturing apparatus, image capturing method, and storage medium

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120315

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120315

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130319

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130410

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130529

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130624

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130722

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313111

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350