US9554035B2 - Image pickup device, method of controlling image pickup device, and computer program for automatically achieving composition specified by user - Google Patents
Image pickup device, method of controlling image pickup device, and computer program for automatically achieving composition specified by user Download PDFInfo
- Publication number
- US9554035B2 US9554035B2 US14/763,091 US201414763091A US9554035B2 US 9554035 B2 US9554035 B2 US 9554035B2 US 201414763091 A US201414763091 A US 201414763091A US 9554035 B2 US9554035 B2 US 9554035B2
- Authority
- US
- United States
- Prior art keywords
- image pickup
- time
- image
- pickup device
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H04N5/23222—
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B9/00—Exposure-making shutters; Diaphragms
- G03B9/64—Mechanism for delaying opening of shutter
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H04N5/23216—
-
- H04N5/2353—
Definitions
- the present disclosure relates to an image pickup device, a method of controlling an image pickup device, and a computer program.
- CMOS complementary metal oxide semiconductor
- Patent Literature 1 discloses a technology in which a digital camera detects a composition to automatically execute image pickup processing and, when a composition specified by a user is detected, the detection is notified to the user.
- Patent Literature 2 discloses a technology in which an image inputted from an optical finder is displayed on a screen of a digital camera in order to prevent a failure of photographing caused by a time lag after a shutter release button is pushed but before operation for exposing light to an image pickup element is started.
- Patent Literature 1 JP 2011-139498A
- Patent Literature 2 JP 2012-99984A
- the present disclosure provides an image pickup device, a method of controlling the image pickup device, and a computer program, each of which is novel and improved and is capable of executing detection of a composition and automatic image pickup processing in consideration of a time lag from the detection of the composition to start of image pickup operation.
- an image pickup device including: a composition detection unit configured to calculate a time at which an object that is specified by a user and is included in a captured image achieves a composition specified by the user; a time calculation unit configured to calculate a time after start instruction of image pickup operation is issued but before an image is captured; and an image pickup control unit configured to start image pickup processing of the image in response to the start instruction of the image pickup operation.
- the composition detection unit issues the start instruction of the image pickup operation to the image pickup control unit the time calculated by the time calculation unit before the time at which the composition specified by the user is achieved.
- a method of controlling an image pickup device including: calculating a time at which an object that is specified by a user and is included in a captured image achieves a composition specified by the user; calculating a time after start instruction of image pickup operation is issued but before an image is captured; and starting image pickup processing of the image in response to the start instruction of the image pickup operation.
- the start instruction of the image pickup operation is issued the calculated time before the time at which the composition specified by the user is achieved.
- a computer program that causes a computer to: calculate a time at which an object that is specified by a user and is included in a captured image achieves a composition specified by the user; calculate a time after start instruction of image pickup operation is issued but before an image is captured; and start image pickup processing of the image in response to the start instruction of the image pickup operation.
- the start instruction of the image pickup operation is issued the calculated time before the time at which the composition specified by the user is achieved.
- an image pickup device As described above, according to the present disclosure, it is possible to provide an image pickup device, a method of controlling the image pickup device, and a computer program, each of which is novel and improved and is capable of executing detection of a composition and automatic image pickup processing in consideration of a time lag from the detection of the composition to start of image pickup operation.
- FIG. 1 is an explanatory diagram illustrating an external appearance example of an image pickup device 100 according to an embodiment of the present disclosure, which illustrates a perspective view seen from a back surface side of the image pickup device 100 .
- FIG. 2 is an explanatory diagram illustrating a functional configuration example of the image pickup device 100 according to an embodiment of the present disclosure.
- FIG. 3 is an explanatory diagram illustrating a functional configuration example of a control unit 110 included in the image pickup device 100 according to an embodiment of the present disclosure.
- FIG. 4 is a flow chart illustrating an operation example of the image pickup device 100 according to an embodiment of the present disclosure.
- FIG. 5 is a flow chart illustrating an operation example of the image pickup device 100 according to an embodiment of the present disclosure.
- FIG. 6 is an explanatory diagram illustrating an example of a screen displayed in a display unit 120 .
- FIG. 7 is an explanatory diagram illustrating an example of a screen displayed in the display unit 120 .
- FIG. 8 is an explanatory diagram for describing processing for calculating a moving direction and a moving speed of a feature point.
- FIG. 9 is an explanatory diagram for describing calculation processing of a time.
- FIG. 10 is an explanatory diagram for describing calculation processing of a time lag.
- FIG. 11 is an explanatory diagram for describing a case where a wider range is specified by a user who uses the imaging device 100 .
- FIG. 1 is an explanatory diagram illustrating an external appearance example of an image pickup device 100 according to the embodiment of the present disclosure, which illustrates a perspective view seen from a back surface side of the image pickup device 100 .
- FIG. 1 there will be described the external appearance example of the image pickup device 100 according to the embodiment of the present disclosure.
- the image pickup device 100 includes a display unit 120 and an operation unit 130 in a housing 101 .
- the display unit 120 displays an image captured by the image pickup device 100 or displays various kinds of setting screens of the image pickup device 100 .
- a touch panel is provided in the display unit 120 , and a user who uses the image pickup device 100 can operate the image pickup device 100 by touching the touch panel provided in the display unit 120 with an operation member such as a finger.
- the operation unit 130 causes a user to operate the image pickup device 100 and includes a button, a switch, and the like for operating the image pickup device 100 .
- FIG. 1 illustrates, as the operation unit 130 , a zoom button 131 , a shutter button 132 , and a power button 133 .
- the zoom button 131 is a button for changing a magnification at the time of image pickup in the image pickup device 100 .
- the shutter button 132 is a button for capturing an image in the image pickup device 100 .
- the power button 133 is a button for turning on/off a power source of the image pickup device 100 .
- the external appearance of the image pickup device 100 is not limited to the above example. It is also needless to say that the button and the switch constituting the operation unit 130 are not limited to those illustrated in FIG. 1 .
- the image pickup device 100 automatically starts image pickup operation when an object specified by a user reaches a position specified by the user.
- the image pickup device 100 according to the embodiment of the present disclosure considers a time lag from detection of a composition to start of the image pickup operation. By considering the time lag from the detection of the composition to the start of the image pickup operation, the image pickup device 100 according to the embodiment of the present disclosure can achieve image pickup in the composition intended by the user.
- the external appearance of the image pickup device 100 according to the embodiment of the present disclosure illustrated in FIG. 1 is merely an example and, as the external appearance of the image pickup device 100 , not only the external appearance illustrated in FIG. 1 but also various forms can be employed.
- FIG. 2 is an explanatory diagram illustrating the functional configuration example of the image pickup device 100 according to the embodiment of the present disclosure.
- FIG. 2 there will be described the functional configuration example of the image pickup device 100 according to the embodiment of the present disclosure.
- the image pickup device 100 includes an image pickup unit 102 , a control unit 110 , the display unit 120 , the operation unit 130 , a flash memory 140 , and a RAM 150 .
- the image pickup unit 102 includes, for example, an imager including a solid-state image pickup element such as a lens, a charge coupled device (CCD), or a complementary metal oxide semiconductor (CMOS), a timing generator that controls, for example, an exposure timing to an image sensor, a sample and hold circuit, and an interface unit that provides original data of an image obtained by exposure to the imager to the following circuit.
- a solid-state image pickup element such as a lens, a charge coupled device (CCD), or a complementary metal oxide semiconductor (CMOS)
- CMOS complementary metal oxide semiconductor
- a timing generator that controls, for example, an exposure timing to an image sensor, a sample and hold circuit
- an interface unit that provides original data of an image obtained by exposure to the imager to the following circuit.
- the control unit 110 controls operation of the image pickup device 100 .
- the control unit 110 may control the operation of the image pickup device 100 by, for example, reading computer programs recorded in the flash memory 140 and sequentially executing the computer programs.
- a specific configuration example of the control unit 110 will be described below.
- the display unit 120 displays an image captured by the image pickup device 100 with the use of the image pickup unit 102 and displays various kinds of setting screens of the image pickup device 100 .
- the display unit 120 includes a display panel 121 and a touch panel 122 .
- the display panel 121 displays an image captured by the image pickup device 100 and displays various kinds of setting screens of the image pickup device 100 and includes, for example, a flat display panel such as a liquid crystal display panel or an organic electroluminescence display panel.
- the touch panel 122 is provided in a display surface of the display panel 121 . A user can operate the image pickup device 100 by touching the touch panel 122 with the use of the operation member such as a finger. Therefore, the control unit 110 executes various kinds of processing in response to a state of touching the touch panel 122 with the use of the operation member.
- the operation unit 130 causes a user to operate the image pickup device 100 and includes the button, the switch, and the like for operating the image pickup device 100 .
- the control unit 110 executes various kinds of processing in response to an operation state of the operation unit 130 . Examples of the various kinds of processing executed by the control unit 110 in response to the operation state of the operation unit 130 include processing of turning on/off the power source of the image pickup device 100 , processing of changing magnification at the time of image pickup, processing of changing other image pickup conditions, and processing of capturing a still image or a moving image.
- the flash memory 140 is a nonvolatile memory in which various kinds of computer programs needed to perform the processing of the control unit 110 and various kinds of data are stored.
- the RAM 150 is a working memory used at the time of the processing of the control unit 110 .
- control unit 110 the display unit 120 , the operation unit 130 , the flash memory 140 , and the RAM 150 are connected to one another via a bus 160 and can communicate with one another.
- FIG. 3 is an explanatory diagram illustrating the functional configuration example of the control unit 110 included in the image pickup device 100 according to the embodiment of the present disclosure.
- the functional configuration example of the control unit 110 included in the image pickup device 100 according to the embodiment of the present disclosure.
- the control unit 110 includes a composition detection unit 111 , a time lag calculation unit 112 , an image pickup control unit 113 , and a composition specification unit 114 .
- the composition detection unit 111 detects whether or not an image captured by the image pickup unit 102 has a composition intended by a user. In this embodiment, the composition detection unit 111 detects whether or not the image captured by the image pickup unit 102 has the composition intended by the user by detecting whether or not a feature point of an object specified by the user is positioned in a location specified by the user on the display unit 120 . In a case where the feature point of the object specified by the user exits in the image captured by the image pickup unit 102 , the composition detection unit 111 calculates a time it takes for the feature point to reach the location specified by the user on the display unit 120 .
- the composition detection unit 111 detects that the feature point of the object specified by the user exists in the location specified by the user on the display unit 120 , the composition detection unit 111 issues a trigger for automatically starting the image pickup operation to the image pickup control unit 113 (described below).
- composition detection unit 111 issues the trigger to the image pickup control unit 113 but before the image pickup control unit 113 starts image pickup processing, there exists a time lag caused by a hardware factor and a software factor of the image pickup device 100 and an environment at the time of image pickup. Therefore, in a case where the composition detection unit 111 issues the trigger to the image pickup control unit 113 when the feature point of the object specified by the user exists in the location specified by the user on the display unit 120 , it is not possible to obtain an image having the composition intended by the user if the object moves.
- the time lag calculation unit 112 calculates a time after the composition detection unit 111 issues the trigger to the image pickup control unit 113 but before the image pickup control unit 113 starts the image pickup processing. Then, the composition detection unit 111 issues the trigger for automatically starting the image pickup operation to the image pickup control unit 113 in consideration of the time calculated by the time lag calculation unit 112 .
- the image pickup device 100 can obtain an image having the composition intended by the user by considering the time after the composition detection unit 111 issues the trigger to the image pickup control unit 113 but before the image pickup control unit 113 starts the image pickup processing.
- the time lag calculation unit 112 calculates the time (time lag) after the composition detection unit 111 issues the trigger to the image pickup control unit 113 but before the image pickup control unit 113 starts the image pickup processing.
- the time lag is caused by the hardware factor and the software factor of the image pickup device 100 and the environment at the time of the image pickup.
- the time lag calculation unit 112 calculates the time (time lag) after the composition detection unit 111 issues the trigger to the image pickup control unit 113 but before the image pickup control unit 113 starts the image pickup processing, the time being caused by the above factors causing the time lag, and notifies the calculated time to the composition detection unit 111 .
- the composition detection unit 111 issues the trigger to the image pickup control unit 113 in consideration of the time calculated by the time lag calculation unit 112 .
- the image pickup control unit 113 executes the image pickup processing in response to push of the shutter button 132 by a user or reception of the trigger for starting the image pickup operation from the composition detection unit 111 .
- the image pickup control unit 113 instructs the image pickup unit 102 to acquire a captured image.
- the captured image acquired in response to the instruction of the image pickup control unit 113 is stored in the flash memory 140 or is displayed in the display unit 120 .
- the composition specification unit 114 causes a user to specify an arbitrary composition. Information on the composition that the composition specification unit 114 causes the user to specify is stored in the flash memory 140 . In a case were the composition specification unit 114 causes the user to specify the composition, the composition specification unit 114 causes the user to specify a target object (or a feature point of the object) and a position where the user wants the object to appear in an image.
- FIG. 4 is a flow chart illustrating the operation example of the image pickup device 100 according to the embodiment of the present disclosure.
- FIG. 4 illustrates the operation example of the image pickup device 100 performed in a case where content of the image captured by the image pickup unit 102 is detected by the control unit 110 and the image pickup operation is automatically executed.
- the operation example of the image pickup device 100 according to the embodiment of the present disclosure.
- the image pickup device 100 causes a user to specify an arbitrary composition (Step S 101 ).
- This specification of the composition in Step S 101 is executed by, for example, the composition specification unit 114 .
- the image pickup device 100 causes the user to specify the arbitrary composition in Step S 101 and then stores the specified composition (Step S 102 ).
- Step S 102 is executed by, for example, storing, in the flash memory 140 , information on the composition that the composition specification unit 114 has caused the user to specify.
- FIG. 5 is a flow chart illustrating an operation example of the image pickup device 100 according to the embodiment of the present disclosure.
- FIG. 5 illustrates specification processing of the composition in Step S 101 of FIG. 4 in more detail.
- the composition specification unit 114 causes a user to specify a feature point of a target object (Step S 111 ).
- FIG. 6 is an explanatory diagram illustrating an example of a screen displayed in the display unit 120 in a case where the feature point of the target object is specified by the user in Step S 111 .
- FIG. 6 illustrates, from upper left in a clockwise direction, a body of shinkansen (high-speed railway), a human face, a cat face, moon, a bird face, and a body of an air plane.
- the feature points of the object to be specified by the user are not limited to the above examples.
- the image pickup device 100 may prepare examples of feature points of objects in advance and cause a user to select one of the examples or may cause a user to specify a feature point of an arbitrary substance.
- FIG. 6 illustrates a state in which the user selects the body of the shinkansen as the object, and the body of the shinkansen is surrounded by a frame line 171 .
- the feature point of the object is specified by causing the user to touch the display unit 120 with his/her finger.
- a method of specifying the feature point of the target object is not limited to the above example, and the feature point of the object may be specified by causing the user to operate the operation unit 130 .
- FIG. 7 is an explanatory diagram illustrating an example of a screen displayed in the display unit 120 when the composition is specified by the user in Step S 112 .
- FIG. 7 illustrates the example of the screen that is displayed in the display unit 120 in a case were the user selects the body of the shinkansen as the feature point of the object and specifies a position of the body of the shinkansen.
- the composition is specified by causing the user to touch the display unit 120 with his/her finger.
- FIG. 7 illustrates a frame line 172 for specifying the composition.
- a method of specifying the composition is not limited to the above example, and the composition may be specified by causing the user to operate the operation unit 130 .
- the arbitrary composition is specified by the user as described above, and then, for example, the composition specification unit 114 stores the specified composition in the flash memory 140 (Step S 102 ).
- Step S 102 The composition specified by the user is stored in Step S 102 , and then the image pickup device 100 executes feature point detection processing to determine whether or not the feature point specified by the user is included in the image captured by the image pickup unit 102 (Step S 103 ).
- the feature point detection processing is executed by, for example, the composition detection unit 111 .
- Step S 104 The feature point detection processing is executed in Step S 103 , and then the image pickup device 100 determines whether or not the feature point specified by the user is included in the image captured in the image pickup unit 102 (Step S 104 ).
- This determination processing in Step S 104 is executed by, for example, the composition detection unit 111 .
- Step S 104 in a case where the feature point specified by the user is not included in the image captured in the image pickup unit 102 , processing returns to Step S 103 , and the image pickup device 100 executes the feature point detection processing again. Meanwhile, as a result of the determination in Step S 104 , in a case where the feature point specified by the user is included in the image captured in the image pickup unit 102 , the image pickup device 100 calculates a moving direction and a moving speed of the detected feature point (Step S 105 ). Processing for calculating the moving direction and the moving speed of the detected feature point in Step S 105 is executed by, for example, the composition detection unit 111 .
- FIG. 8 is an explanatory diagram for describing the processing for calculating the moving direction and the moving speed of the detected feature point in Step S 105 .
- the composition detection unit 111 can calculate the moving direction of the feature point by detecting that the feature point moves in a direction from upper left to lower right of an image.
- the composition detection unit 111 can calculate the moving speed of the feature point.
- Step S 105 The moving direction and the moving speed of the detected feature point are calculated in Step S 105 , and then, in a case where the detected feature point straightly moves, the image pickup device 100 calculates a time at which the detected feature point reaches the position specified by the user (Step S 106 ).
- This calculation processing of the time in Step S 106 is executed by, for example, the composition detection unit 111 .
- FIG. 9 is an explanatory diagram for describing the calculation processing of the time in Step S 106 .
- the composition detection unit 111 determines, by calculation, based on the moving speed of the feature point, that the feature point reaches the position specified by the user at, for example, a time T 1 after 900 milliseconds.
- Step S 106 The time at which the feature point reaches the position specified by the user is calculated in Step S 106 , and then the image pickup device 100 calculates a time lag after a trigger for starting image pickup is issued but before image pickup processing is started, thereby determining a release correction time (Step S 107 ).
- the release correction time is a time corresponding to a difference between the time at which the feature point reaches the position specified by the user and a time at which the trigger for starting the image pickup is actually issued.
- This calculation processing of the time lag in Step S 107 is executed by, for example, the time lag calculation unit 112 .
- examples of the factors causing the time lag include the inherent processing speed of the image pickup device 100 , the driving speed of the shutter curtain and the aperture of the image pickup device 100 , the inherent driving speed of the shutter curtain and the aperture of the lens, the processing speed that is dynamically changed depending on the control state of the image pickup device 100 , and the processing speed that is dynamically changed depending on the control state of the lens.
- the time lag from a state of the image pickup device 100 or the lens is influenced by a processing property of the image pickup device 100 .
- the time lag calculation unit 112 calculates, in Step S 107 , the time after the composition detection unit 111 issues the trigger to the image pickup control unit 113 but before the image pickup control unit 113 starts the image pickup processing, the time being caused by the above factors, and notifies the calculated time to the composition detection unit 111 .
- FIG. 10 is an explanatory diagram for describing calculation processing of the time lag in Step S 107 .
- the time lag calculation unit 112 determines that the time lag after the trigger for starting the image pickup is issued but before the image pickup operation is started is 20 milliseconds and notifies information on the time lag to the composition detection unit 111 .
- time lag calculation unit 112 may obtain the time lag by calculation in a case where the feature point of the object is detected, but, in a case where the processing property of the image pickup device 100 is not high, the time lag calculation unit 112 may refer to a fixed value held in advance on the basis of a setting state of the image pickup device 100 .
- the release correction time is determined in Step S 107 , and then the image pickup device 100 determines whether or not a timing reaches the release timing on the basis of the time calculated in Step S 106 and the release correction time determined in Step S 107 (Step S 108 ). This determination on whether or not a timing reaches the release timing in Step S 108 is executed by, for example, the composition detection unit 111 .
- Step S 106 the time T 1 at which the feature point reaches the position specified by the user is calculated in Step S 106 and the time lag (release correction time) after the trigger for starting the image pickup is issued but before the image pickup operation is started is calculated to be 20 milliseconds in Step S 107
- the image pickup device 100 issues the trigger for starting the image pickup 20 milliseconds before the time T 1 calculated in Step S 106 , and therefore it is possible to capture an image in which the feature point exists in the position specified by the user.
- Step S 108 in a case where it is determined that a timing does not reach the release timing, the image pickup device 100 executes the feature point detection processing in Step S 103 again. Meanwhile, as a result of the determination in Step S 108 , in a case where it is determined that a timing reaches the release timing, the image pickup device 100 performs release by issuing the trigger for starting the image pickup (Step S 109 ). The trigger for starting the image pickup is issued to the image pickup control unit 113 by the composition detection unit 111 .
- the image pickup device 100 executes the automatic image pickup processing in consideration of the time lag after the trigger for starting the image pickup is issued but before the image pickup operation is started.
- the image pickup device 100 by executing the automatic image pickup processing in consideration of the time lag, the image pickup device 100 according to the embodiment of the present disclosure can capture the image having the composition specified by the user as the user intended.
- the image pickup device 100 may display the image captured by the release in display unit 120 after the release is executed in Step S 109 , and, in a case where the image is displayed, the image pickup device 100 may overlap, on the image, the release correction time and a position of the object obtained when the release timing has not been corrected.
- the image pickup device 100 may overlap the release correction time expressed as “20 msec” or the like on the image.
- the image pickup device 100 may express the position of the object obtained when the release timing has not been corrected by shifting the object through image processing, or may express the position with the use of an arbitrary mark or the like instead of the object.
- the release correction time and the position of the object obtained when the release timing has not been corrected are presented by overlapping the release correction time and the position on the image, improvement in a user's skill of using the image pickup device 100 can be expected in terms of the release timing.
- the image pickup device 100 calculates the time T 1 at which the feature point reaches the position specified by the user and the time lag after the trigger for starting the image pickup is issued but before the image pickup operation is started.
- the object to be captured by the user with the use of the image pickup device 100 does not necessarily move straightly in the image and may curvedly move in the image.
- a wider range may be specified instead of specification of one position, i.e., so-called specification of a pinpoint position on the screen.
- the image pickup device 100 specify, in advance, not only the pinpoint position but also a range in which the object is assumed to straightly move, and, when it is detected that the object enters the range, as described above, the image pickup device 100 may calculate the time T 1 at which the feature point reaches the position specified by the user and the time lag after the trigger for starting the image pickup is issued but before the image pickup operation is started.
- FIG. 11 is an explanatory diagram for describing a case where a wider range is specified by a user who uses the image pickup device 100 instead of specification of a pinpoint position.
- a reference sign 173 of FIG. 11 is a frame line indicating a range specified by the user.
- the composition specification unit 114 may cause the user to specify the wider range, as compared with a case where the pinpoint position is specified.
- the image pickup device 100 may automatically determine whether to specify the pinpoint position or specify the wider range on the basis of motion of the feature point. For example, in a case where the feature point straightly moves within a range of a certain time, the image pickup device 100 may execute the automatic image pickup processing while a pinpoint position is specified.
- the image pickup device 100 executes the automatic image pickup processing of the still image in consideration of the time lag after the trigger for starting the image pickup is issued but before the image pickup operation is started.
- the present disclosure is not limited to the above example.
- the image pickup device 100 may start image pickup operation of a moving image when an object reaches the vicinity of a location specified by a user and terminate the image pickup operation of the moving image when the object departs from the vicinity of the location specified by the user.
- the image pickup device 100 may extract a frame in which the object exists in the location specified by the user from the moving image captured as described above and record the frame as a still image.
- the image pickup device 100 that executes automatic image pickup processing of a still image in consideration of a time lag after a trigger for starting image pickup is issued but before image pickup operation is started.
- the image pickup device 100 can capture a still image having a composition intended by a user.
- Steps in processes executed by devices in this specification are not necessarily executed chronologically in the order described in a sequence chart or a flow chart.
- Steps in processes executed by devices may be executed in a different order from the order described in a flow chart or may be executed in parallel.
- a computer program can be created which causes hardware such as a CPU, ROM, or RAM, incorporated in each of the devices, to function in a manner similar to that of structures in the above-described devices. Furthermore, it is possible to provide a recording medium having the computer program recorded thereon. Moreover, by configuring respective functional blocks shown in a functional block diagram as hardware, the hardware can achieve a series of processes.
- a device to which the present disclosure is applied is not limited to the above example. It is needless to say that, for example, the technology in the present disclosure can be similarly applied to a mobile phone, a game console, a personal computer, a tablet terminal, and other information processing devices, in each of which a camera is mounted.
- present technology may also be configured as below.
- An image pickup device including:
- composition detection unit configured to calculate a time at which an object that is specified by a user and is included in a captured image achieves a composition specified by the user
- a time calculation unit configured to calculate a time after start instruction of image pickup operation is issued but before an image is captured
- an image pickup control unit configured to start image pickup processing of the image in response to the start instruction of the image pickup operation
- composition detection unit issues the start instruction of the image pickup operation to the image pickup control unit the time calculated by the time calculation unit before the time at which the composition specified by the user is achieved.
- time calculation unit calculates the time in consideration of a processing time that dynamically changes in accordance with a control state of the image pickup device.
- the image pickup control unit overlaps information on the time calculated by the time calculation unit on the image obtained by the image pickup operation executed based on the instruction from the composition detection unit.
- the image pickup device according to any one of (1) to (3),
- the image pickup control unit overlaps information on a position of the object obtained when the image pickup operation is executed at the time calculated by the composition detection unit on the image obtained by the image pickup operation executed based on the instruction from the composition detection unit.
- the image pickup control unit extracts the object and uses the object as the information on the position of the object.
- composition detection unit issues the start instruction of the image pickup operation to the image pickup control unit as long as the object is included in the captured image.
- composition detection unit changes, in accordance with a specification method of the composition by the user, a timing to start calculation of the time at which the composition specified by the user is achieved.
- a method of controlling an image pickup device including:
- composition detection step of calculating a time at which an object that is specified by a user and is included in a captured image achieves a composition specified by the user
- the start instruction of the image pickup operation is issued in the image pickup control step the time calculated in the time calculation step before the time at which the composition specified by the user is achieved.
- a computer program that causes a computer to perform:
- composition detection step of calculating a time at which an object that is specified by a user and is included in a captured image achieves a composition specified by the user
- the start instruction of the image pickup operation is issued in the image pickup control step the time calculated in the time calculation step before the time at which the composition specified by the user is achieved.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
Description
- 100 image pickup device
- 102 image pickup unit
- 110 control unit
- 111 composition detection unit
- 112 time lag calculation unit
- 113 image pickup control unit
- 114 composition specification unit
- 120 display unit
- 130 operation unit
- 131 zoom button
- 132 shutter button
- 133 power button
- 140 flash memory
- 150 RAM
- 160 bus
Claims (11)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2013-022247 | 2013-02-07 | ||
| JP2013022247 | 2013-02-07 | ||
| PCT/JP2014/051219 WO2014122990A1 (en) | 2013-02-07 | 2014-01-22 | Imaging device, control method for imaging device, and computer program |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20150373260A1 US20150373260A1 (en) | 2015-12-24 |
| US9554035B2 true US9554035B2 (en) | 2017-01-24 |
Family
ID=51299587
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/763,091 Active US9554035B2 (en) | 2013-02-07 | 2014-01-22 | Image pickup device, method of controlling image pickup device, and computer program for automatically achieving composition specified by user |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US9554035B2 (en) |
| WO (1) | WO2014122990A1 (en) |
Citations (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH10232418A (en) * | 1997-02-18 | 1998-09-02 | Canon Inc | Camera and anti-vibration control device |
| JP2002335436A (en) | 2001-05-08 | 2002-11-22 | Fuji Photo Film Co Ltd | Camera |
| JP2003222790A (en) * | 2002-01-31 | 2003-08-08 | Minolta Co Ltd | Camera |
| JP2005215373A (en) | 2004-01-30 | 2005-08-11 | Konica Minolta Photo Imaging Inc | Imaging apparatus |
| US20060132623A1 (en) * | 1998-08-21 | 2006-06-22 | Nikon Corporation | Electronic camera |
| US7415201B2 (en) * | 2003-09-30 | 2008-08-19 | Olympus Corporation | Auto focusing device for camera and method used in auto focusing device for camera for determining whether or not to emit auxiliary light |
| US20080199169A1 (en) * | 2007-02-19 | 2008-08-21 | Canon Kabushiki Kaisha | Camera and photographic lens |
| US20110141344A1 (en) * | 2007-04-04 | 2011-06-16 | Nikon Corporation | Digital camera |
| JP2011139498A (en) | 2011-02-14 | 2011-07-14 | Fujifilm Corp | Imaging device and control method thereof |
| US20110228128A1 (en) * | 2008-12-27 | 2011-09-22 | Canon Kabushiki Kaisha | Imaging apparatus and imaging method |
| JP2012099984A (en) | 2010-10-29 | 2012-05-24 | Fujifilm Corp | Imaging device, and display control method |
-
2014
- 2014-01-22 WO PCT/JP2014/051219 patent/WO2014122990A1/en not_active Ceased
- 2014-01-22 US US14/763,091 patent/US9554035B2/en active Active
Patent Citations (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH10232418A (en) * | 1997-02-18 | 1998-09-02 | Canon Inc | Camera and anti-vibration control device |
| US20060132623A1 (en) * | 1998-08-21 | 2006-06-22 | Nikon Corporation | Electronic camera |
| JP2002335436A (en) | 2001-05-08 | 2002-11-22 | Fuji Photo Film Co Ltd | Camera |
| JP2003222790A (en) * | 2002-01-31 | 2003-08-08 | Minolta Co Ltd | Camera |
| US7415201B2 (en) * | 2003-09-30 | 2008-08-19 | Olympus Corporation | Auto focusing device for camera and method used in auto focusing device for camera for determining whether or not to emit auxiliary light |
| JP2005215373A (en) | 2004-01-30 | 2005-08-11 | Konica Minolta Photo Imaging Inc | Imaging apparatus |
| US20080199169A1 (en) * | 2007-02-19 | 2008-08-21 | Canon Kabushiki Kaisha | Camera and photographic lens |
| US20110141344A1 (en) * | 2007-04-04 | 2011-06-16 | Nikon Corporation | Digital camera |
| US20110228128A1 (en) * | 2008-12-27 | 2011-09-22 | Canon Kabushiki Kaisha | Imaging apparatus and imaging method |
| JP2012099984A (en) | 2010-10-29 | 2012-05-24 | Fujifilm Corp | Imaging device, and display control method |
| JP2011139498A (en) | 2011-02-14 | 2011-07-14 | Fujifilm Corp | Imaging device and control method thereof |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2014122990A1 (en) | 2014-08-14 |
| US20150373260A1 (en) | 2015-12-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9106836B2 (en) | Imaging apparatus, control method for the same, and recording medium, where continuous shooting or single shooting is performed based on touch | |
| US9007511B2 (en) | Imaging device, control method of imaging device, and computer program | |
| US20170094189A1 (en) | Electronic apparatus, imaging method, and non-transitory computer readable recording medium | |
| RU2014143020A (en) | DISPLAY MANAGEMENT DEVICE AND DISPLAY MANAGEMENT METHOD | |
| KR20170042491A (en) | Electronic apparatus and control method thereof | |
| JP6751620B2 (en) | Imaging device and its control method, program, and storage medium | |
| JP2015126326A (en) | Electronic apparatus and image processing method | |
| JP2013179536A (en) | Electronic apparatus and control method therefor | |
| CN105592260A (en) | ocusing method and apparatus, and terminal | |
| US8749688B2 (en) | Portable device, operating method, and computer-readable storage medium | |
| US9088762B2 (en) | Image capturing apparatus and control method thereof | |
| US20160057328A1 (en) | Electronic device, method and storage medium | |
| JP2020106770A (en) | Information processing device, information processing method, imaging device, and program | |
| JP5448868B2 (en) | IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD | |
| CN104580905A (en) | Photographing method and terminal | |
| US9554035B2 (en) | Image pickup device, method of controlling image pickup device, and computer program for automatically achieving composition specified by user | |
| JP6300569B2 (en) | Imaging apparatus and control method thereof | |
| JP5725894B2 (en) | Portable device, program, and driving method | |
| JP2019015752A (en) | Display control apparatus, control method, and program | |
| JP2018054762A5 (en) | ||
| JP6061972B2 (en) | Mobile device and control method | |
| JP2013214882A (en) | Imaging apparatus | |
| JP5957117B2 (en) | Portable device, display method and program | |
| JP6497887B2 (en) | IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, PROGRAM | |
| KR101662560B1 (en) | Apparatus and Method of Controlling Camera Shutter Executing Function-Configuration and Image-Shooting Simultaneously |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SUGITA, YUSUKE;REEL/FRAME:036174/0362 Effective date: 20150515 |
|
| FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |