WO2009053863A1 - Automatic timing of a photographic shot - Google Patents

Automatic timing of a photographic shot Download PDF

Info

Publication number
WO2009053863A1
WO2009053863A1 PCT/IB2008/053431 IB2008053431W WO2009053863A1 WO 2009053863 A1 WO2009053863 A1 WO 2009053863A1 IB 2008053431 W IB2008053431 W IB 2008053431W WO 2009053863 A1 WO2009053863 A1 WO 2009053863A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sensor
conditions
orientation
frame
Prior art date
Application number
PCT/IB2008/053431
Other languages
French (fr)
Inventor
Henrik Lars Johan Eliasson
Shuji Shimizu
Original Assignee
Sony Ericsson Mobile Communications Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Ericsson Mobile Communications Ab filed Critical Sony Ericsson Mobile Communications Ab
Publication of WO2009053863A1 publication Critical patent/WO2009053863A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Definitions

  • a camera may use ultrasound or infrared sensors to measure the distance between a subject and the camera.
  • white balancing the camera may digitally modify a color component of a picture to improve its quality.
  • adjusting shutter speed the camera may determine the optimal exposure of photoelectric sensors to light within the camera.
  • a device may include a sensor to receive first information relating to a first set of conditions, where the first set of conditions comprise at least one of: steadiness of the device, movement of the device, a location of a subject image in a frame of an image, an orientation of the device, or one or more rules of composition.
  • the device may include a processor to monitor the first set of conditions and automatically capture an image based on the first set of conditions.
  • the senor may be further configured to receive second information relating to a second set of conditions and the processor may be further configured to initiate the monitoring of the first set of conditions based the second set of conditions.
  • the second set of conditions may include at least one of: a sudden motion of the device, a change in an orientation of the device, or a change in an amount of light around the device.
  • the second information may exclude information from an acoustic sensor, a mechanical sensor, a touch screen, a microphone, or a button.
  • the first information may exclude information from an acoustic sensor, a mechanical sensor, a touch screen, a microphone, or a button.
  • the processor may be further configured to terminate the monitoring of the first set of conditions after a particular length of time elapses.
  • the processor may be further configured to crop the captured image in order to place the subject image in a specific location in the frame of the captured image, rotate the captured image to align horizontal features in the captured image to horizontal sides of the frame of the captured image, or resize the captured image.
  • the senor may include at least one of a motion sensor, an orientation sensor, an acoustic sensor, a mechanical sensor, or a light sensor. Additionally, the sensor may include at least one of an accelerometer, a gyroscope, a touch-screen, a microphone, an ultrasound sensor, an infrared sensor, a button for detecting a user input, a . comg
  • the processor may be further configured to reduce noise in the image or sharpen the image.
  • the device may further include at least one of a zoom lens or a wide- angle lens.
  • the device may include a camera or a cell phone.
  • a method may include receiving one or more signals from one or more sensors of a device, where the one or more signals include at least one of: a signal that indicates when the device is steady, a signal that indicates when the device is not moving, a signal that indicates when the device is in a particular orientation, or a signal that indicates when a subject image in a frame of an image is positioned in accordance with one or more rules of composition.
  • the method may include monitoring the one or more signals to determine when a favorable condition exists for capturing the image and capturing the image when it is determined that the favorable condition exists. Additionally, the method may further include processing the captured image to improve a quality of the captured image.
  • the method may further include starting the monitoring of the one or more signals based on at least one of: a press on a button of the device, a change in motion of the device, a change in orientation of the device, or a change in amount of light sensed by the device.
  • monitoring the one or more signals may include at least one of determining if the device is oriented horizontally or vertically or determining if the subject image is fully within the frame.
  • the method may further include receiving user inputs to select factors that are used to determine if the favorable condition exists.
  • a device may include means for continually receiving information related to at least one of: steadiness of the device, movement of the device, a location of a subject image in a frame of an image, an orientation of the device; or one or more rules of composition.
  • the device may include means for continually monitoring the means for continually receiving information and means for automatically capturing the image when the means for continually monitoring detects a favorable condition for capturing the image.
  • the device may further include means for applying the one or more rules of composition to the captured image.
  • FIG. IA and IB show an exemplary viewfmder/display of an exemplary device in which concepts described herein may be implemented;
  • FIGS. 2A and 2B are front and rear views, respectively, of an exemplary device in which concepts described herein may be implemented;
  • Fig. 3 is a block diagram of exemplary components of the exemplary device of Figs. 2A and 2B;
  • Fig. 4 is an functional block diagram of the exemplary device of Figs. 2A and 2B;
  • Fig. 5 is a flow chart of an exemplary process for automatic timing of a photographic shot;
  • Figs. 6A and 6B are front and rear views, respectively, of another exemplary device in which concepts described herein may be implemented;
  • Fig. 7A is an illustration of a user taking a picture with the device of Figs. 2A and 2B;
  • Figs. 7B and 7C show the viewf ⁇ nder/display of the device in Figs. 2A and 2B.
  • a device may aid a user in taking pictures.
  • the device may automatically capture an image based on certain factors, such as steadiness of the device, lighting of a subject of the image, the spatial orientation of the device or the subject, the composition of the image (e.g., whether a face is present in the image or not), etc.
  • the device determines an appropriate moment to take a picture.
  • Figs. IA and IB illustrate the above example.
  • Fig. IA shows a viewfmder/ display 102 with a subject image 104, when the subject is under an unfavorable condition (e.g., a bad lighting condition).
  • an unfavorable condition e.g., a bad lighting condition.
  • the condition improves e.g., lighting condition improves
  • the camera may capture subject image 104 ⁇
  • timing a photographic shot to achieve a particular effect may be sometimes preferable to trying to achieve the similar effect by first capturing an image and then applying image processing techniques to the captured image. For example, suppose a camera times a shot of a moving subject, so that a subject image is centered within in a frame. One may attempt to obtain a similar effect by first capturing an image, and then shifting the frame to cover a slightly different area of the captured image. Such a shift may not be possible, however, if the captured image is not large enough to fully cover the shifted frame.
  • image may refer to a digital or an analog representation of visual information (e.g., a picture, a video, a photograph, animations, etc).
  • a digital camera may include an electronic device that may capture and store images electronically instead of using photographic film.
  • a digital camera may be multifunctional, with some devices capable of recording sound and/or images.
  • a "subject,” as the term is used herein, is to be broadly interpreted to include any person, place, and/or thing capable of being captured as an image.
  • the term “subject image” may refer to an image of a subject.
  • the term “frame” may refer to a closed, often rectangular, border of lines or edges (physical or logical) that enclose the picture of a subject.
  • rules of composition may refer to guidelines, techniques, or procedures that may be used to compose a particular image.
  • the rules of composition may not only include well known rules in photography, such as the Rule of Thirds, but may also include ad-hoc rules, techniques, or procedures for adjusting images in a particular manner (e.g., aligning vertical lines in an image with vertical sides of a frame).
  • the Rule of Thirds may refer to a rule of thumb in image composition.
  • a frame of an image may be partitioned into nine equal parts, which may be used to align features of the image.
  • the rules that are related to the Golden Ratio may refer to rules of thumb in image composition.
  • a frame of an image may be partitioned into Golden Sections based on the Golden Ratio, which is approximately 1.618.
  • the partitions or points on the partitions may be used to align features of the image.
  • Figs. 2A and 2B are front and rear views, respectively, of an exemplary device 200 in which concepts described herein may be implemented.
  • device 200 may take the form of a camera (e.g., a standard 35 mm or digital camera).
  • device 200 may include a button 202, a viewfmder/display 204, a lens assembly 206, sensors 208, a flash 210, and housing 212.
  • Button 202 may permit the user to interact with device 200 to cause device 200 to perform one or more operations, such as taking a picture.
  • Viewfmder/display 204 may provide visual information to the user, such as an image of a view, video images, pictures, etc.
  • Lens assembly 206 may include a device for manipulating light rays from a given or a selected range, so that images in the range can be captured in a desired manner.
  • Sensors 208 may collect and provide, to device 200, information (e.g., acoustic, infrared, etc.) that may be used to aid the user in capturing images.
  • information e.g., acoustic, infrared, etc.
  • Flash 210 may include any type of flash unit used in cameras and may provide illumination for taking pictures.
  • Housing 212 may provide a casing for components of device 200 and may protect the components from outside elements.
  • Fig. 3 is a block diagram of exemplary components of device 200.
  • the term "component,” as used herein, may refer to hardware component, a software component, or a combination of the two.
  • device 200 may include memory 302, processing unit 304, viewfmder/display 306, lens assembly 308, sensors 310, and other input/output components 312. In other implementations, device 200 may include more, fewer, or different components.
  • Memory 302 may include static memory, such as read only memory (ROM), and/or dynamic memory, such as random access memory (RAM), or onboard cache, for storing data and machine-readable instructions. Memory 302 may also include storage devices, such as a floppy disk, CD ROM, CD read/write (R/W) disc, and/or flash memory, as well as other types of storage devices.
  • Processing unit 304 may include a processor, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), and/or other processing logic capable of controlling device 200.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • Viewfmder/display 306 may include a component that can display signals generated by device 200 as images on a screen and/or that can accept inputs in the form of taps or touches on the screen.
  • viewfmder/display 306 may provide a window through which the user may view images that are received from lens assembly 308.
  • Examples of viewfinder/display 306 include an optical viewfinder (e.g., a reversed telescope), liquid crystal display (LCD), organic light-emitting diode (OLED) display, surface-conduction electron- emitter display (SED), plasma display, field emission display (FED), bistable display, and/or a touch screen.
  • optical viewfinder e.g., a reversed telescope
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • SED surface-conduction electron- emitter display
  • FED field emission display
  • bistable display and/or a touch screen.
  • Lens assembly 308 may include a component for manipulating light rays from a given or a selected range, so that images in the range can be captured in a desired manner (e.g., a zoom lens, a wide-angle lens, etc.). Lens assembly 308 may be controlled manually and/or electromechanically by processing unit 304 to obtain the correct focus, span, and magnification (i.e., zoom) of the subject image and to provide a proper exposure.
  • Sensors 310 may include one or more devices for obtaining information related to image, luminance, focus, zoom, sound, movement of device 200, and/or orientation of device 200. Sensors 310 may provide the information to processing unit 304, so that processing unit 304 may control lens assembly 308 and/or other components.
  • Other input/output components 312 may include components for converting physical events or phenomena to and/or from digital signals that pertain to device 200. Examples of other input/output components 312 may include a flash, button, mouse, speaker, microphone, Universal Serial Bus (USB) port, etc.
  • device 200 may include other components, such as a network interface.
  • the network interface may include any transceiver- like mechanism that enables device 200 to communicate with other devices and/or systems.
  • the network interface may include mechanisms for communicating via a network, such as the Internet, a terrestrial wireless network (e.g., wireless local area network (WLAN)), a satellite-based network, etc.
  • the network interface may include a modem, an Ethernet interface to a local area network (LAN), and/or an interface/connection for connecting device 200 to other devices (e.g., a Bluetooth interface).
  • Fig. 4 is a functional block diagram of device 200.
  • device 200 may include a database 402, image processing logic 404, composition logic 406, and/or shooting logic 408.
  • device 200 may include fewer, additional, or different types of functional blocks than those illustrated in Fig. 4.
  • Database 402 may be included in memory 302 (Fig. 3) and act as an information repository for the components of device 200.
  • database 402 may store or maintain images (e.g., pictures, video clips, etc.) that may be stored and/or accessed by image processing logic 404 and/or composition logic 406.
  • database 402 may store or maintain audio files (e.g., audio clips, ring tones, etc.).
  • Image processing logic 404 may include hardware and/or software for processing images before/after the images are captured. For example, image processing logic 404 may apply a filter for noise-reduction in to improve the quality of images before/after the images are captured by device 200. In another example, image processing logic 404 may apply an antialiasing procedure to captured images.
  • Composition logic 406 may include hardware and/or software for applying rules of composition to images before and/or after the images are captured. The application of the rules of composition may render the images more aesthetically pleasing to viewers. Some of the rules of composition that may be applied by composition logic 406 may include: aligning a vertical or horizontal feature within an image with horizontal/vertical sides of a frame of the image; aligning an image with the direction of gravity at the time the image is captured; centering a subject of an image; applying the Rule of Thirds; zooming in on a subject to increase the size of a subject image relative to the frame of the image; applying rules related to the Golden Ratio; etc. Depending on implementation, composition logic 406 may apply fewer, additional, or different rules than those listed above.
  • Shooting logic 408 may include hardware and/or software for capturing an image at a precise moment based on various sensor inputs and/or parameters that are related to the image. In one implementation, shooting logic 408 may determine whether a user wishes to use device 200 based on a number of factors. The factors may include: a change in motion of device 200; a change in the orientation of device 200; the amount of light that is detected by sensors 310 in device 200; etc. The motion/ orientation of device 200 may be relevant, because, for example, the user may pull out device 200 from a pocket to capture an image and, thus, move or rotate device 200.
  • a change in the amount of light that device 200 detects may indicate whether device 200 has been removed from an enclosed space for use (e.g., a pocket, a box, an encasing, etc.).
  • device 200 may rely on other factors to determine whether the user intends to use device 200 to capture an image.
  • shooting logic 408 may automatically capture an image, depending on various factors.
  • the factors that shooting logic 408 may use in determining a correct moment to capture an image may include a decrease in the movement of device 200, amount of light that is detected by sensors 310, how much device 200 is shaking, orientation of device 200, detection of a particular subject image (e.g., a face) within a frame associated with the subject image; etc.
  • a decrease in the movement of device 200 may indicate a user's attempt to bring device 200 to a still position in order to capture an image.
  • the amount of light detected by sensors 310 may inform shooting logic 408 about the exposure that sensors 310 may incur in capturing the image. For example, if sensors 310 detect too much light, shooting logic 408 may delay taking a shot. Excessive shaking in device 200 may introduce blurs in the captured image, and therefore, may cause shooting logic 310 to delay taking the shot.
  • Whether a subject e.g., face
  • shooting logic 408 may employ a component of image processing logic 404 (e.g., a face recognition component).
  • Fig. 5 shows an exemplary process 502 for an automatic timing of a photographic shot.
  • Process 502 may begin at block 504, where user inputs for selecting factors that may be used by device 200 to determine if a user wishes to use device 200 may be received (block 504).
  • the factors may include: a change in motion or the orientation of device 200, the amount of light that is detected by sensors 310 in device 200, etc.
  • the factors may include: how much device 200 is shaking or moving, the orientation of device 200, amount of light that is detected by sensors 310, a detection of a subject image within a frame, etc. As described above with reference to shooting logic 408, these factors may be used by device 200 to determine the moment at which a subject image may be captured.
  • device 200 may be configured to account for factors that are related to image composition. For example, device 200 may detect whether a subject image is centered in a frame, whether the subject image satisfies the Rule of Thirds, whether the subject image satisfies the Golden Rule, etc.
  • a motion and/or a change in the orientation of device 200, a change in lighting, a change in an input image, and/or a user input may be detected (block 508).
  • the motion and/or the change in the orientation of device 200, the change in lighting, a change in the input image, and or the user input may reveal a user's intent to use device 200 to capture an image.
  • the user may quickly take device 200 out of a pocket. Consequently, the user may move device 200, or change the orientation of device 200. Furthermore, the user may expose device 200 to light when the user removes device 200 from the pocket. These factors may be detected by sensors 310 and/or processing unit 304 in device 200, and used to determine if the user wants to use device 200 to capture an image. In some implementations, device 200 may simply rely on a user input (e.g., pressing on button 202) to determine that the user wants to use device 200.
  • a user input e.g., pressing on button 202
  • a shot may be taken (block 514).
  • the user selects the motion/orientation, lighting, and properties that are associated with desirable images as factors that device 200 may account for in automatically timing a shot.
  • device 200 may take the shot when device 200 determines that device 200 is being held motionless for a moment (e.g., a second) after being removed from a pocket, that device 200 is being held horizontally or vertically, that there is sufficient light for capturing an image, and/or that a subject image is completely/mostly within the frame.
  • process 502 may return to block 512 to continue monitoring the factors.
  • Process 502 may repeat blocks 512-514, until a particular length of time elapses. In one implementation, the particular length of time may be inputted by the user at block 506.
  • FIGs. 6A and 6B are front and rear views, respectively, of another exemplary device
  • device 600 may include any of the following devices that have the ability to or are adapted to capture or process images (e.g., a video clip, a photograph, etc): a telephone, such as a radio telephone or a mobile telephone; a personal communications system (PCS) terminal that may combine a cellular radiotelephone with, data processing, facsimile, and/or data communications capabilities; an electronic notepad; a laptop; a personal computer (PC); a personal digital assistant (PD A) that can include a telephone; or another type of computational or communication device with the ability to process images.
  • a telephone such as a radio telephone or a mobile telephone
  • PCS personal communications system
  • PD A personal digital assistant
  • device 600 may include a speaker 602, a display 604, control buttons 606, a keypad 608, a microphone 610, sensors 612, a lens assembly 614, a flash 616, and housing 618.
  • Speaker 602 may provide audible information to a user of device 600.
  • Display 604 may provide visual information to the user, such as video images or pictures.
  • Control buttons 606 may permit the user to interact with device 600 to cause device 600 to perform one or more operations, such as place or receive a telephone call.
  • Keypad 608 may include a standard telephone keypad.
  • Microphone 610 may receive audible information from the user.
  • Sensors 612 may collect and provide, to device 600, information (e.g., acoustic, infrared, etc.) that is used to aid the user in capturing images.
  • Lens assembly 614 may include a device for manipulating light rays from a given or a selected range, so that images in the range can be captured in a desired manner.
  • Flash 616 may include any type of flash unit used in cameras and may provide illumination for taking pictures.
  • Housing 618 may provide a casing for components of device 600 and may protect the components from outside elements.
  • EXAMPLE illustrates processes involved in automatic timing of a photographic shot, with reference to Figs. 7A through 7C.
  • the example is consistent with exemplary process 502 described above with respect to Fig. 5.
  • Maria 702 is at a costume party. Assume that Maria 702 has input various factors that device 200 may use in determining whether Maria wishes to use device 200 in timing a shot. Also, assume that Maria 702 wants to take a picture of her friends 704, who are about to leave. Unfortunately, Maria 702 has broken her arm during her last ski trip to the Alps, and consequently, Maria 702 's cannot always take quick pictures. .
  • Maria 702 quickly lifts device 200.
  • device 200 may determine that Maria 702 wishes to use device 200 to take a picture, based on the various factors, such as a lighting condition, changes in its motion and orientation, etc.
  • Figs. 7B and 7C illustrate viewfmder/display 204 of device 204, as Maria quickly attempts to take a picture of friends 704. Because Maria 702's hand shakes, sometimes viewfmder/display 204 shows her friends as in Fig. 7B, and sometimes as in Fig. 7C.
  • Maria 702 manages to hold her hand steady for a moment.
  • Device 200 takes a shot of Maria 702's friends 704, based on various factors that include the steadiness of device 200, etc.
  • Maria 702 shows the picture to her friends 704, who compliment Maria 702 for her skill in taking pictures.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

A device may receive one or more signals from one or more sensors of a device, where the one or more signals include at least one of: a signal that indicates when the device is steady, a signal that indicates when the device is not moving, a signal that indicates when the device is in a particular orientation, or a signal that indicates when a subject image in a frame of an image is positioned in accordance with one or more rules of composition. In addition, the device may monitor the one or more signals to determine when a favorable condition exists for capturing the image and capture the image when it is determined that the favorable condition exists.

Description

AUTOMATIC TIMING OF A PHOTOGRAPHIC SHOT BACKGROUND
Many of today's cameras have the ability to aid a photographer in focusing, white balancing, and/or adjusting shutter speed. For focusing, a camera may use ultrasound or infrared sensors to measure the distance between a subject and the camera. For white balancing, the camera may digitally modify a color component of a picture to improve its quality. For adjusting shutter speed, the camera may determine the optimal exposure of photoelectric sensors to light within the camera.
SUMMARY According to one aspect, a device may include a sensor to receive first information relating to a first set of conditions, where the first set of conditions comprise at least one of: steadiness of the device, movement of the device, a location of a subject image in a frame of an image, an orientation of the device, or one or more rules of composition. In addition, the device may include a processor to monitor the first set of conditions and automatically capture an image based on the first set of conditions.
Additionally, the sensor may be further configured to receive second information relating to a second set of conditions and the processor may be further configured to initiate the monitoring of the first set of conditions based the second set of conditions.
Additionally, the second set of conditions may include at least one of: a sudden motion of the device, a change in an orientation of the device, or a change in an amount of light around the device.
Additionally, the second information may exclude information from an acoustic sensor, a mechanical sensor, a touch screen, a microphone, or a button.
Additionally, the first information may exclude information from an acoustic sensor, a mechanical sensor, a touch screen, a microphone, or a button.
Additionally, the processor may be further configured to terminate the monitoring of the first set of conditions after a particular length of time elapses.
Additionally, the processor may be further configured to crop the captured image in order to place the subject image in a specific location in the frame of the captured image, rotate the captured image to align horizontal features in the captured image to horizontal sides of the frame of the captured image, or resize the captured image.
Additionally, the sensor may include at least one of a motion sensor, an orientation sensor, an acoustic sensor, a mechanical sensor, or a light sensor. Additionally, the sensor may include at least one of an accelerometer, a gyroscope, a touch-screen, a microphone, an ultrasound sensor, an infrared sensor, a button for detecting a user input, a.comg|e2^^^ a charge-coupled device (CCD) sensor. Additionally, the processor may be further configured to receive user inputs that select the second set of conditions.
Additionally, the processor may be further configured to reduce noise in the image or sharpen the image.
Additionally, the device may further include at least one of a zoom lens or a wide- angle lens.
Additionally, the device may include a camera or a cell phone.
According to another aspect, a method may include receiving one or more signals from one or more sensors of a device, where the one or more signals include at least one of: a signal that indicates when the device is steady, a signal that indicates when the device is not moving, a signal that indicates when the device is in a particular orientation, or a signal that indicates when a subject image in a frame of an image is positioned in accordance with one or more rules of composition. In addition, the method may include monitoring the one or more signals to determine when a favorable condition exists for capturing the image and capturing the image when it is determined that the favorable condition exists. Additionally, the method may further include processing the captured image to improve a quality of the captured image.
Additionally, the method may further include starting the monitoring of the one or more signals based on at least one of: a press on a button of the device, a change in motion of the device, a change in orientation of the device, or a change in amount of light sensed by the device.
Additionally, monitoring the one or more signals may include at least one of determining if the device is oriented horizontally or vertically or determining if the subject image is fully within the frame.
Additionally, the method may further include receiving user inputs to select factors that are used to determine if the favorable condition exists.
According to yet another aspect, a device may include means for continually receiving information related to at least one of: steadiness of the device, movement of the device, a location of a subject image in a frame of an image, an orientation of the device; or one or more rules of composition. In addition, the device may include means for continually monitoring the means for continually receiving information and means for automatically capturing the image when the means for continually monitoring detects a favorable condition for capturing the image.
Additionally, the device may further include means for applying the one or more rules of composition to the captured image.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate one or more embodiments described herein and, together with the description, explain the embodiments. In the drawings: Figs. IA and IB show an exemplary viewfmder/display of an exemplary device in which concepts described herein may be implemented;
Figs. 2A and 2B are front and rear views, respectively, of an exemplary device in which concepts described herein may be implemented;
Fig. 3 is a block diagram of exemplary components of the exemplary device of Figs. 2A and 2B;
Fig. 4 is an functional block diagram of the exemplary device of Figs. 2A and 2B; Fig. 5 is a flow chart of an exemplary process for automatic timing of a photographic shot;
Figs. 6A and 6B are front and rear views, respectively, of another exemplary device in which concepts described herein may be implemented;
Fig. 7A is an illustration of a user taking a picture with the device of Figs. 2A and 2B; and
Figs. 7B and 7C show the viewfϊnder/display of the device in Figs. 2A and 2B.
DETAILED DESCRIPTION OF EMBODIMENTS The following detailed description refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements.
In implementations described herein, a device (e.g., a camera) may aid a user in taking pictures. When a user performs a certain movement, the device may automatically capture an image based on certain factors, such as steadiness of the device, lighting of a subject of the image, the spatial orientation of the device or the subject, the composition of the image (e.g., whether a face is present in the image or not), etc.
For example, assume that, at night, a user wishes to take a photographic shot of a person. Because the person is about to take a taxi, the user quickly pulls out a camera from a pocket, and directs the camera toward the person. Based on conditions such as steadiness of the camera, orientation of the camera, position of the person (or the person's face) in a frame, or other factors, the device determines an appropriate moment to take a picture.
Figs. IA and IB illustrate the above example. Fig. IA shows a viewfmder/ display 102 with a subject image 104, when the subject is under an unfavorable condition (e.g., a bad lighting condition). When the condition improves (e.g., lighting condition improves), as illustrated in Fig. IB, the camera may capture subject image 104Λ
For the user, timing a photographic shot to achieve a particular effect may be sometimes preferable to trying to achieve the similar effect by first capturing an image and then applying image processing techniques to the captured image. For example, suppose a camera times a shot of a moving subject, so that a subject image is centered within in a frame. One may attempt to obtain a similar effect by first capturing an image, and then shifting the frame to cover a slightly different area of the captured image. Such a shift may not be possible, however, if the captured image is not large enough to fully cover the shifted frame.
The term "image," as used herein, may refer to a digital or an analog representation of visual information (e.g., a picture, a video, a photograph, animations, etc).
The term "camera," as used herein, may include a device that may capture images. For example, a digital camera may include an electronic device that may capture and store images electronically instead of using photographic film. A digital camera may be multifunctional, with some devices capable of recording sound and/or images. A "subject," as the term is used herein, is to be broadly interpreted to include any person, place, and/or thing capable of being captured as an image. The term "subject image" may refer to an image of a subject. The term "frame" may refer to a closed, often rectangular, border of lines or edges (physical or logical) that enclose the picture of a subject.
The term "rules of composition," as used herein, may refer to guidelines, techniques, or procedures that may be used to compose a particular image. The rules of composition may not only include well known rules in photography, such as the Rule of Thirds, but may also include ad-hoc rules, techniques, or procedures for adjusting images in a particular manner (e.g., aligning vertical lines in an image with vertical sides of a frame).
As used herein, the Rule of Thirds may refer to a rule of thumb in image composition. According to the rule, a frame of an image may be partitioned into nine equal parts, which may be used to align features of the image.
As used herein, the rules that are related to the Golden Ratio may refer to rules of thumb in image composition. According to the rule, a frame of an image may be partitioned into Golden Sections based on the Golden Ratio, which is approximately 1.618. The partitions or points on the partitions may be used to align features of the image.
EXEMPLARY DEVICE
Figs. 2A and 2B are front and rear views, respectively, of an exemplary device 200 in which concepts described herein may be implemented. In this implementation, device 200 may take the form of a camera (e.g., a standard 35 mm or digital camera). As shown in Figs. 2A and 2B, device 200 may include a button 202, a viewfmder/display 204, a lens assembly 206, sensors 208, a flash 210, and housing 212. Button 202 may permit the user to interact with device 200 to cause device 200 to perform one or more operations, such as taking a picture. Viewfmder/display 204 may provide visual information to the user, such as an image of a view, video images, pictures, etc. Lens assembly 206 may include a device for manipulating light rays from a given or a selected range, so that images in the range can be captured in a desired manner. Sensors 208 may collect and provide, to device 200, information (e.g., acoustic, infrared, etc.) that may be used to aid the user in capturing images. For example, sensors 208 may provide acoustic information that may be used for automatically focusing an image. Flash 210 may include any type of flash unit used in cameras and may provide illumination for taking pictures. Housing 212 may provide a casing for components of device 200 and may protect the components from outside elements.
Fig. 3 is a block diagram of exemplary components of device 200. The term "component," as used herein, may refer to hardware component, a software component, or a combination of the two. As shown, device 200 may include memory 302, processing unit 304, viewfmder/display 306, lens assembly 308, sensors 310, and other input/output components 312. In other implementations, device 200 may include more, fewer, or different components.
Memory 302 may include static memory, such as read only memory (ROM), and/or dynamic memory, such as random access memory (RAM), or onboard cache, for storing data and machine-readable instructions. Memory 302 may also include storage devices, such as a floppy disk, CD ROM, CD read/write (R/W) disc, and/or flash memory, as well as other types of storage devices. Processing unit 304 may include a processor, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), and/or other processing logic capable of controlling device 200.
Viewfmder/display 306 may include a component that can display signals generated by device 200 as images on a screen and/or that can accept inputs in the form of taps or touches on the screen. For example, viewfmder/display 306 may provide a window through which the user may view images that are received from lens assembly 308. Examples of viewfinder/display 306 include an optical viewfinder (e.g., a reversed telescope), liquid crystal display (LCD), organic light-emitting diode (OLED) display, surface-conduction electron- emitter display (SED), plasma display, field emission display (FED), bistable display, and/or a touch screen. Lens assembly 308 may include a component for manipulating light rays from a given or a selected range, so that images in the range can be captured in a desired manner (e.g., a zoom lens, a wide-angle lens, etc.). Lens assembly 308 may be controlled manually and/or electromechanically by processing unit 304 to obtain the correct focus, span, and magnification (i.e., zoom) of the subject image and to provide a proper exposure. Sensors 310 may include one or more devices for obtaining information related to image, luminance, focus, zoom, sound, movement of device 200, and/or orientation of device 200. Sensors 310 may provide the information to processing unit 304, so that processing unit 304 may control lens assembly 308 and/or other components. Examples of sensors 310 may include a_cpjn£lerjp^rrtaryjn^ta]-oxide=
Figure imgf000008_0001
(CCD) sensor for sensing light, a gyroscope for sensing the orientation of device 200, an accelerometer for sensing movement of device 200, an infrared signal sensor or an ultrasound sensor for measuring a distance from a subject to device 200, a microphone; etc. Other input/output components 312 may include components for converting physical events or phenomena to and/or from digital signals that pertain to device 200. Examples of other input/output components 312 may include a flash, button, mouse, speaker, microphone, Universal Serial Bus (USB) port, etc.
In other implementations, device 200 may include other components, such as a network interface. If included in device 200, the network interface may include any transceiver- like mechanism that enables device 200 to communicate with other devices and/or systems. For example, the network interface may include mechanisms for communicating via a network, such as the Internet, a terrestrial wireless network (e.g., wireless local area network (WLAN)), a satellite-based network, etc. Additionally or alternatively, the network interface may include a modem, an Ethernet interface to a local area network (LAN), and/or an interface/connection for connecting device 200 to other devices (e.g., a Bluetooth interface).
Fig. 4 is a functional block diagram of device 200. As shown, device 200 may include a database 402, image processing logic 404, composition logic 406, and/or shooting logic 408. Depending on the particular implementation, device 200 may include fewer, additional, or different types of functional blocks than those illustrated in Fig. 4.
Database 402 may be included in memory 302 (Fig. 3) and act as an information repository for the components of device 200. For example, in one implementation, database 402 may store or maintain images (e.g., pictures, video clips, etc.) that may be stored and/or accessed by image processing logic 404 and/or composition logic 406. In another example, database 402 may store or maintain audio files (e.g., audio clips, ring tones, etc.).
Image processing logic 404 may include hardware and/or software for processing images before/after the images are captured. For example, image processing logic 404 may apply a filter for noise-reduction in to improve the quality of images before/after the images are captured by device 200. In another example, image processing logic 404 may apply an antialiasing procedure to captured images.
Composition logic 406 may include hardware and/or software for applying rules of composition to images before and/or after the images are captured. The application of the rules of composition may render the images more aesthetically pleasing to viewers. Some of the rules of composition that may be applied by composition logic 406 may include: aligning a vertical or horizontal feature within an image with horizontal/vertical sides of a frame of the image; aligning an image with the direction of gravity at the time the image is captured; centering a subject of an image; applying the Rule of Thirds; zooming in on a subject to increase the size of a subject image relative to the frame of the image; applying rules related to the Golden Ratio; etc. Depending on implementation, composition logic 406 may apply fewer, additional, or different rules than those listed above.
Shooting logic 408 may include hardware and/or software for capturing an image at a precise moment based on various sensor inputs and/or parameters that are related to the image. In one implementation, shooting logic 408 may determine whether a user wishes to use device 200 based on a number of factors. The factors may include: a change in motion of device 200; a change in the orientation of device 200; the amount of light that is detected by sensors 310 in device 200; etc. The motion/ orientation of device 200 may be relevant, because, for example, the user may pull out device 200 from a pocket to capture an image and, thus, move or rotate device 200. A change in the amount of light that device 200 detects may indicate whether device 200 has been removed from an enclosed space for use (e.g., a pocket, a box, an encasing, etc.). Depending on implementation, device 200 may rely on other factors to determine whether the user intends to use device 200 to capture an image. Provided that shooting logic 408 determines the user wishes to use device 200, shooting logic 408 may automatically capture an image, depending on various factors. The factors that shooting logic 408 may use in determining a correct moment to capture an image may include a decrease in the movement of device 200, amount of light that is detected by sensors 310, how much device 200 is shaking, orientation of device 200, detection of a particular subject image (e.g., a face) within a frame associated with the subject image; etc.
A decrease in the movement of device 200 may indicate a user's attempt to bring device 200 to a still position in order to capture an image. The amount of light detected by sensors 310 may inform shooting logic 408 about the exposure that sensors 310 may incur in capturing the image. For example, if sensors 310 detect too much light, shooting logic 408 may delay taking a shot. Excessive shaking in device 200 may introduce blurs in the captured image, and therefore, may cause shooting logic 310 to delay taking the shot. Whether a subject (e.g., face) is fully within a frame of the subject image may indicate a if a user wishes to capture the image. In determining whether the subject image is within the frame, shooting logic 408 may employ a component of image processing logic 404 (e.g., a face recognition component). EXEMPLARY PROCESSES FOR AIDING IMAGE COMPOSITION
AND/OR FRAMING
Fig. 5 shows an exemplary process 502 for an automatic timing of a photographic shot. Process 502 may begin at block 504, where user inputs for selecting factors that may be used by device 200 to determine if a user wishes to use device 200 may be received (block 504). As already described above, the factors may include: a change in motion or the orientation of device 200, the amount of light that is detected by sensors 310 in device 200, etc.
User inputs for selecting factors that may be used by device 200 to determine the timing of a shot may be received (block 506). The factors may include: how much device 200 is shaking or moving, the orientation of device 200, amount of light that is detected by sensors 310, a detection of a subject image within a frame, etc. As described above with reference to shooting logic 408, these factors may be used by device 200 to determine the moment at which a subject image may be captured. In some implementations, device 200 may be configured to account for factors that are related to image composition. For example, device 200 may detect whether a subject image is centered in a frame, whether the subject image satisfies the Rule of Thirds, whether the subject image satisfies the Golden Rule, etc.
A motion and/or a change in the orientation of device 200, a change in lighting, a change in an input image, and/or a user input may be detected (block 508). The motion and/or the change in the orientation of device 200, the change in lighting, a change in the input image, and or the user input may reveal a user's intent to use device 200 to capture an image.
For example, when a user wishes to take a picture, the user may quickly take device 200 out of a pocket. Consequently, the user may move device 200, or change the orientation of device 200. Furthermore, the user may expose device 200 to light when the user removes device 200 from the pocket. These factors may be detected by sensors 310 and/or processing unit 304 in device 200, and used to determine if the user wants to use device 200 to capture an image. In some implementations, device 200 may simply rely on a user input (e.g., pressing on button 202) to determine that the user wants to use device 200.
It may be determined if the user wishes to use device 200 to capture an image (block 510). If the changes that are detected at block 508 meet thresholds or if the changes meet requirements that may be inputted at block 504, device 200 may determine that the user wishes to use device 200. The motion/orientation, lighting, input image, and/or user input may be monitored
(block 512).
If the motion/orientation, lighting, properties associated with the input images, and/or user input indicates a favorable condition for capturing the image, a shot may be taken (block 514). For example, assume that, at block 506, the user selects the motion/orientation, lighting, and properties that are associated with desirable images as factors that device 200 may account for in automatically timing a shot. In such a case, device 200 may take the shot when device 200 determines that device 200 is being held motionless for a moment (e.g., a second) after being removed from a pocket, that device 200 is being held horizontally or vertically, that there is sufficient light for capturing an image, and/or that a subject image is completely/mostly within the frame. If device 200 determines that a favorable condition for capturing the image is not present, process 502 may return to block 512 to continue monitoring the factors. Process 502 may repeat blocks 512-514, until a particular length of time elapses. In one implementation, the particular length of time may be inputted by the user at block 506.
ALTERNATIVE IMPLEMENTATION Figs. 6A and 6B are front and rear views, respectively, of another exemplary device
600 in which concepts described herein may be implemented. In the implementation shown, device 600 may include any of the following devices that have the ability to or are adapted to capture or process images (e.g., a video clip, a photograph, etc): a telephone, such as a radio telephone or a mobile telephone; a personal communications system (PCS) terminal that may combine a cellular radiotelephone with, data processing, facsimile, and/or data communications capabilities; an electronic notepad; a laptop; a personal computer (PC); a personal digital assistant (PD A) that can include a telephone; or another type of computational or communication device with the ability to process images. As shown, device 600 may include a speaker 602, a display 604, control buttons 606, a keypad 608, a microphone 610, sensors 612, a lens assembly 614, a flash 616, and housing 618. Speaker 602 may provide audible information to a user of device 600. Display 604 may provide visual information to the user, such as video images or pictures. Control buttons 606 may permit the user to interact with device 600 to cause device 600 to perform one or more operations, such as place or receive a telephone call. Keypad 608 may include a standard telephone keypad. Microphone 610 may receive audible information from the user. Sensors 612 may collect and provide, to device 600, information (e.g., acoustic, infrared, etc.) that is used to aid the user in capturing images. Lens assembly 614 may include a device for manipulating light rays from a given or a selected range, so that images in the range can be captured in a desired manner. Flash 616 may include any type of flash unit used in cameras and may provide illumination for taking pictures. Housing 618 may provide a casing for components of device 600 and may protect the components from outside elements.
EXAMPLE The following example illustrates processes involved in automatic timing of a photographic shot, with reference to Figs. 7A through 7C. The example is consistent with exemplary process 502 described above with respect to Fig. 5.
In Fig. 7A, Maria 702 is at a costume party. Assume that Maria 702 has input various factors that device 200 may use in determining whether Maria wishes to use device 200 in timing a shot. Also, assume that Maria 702 wants to take a picture of her friends 704, who are about to leave. Unfortunately, Maria 702 has broken her arm during her last ski trip to the Alps, and consequently, Maria 702 's cannot always take quick pictures. .
Maria 702 quickly lifts device 200. As described above, device 200 may determine that Maria 702 wishes to use device 200 to take a picture, based on the various factors, such as a lighting condition, changes in its motion and orientation, etc.
Figs. 7B and 7C illustrate viewfmder/display 204 of device 204, as Maria quickly attempts to take a picture of friends 704. Because Maria 702's hand shakes, sometimes viewfmder/display 204 shows her friends as in Fig. 7B, and sometimes as in Fig. 7C.
When an image of her friends 704 appears as in Fig. 7C, Maria 702 manages to hold her hand steady for a moment. Device 200 takes a shot of Maria 702's friends 704, based on various factors that include the steadiness of device 200, etc. The next day, Maria 702 shows the picture to her friends 704, who compliment Maria 702 for her skill in taking pictures. CONCLUSION
The foregoing description of implementations provides illustration, but is not intended to be exhaustive or to limit the implementations to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practice of the teachings.
For example, while a series of blocks has been described with regard to an exemplary process illustrated in Fig. 5, the order of the blocks may be modified in other implementations. In addition, non-dependent blocks may represent acts that can be performed in parallel to other blocks. It will be apparent that aspects described herein may be implemented in many different forms of software, firmware, and hardware in the implementations illustrated in the figures. The actual software code or specialized control hardware used to implement aspects does not limit the invention. Thus, the operation and behavior of the aspects were described without reference to the specific software code - it being understood that software and control hardware can be designed to implement the aspects based on the description herein.
It should be emphasized that the term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components, or groups thereof. Further, certain portions of the implementations have been described as "logic" that performs one or more functions. This logic may include hardware, such as a processor, an application specific integrated circuit, or a field programmable gate array, software, or a combination of hardware and software.
Even though particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the invention. In fact, many of these features may be combined in ways not specifically recited in the claims and/or disclosed in the specification.
No element, act, or instruction used in the present application should be construed as critical or essential to the implementations described herein unless explicitly described as such. Also, as used herein, the article "a" is intended to include one or more items. Where one item is intended, the term "one" or similar language is used. Further, the phrase "based on" is intended to mean "based, at least in part, on" unless explicitly stated otherwise.

Claims

WHAT IS CLAIMED IS:
1. A device comprising: a sensor to receive first information relating to a first set of conditions, where the first set of conditions comprise at least one of: steadiness of the device, movement of the device, a location of a subject image in a frame of an image, an orientation of the device, or one or more rules of composition; and a processor to monitor the first set of conditions and automatically capture an image based on the first set of conditions.
2. The device of claim 1 , where the sensor is further configured to receive second information relating to a second set of conditions and the processor is further configured to initiate the monitoring of the first set of conditions based the second set of conditions.
3. The device of claim 2, where the second set of conditions includes at least one of: a sudden motion of the device; a change in an orientation of the device; or a change in an amount of light around the device.
4. The device of claim 2, where the second information excludes information from an acoustic sensor, a mechanical sensor, a touch screen, a microphone, or a button.
5. The device of claim 1, where the first information excludes information from an acoustic sensor, a mechanical sensor, a touch screen, a microphone, or a button.
6. The device of claim 1, where the processor is further configured to: terminate the monitoring of the first set of conditions after a particular length of time elapses.
7. The device of claim 1, where the processor is further configured to: crop the captured image in order to place the subject image in a specific location in the frame of the captured image; rotate the captured image to align horizontal features in the captured image to horizontal sides of the frame of the captured image; or resize the captured image.
8. The device of claim 1, where the sensor includes at least one of: a motion sensor; an orientation sensor; an acoustic sensor; a mechanical sensor; or a light sensor.
9. The device of claim 1, where the sensor includes at least one of: an accelerometer; a gyroscope; a touch-screen; a microphone; an ultrasound sensor; an infrared sensor; a button for detecting a user input; a complementary metal-oxide-semiconductor (CMQ.S)..sensor; or a charge-coupled device (CCD) sensor.
10. The device of claim 1, where the processor is further configured to: receive user inputs that select the second set of conditions.
11. The device of claim 1 , where the processor is further configured to: reduce noise in the image; or sharpen the image.
12. The device of claim 1, further comprising at least one of: a zoom lens; or a wide-angle lens.
13. The device of claim 1 , where the device includes: a camera; or a cell phone.
14. A method comprising : receiving one or more signals from one or more sensors of a device, where the one or more signals include at least one of: a signal that indicates when the device is steady, a signal that indicates when the device is not moving, a signal that indicates when the device is in a particular orientation, or a signal that indicates when a subject image in a frame of an image is positioned in accordance with one or more rules of composition; monitoring the one or more signals to determine when a favorable condition exists for capturing the image; and capturing the image when it is determined that the favorable condition exists.
15. The method of claim 14, further comprising: processing the captured image to improve a quality of the captured image.
16. The method of claim 14, further comprising: starting the monitoring of the one or more signals based on at least one of: a press on a button of the device, a change in motion of the device, a change in orientation of the device, or a change in amount of light sensed by the device.
17. The method of claim 14, where monitoring the one or more signals includes at least one of: determining if the device is oriented horizontally or vertically; or determining if the subject image is fully within the frame.
18. The method of claim 14, further comprising: receiving user inputs to select factors that are used to determine if the favorable condition exists.
19. A device comprising : means for continually receiving information related to at least one of: steadiness of the device, movement of the device, a location of a subject image in a frame of an image, an orientation of the device; or one or more rules of composition; means for continually monitoring the means for continually receiving information; and means for automatically capturing the image when the means for continually monitoring detects a favorable condition for capturing the image.
20. The device of claim 19, further comprising: means for applying the one or more rules of composition to the captured image.
PCT/IB2008/053431 2007-10-26 2008-08-26 Automatic timing of a photographic shot WO2009053863A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US98282507P 2007-10-26 2007-10-26
US60/982,825 2007-10-26

Publications (1)

Publication Number Publication Date
WO2009053863A1 true WO2009053863A1 (en) 2009-04-30

Family

ID=40329352

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2008/053431 WO2009053863A1 (en) 2007-10-26 2008-08-26 Automatic timing of a photographic shot

Country Status (1)

Country Link
WO (1) WO2009053863A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015125409A1 (en) * 2014-02-21 2015-08-27 Sony Corporation Wearable device, control apparatus, photographing control method and automatic imaging apparatus
EP2786206A4 (en) * 2012-10-12 2015-11-18 Ebay Inc Guided photography and video on a mobile device
CN106303406A (en) * 2015-06-29 2017-01-04 Lg电子株式会社 Mobile terminal
US9552598B2 (en) 2012-10-12 2017-01-24 Ebay Inc. Mobile trigger web workflow
EP3182695A3 (en) * 2015-12-17 2017-09-20 LG Electronics Inc. Mobile terminal and method for controlling the same
JP2020030841A (en) * 2009-10-19 2020-02-27 セラノス アイピー カンパニー エルエルシー Integrated health data capture and analysis system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040170397A1 (en) * 1999-06-03 2004-09-02 Fuji Photo Film Co., Ltd. Camera and method of photographing good image
GB2400667A (en) * 2003-04-15 2004-10-20 Hewlett Packard Development Co Attention detection
GB2403365A (en) * 2003-06-27 2004-12-29 Hewlett Packard Development Co Camera having behaviour memory
JP2006005662A (en) * 2004-06-17 2006-01-05 Nikon Corp Electronic camera and electronic camera system
EP1793580A1 (en) * 2005-12-05 2007-06-06 Microsoft Corporation Camera for automatic image capture having plural capture modes with different capture triggers

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040170397A1 (en) * 1999-06-03 2004-09-02 Fuji Photo Film Co., Ltd. Camera and method of photographing good image
GB2400667A (en) * 2003-04-15 2004-10-20 Hewlett Packard Development Co Attention detection
GB2403365A (en) * 2003-06-27 2004-12-29 Hewlett Packard Development Co Camera having behaviour memory
JP2006005662A (en) * 2004-06-17 2006-01-05 Nikon Corp Electronic camera and electronic camera system
EP1793580A1 (en) * 2005-12-05 2007-06-06 Microsoft Corporation Camera for automatic image capture having plural capture modes with different capture triggers

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GEMMELL J ET AL: "Passive Capture and Ensuing Issues for a Personal Lifetime Store", PROCEEDINGS OF THE 1ST. ACM WORKSHOP ON CONTINUOUS ARCHIVAL AND RETRIEVAL OF PERSONAL EXPERIENCES. CARPE'04. NEW YORK, NY, OCT. 15, 2004; [PROCEEDINGS OF THE WORKSHOP ON CONTINUOUS ARCHIVAL AND RETRIEVAL OF PERSONAL EXPERIENCES. CARPE], NEW YORK, NY, 15 October 2004 (2004-10-15), pages 48 - 55, XP002374525, ISBN: 978-1-58113-932-7 *
HEALEY J ET AL: "StartleCam: a cybernetic wearable camera", WEARABLE COMPUTERS, 1998. DIGEST OF PAPERS. SECOND INTERNATIONAL SYMPO SIUM ON PITTSBURGH, PA, USA 19-20 OCT. 1998, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 19 October 1998 (1998-10-19), pages 42 - 49, XP010312821, ISBN: 978-0-8186-9074-7 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020030841A (en) * 2009-10-19 2020-02-27 セラノス アイピー カンパニー エルエルシー Integrated health data capture and analysis system
JP7023913B2 (en) 2009-10-19 2022-02-22 ラブラドール ダイアグノスティクス エルエルシー Integrated health data acquisition and analysis system
US10750075B2 (en) 2012-10-12 2020-08-18 Ebay Inc. Guided photography and video on a mobile device
EP3594744A1 (en) * 2012-10-12 2020-01-15 eBay, Inc. Guided photography and video on a mobile device
US9552598B2 (en) 2012-10-12 2017-01-24 Ebay Inc. Mobile trigger web workflow
US11763377B2 (en) 2012-10-12 2023-09-19 Ebay Inc. Guided photography and video on a mobile device
US11430053B2 (en) 2012-10-12 2022-08-30 Ebay Inc. Guided photography and video on a mobile device
EP2786206A4 (en) * 2012-10-12 2015-11-18 Ebay Inc Guided photography and video on a mobile device
US9883090B2 (en) 2012-10-12 2018-01-30 Ebay Inc. Guided photography and video on a mobile device
US9374517B2 (en) 2012-10-12 2016-06-21 Ebay Inc. Guided photography and video on a mobile device
US10341548B2 (en) 2012-10-12 2019-07-02 Ebay Inc. Guided photography and video on a mobile device
WO2015125409A1 (en) * 2014-02-21 2015-08-27 Sony Corporation Wearable device, control apparatus, photographing control method and automatic imaging apparatus
US10356322B2 (en) 2014-02-21 2019-07-16 Sony Corporation Wearable device, control apparatus, photographing control method and automatic imaging apparatus
US10638046B2 (en) 2014-02-21 2020-04-28 Sony Corporation Wearable device, control apparatus, photographing control method and automatic imaging apparatus
CN106303406A (en) * 2015-06-29 2017-01-04 Lg电子株式会社 Mobile terminal
CN106303406B (en) * 2015-06-29 2021-03-02 Lg电子株式会社 Mobile terminal
US9883365B2 (en) 2015-06-29 2018-01-30 Lg Electronics Inc. Mobile terminal
EP3112989A3 (en) * 2015-06-29 2017-03-01 LG Electronics Inc. Mobile terminal
US10015400B2 (en) 2015-12-17 2018-07-03 Lg Electronics Inc. Mobile terminal for capturing an image and associated image capturing method
EP3182695A3 (en) * 2015-12-17 2017-09-20 LG Electronics Inc. Mobile terminal and method for controlling the same

Similar Documents

Publication Publication Date Title
US7991285B2 (en) Using a captured background image for taking a photograph
JP5787907B2 (en) Imaging device for taking self-portrait images
US7920179B2 (en) Shadow and reflection identification in image capturing devices
US20100302393A1 (en) Self-portrait assistance in image capturing devices
KR100925319B1 (en) Image pickup apparatus equipped with function of detecting image shaking, control method of the image pickup apparatus, and recording medium recording control program of the image pickup apparatus
JP2005321806A (en) Image-exposure system and method
WO2019037781A1 (en) Terminal and anti-shaking photographing method therefor, and storage apparatus
WO2009053863A1 (en) Automatic timing of a photographic shot
US20140118606A1 (en) Smart cameras
JP2013229856A (en) Image processing apparatus, imaging apparatus, server apparatus, and computer program
JP2009212802A (en) Imaging apparatus with composition assisting function, and composition assisting method of the same imaging apparatus
JP2013141155A (en) Imaging device
JP2005184246A (en) Imaging unit
JP2007316471A (en) Imaging apparatus and program therefore
JP5877030B2 (en) Imaging apparatus and imaging method
JP2007013586A (en) Image pickup device
JP2011172266A (en) Imaging apparatus, imaging method and imaging program
TWI381243B (en) Portable electrical apparatus and operating method thereof
WO2009053864A1 (en) Aiding image composition and/or framing
JP2006262211A (en) Digital still camera and imaging method thereof
JP5963890B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND IMAGING DEVICE MODE SWITCHING PROGRAM
JP2005156861A (en) Personal digital assistant
JP2014135609A (en) Image-capturing apparatus
JP2006115172A (en) Photographic apparatus and program
JP2017017435A (en) Image effect processing support device, image effect processing support method, and image effect processing support program

Legal Events

Date Code Title Description
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08807440

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08807440

Country of ref document: EP

Kind code of ref document: A1