US20150271381A1 - Methods and systems for determining frames and photo composition within multiple frames - Google Patents

Methods and systems for determining frames and photo composition within multiple frames Download PDF

Info

Publication number
US20150271381A1
US20150271381A1 US14/220,149 US201414220149A US2015271381A1 US 20150271381 A1 US20150271381 A1 US 20150271381A1 US 201414220149 A US201414220149 A US 201414220149A US 2015271381 A1 US2015271381 A1 US 2015271381A1
Authority
US
United States
Prior art keywords
frames
frame
moving speed
candidate
area corresponding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/220,149
Other languages
English (en)
Inventor
Bing-Sheng Lin
Yi-Chi Lin
Tai-Ling Lu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HTC Corp
Original Assignee
HTC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HTC Corp filed Critical HTC Corp
Priority to US14/220,149 priority Critical patent/US20150271381A1/en
Priority to TW103137446A priority patent/TWI531911B/zh
Priority to CN201410614516.5A priority patent/CN104933677B/zh
Assigned to HTC CORPORATION reassignment HTC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIN, YI-CHI, LU, TAI-LING, LIN, BING-SHENG
Publication of US20150271381A1 publication Critical patent/US20150271381A1/en
Priority to US15/159,825 priority patent/US9898828B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • H04N5/2353
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • G06T7/2006
    • G06T7/2073
    • G06T7/2086
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • H04N5/23245
    • H04N5/23293
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2625Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect
    • H04N5/372
    • H04N5/374
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/042Picture signal generators using solid-state devices having a single pick-up sensor

Definitions

  • the disclosure relates generally to image frame management, and, more particularly to methods and systems for determining frames and photo composition within multiple frames.
  • a handheld device may have telecommunications capabilities, e-mail message capabilities, image capture capabilities, an advanced address book management system, a media playback system, and various other functions. Due to increased convenience and functions of the devices, these devices have become necessities of life.
  • the image capture unit such as a camera takes images immediately one after another in a short amount of time. That is, when the continuous shot function is performed, a continuous image capture process is performed to continuously capture a plurality of images in sequence.
  • an inventive function called “dynamic continuous shot composition” may be also provided on the portable devices.
  • the dynamic continuous shot composition is a way for image composition.
  • a camera can be set on a tripod, and several images with the same scene are continuously captured by the camera.
  • the moving object within the images are extracted and overlapped onto the last image, thus to present the dynamic effect of the track of the moving object.
  • the overlap of the moving object onto the last image can be achieved by using an image composition algorithm. It is understood that, the image composition algorithm may be various and known in the art, and related descriptions are omitted here.
  • the respective images are continuously captured with a fixed time interval. If the time interval is too short or the moving speed of the object is too fast, the objects on the composed image may have a large overlapped portion, as shown in FIG. 1A . On the contrary, if the time interval is too long or the moving speed of the object is too slow, the objects may be scattered on the composed image, as shown in FIG. 1B , and the number of the objects on the composed image may be not enough, resulting in difficulties for presenting the dynamic effect.
  • a plurality of frames which are respectively captured with a predefined time interval are obtained. At least one object within at least two of the frames is detected. A moving speed of the object is calculated according to the positions of the object in the respective frames and the predefined time interval, and candidate frames are selected from the frames according to the moving speed of the object.
  • An embodiment of a system for determining frames within multiple frames comprises a storage unit and a processing unit.
  • the storage unit comprises a plurality of frames, which are respectively captured with a predefined time interval.
  • the processing unit detects at least one object within at least two of the frames.
  • the processing unit calculates a moving speed of the object according to the positions of the object in the respective frames and the predefined time interval, and selects candidate frames from the frames according to the moving speed of the object.
  • a plurality of frames which are respectively captured with a predefined time interval are obtained. At least one object within at least two of the frames is detected. A moving speed of the object is calculated according to the positions of the object in the respective frames and the predefined time interval, and candidate frames are selected from the frames according to the moving speed of the object. The candidate frames are composed to generate a composed photo.
  • An embodiment of a system for photo composition within multiple frames comprises a storage unit and a processing unit.
  • the storage unit comprises a plurality of frames, which are respectively captured with a predefined time interval.
  • the processing unit detects at least one object within at least two of the frames.
  • the processing unit calculates a moving speed of the object according to the positions of the object in the respective frames and the predefined time interval, and selects candidate frames from the frames according to the moving speed of the object.
  • the processing unit composes the candidate frames to generate a composed photo.
  • a table is looked up according to the moving speed of the object to obtain a frame gap number, and the candidates frames are selected within the frames at intervals of the frame gap number. In some embodiments, the faster the moving speed is, the larger the frame gap number is.
  • a plurality of frames which are respectively captured with a predefined time interval are obtained. At least one object within at least two of the frames is detected. An overlapped area corresponding to the object within a first frame and a second frame is calculated, and at least one candidate frame is selected according, to the overlapped area corresponding to the object.
  • An embodiment of a system for determining frames within multiple frames comprises a storage unit and a processing unit.
  • the storage unit comprises a plurality of frames, which are respectively captured with a predefined time interval.
  • the processing unit detects at least one object within at least two of the frames.
  • the processing unit calculates an overlapped area corresponding to the object within a first frame and a second frame, and selects at least one candidate frame according to the overlapped area corresponding to the object.
  • a plurality of frames which are respectively captured with a predefined time interval are obtained. At least one object within at least two of the frames is detected. An overlapped area corresponding to the object within a first frame and a second frame is calculated, and at least one candidate frame is selected according to the overlapped area corresponding to the object. The at least one candidate frame is composed to generate a composed photo.
  • An embodiment of a system for photo composition within multiple frames comprises a storage unit and a processing unit.
  • the storage unit comprises a plurality of frames, which are respectively captured with a predefined time interval.
  • the processing unit detects at least one object within at least two of the frames.
  • the processing unit calculates an overlapped area corresponding to the object within a first frame and a second frame, and selects at least one candidate frame according to the overlapped area corresponding to the object.
  • the processing unit composes the at least one candidate frame to generate a composed photo.
  • Methods for determining frames and photo composition within multiple frames may take the form of a program code embodied in a tangible media.
  • the program code When the program code is loaded into and executed by a machine, the machine becomes an apparatus for practicing the disclosed method.
  • FIG. 1A is a schematic diagram illustrating an example of a composed image with objects having a large overlapped portion
  • FIG. 1B is a schematic diagram illustrating an example of a composed image with scattered objects
  • FIG. 2 is a schematic diagram illustrating an embodiment of a system for determining frames and photo composition within multiple frames of the invention
  • FIG. 3 is a flowchart of an embodiment of a method for determining frames within multiple frames of the invention
  • FIG. 4 is a flowchart of another embodiment of a method for determining frames within multiple frames of the invention.
  • FIG. 5 is a flowchart of an embodiment of a method for selecting at least one candidate frame according to the overlapped area corresponding to the object of the invention
  • FIG. 6 is a flowchart of another embodiment of a method for selecting at least one candidate frame according to the overlapped area corresponding to the object of the invention.
  • FIG. 7 is a flowchart of an embodiment of a method for photo composition within multiple frames of the invention.
  • FIG. 8 is a flowchart of another embodiment of a method for photo composition within multiple frames of the invention.
  • FIG. 2 is a schematic diagram illustrating an embodiment of a system for determining frames and/or photo composition within multiple frames of the invention.
  • the system for determining frames and/or photo composition within multiple frames 100 can be used in an electronic device, such as a computer, or a portable device, such as a digital camera, a handheld device such as a mobile phone, a smart phone, a PDA (Personal Digital Assistant), a GPS (Global Positioning System), or any picture-taking device.
  • an electronic device such as a computer, or a portable device, such as a digital camera, a handheld device such as a mobile phone, a smart phone, a PDA (Personal Digital Assistant), a GPS (Global Positioning System), or any picture-taking device.
  • PDA Personal Digital Assistant
  • GPS Global Positioning System
  • the system for determining frames and/or photo con position within multiple frames 100 comprises a storage unit 110 and a processing unit 120 .
  • the storage unit 110 comprises a plurality of frames, which are respectively captured with a tune interval. It is understood that, in some embodiments, a time interval would be predefined or dynamically defined. It is understood that, in some embodiments, the frames can be obtained from a video. It is understood that, in some embodiments, the system for determining frames and/or photo composition within multiple frames 100 can also comprise an image capture unit (not shown in FIG. 2 ).
  • the image capture unit may be a CCD (Charge Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor), placed at the imaging position for objects inside the electronic device.
  • the image capture unit can continuously capture the frames within a predefined time interval.
  • the system for determining frames and/or photo composition within multiple frames 100 can also comprise a display unit (not shown in FIG. 2 ).
  • the display unit can display related figures and interfaces, and related data, such as the image frames continuously captured by the image capture unit.
  • the display unit may be a screen integrated with a touch-sensitive device (not shown).
  • the touch-sensitive device has a touch-sensitive surface comprising sensors in at least one dimension to detect contact and movement of an input tool, such as a stylus or finger on the touch-sensitive surface. That is, users can directly input related data via the display unit.
  • the processing unit 120 can control related components of the system for determining frames and/or photo composition within multiple frames 100 , process the image frames, and perform the methods for determining frames and/or photo composition within multiple frames, which will be discussed further in the following paragraphs. It is noted that, in some embodiments, the system for determining frames and/or photo composition within multiple frames 100 can further comprise a focus unit (not shown in FIG. 2 ). The processing unit 120 can control the focus unit to perform a focus process for at least one object during, the photography process.
  • FIG. 3 is a flowchart of an embodiment of a method for determining frames within multiple frames of the invention.
  • the method for determining frames within multiple frames can be used in an electronic device, such as a computer, or a portable device, such as a digital camera, a handheld device such as a mobile phone, a smart phone, a PDA, a GPS, or any picture-taking device.
  • candidate frames for photo composition can be determined.
  • a plurality of frames are obtained. It is understood that, in some embodiments, the frames are continuously and respectively captured within a predefined time interval. In some embodiments, the frames can be obtained from a video.
  • step S 320 at least one object within at least two of the frames, such as the first two successive frames is detected. It is understood that, in some embodiments, the object can be obtained by transforming the at least two frames into grayscale frames, and subtracting the grayscale frames with each other to obtain a contour of the object.
  • a moving speed of the object is calculated according to the positions of the object in the respective frames and the predefined time interval.
  • step S 340 candidate frames are selected from the frames according to the moving speed of the object.
  • a table can be looked up according to the moving speed of the object to obtain a frame gap number, and the candidates frames can be selected within the frames at intervals of the frame gap number. In the table, the faster the moving speed is, the larger the frame nap number is. It is noted that, the actual value of the frame gap number can be flexibly designed according different applications and requirements.
  • the selected candidate frames can be used for photo composition.
  • FIG. 4 is a flowchart of another embodiment of a method for determining frames within multiple frames of the invention.
  • the method for determining frames within multiple frames can be used in an electronic device, such as a computer, or a portable device, such as a digital camera, a handheld device such as a mobile phone, a smart phone, a PDA, a GPS, or any picture-taking device.
  • candidate frames for photo composition can be determined.
  • a plurality of frames are obtained. It is understood that, in some embodiments, the frames are continuously and respectively captured with a predefined time interval. In some embodiments, the frames can be obtained from a video.
  • step S 420 at least one object within at least two of the frames, such as the first two successive frames is detected. It is understood that, in some embodiments, the object can be obtained by transforming the at least two frames into grayscale frames, and subtracting the grayscale frames with each other to obtain as contour of the object.
  • an overlapped area corresponding to the object within two frames, such as a first frame and a second frame is calculated.
  • step S 440 at least one candidate frame is selected according to the overlapped area corresponding to the object.
  • the selected candidate frames can be used for photo composition.
  • FIG. 5 is a flowchart of an embodiment of a method for selecting at least one candidate frame according to the overlapped area corresponding to the object of the invention.
  • step S 510 it is determined whether the overlapped area corresponding to the object within two frames, such as a first frame and a second frame is less than a specific percentage of a contour area of the object. If the overlapped area corresponding to the object is less than a specific percentage of the contour area of the object (Yes in step S 510 ), in step S 520 , the rear frame, such as the second frame is selected as the candidate frame. If the overlapped area corresponding to the object is not less than a specific percentage of the contour area of the object (No in step S 510 ), the procedure is completed.
  • an overlapped area corresponding to the object within the first frame and a subsequent frame such as a third frame (wherein the second frame is not selected as a candidate frame) can be calculated, and accordingly determined.
  • an overlapped area corresponding to the object within the second frame and a subsequent frame such as a third frame (wherein the second frame is selected as a candidate frame) can be calculated, and accordingly determined. The process is repeated until all frames are examined.
  • FIG. 6 is a flowchart of another embodiment of a method for selecting at least one candidate frame according to the overlapped area corresponding to the object of the invention.
  • step S 610 it is determined whether the overlapped area corresponding to the object within two frames, such as a first frame and a second frame equals to zero. If the overlapped area corresponding to the object equals to zero (Yes in step S 610 ), in step S 620 , a moving speed of the object is calculated according to the positions of the object in the two frames and the predefined time interval. It is understood that, in some embodiments, once the contour of the object is detected, the area of the contour, the position of mass center of the area, and the moving speed of the object can be also calculated.
  • step S 630 candidate frames are selected within the frames accord Mg to the moving speed of the object.
  • a table can be looked up according to the moving speed of the object to obtain a frame gap number, and the candidates frames can be selected within the frames at intervals of the frame gap number. In the table, the faster the moving speed is, the larger the frame gap number is. It is noted that, the actual value of the frame gap number can be flexibly designed according different applications and requirements.
  • the selected candidate frames can be used for photo composition. If the overlapped area corresponding to the object does not equal to zero (No in step S 610 ), the procedure is completed.
  • an overlapped area corresponding to the object within the first frame and a subsequent frame such as a third frame (wherein the overlapped area corresponding to the object does not equal to zero) can be calculated, and accordingly determined. The process is repeated until all frames are examined.
  • FIG. 7 is a flowchart of an embodiment of it method for photo composition within multiple frames of the invention.
  • the method for photo composition within multiple frames can be used in an electronic device, such as a computer, or a portable device, such as a digital camera, a handheld device such as a mobile phone, a smart phone, a PDA, a GPS, or any picture-taking device.
  • candidate frames can be selected from frames and used for photo composition.
  • a plurality of frames are obtained. It is understood that, in some embodiments, the frames are continuously and respectively captured with a predefined time interval. In some embodiments, the frames can be obtained from a video.
  • step S 720 at least one object within at least two of the frames, such as the first two successive frames is detected. It is understood that, in some embodiments, the object can be obtained by transforming the at least two frames into grayscale frames, and subtracting the grayscale frames with each other to obtain a contour of the object.
  • a moving speed of the object is calculated according to the positions of the object in the respective frames and the predefined time interval.
  • step S 740 candidate frames are selected from the frames according to the moving speed of the object. It is understood that, in some embodiments, a table can be looked up according to the moving speed of the object to obtain a frame gap number, and the candidates frames can be selected within the frames at intervals of the frame gap number. In the table, the faster the moving speed is, the larger the frame gap number is. It is noted that, the actual value of the frame gap number can be flexibly designed according different applications and requirements.
  • step S 750 the selected candidate frames are composed to generate a composed photo. It is understood that, the composition of candidate frames can be performed by using an image composition algorithm. It is understood that the image composition algorithm may be various and known in the art, and related descriptions are omitted here.
  • FIG. 8 is a flowchart of another embodiment of a method for photo composition within multiple frames of the invention.
  • the method for photo composition can be used in an electronic device, such as a computer, or a portable device, such as a digital camera, a handheld device such as a mobile phone, a smart phone, a PDA, a GPS, or any picture-taking device, in the embodiment, candidate frames can be selected from frames and used for photo composition.
  • step S 810 a plurality of frames are obtained. It is understood that, in some embodiments, the frames are continuously and respectively captured with as predefined time interval. In some embodiments, the frames can be obtained from a video.
  • step S 820 at least one object within at least two of the frames, such as the first two successive frames is detected. It is understood that, in some embodiments, the object can be obtained by transforming the at least two frames into grayscale frames, and subtracting the grayscale frames with each other to obtain a contour of the object.
  • step S 830 an overlapped area corresponding to the object within two frames, such as a first frame and a second frame is calculated.
  • step S 840 at least one candidate frame is selected according to the overlapped area corresponding, to the object. It is understood that, in some embodiments, it is determined whether the overlapped area corresponding to the object is less than a specific percentage of a contour area of the object. If the overlapped area corresponding to the object is less than a specific percentage of the contour area of the object, the second frame is selected as the candidate frame. In some embodiments, it is determined whether the overlapped area corresponding to the object equals to zero.
  • a moving speed of the object is calculated according to the positions of the object in the respective frames and the predefined time interval, and candidate frames are selected within the frames according to the moving speed of the object.
  • the selected candidate frames are composed to generate a composed photo.
  • the composition of candidate frames can be performed by using an image composition algorithm. It is understood that, the image composition algorithm may be various and known in the art, and related descriptions are omitted here.
  • the methods and systems for determining frames and photo composition within multiple frames of the present invention can select appropriate frames from continuously captured frames according to the moving speed of the object, and/or the overlapped area corresponding to the object with frames, and accordingly generate a composed frame, thus improving the dynamic effect of the track of the moving object.
  • Methods for determining frames and photo composition within multiple frames may take the form of a program code (i.e., executable instructions) embodied in tangible media, such as floppy diskettes, CD-ROMS, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded, into and executed by a machine, such as a computer, the machine thereby becomes an apparatus for practicing the methods.
  • the methods may also be embodied in the form of a program code transmitted over some transmission medium, such as electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing, the disclosed methods.
  • the program code When implemented on a general-purpose processor, the program code combines with the processor to provide a unique apparatus that operates analogously to application specific logic circuits.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)
US14/220,149 2014-03-20 2014-03-20 Methods and systems for determining frames and photo composition within multiple frames Abandoned US20150271381A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US14/220,149 US20150271381A1 (en) 2014-03-20 2014-03-20 Methods and systems for determining frames and photo composition within multiple frames
TW103137446A TWI531911B (zh) 2014-03-20 2014-10-29 用於自多個訊框中決定多個候選訊框的方法
CN201410614516.5A CN104933677B (zh) 2014-03-20 2014-11-04 用以在多个讯框中决定多个候选讯框的方法
US15/159,825 US9898828B2 (en) 2014-03-20 2016-05-20 Methods and systems for determining frames and photo composition within multiple frames

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/220,149 US20150271381A1 (en) 2014-03-20 2014-03-20 Methods and systems for determining frames and photo composition within multiple frames

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/159,825 Division US9898828B2 (en) 2014-03-20 2016-05-20 Methods and systems for determining frames and photo composition within multiple frames

Publications (1)

Publication Number Publication Date
US20150271381A1 true US20150271381A1 (en) 2015-09-24

Family

ID=54120832

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/220,149 Abandoned US20150271381A1 (en) 2014-03-20 2014-03-20 Methods and systems for determining frames and photo composition within multiple frames
US15/159,825 Expired - Fee Related US9898828B2 (en) 2014-03-20 2016-05-20 Methods and systems for determining frames and photo composition within multiple frames

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/159,825 Expired - Fee Related US9898828B2 (en) 2014-03-20 2016-05-20 Methods and systems for determining frames and photo composition within multiple frames

Country Status (3)

Country Link
US (2) US20150271381A1 (zh)
CN (1) CN104933677B (zh)
TW (1) TWI531911B (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9036943B1 (en) * 2013-03-14 2015-05-19 Amazon Technologies, Inc. Cloud-based image improvement
TWI647674B (zh) * 2018-02-09 2019-01-11 光陽工業股份有限公司 Navigation method and system for presenting different navigation pictures

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110043639A1 (en) * 2009-08-20 2011-02-24 Sanyo Electric Co., Ltd. Image Sensing Apparatus And Image Processing Apparatus
US20110205397A1 (en) * 2010-02-24 2011-08-25 John Christopher Hahn Portable imaging device having display with improved visibility under adverse conditions
US20120242853A1 (en) * 2011-03-25 2012-09-27 David Wayne Jasinski Digital camera for capturing an image sequence

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6659344B2 (en) * 2000-12-06 2003-12-09 Ncr Corporation Automated monitoring of activity of shoppers in a market
CN100477745C (zh) * 2002-08-09 2009-04-08 夏普株式会社 图像合成装置和图象合成方法
EP1694247A2 (en) * 2003-12-15 2006-08-30 Technion Research And Development Foundation, Ltd. Therapeutic drug-eluting endoluminal covering
US7860343B2 (en) * 2006-04-10 2010-12-28 Nokia Corporation Constructing image panorama using frame selection
JP4513869B2 (ja) * 2008-02-13 2010-07-28 カシオ計算機株式会社 撮像装置、ストロボ画像生成方法、および、プログラム
US8103134B2 (en) * 2008-02-20 2012-01-24 Samsung Electronics Co., Ltd. Method and a handheld device for capturing motion
TW201001338A (en) * 2008-06-16 2010-01-01 Huper Lab Co Ltd Method of detecting moving objects
JP4735742B2 (ja) * 2008-12-18 2011-07-27 カシオ計算機株式会社 撮像装置、ストロボ画像生成方法、および、プログラム
JP5321070B2 (ja) * 2009-01-08 2013-10-23 カシオ計算機株式会社 撮影装置、撮影方法及びプログラム
JP5131257B2 (ja) * 2009-08-27 2013-01-30 カシオ計算機株式会社 表示制御装置及び表示制御プログラム
JP5484097B2 (ja) * 2010-01-27 2014-05-07 株式会社ワコム 位置検出装置および方法
US20120002112A1 (en) * 2010-07-02 2012-01-05 Sony Corporation Tail the motion method of generating simulated strobe motion videos and pictures using image cloning
US8736716B2 (en) * 2011-04-06 2014-05-27 Apple Inc. Digital camera having variable duration burst mode

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110043639A1 (en) * 2009-08-20 2011-02-24 Sanyo Electric Co., Ltd. Image Sensing Apparatus And Image Processing Apparatus
US20110205397A1 (en) * 2010-02-24 2011-08-25 John Christopher Hahn Portable imaging device having display with improved visibility under adverse conditions
US20120242853A1 (en) * 2011-03-25 2012-09-27 David Wayne Jasinski Digital camera for capturing an image sequence

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Author: Jacques, J.C.S. et al. Title: Background Subtraction and Shadow Detection in Grayscale Video Sequences Date: Oct. 2005 Published in: Computer Graphics and Image Processing, 2005. SIBGRAPI 2005. 18th Brazilian Symposium on *

Also Published As

Publication number Publication date
TWI531911B (zh) 2016-05-01
TW201537359A (zh) 2015-10-01
US20160267680A1 (en) 2016-09-15
CN104933677B (zh) 2018-11-09
US9898828B2 (en) 2018-02-20
CN104933677A (zh) 2015-09-23

Similar Documents

Publication Publication Date Title
KR101990073B1 (ko) 전자장치에서 다초점 영상 촬영 및 저장 방법 및 장치
US20190208125A1 (en) Depth Map Calculation in a Stereo Camera System
US10586308B2 (en) Digital media environment for removal of obstructions in a digital image scene
CN107787463B (zh) 优化对焦堆栈的捕获
US8643760B2 (en) Image processing methods and systems for handheld devices
US20120098981A1 (en) Image capture methods and systems
US20140204263A1 (en) Image capture methods and systems
US9445073B2 (en) Image processing methods and systems in accordance with depth information
US9898828B2 (en) Methods and systems for determining frames and photo composition within multiple frames
CN107003730A (zh) 一种电子设备、拍照方法及拍照装置
CN109889736B (zh) 基于双摄像头、多摄像头的图像获取方法、装置及设备
US9071735B2 (en) Name management and group recovery methods and systems for burst shot
US20160373648A1 (en) Methods and systems for capturing frames based on device information
KR102174808B1 (ko) 픽셀 다중화 카메라에 대한 떨림 블러 및 모션 블러의 제어
US9420194B2 (en) Methods and systems for generating long shutter frames
US20150355780A1 (en) Methods and systems for intuitively refocusing images
US9307160B2 (en) Methods and systems for generating HDR images
CN113364985B (zh) 一种现场直播的镜头跟踪方法、装置及介质
US9202133B2 (en) Methods and systems for scene recognition
CN115278053A (zh) 图像拍摄方法和电子设备
US8233787B2 (en) Focus method and photographic device using the method

Legal Events

Date Code Title Description
AS Assignment

Owner name: HTC CORPORATION, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIN, BING-SHENG;LIN, YI-CHI;LU, TAI-LING;SIGNING DATES FROM 20070910 TO 20140513;REEL/FRAME:035485/0435

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION