US20130021512A1 - Framing of Images in an Image Capture Device - Google Patents
Framing of Images in an Image Capture Device Download PDFInfo
- Publication number
- US20130021512A1 US20130021512A1 US13/232,052 US201113232052A US2013021512A1 US 20130021512 A1 US20130021512 A1 US 20130021512A1 US 201113232052 A US201113232052 A US 201113232052A US 2013021512 A1 US2013021512 A1 US 2013021512A1
- Authority
- US
- United States
- Prior art keywords
- image
- framing
- capture device
- subject
- logic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000009432 framing Methods 0.000 title claims abstract description 191
- 238000000034 method Methods 0.000 claims description 13
- 230000000977 initiatory effect Effects 0.000 claims description 4
- 230000006854 communication Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000003339 best practice Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000007175 bidirectional communication Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 210000002414 leg Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000013024 troubleshooting Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G06T5/73—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/136—Incoming video signal characteristics or properties
- H04N19/137—Motion inside a coding unit, e.g. average field, frame or block difference
- H04N19/139—Analysis of motion vectors, e.g. their magnitude, direction, variance or reliability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/189—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
- H04N19/192—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding the adaptation method, adaptation tool or adaptation type being iterative or recursive
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/42—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
- H04N19/436—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation using parallelised computational arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/537—Motion estimation other than block-based
- H04N19/54—Motion estimation other than block-based using feature points or meshes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/56—Motion estimation with initialisation of the vector search, e.g. estimating a good candidate to initiate a search
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/683—Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20201—Motion blur correction
Definitions
- ⁇ олователи may sometimes improperly frame an image that is captured by the device.
- a user may capture an image of a subject without framing the subject ideally.
- the subject of an image may not be centered in the frame, the subject may occupy too little or too much of the frame relative to any background and/or foreground image elements, have insufficient or excessive lighting, or possess other imperfections or inadequacies related to the user's framing of the image.
- FIGS. 1A and 1B are drawings of a mobile device incorporating an image capture device according to various embodiments of the disclosure.
- FIG. 2 is a drawing of an image capture device that can be incorporated into a mobile device shown in FIG. 1 according to various embodiments of the disclosure.
- FIGS. 3-8 are drawings of example user interfaces that can be generated in a mobile device in association with the image capture device shown in FIG. 2 according to various embodiments of the disclosure.
- FIG. 9 is a flowchart depicting one example execution of a user interface application executed in an image capture device according to various embodiments of the disclosure.
- Embodiments of the present disclosure relate to systems and methods that can be executed in an image capture device. More specifically, embodiments of the disclosure relate to systems and methods for framing and/or reframing of images captured by an image capture device to improve the framing characteristics and/or appearance.
- an image capture device can include a camera, video camera, a mobile device with an integrated image capture device, or other devices suitable to capturing imagery and/or video as can be appreciated.
- an image capture device according to an embodiment of the disclosure can include a device such as a smartphone, tablet computing system, laptop computer, desktop computer, or any other computing device that has the capability to receive and/or capture imagery via image capture hardware.
- image capture device hardware can include components such as lenses, image sensors (e.g., charge coupled devices, CMOS image sensor, etc.), processor(s), image signal processor(s), a main processor, memory, mass storage, or any other hardware or software components that can facilitate capture of imagery and/or video.
- an image signal processor can be incorporated as a part of a main processor in an image capture device module that is in turn incorporated into a device having its own processor, memory and other components.
- An image capture device can provide a user interface via a display that is integrated into the image capture device.
- the display can be integrated with a mobile device, such as a smartphone and/or tablet computing device, and can include a touchscreen input device (e.g., a capacitive touchscreen, etc.) with which a user may interact with the user interface that is presented thereon.
- the image capture device hardware can also include one or more buttons, dials, toggles, switches, or other input devices with which the user can interact with software executed in the image capture device.
- FIGS. 1A-1B show a mobile device 102 that can comprise and/or incorporate an image capture device according to various embodiments of the disclosure.
- the mobile device 102 may comprise, for example, a processor-based system, such as a computer system.
- a computer system may be embodied in the form of a desktop computer, a laptop computer, a personal digital assistant, a mobile device (e.g., cellular telephone, smart phone, etc.), tablet computing system, set-top box, music players, or other devices with like capability.
- the mobile device can include, for example, an image capture device 104 , which can further include a lens system as well as other hardware components that can be integrated with the device to facilitate image capture.
- the mobile device 102 can also include a display device 141 upon which various content and other user interfaces may be rendered.
- the mobile device 102 can also include one or more input devices with which a user can interact with a user interface rendered on the display device 141 .
- the mobile device 102 can include or be in communication with a mouse, touch input device (e.g., capacitive and/or resistive touchscreen incorporated with the display device 141 ), keyboard, or other input devices.
- the mobile device 102 may be configured to execute various applications, such as a camera application that can interact with an image capture module that includes various hardware and/or software components that facilitate capture and/or storage of images and/or video.
- the camera application can interact with application programming interfaces (API's) and/or other software libraries and/or drivers that are provided for the purpose interacting with image capture hardware, such as the lens system and other image capture hardware.
- API's application programming interfaces
- the camera application can be a special purpose application, a plug-in or executable library, one or more API's, image control algorithms, image capture device firmware, or other software that can facilitate communication with image capture hardware in communication with the mobile device 102 .
- FIG. 2 illustrates an embodiment of the various image capture components, or one example of an image capture device 104 , that can be incorporated in the mobile device 102 illustrated in FIGS. 1A-1B .
- an image capture device according to an embodiment of the disclosure more generally comprises an image capture device that can provide images in digital form.
- the image capture device 104 includes a lens system 200 that conveys images of viewed scenes to an image sensor 202 .
- the image sensor 202 comprises a charge-coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) sensor that is driven by one or more sensor drivers 204 .
- the analog image signals captured by the sensor 202 are provided to an analog-to-digital (ND) converter 206 for conversion into binary code that can be processed by a processor 208 .
- the processor can also execute an image framing application 151 that can facilitate framing of images captured by a user as well as generating recommendations to the user regarding adjustments to image framing that can be made to produce higher quality images with the image capture device 104 .
- the image framing application 151 can take the form of API's, firmware, or other software accessible to the image capture device 104 and/or a mobile device 102 or other system in which the image capture device 104 is integrated.
- Operation of the sensor driver(s) 204 is controlled through a camera controller 210 that is in bi-directional communication with the processor 208 .
- the controller 210 can control one or more motors 212 that are used to drive the lens system 200 (e.g., to adjust focus, zoom, and/or aperture settings).
- the controller 210 can also communicate with a flash system, user input devices (e.g., buttons, dials, toggles, etc.) or other components associated with the image capture device 104 . Operation of the camera controller 210 may be adjusted through manipulation of a user interface.
- a user interface comprises the various components used to enter selections and commands into the image capture device 104 and therefore can include various buttons as well as a menu system that, for example, is displayed to the user in, for example, a camera application executed on a mobile device 102 and/or on a back panel associated with a standalone digital camera.
- the digital image signals are processed in accordance with instructions from an image signal processor 218 that can be implemented as a standalone processor within the image capture device as well as being a part of the processor 208 . Processed (e.g., compressed) images may then be stored in storage memory, such as that contained within a removable solid-state memory card (e.g., Flash memory card).
- the embodiment shown in FIG. 2 further includes a device interface 224 through which the image capture device 104 can communicate with a mobile device or other computing system in which it may be integrated.
- the device interface 224 can allow the image capture device to communicate with a main processor associated with a mobile device as well as memory, mass storage, or other resources associated with the mobile device.
- the device interface 224 can communicate with a mobile device in various communications protocols, and this communication can be facilitated, at a software level, by various device drivers, libraries, API's or other software associated with the image capture device 104 that is executed in the mobile device.
- An image capture device e.g., camera, mobile device with integrated camera, etc.
- processing system can be configured with automatic framing and/or reframing capabilities that are based at least upon an identification and characterization of various image elements.
- An image capture device 104 as described herein can identify various framing characteristics associated with an image captured by the device and automatically reframe the image and/or suggest adjustments to framing conditions that a user may take to comply with framing guidelines that can be accessible to the image capture device 104 .
- framing guidelines can specify various ranges of parameters regarding various types of image subjects (e.g., people, foreground elements, background elements, other objects, etc.).
- the framing guidelines can also specify ranges of parameters that are related to various other image properties, such as, but not limited to, lighting sources, such as a device flash and/or natural or artificial light sources within the image, brightness, sharpness, tone, color intensity, contrast, gamma, etc., or other aspects of an image.
- the image framing application 151 can determine whether the framing characteristics of an image captured by the image capture device comply with ranges of various parameters that are specified by at least one framing guideline that is accessible to the image framing application 151 .
- the analysis of imagery as well as determinations regarding whether framing characteristics of an image comply with framing guidelines can be accomplished via software executed by the processor 208 , the ISP 218 as well as a processor associated with a device in communication with the image capture device 104 . It should be appreciated that the specific implementation and/or embodiments disclosed herein are merely examples.
- FIG. 3 illustrates an example image that can be captured by the image capture device 104 ( FIG. 2 ) according to various embodiments according to the disclosure.
- the image capture device 104 is incorporated into a mobile device 102 , which can execute a camera application that renders a user interface for display on a display device associated with the mobile device 102 .
- FIG. 3 illustrates an example of an image 303 that can be captured by the image capture device.
- the image 303 can be captured via a camera application executed on a mobile device where the camera application is configured to communicate with API's associated with the image capture device for the purposes of initiating capture of imagery, display of imagery on a display of the mobile device as well as storage of captured imagery in the form of still images and/or video in memory or mass storage associated with the mobile device.
- the example image 303 includes various elements, such as a subject 305 , foreground elements, background elements, and other elements or objects in an image as can be appreciated.
- the image framing application 151 executed by the image capture device 104 can analyze the image 303 to identify various framing characteristics of the image. To perform such an analysis, the image framing application 151 can identify the various elements in an image. In other words, the image framing application 151 can identify objects that are depicted in an image 303 captured by the image capture device 104 . The image framing application 151 can also identify a subject of the image. For example, a subject of the image can be one or more people or any other object that is the focus of an image. The image framing application 151 can characterize the objects and/or elements within an image 303 , which can be used to determine the framing characteristics of the image.
- the various regions, objects, elements, etc., within an image 303 can be characterized based upon their content.
- people depicted in an image 303 can be identified as such, background elements (e.g., sky, sun, etc.), foreground elements, and other elements can be characterized.
- the image framing application 151 can identify the framing characteristics of the image 303 .
- the image framing application 151 can calculate a measure of how well-framed the captured image is as well as whether the framing of the image can be improved upon.
- the framing characteristics can then be compared with various framing guidelines, which can specify ranges of various parameters that represent best practices, or a well-framed image. Accordingly, in some embodiments, the image framing application 151 can automatically reframe the image 303 based upon the captured image data, which can result in a more aesthetically pleasing image.
- Framing characteristics associated with the image 303 can include, as one example, a percentage of the image in which a subject appears. As another example, a framing characteristic can comprise a percentage of the subject that appears in the image. Framing characteristics can also include coordinates that describe a horizontal and/or vertical position of the subject within the image 303 . As additional examples, framing characteristics can include: a position of lighting sources, such as a device flash and/or natural or artificial light sources relative to the subject, clarity of the subject, brightness, tone, color intensity, contrast, gamma, or other characteristics associated with the subject in the image 303 .
- the image framing application 151 can identify the subject 305 of the image and identify its various framing characteristics. By way of illustration, the image framing application 151 can determine a fraction and/or percentage of the image that the subject 305 occupies as well as the percentage of the subject 305 appearing in the image. The image framing application 151 can estimate such a percentage by identifying the subject 305 as a human body and estimating a percentage of the image that does not appear in the image 303 . The percentage of the image that the subject 305 occupies as well as coordinates describing the position of the subject 305 within the image can also be identified. The image framing application 151 can also determine a percentage of the image that the background (e.g., sky, landscape, etc.), foreground, and/or other image elements consume.
- the background e.g., sky, landscape, etc.
- the image framing application 151 can compare one or more of the framing characteristics against one or more framing guidelines.
- Framing guidelines can represent ideal or best practices as it relates to the framing and/or composition of an image.
- a framing guideline can specify one or more percentage range that a subject of an image should consume.
- a framing guideline can specify one or more percentage range of a subject that should appear in an image.
- the framing guidelines can also specify these parameters as they relate to the other image elements that can be identified in the image 305 .
- the image framing application 151 can specify that a subject of an image, if it is a person or human body, should ideally comprise 15-25% or 65-75% of an image.
- the framing guidelines also specify that a human subject, if represented in the image 305 , should appear such that the head of the subject is located within a certain range of the vertical and/or horizontal center position of the image.
- the framing guidelines can also specify that if a human body is the subject of an image that the body should not be cut off at the knees and/or legs. In other words, the framing guidelines can specify that a certain percentage range of the subject should be represented in the image 305 .
- the image framing application 151 can detect the lighting conditions of the image 303 . For example, the intensity and/or position of light sources within the image 303 can be detected. As another example, the distance of the subject 305 from the image capture device 104 , which can be derived from data regarding focusing from the lens system of the image capture device, can also be determined. Additionally, the image framing application 151 can determine an optimum distance from the image capture device 104 based at least upon the characteristics of a flash device incorporated into the image capture device 104 .
- the image framing application 151 can calculate a framing score that expresses the extent to which the framing characteristics of an image comply with the various framing guidelines. In one embodiment, such a framing score can be based at least upon how closely the identified framing characteristics comply with framing guidelines. Continuing the above example of a hypothetical framing guideline that specifies various percentage ranges of an image that a subject should consume, the framing score can include a measure of how closely the identified framing characteristics of an image comply with one or more of the percentage ranges.
- the image framing application 151 can identify the framing characteristics of the image 303 and determine the extent to which they comply with framing guidelines with which the image framing application 151 can be configured. As one non-limiting example, the image framing application 151 can identify that the subject 305 in the depicted image 303 can be adjusted to comply with framing guidelines. In other words, the image framing application 151 can identify image adjustments that can raise a framing score associated with the image 303 .
- the image framing application 151 can identify a region 407 of the image that can be extracted and/or cropped to result in an image that more closely complies with one or more framing guidelines.
- the region 407 can be identified by the image framing application 151 that can cause the image to more closely comply with one or more framing guidelines.
- FIG. 4 illustrates an example of a subject 305 whose horizontal and/or vertical coordinates may lie outside a range specified by framing guidelines.
- the percentage of the image 303 that the subject 305 consumes may lie outside a framing guideline, as could a percentage of the subject 305 that appears in the image 303 .
- the image framing application 151 can crop the image 303 of FIG. 4 so that the position and/or size of the subject 505 is reframed and so that the resultant image more closely complies with one or more framing guidelines with which the image framing application 151 can be configured.
- the subject of FIG. 5 in the resultant image 503 is centered and the percentage of the subject 505 shown in the image 503 has been adjusted, which can cause the image 503 to comply with framing guidelines.
- FIG. 6 shows an alternative image 603 that can be reframed by the image framing application 151 according to various embodiments of the disclosure.
- the image framing application 151 can identify the subject 605 of the image and determine whether the framing characteristics of the image 603 comply with various framing guidelines.
- the image framing application 151 can determine that the vertical coordinates associated with the subject 605 as well as a percentage of the image 603 that the subject 605 consumes can be altered to comply with framing guidelines. Accordingly, the image framing application 151 can identify a region 607 of the image 603 that can be cropped to achieve such a result.
- FIG. 7 continues the example of FIG. 6 and illustrates a resultant image 703 that is cropped from the image 603 captured by the image capture device 104 and shown in FIG. 6 .
- FIG. 8 illustrates an example of how the image framing application 151 can identify recommendations regarding improvements to framing of an image 803 .
- the image framing application 151 can identify that an intense light source is in a background of the image 803 and generate a suggestion regarding how the image 803 can be reframed by the user to yield a resultant image that better complies with framing guidelines.
- the image framing application 151 can generate a suggestion that the user reposition the subject and/or the image capture device 105 .
- the image framing application 151 can identify a distance of the subject 805 from the image capture device 104 and generate a recommendation that the user position the image capture device 104 and/or the subject 805 closer or further from one another depending on an optimum range associated with a flash device associated with the image capture device 105 .
- the image capture device can also analyze color intensity, image quality, or other parameters associated with the subject 805 and generate similar recommendations that are related to framing of the image that can result in a higher quality result.
- FIG. 9 shown is a flowchart that provides one example of the operation of a portion of an image framing application 151 executed by an image capture device 104 , a mobile device 102 or any other device in which an image capture device 104 is integrated according to various embodiments of the disclosure. It is understood that the flowchart of FIG. 6 provides merely an example of the many different types of functional arrangements that may be employed to implement the operation of the portion of logic employed by the image capture device as described herein. As an alternative, the flowchart of FIG. 6 may be viewed as depicting an example of steps of a method implemented in a computing device, processor, or other circuits according to one or more embodiments.
- the image framing application 151 can initiate capture of one or more images and/or video by the image capture device 104 .
- image capture can be initiated by the user and/or any software application executed by the image capture device 104 or any device in which the image capture device 104 is integrated.
- the image framing application 151 can identify framing characteristics of the image.
- the image framing application 151 can generate a framing score associated with the identified framing characteristics. In other words, the image framing application 151 can determine whether the framing characteristics comply with framing guidelines or whether the image characteristics can be adjusted to more closely comply with framing guidelines. In other words, the image framing application 151 can reframe an image when the framing characteristics do not comply with framing guidelines.
- the image framing application 151 can determine whether improvement of the framing score is possible. In other words, the image framing application 151 can determine whether the image can be reframed (e.g., a region of the image identified and/or cropped from the image) and/or adjust other image characteristics or parameters associated with the image to improve the framing score. If so, in box 909 , the image framing application 151 can reframe the image such that the framing characteristics more closely comply with one or more framing guidelines.
- images may be adjusted and/or reframed without initiating image capture as described in box 901 , and that the example illustrated in the flowchart of FIG. 9 is but one non-limiting example.
- a mobile device 102 and/or image capture device 104 can generate a user interface element providing adjustability of multiple image settings in conjunction a gallery application that allows for viewing and/or browsing of imagery and/or video stored in a mass storage device.
- Other variations should be appreciated by a person of ordinary skill in the art.
- Embodiments of the present disclosure can be implemented in various devices, for example, having a processor, memory as well as image capture hardware that can be coupled to a local interface.
- the logic described herein can be executable by one or more processors integrated with a device.
- an application executed in a computing device such as a mobile device, can invoke one or more API's that provide the logic described herein as well as facilitate interaction with image capture hardware.
- any component discussed herein is implemented in the form of software, any one of a number of programming languages may be employed such as, for example, processor specific assembler languages, C, C++, C#, Objective C, Java, Javascript, Perl, PHP, Visual Basic, Python, Ruby, Delphi, Flash, or other programming languages.
- executable means a program file that is in a form that can ultimately be run by a processor.
- executable programs may be, for example, a compiled program that can be translated into machine code in a format that can be loaded into a random access portion of memory and run by a processor, source code that may be expressed in proper format such as object code that is capable of being loaded into a random access portion of the memory and executed by the processor, or source code that may be interpreted by another executable program to generate instructions in a random access portion of the memory to be executed by the processor, etc.
- An executable program may be stored in any portion or component of the memory including, for example, random access memory (RAM), read-only memory (ROM), hard drive, solid-state drive, USB flash drive, memory card, optical disc such as compact disc (CD) or digital versatile disc (DVD), floppy disk, magnetic tape, or other memory components.
- RAM random access memory
- ROM read-only memory
- hard drive solid-state drive
- USB flash drive USB flash drive
- memory card such as compact disc (CD) or digital versatile disc (DVD), floppy disk, magnetic tape, or other memory components.
- CD compact disc
- DVD digital versatile disc
- each block may represent a module, segment, or portion of code that comprises program instructions to implement the specified logical function(s).
- the program instructions may be embodied in the form of source code that comprises human-readable statements written in a programming language or machine code that comprises numerical instructions recognizable by a suitable execution system such as a processor in a computer system or other system.
- the machine code may be converted from the source code, etc.
- each block may represent a circuit or a number of interconnected circuits to implement the specified logical function(s).
- FIG. 9 shows a specific order of execution, it is understood that the order of execution may differ from that which is depicted. For example, the order of execution of two or more blocks may be scrambled relative to the order shown. Also, two or more blocks shown in succession in FIG. 9 may be executed concurrently or with partial concurrence. Further, in some embodiments, one or more of the blocks shown in FIG. 9 may be skipped or omitted. In addition, any number of counters, state variables, warning semaphores, or messages might be added to the logical flow described herein, for purposes of enhanced utility, accounting, performance measurement, or providing troubleshooting aids, etc. It is understood that all such variations are within the scope of the present disclosure.
- any logic or application described herein that comprises software or code can be embodied in any non-transitory computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor in a computer device or other system.
- the logic may comprise, for example, statements including instructions and declarations that can be fetched from the computer-readable medium and executed by the instruction execution system.
- a “computer-readable medium” can be any medium that can contain, store, or maintain the logic or application described herein for use by or in connection with the instruction execution system.
- the computer-readable medium can comprise any one of many physical media such as, for example, magnetic, optical, or semiconductor media.
- a suitable computer-readable medium would include, but are not limited to, magnetic tapes, magnetic floppy diskettes, magnetic hard drives, memory cards, solid-state drives, USB flash drives, or optical discs.
- the computer-readable medium may be a random access memory (RAM) including, for example, static random access memory (SRAM) and dynamic random access memory (DRAM), or magnetic random access memory (MRAM).
- the computer-readable medium may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other type of memory device.
Abstract
Description
- This application claims priority to co-pending U.S. provisional application entitled, “Image Capture Device Systems and Methods,” having Ser. No. 61/509,747, filed Jul. 20, 2011, which is entirely incorporated herein by reference.
- Users of image capture devices (e.g., still cameras, video cameras, etc.) may sometimes improperly frame an image that is captured by the device. In other words, a user may capture an image of a subject without framing the subject ideally. In some cases, the subject of an image may not be centered in the frame, the subject may occupy too little or too much of the frame relative to any background and/or foreground image elements, have insufficient or excessive lighting, or possess other imperfections or inadequacies related to the user's framing of the image.
- Many aspects of the invention can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present invention. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
-
FIGS. 1A and 1B are drawings of a mobile device incorporating an image capture device according to various embodiments of the disclosure. -
FIG. 2 is a drawing of an image capture device that can be incorporated into a mobile device shown inFIG. 1 according to various embodiments of the disclosure. -
FIGS. 3-8 are drawings of example user interfaces that can be generated in a mobile device in association with the image capture device shown inFIG. 2 according to various embodiments of the disclosure. -
FIG. 9 is a flowchart depicting one example execution of a user interface application executed in an image capture device according to various embodiments of the disclosure. - Embodiments of the present disclosure relate to systems and methods that can be executed in an image capture device. More specifically, embodiments of the disclosure relate to systems and methods for framing and/or reframing of images captured by an image capture device to improve the framing characteristics and/or appearance. In the context of this disclosure, an image capture device can include a camera, video camera, a mobile device with an integrated image capture device, or other devices suitable to capturing imagery and/or video as can be appreciated. In some embodiments, an image capture device according to an embodiment of the disclosure can include a device such as a smartphone, tablet computing system, laptop computer, desktop computer, or any other computing device that has the capability to receive and/or capture imagery via image capture hardware.
- Accordingly, image capture device hardware can include components such as lenses, image sensors (e.g., charge coupled devices, CMOS image sensor, etc.), processor(s), image signal processor(s), a main processor, memory, mass storage, or any other hardware or software components that can facilitate capture of imagery and/or video. In some embodiments, an image signal processor can be incorporated as a part of a main processor in an image capture device module that is in turn incorporated into a device having its own processor, memory and other components.
- An image capture device according to an embodiment of the disclosure can provide a user interface via a display that is integrated into the image capture device. The display can be integrated with a mobile device, such as a smartphone and/or tablet computing device, and can include a touchscreen input device (e.g., a capacitive touchscreen, etc.) with which a user may interact with the user interface that is presented thereon. The image capture device hardware can also include one or more buttons, dials, toggles, switches, or other input devices with which the user can interact with software executed in the image capture device.
- Referring now to the drawings,
FIGS. 1A-1B show amobile device 102 that can comprise and/or incorporate an image capture device according to various embodiments of the disclosure. Themobile device 102 may comprise, for example, a processor-based system, such as a computer system. Such a computer system may be embodied in the form of a desktop computer, a laptop computer, a personal digital assistant, a mobile device (e.g., cellular telephone, smart phone, etc.), tablet computing system, set-top box, music players, or other devices with like capability. The mobile device can include, for example, animage capture device 104, which can further include a lens system as well as other hardware components that can be integrated with the device to facilitate image capture. Themobile device 102 can also include adisplay device 141 upon which various content and other user interfaces may be rendered. Themobile device 102 can also include one or more input devices with which a user can interact with a user interface rendered on thedisplay device 141. For example, themobile device 102 can include or be in communication with a mouse, touch input device (e.g., capacitive and/or resistive touchscreen incorporated with the display device 141), keyboard, or other input devices. - The
mobile device 102 may be configured to execute various applications, such as a camera application that can interact with an image capture module that includes various hardware and/or software components that facilitate capture and/or storage of images and/or video. In one embodiment, the camera application can interact with application programming interfaces (API's) and/or other software libraries and/or drivers that are provided for the purpose interacting with image capture hardware, such as the lens system and other image capture hardware. The camera application can be a special purpose application, a plug-in or executable library, one or more API's, image control algorithms, image capture device firmware, or other software that can facilitate communication with image capture hardware in communication with themobile device 102. -
FIG. 2 illustrates an embodiment of the various image capture components, or one example of animage capture device 104, that can be incorporated in themobile device 102 illustrated inFIGS. 1A-1B . Although one implementation is shown inFIG. 2 and described herein, an image capture device according to an embodiment of the disclosure more generally comprises an image capture device that can provide images in digital form. - The
image capture device 104 includes alens system 200 that conveys images of viewed scenes to animage sensor 202. By way of example, theimage sensor 202 comprises a charge-coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) sensor that is driven by one ormore sensor drivers 204. The analog image signals captured by thesensor 202 are provided to an analog-to-digital (ND)converter 206 for conversion into binary code that can be processed by aprocessor 208. The processor can also execute animage framing application 151 that can facilitate framing of images captured by a user as well as generating recommendations to the user regarding adjustments to image framing that can be made to produce higher quality images with theimage capture device 104. In some embodiments, theimage framing application 151 can take the form of API's, firmware, or other software accessible to theimage capture device 104 and/or amobile device 102 or other system in which theimage capture device 104 is integrated. - Operation of the sensor driver(s) 204 is controlled through a camera controller 210 that is in bi-directional communication with the
processor 208. In some embodiments, the controller 210 can control one ormore motors 212 that are used to drive the lens system 200 (e.g., to adjust focus, zoom, and/or aperture settings). The controller 210 can also communicate with a flash system, user input devices (e.g., buttons, dials, toggles, etc.) or other components associated with theimage capture device 104. Operation of the camera controller 210 may be adjusted through manipulation of a user interface. A user interface comprises the various components used to enter selections and commands into theimage capture device 104 and therefore can include various buttons as well as a menu system that, for example, is displayed to the user in, for example, a camera application executed on amobile device 102 and/or on a back panel associated with a standalone digital camera. - The digital image signals are processed in accordance with instructions from an
image signal processor 218 that can be implemented as a standalone processor within the image capture device as well as being a part of theprocessor 208. Processed (e.g., compressed) images may then be stored in storage memory, such as that contained within a removable solid-state memory card (e.g., Flash memory card). The embodiment shown inFIG. 2 further includes adevice interface 224 through which theimage capture device 104 can communicate with a mobile device or other computing system in which it may be integrated. For example, thedevice interface 224 can allow the image capture device to communicate with a main processor associated with a mobile device as well as memory, mass storage, or other resources associated with the mobile device. Thedevice interface 224 can communicate with a mobile device in various communications protocols, and this communication can be facilitated, at a software level, by various device drivers, libraries, API's or other software associated with theimage capture device 104 that is executed in the mobile device. - An image capture device (e.g., camera, mobile device with integrated camera, etc.) and/or processing system can be configured with automatic framing and/or reframing capabilities that are based at least upon an identification and characterization of various image elements. An
image capture device 104 as described herein can identify various framing characteristics associated with an image captured by the device and automatically reframe the image and/or suggest adjustments to framing conditions that a user may take to comply with framing guidelines that can be accessible to theimage capture device 104. For example, framing guidelines can specify various ranges of parameters regarding various types of image subjects (e.g., people, foreground elements, background elements, other objects, etc.). Additionally, the framing guidelines can also specify ranges of parameters that are related to various other image properties, such as, but not limited to, lighting sources, such as a device flash and/or natural or artificial light sources within the image, brightness, sharpness, tone, color intensity, contrast, gamma, etc., or other aspects of an image. As described herein, theimage framing application 151 can determine whether the framing characteristics of an image captured by the image capture device comply with ranges of various parameters that are specified by at least one framing guideline that is accessible to theimage framing application 151. - The analysis of imagery as well as determinations regarding whether framing characteristics of an image comply with framing guidelines can be accomplished via software executed by the
processor 208, the ISP 218 as well as a processor associated with a device in communication with theimage capture device 104. It should be appreciated that the specific implementation and/or embodiments disclosed herein are merely examples. - Accordingly, reference is now made to
FIG. 3 , which illustrates an example image that can be captured by the image capture device 104 (FIG. 2 ) according to various embodiments according to the disclosure. In the depicted non-limiting examples ofFIGS. 3-4 , theimage capture device 104 is incorporated into amobile device 102, which can execute a camera application that renders a user interface for display on a display device associated with themobile device 102. It should be appreciated that this is only one non-limiting illustrative implementation. Therefore,FIG. 3 illustrates an example of animage 303 that can be captured by the image capture device. As one example, theimage 303 can be captured via a camera application executed on a mobile device where the camera application is configured to communicate with API's associated with the image capture device for the purposes of initiating capture of imagery, display of imagery on a display of the mobile device as well as storage of captured imagery in the form of still images and/or video in memory or mass storage associated with the mobile device. Theexample image 303 includes various elements, such as a subject 305, foreground elements, background elements, and other elements or objects in an image as can be appreciated. - According to one embodiment of the disclosure, the
image framing application 151 executed by theimage capture device 104 can analyze theimage 303 to identify various framing characteristics of the image. To perform such an analysis, theimage framing application 151 can identify the various elements in an image. In other words, theimage framing application 151 can identify objects that are depicted in animage 303 captured by theimage capture device 104. Theimage framing application 151 can also identify a subject of the image. For example, a subject of the image can be one or more people or any other object that is the focus of an image. Theimage framing application 151 can characterize the objects and/or elements within animage 303, which can be used to determine the framing characteristics of the image. - The various regions, objects, elements, etc., within an
image 303 can be characterized based upon their content. Example, people depicted in animage 303 can be identified as such, background elements (e.g., sky, sun, etc.), foreground elements, and other elements can be characterized. Subsequently, theimage framing application 151 can identify the framing characteristics of theimage 303. In other words, theimage framing application 151 can calculate a measure of how well-framed the captured image is as well as whether the framing of the image can be improved upon. The framing characteristics can then be compared with various framing guidelines, which can specify ranges of various parameters that represent best practices, or a well-framed image. Accordingly, in some embodiments, theimage framing application 151 can automatically reframe theimage 303 based upon the captured image data, which can result in a more aesthetically pleasing image. - Framing characteristics associated with the
image 303 can include, as one example, a percentage of the image in which a subject appears. As another example, a framing characteristic can comprise a percentage of the subject that appears in the image. Framing characteristics can also include coordinates that describe a horizontal and/or vertical position of the subject within theimage 303. As additional examples, framing characteristics can include: a position of lighting sources, such as a device flash and/or natural or artificial light sources relative to the subject, clarity of the subject, brightness, tone, color intensity, contrast, gamma, or other characteristics associated with the subject in theimage 303. - In the depicted example, the
image framing application 151 can identify the subject 305 of the image and identify its various framing characteristics. By way of illustration, theimage framing application 151 can determine a fraction and/or percentage of the image that the subject 305 occupies as well as the percentage of the subject 305 appearing in the image. Theimage framing application 151 can estimate such a percentage by identifying the subject 305 as a human body and estimating a percentage of the image that does not appear in theimage 303. The percentage of the image that the subject 305 occupies as well as coordinates describing the position of the subject 305 within the image can also be identified. Theimage framing application 151 can also determine a percentage of the image that the background (e.g., sky, landscape, etc.), foreground, and/or other image elements consume. - Accordingly, upon identifying the various framing characteristics of the
image 303, theimage framing application 151 can compare one or more of the framing characteristics against one or more framing guidelines. Framing guidelines can represent ideal or best practices as it relates to the framing and/or composition of an image. For example, a framing guideline can specify one or more percentage range that a subject of an image should consume. As another example, a framing guideline can specify one or more percentage range of a subject that should appear in an image. The framing guidelines can also specify these parameters as they relate to the other image elements that can be identified in theimage 305. As a non-limiting example, theimage framing application 151 can specify that a subject of an image, if it is a person or human body, should ideally comprise 15-25% or 65-75% of an image. - Continuing this illustrative example, the framing guidelines also specify that a human subject, if represented in the
image 305, should appear such that the head of the subject is located within a certain range of the vertical and/or horizontal center position of the image. As yet another illustrative example, the framing guidelines can also specify that if a human body is the subject of an image that the body should not be cut off at the knees and/or legs. In other words, the framing guidelines can specify that a certain percentage range of the subject should be represented in theimage 305. - As another example, the
image framing application 151 can detect the lighting conditions of theimage 303. For example, the intensity and/or position of light sources within theimage 303 can be detected. As another example, the distance of the subject 305 from theimage capture device 104, which can be derived from data regarding focusing from the lens system of the image capture device, can also be determined. Additionally, theimage framing application 151 can determine an optimum distance from theimage capture device 104 based at least upon the characteristics of a flash device incorporated into theimage capture device 104. - To determine whether framing of an image captured by the
image capture device 104 can be improved, theimage framing application 151 can calculate a framing score that expresses the extent to which the framing characteristics of an image comply with the various framing guidelines. In one embodiment, such a framing score can be based at least upon how closely the identified framing characteristics comply with framing guidelines. Continuing the above example of a hypothetical framing guideline that specifies various percentage ranges of an image that a subject should consume, the framing score can include a measure of how closely the identified framing characteristics of an image comply with one or more of the percentage ranges. - Reference is now made to
FIG. 5 , which illustrates an example of how theimage framing application 151 can reframe an image captured by theimage capture device 104 according to various embodiments of the disclosure. In the depicted example, theimage framing application 151 can identify the framing characteristics of theimage 303 and determine the extent to which they comply with framing guidelines with which theimage framing application 151 can be configured. As one non-limiting example, theimage framing application 151 can identify that the subject 305 in the depictedimage 303 can be adjusted to comply with framing guidelines. In other words, theimage framing application 151 can identify image adjustments that can raise a framing score associated with theimage 303. Accordingly, theimage framing application 151 can identify a region 407 of the image that can be extracted and/or cropped to result in an image that more closely complies with one or more framing guidelines. In this example, the region 407 can be identified by theimage framing application 151 that can cause the image to more closely comply with one or more framing guidelines.FIG. 4 illustrates an example of a subject 305 whose horizontal and/or vertical coordinates may lie outside a range specified by framing guidelines. As another example, the percentage of theimage 303 that the subject 305 consumes may lie outside a framing guideline, as could a percentage of the subject 305 that appears in theimage 303. - In the depicted example, the
image framing application 151 can crop theimage 303 ofFIG. 4 so that the position and/or size of the subject 505 is reframed and so that the resultant image more closely complies with one or more framing guidelines with which theimage framing application 151 can be configured. For example, the subject ofFIG. 5 in theresultant image 503 is centered and the percentage of the subject 505 shown in theimage 503 has been adjusted, which can cause theimage 503 to comply with framing guidelines. -
FIG. 6 shows analternative image 603 that can be reframed by theimage framing application 151 according to various embodiments of the disclosure. In the depicted example, theimage framing application 151 can identify the subject 605 of the image and determine whether the framing characteristics of theimage 603 comply with various framing guidelines. In the depicted example, theimage framing application 151 can determine that the vertical coordinates associated with the subject 605 as well as a percentage of theimage 603 that the subject 605 consumes can be altered to comply with framing guidelines. Accordingly, theimage framing application 151 can identify aregion 607 of theimage 603 that can be cropped to achieve such a result.FIG. 7 continues the example ofFIG. 6 and illustrates aresultant image 703 that is cropped from theimage 603 captured by theimage capture device 104 and shown inFIG. 6 . - Reference is now made to
FIG. 8 , which illustrates an example of how theimage framing application 151 can identify recommendations regarding improvements to framing of animage 803. In the depicted example, theimage framing application 151 can identify that an intense light source is in a background of theimage 803 and generate a suggestion regarding how theimage 803 can be reframed by the user to yield a resultant image that better complies with framing guidelines. In the example ofFIG. 8 , theimage framing application 151 can generate a suggestion that the user reposition the subject and/or the image capture device 105. - In some embodiments, the
image framing application 151 can identify a distance of the subject 805 from theimage capture device 104 and generate a recommendation that the user position theimage capture device 104 and/or the subject 805 closer or further from one another depending on an optimum range associated with a flash device associated with the image capture device 105. The image capture device can also analyze color intensity, image quality, or other parameters associated with the subject 805 and generate similar recommendations that are related to framing of the image that can result in a higher quality result. - Referring next to
FIG. 9 , shown is a flowchart that provides one example of the operation of a portion of animage framing application 151 executed by animage capture device 104, amobile device 102 or any other device in which animage capture device 104 is integrated according to various embodiments of the disclosure. It is understood that the flowchart ofFIG. 6 provides merely an example of the many different types of functional arrangements that may be employed to implement the operation of the portion of logic employed by the image capture device as described herein. As an alternative, the flowchart ofFIG. 6 may be viewed as depicting an example of steps of a method implemented in a computing device, processor, or other circuits according to one or more embodiments. - First, in
box 901, theimage framing application 151 can initiate capture of one or more images and/or video by theimage capture device 104. In one embodiment, image capture can be initiated by the user and/or any software application executed by theimage capture device 104 or any device in which theimage capture device 104 is integrated. Inbox 903, theimage framing application 151 can identify framing characteristics of the image. Inbox 905, theimage framing application 151 can generate a framing score associated with the identified framing characteristics. In other words, theimage framing application 151 can determine whether the framing characteristics comply with framing guidelines or whether the image characteristics can be adjusted to more closely comply with framing guidelines. In other words, theimage framing application 151 can reframe an image when the framing characteristics do not comply with framing guidelines. - In
box 907, theimage framing application 151 can determine whether improvement of the framing score is possible. In other words, theimage framing application 151 can determine whether the image can be reframed (e.g., a region of the image identified and/or cropped from the image) and/or adjust other image characteristics or parameters associated with the image to improve the framing score. If so, inbox 909, theimage framing application 151 can reframe the image such that the framing characteristics more closely comply with one or more framing guidelines. - It should be appreciated that in some embodiments, images may be adjusted and/or reframed without initiating image capture as described in
box 901, and that the example illustrated in the flowchart ofFIG. 9 is but one non-limiting example. For example, amobile device 102 and/orimage capture device 104 can generate a user interface element providing adjustability of multiple image settings in conjunction a gallery application that allows for viewing and/or browsing of imagery and/or video stored in a mass storage device. Other variations should be appreciated by a person of ordinary skill in the art. - Embodiments of the present disclosure can be implemented in various devices, for example, having a processor, memory as well as image capture hardware that can be coupled to a local interface. The logic described herein can be executable by one or more processors integrated with a device. In one embodiment, an application executed in a computing device, such as a mobile device, can invoke one or more API's that provide the logic described herein as well as facilitate interaction with image capture hardware. Where any component discussed herein is implemented in the form of software, any one of a number of programming languages may be employed such as, for example, processor specific assembler languages, C, C++, C#, Objective C, Java, Javascript, Perl, PHP, Visual Basic, Python, Ruby, Delphi, Flash, or other programming languages.
- As such, these software components can be executable by one or more processors in various devices. In this respect, the term “executable” means a program file that is in a form that can ultimately be run by a processor. Examples of executable programs may be, for example, a compiled program that can be translated into machine code in a format that can be loaded into a random access portion of memory and run by a processor, source code that may be expressed in proper format such as object code that is capable of being loaded into a random access portion of the memory and executed by the processor, or source code that may be interpreted by another executable program to generate instructions in a random access portion of the memory to be executed by the processor, etc. An executable program may be stored in any portion or component of the memory including, for example, random access memory (RAM), read-only memory (ROM), hard drive, solid-state drive, USB flash drive, memory card, optical disc such as compact disc (CD) or digital versatile disc (DVD), floppy disk, magnetic tape, or other memory components.
- Although various logic described herein may be embodied in software or code executed by general purpose hardware as discussed above, as an alternative the same may also be embodied in dedicated hardware or a combination of software/general purpose hardware and dedicated hardware. If embodied in dedicated hardware, each can be implemented as a circuit or state machine that employs any one of or a combination of a number of technologies. These technologies may include, but are not limited to, discrete logic circuits having logic gates for implementing various logic functions upon an application of one or more data signals, application specific integrated circuits having appropriate logic gates, or other components, etc. Such technologies are generally well known by those skilled in the art and, consequently, are not described in detail herein.
- The flowchart of
FIG. 9 shows the functionality and operation of an implementation of portions of an image capture device according to embodiments of the disclosure. If embodied in software, each block may represent a module, segment, or portion of code that comprises program instructions to implement the specified logical function(s). The program instructions may be embodied in the form of source code that comprises human-readable statements written in a programming language or machine code that comprises numerical instructions recognizable by a suitable execution system such as a processor in a computer system or other system. The machine code may be converted from the source code, etc. If embodied in hardware, each block may represent a circuit or a number of interconnected circuits to implement the specified logical function(s). - Although the flowchart of
FIG. 9 shows a specific order of execution, it is understood that the order of execution may differ from that which is depicted. For example, the order of execution of two or more blocks may be scrambled relative to the order shown. Also, two or more blocks shown in succession inFIG. 9 may be executed concurrently or with partial concurrence. Further, in some embodiments, one or more of the blocks shown inFIG. 9 may be skipped or omitted. In addition, any number of counters, state variables, warning semaphores, or messages might be added to the logical flow described herein, for purposes of enhanced utility, accounting, performance measurement, or providing troubleshooting aids, etc. It is understood that all such variations are within the scope of the present disclosure. - Also, any logic or application described herein that comprises software or code can be embodied in any non-transitory computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor in a computer device or other system. In this sense, the logic may comprise, for example, statements including instructions and declarations that can be fetched from the computer-readable medium and executed by the instruction execution system. In the context of the present disclosure, a “computer-readable medium” can be any medium that can contain, store, or maintain the logic or application described herein for use by or in connection with the instruction execution system. The computer-readable medium can comprise any one of many physical media such as, for example, magnetic, optical, or semiconductor media. More specific examples of a suitable computer-readable medium would include, but are not limited to, magnetic tapes, magnetic floppy diskettes, magnetic hard drives, memory cards, solid-state drives, USB flash drives, or optical discs. Also, the computer-readable medium may be a random access memory (RAM) including, for example, static random access memory (SRAM) and dynamic random access memory (DRAM), or magnetic random access memory (MRAM). In addition, the computer-readable medium may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other type of memory device.
- It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/232,052 US20130021512A1 (en) | 2011-07-20 | 2011-09-14 | Framing of Images in an Image Capture Device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161509747P | 2011-07-20 | 2011-07-20 | |
US13/232,052 US20130021512A1 (en) | 2011-07-20 | 2011-09-14 | Framing of Images in an Image Capture Device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130021512A1 true US20130021512A1 (en) | 2013-01-24 |
Family
ID=47555520
Family Applications (9)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/232,045 Abandoned US20130021488A1 (en) | 2011-07-20 | 2011-09-14 | Adjusting Image Capture Device Settings |
US13/232,052 Abandoned US20130021512A1 (en) | 2011-07-20 | 2011-09-14 | Framing of Images in an Image Capture Device |
US13/235,975 Abandoned US20130021504A1 (en) | 2011-07-20 | 2011-09-19 | Multiple image processing |
US13/245,941 Abandoned US20130021489A1 (en) | 2011-07-20 | 2011-09-27 | Regional Image Processing in an Image Capture Device |
US13/281,521 Abandoned US20130021490A1 (en) | 2011-07-20 | 2011-10-26 | Facial Image Processing in an Image Capture Device |
US13/313,345 Abandoned US20130022116A1 (en) | 2011-07-20 | 2011-12-07 | Camera tap transcoder architecture with feed forward encode data |
US13/313,352 Active 2032-01-11 US9092861B2 (en) | 2011-07-20 | 2011-12-07 | Using motion information to assist in image processing |
US13/330,047 Abandoned US20130021484A1 (en) | 2011-07-20 | 2011-12-19 | Dynamic computation of lens shading |
US13/413,863 Abandoned US20130021491A1 (en) | 2011-07-20 | 2012-03-07 | Camera Device Systems and Methods |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/232,045 Abandoned US20130021488A1 (en) | 2011-07-20 | 2011-09-14 | Adjusting Image Capture Device Settings |
Family Applications After (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/235,975 Abandoned US20130021504A1 (en) | 2011-07-20 | 2011-09-19 | Multiple image processing |
US13/245,941 Abandoned US20130021489A1 (en) | 2011-07-20 | 2011-09-27 | Regional Image Processing in an Image Capture Device |
US13/281,521 Abandoned US20130021490A1 (en) | 2011-07-20 | 2011-10-26 | Facial Image Processing in an Image Capture Device |
US13/313,345 Abandoned US20130022116A1 (en) | 2011-07-20 | 2011-12-07 | Camera tap transcoder architecture with feed forward encode data |
US13/313,352 Active 2032-01-11 US9092861B2 (en) | 2011-07-20 | 2011-12-07 | Using motion information to assist in image processing |
US13/330,047 Abandoned US20130021484A1 (en) | 2011-07-20 | 2011-12-19 | Dynamic computation of lens shading |
US13/413,863 Abandoned US20130021491A1 (en) | 2011-07-20 | 2012-03-07 | Camera Device Systems and Methods |
Country Status (1)
Country | Link |
---|---|
US (9) | US20130021488A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150213609A1 (en) * | 2014-01-30 | 2015-07-30 | Adobe Systems Incorporated | Image Cropping Suggestion |
US9251594B2 (en) | 2014-01-30 | 2016-02-02 | Adobe Systems Incorporated | Cropping boundary simplicity |
US20160267357A1 (en) * | 2015-03-12 | 2016-09-15 | Care Zone Inc. | Importing Structured Prescription Records from a Prescription Label on a Medication Package |
US9456195B1 (en) * | 2015-10-08 | 2016-09-27 | Dual Aperture International Co. Ltd. | Application programming interface for multi-aperture imaging systems |
US10791265B1 (en) | 2017-10-13 | 2020-09-29 | State Farm Mutual Automobile Insurance Company | Systems and methods for model-based analysis of damage to a vehicle |
US20220172509A1 (en) * | 2012-10-19 | 2022-06-02 | Google Llc | Image Optimization During Facial Recognition |
US11410413B2 (en) | 2018-09-10 | 2022-08-09 | Samsung Electronics Co., Ltd. | Electronic device for recognizing object and method for controlling electronic device |
US11587046B1 (en) | 2017-10-25 | 2023-02-21 | State Farm Mutual Automobile Insurance Company | Systems and methods for performing repairs to a vehicle |
Families Citing this family (77)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001509910A (en) | 1997-01-27 | 2001-07-24 | ディー. ハーランド,ペーター | Coating, method and apparatus for suppressing reflection from optical substrate |
US10116839B2 (en) | 2014-08-14 | 2018-10-30 | Atheer Labs, Inc. | Methods for camera movement compensation for gesture detection and object recognition |
JP5781351B2 (en) * | 2011-03-30 | 2015-09-24 | 日本アビオニクス株式会社 | Imaging apparatus, pixel output level correction method thereof, infrared camera system, and interchangeable lens system |
JP5778469B2 (en) | 2011-04-28 | 2015-09-16 | 日本アビオニクス株式会社 | Imaging apparatus, image generation method, infrared camera system, and interchangeable lens system |
KR101796481B1 (en) * | 2011-11-28 | 2017-12-04 | 삼성전자주식회사 | Method of eliminating shutter-lags with low power consumption, camera module, and mobile device having the same |
US9118876B2 (en) * | 2012-03-30 | 2015-08-25 | Verizon Patent And Licensing Inc. | Automatic skin tone calibration for camera images |
US9462255B1 (en) | 2012-04-18 | 2016-10-04 | Amazon Technologies, Inc. | Projection and camera system for augmented reality environment |
US9619036B2 (en) | 2012-05-11 | 2017-04-11 | Comcast Cable Communications, Llc | System and methods for controlling a user experience |
US9438805B2 (en) * | 2012-06-08 | 2016-09-06 | Sony Corporation | Terminal device and image capturing method |
US8957973B2 (en) * | 2012-06-11 | 2015-02-17 | Omnivision Technologies, Inc. | Shutter release using secondary camera |
US20130335587A1 (en) * | 2012-06-14 | 2013-12-19 | Sony Mobile Communications, Inc. | Terminal device and image capturing method |
TWI498771B (en) * | 2012-07-06 | 2015-09-01 | Pixart Imaging Inc | Gesture recognition system and glasses with gesture recognition function |
KR101917650B1 (en) * | 2012-08-03 | 2019-01-29 | 삼성전자 주식회사 | Method and apparatus for processing a image in camera device |
US9554042B2 (en) * | 2012-09-24 | 2017-01-24 | Google Technology Holdings LLC | Preventing motion artifacts by intelligently disabling video stabilization |
JP2014086849A (en) * | 2012-10-23 | 2014-05-12 | Sony Corp | Content acquisition device and program |
US9060127B2 (en) * | 2013-01-23 | 2015-06-16 | Orcam Technologies Ltd. | Apparatus for adjusting image capture settings |
JP2014176034A (en) * | 2013-03-12 | 2014-09-22 | Ricoh Co Ltd | Video transmission device |
US9552630B2 (en) * | 2013-04-09 | 2017-01-24 | Honeywell International Inc. | Motion deblurring |
US9595083B1 (en) * | 2013-04-16 | 2017-03-14 | Lockheed Martin Corporation | Method and apparatus for image producing with predictions of future positions |
US9916367B2 (en) | 2013-05-03 | 2018-03-13 | Splunk Inc. | Processing system search requests from multiple data stores with overlapping data |
US8738629B1 (en) | 2013-05-03 | 2014-05-27 | Splunk Inc. | External Result Provided process for retrieving data stored using a different configuration or protocol |
US10003792B2 (en) | 2013-05-27 | 2018-06-19 | Microsoft Technology Licensing, Llc | Video encoder for images |
US10796617B2 (en) * | 2013-06-12 | 2020-10-06 | Infineon Technologies Ag | Device, method and system for processing an image data stream |
US9529513B2 (en) * | 2013-08-05 | 2016-12-27 | Microsoft Technology Licensing, Llc | Two-hand interaction with natural user interface |
US9270959B2 (en) | 2013-08-07 | 2016-02-23 | Qualcomm Incorporated | Dynamic color shading correction |
CN105612083B (en) * | 2013-10-09 | 2018-10-23 | 麦格纳覆盖件有限公司 | To the system and method for the control that vehicle window is shown |
US9973672B2 (en) | 2013-12-06 | 2018-05-15 | Huawei Device (Dongguan) Co., Ltd. | Photographing for dual-lens device using photographing environment determined using depth estimation |
US10931866B2 (en) * | 2014-01-05 | 2021-02-23 | Light Labs Inc. | Methods and apparatus for receiving and storing in a camera a user controllable setting that is used to control composite image generation performed after image capture |
US10121060B2 (en) * | 2014-02-13 | 2018-11-06 | Oath Inc. | Automatic group formation and group detection through media recognition |
KR102128468B1 (en) * | 2014-02-19 | 2020-06-30 | 삼성전자주식회사 | Image Processing Device and Method including a plurality of image signal processors |
CN103841328B (en) * | 2014-02-27 | 2015-03-11 | 深圳市中兴移动通信有限公司 | Low-speed shutter shooting method and device |
US10136140B2 (en) | 2014-03-17 | 2018-11-20 | Microsoft Technology Licensing, Llc | Encoder-side decisions for screen content encoding |
US20150297986A1 (en) * | 2014-04-18 | 2015-10-22 | Aquifi, Inc. | Systems and methods for interactive video games with motion dependent gesture inputs |
JP6565905B2 (en) * | 2014-05-08 | 2019-08-28 | ソニー株式会社 | Information processing apparatus and information processing method |
US10051196B2 (en) * | 2014-05-20 | 2018-08-14 | Lenovo (Singapore) Pte. Ltd. | Projecting light at angle corresponding to the field of view of a camera |
WO2016004278A1 (en) * | 2014-07-03 | 2016-01-07 | Brady Worldwide, Inc. | Lockout/tagout device with non-volatile memory and related system |
US10031400B2 (en) * | 2014-08-06 | 2018-07-24 | Kevin J. WARRIAN | Orientation system for image recording device |
KR102225947B1 (en) * | 2014-10-24 | 2021-03-10 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
CN105549302B (en) | 2014-10-31 | 2018-05-08 | 国际商业机器公司 | The coverage suggestion device of photography and vedio recording equipment |
US10334158B2 (en) * | 2014-11-03 | 2019-06-25 | Robert John Gove | Autonomous media capturing |
US20160148648A1 (en) * | 2014-11-20 | 2016-05-26 | Facebook, Inc. | Systems and methods for improving stabilization in time-lapse media content |
US10924743B2 (en) | 2015-02-06 | 2021-02-16 | Microsoft Technology Licensing, Llc | Skipping evaluation stages during media encoding |
EP3274986A4 (en) | 2015-03-21 | 2019-04-17 | Mine One GmbH | Virtual 3d methods, systems and software |
US10853625B2 (en) | 2015-03-21 | 2020-12-01 | Mine One Gmbh | Facial signature methods, systems and software |
US20160316220A1 (en) * | 2015-04-21 | 2016-10-27 | Microsoft Technology Licensing, Llc | Video encoder management strategies |
EP3295372A4 (en) * | 2015-05-12 | 2019-06-12 | Mine One GmbH | Facial signature methods, systems and software |
US10165186B1 (en) * | 2015-06-19 | 2018-12-25 | Amazon Technologies, Inc. | Motion estimation based video stabilization for panoramic video from multi-camera capture device |
US10447926B1 (en) | 2015-06-19 | 2019-10-15 | Amazon Technologies, Inc. | Motion estimation based video compression and encoding |
US10136132B2 (en) | 2015-07-21 | 2018-11-20 | Microsoft Technology Licensing, Llc | Adaptive skip or zero block detection combined with transform size decision |
EP3136726B1 (en) * | 2015-08-27 | 2018-03-07 | Axis AB | Pre-processing of digital images |
US9648223B2 (en) * | 2015-09-04 | 2017-05-09 | Microvision, Inc. | Laser beam scanning assisted autofocus |
US9578221B1 (en) * | 2016-01-05 | 2017-02-21 | International Business Machines Corporation | Camera field of view visualizer |
JP6514140B2 (en) * | 2016-03-17 | 2019-05-15 | 株式会社東芝 | Imaging support apparatus, method and program |
US9639935B1 (en) * | 2016-05-25 | 2017-05-02 | Gopro, Inc. | Apparatus and methods for camera alignment model calibration |
EP3466051A1 (en) | 2016-05-25 | 2019-04-10 | GoPro, Inc. | Three-dimensional noise reduction |
WO2017205597A1 (en) * | 2016-05-25 | 2017-11-30 | Gopro, Inc. | Image signal processing-based encoding hints for motion estimation |
US10140776B2 (en) * | 2016-06-13 | 2018-11-27 | Microsoft Technology Licensing, Llc | Altering properties of rendered objects via control points |
US9851842B1 (en) * | 2016-08-10 | 2017-12-26 | Rovi Guides, Inc. | Systems and methods for adjusting display characteristics |
US10366122B2 (en) * | 2016-09-14 | 2019-07-30 | Ants Technology (Hk) Limited. | Methods circuits devices systems and functionally associated machine executable code for generating a searchable real-scene database |
US10313552B2 (en) * | 2016-10-26 | 2019-06-04 | Orcam Technologies Ltd. | Systems and methods for providing visual feedback of a field of view |
CN106550227B (en) * | 2016-10-27 | 2019-02-22 | 成都西纬科技有限公司 | A kind of image saturation method of adjustment and device |
US10477064B2 (en) | 2017-08-21 | 2019-11-12 | Gopro, Inc. | Image stitching with electronic rolling shutter correction |
JP7004736B2 (en) * | 2017-10-26 | 2022-01-21 | 京セラ株式会社 | Image processing equipment, imaging equipment, driving support equipment, mobile objects, and image processing methods |
KR20190087977A (en) * | 2017-12-25 | 2019-07-25 | 저텍 테크놀로지 컴퍼니 리미티드 | Laser beam scanning display and augmented reality glasses |
JPWO2020084999A1 (en) * | 2018-10-25 | 2021-09-09 | ソニーグループ株式会社 | Image processing device, image processing method, and program |
US10771696B2 (en) * | 2018-11-26 | 2020-09-08 | Sony Corporation | Physically based camera motion compensation |
WO2020142471A1 (en) * | 2018-12-30 | 2020-07-09 | Sang Chul Kwon | Foldable mobile phone |
US11289078B2 (en) * | 2019-06-28 | 2022-03-29 | Intel Corporation | Voice controlled camera with AI scene detection for precise focusing |
US10861127B1 (en) * | 2019-09-17 | 2020-12-08 | Gopro, Inc. | Image and video processing using multiple pipelines |
US11064118B1 (en) | 2019-12-18 | 2021-07-13 | Gopro, Inc. | Systems and methods for dynamic stabilization adjustment |
US11006044B1 (en) * | 2020-03-03 | 2021-05-11 | Qualcomm Incorporated | Power-efficient dynamic electronic image stabilization |
US11284157B2 (en) | 2020-06-11 | 2022-03-22 | Rovi Guides, Inc. | Methods and systems facilitating adjustment of multiple variables via a content guidance application |
TWI774039B (en) * | 2020-08-12 | 2022-08-11 | 瑞昱半導體股份有限公司 | System for compensating image with fixed pattern noise |
US11563899B2 (en) * | 2020-08-14 | 2023-01-24 | Raytheon Company | Parallelization technique for gain map generation using overlapping sub-images |
CN114079735B (en) * | 2020-08-19 | 2024-02-23 | 瑞昱半导体股份有限公司 | Image compensation system for fixed image noise |
US11902671B2 (en) * | 2021-12-09 | 2024-02-13 | Fotonation Limited | Vehicle occupant monitoring system including an image acquisition device with a rolling shutter image sensor |
WO2023150800A1 (en) * | 2022-02-07 | 2023-08-10 | Gopro, Inc. | Methods and apparatus for real-time guided encoding |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7034848B2 (en) * | 2001-01-05 | 2006-04-25 | Hewlett-Packard Development Company, L.P. | System and method for automatically cropping graphical images |
Family Cites Families (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100325253B1 (en) * | 1998-05-19 | 2002-03-04 | 미야즈 준이치롯 | Motion vector search method and apparatus |
US6486908B1 (en) * | 1998-05-27 | 2002-11-26 | Industrial Technology Research Institute | Image-based method and system for building spherical panoramas |
US20010047517A1 (en) * | 2000-02-10 | 2001-11-29 | Charilaos Christopoulos | Method and apparatus for intelligent transcoding of multimedia data |
JP2001245303A (en) * | 2000-02-29 | 2001-09-07 | Toshiba Corp | Moving picture coder and moving picture coding method |
US6407680B1 (en) * | 2000-12-22 | 2002-06-18 | Generic Media, Inc. | Distributed on-demand media transcoding system and method |
JP4205574B2 (en) * | 2001-05-31 | 2009-01-07 | キヤノン株式会社 | Image processing apparatus and control method thereof |
US7801215B2 (en) * | 2001-07-24 | 2010-09-21 | Sasken Communication Technologies Limited | Motion estimation technique for digital video encoding applications |
US20030126622A1 (en) * | 2001-12-27 | 2003-07-03 | Koninklijke Philips Electronics N.V. | Method for efficiently storing the trajectory of tracked objects in video |
KR100850705B1 (en) * | 2002-03-09 | 2008-08-06 | 삼성전자주식회사 | Method for adaptive encoding motion image based on the temperal and spatial complexity and apparatus thereof |
JP4275358B2 (en) * | 2002-06-11 | 2009-06-10 | 株式会社日立製作所 | Image information conversion apparatus, bit stream converter, and image information conversion transmission method |
US7259784B2 (en) * | 2002-06-21 | 2007-08-21 | Microsoft Corporation | System and method for camera color calibration and image stitching |
US20040131276A1 (en) * | 2002-12-23 | 2004-07-08 | John Hudson | Region-based image processor |
EP1577705B1 (en) * | 2002-12-25 | 2018-08-01 | Nikon Corporation | Blur correction camera system |
US20130107938A9 (en) * | 2003-05-28 | 2013-05-02 | Chad Fogg | Method And Apparatus For Scalable Video Decoder Using An Enhancement Stream |
KR100566290B1 (en) * | 2003-09-18 | 2006-03-30 | 삼성전자주식회사 | Image Scanning Method By Using Scan Table and Discrete Cosine Transform Apparatus adapted it |
JP4123171B2 (en) * | 2004-03-08 | 2008-07-23 | ソニー株式会社 | Method for manufacturing vibration type gyro sensor element, vibration type gyro sensor element, and method for adjusting vibration direction |
WO2005094270A2 (en) * | 2004-03-24 | 2005-10-13 | Sharp Laboratories Of America, Inc. | Methods and systems for a/v input device to diplay networking |
US8315307B2 (en) * | 2004-04-07 | 2012-11-20 | Qualcomm Incorporated | Method and apparatus for frame prediction in hybrid video compression to enable temporal scalability |
US20060109900A1 (en) * | 2004-11-23 | 2006-05-25 | Bo Shen | Image data transcoding |
JP2006203682A (en) * | 2005-01-21 | 2006-08-03 | Nec Corp | Converting device of compression encoding bit stream for moving image at syntax level and moving image communication system |
TW200816798A (en) * | 2006-09-22 | 2008-04-01 | Altek Corp | Method of automatic shooting by using an image recognition technology |
US7843824B2 (en) * | 2007-01-08 | 2010-11-30 | General Instrument Corporation | Method and apparatus for statistically multiplexing services |
US7924316B2 (en) * | 2007-03-14 | 2011-04-12 | Aptina Imaging Corporation | Image feature identification and motion compensation apparatus, systems, and methods |
CN101682738A (en) * | 2007-05-23 | 2010-03-24 | 日本电气株式会社 | Dynamic image distribution system, conversion device, and dynamic image distribution method |
KR20100031755A (en) * | 2007-07-30 | 2010-03-24 | 닛본 덴끼 가부시끼가이샤 | Connection terminal, distribution system, conversion method, and program |
US20090060039A1 (en) * | 2007-09-05 | 2009-03-05 | Yasuharu Tanaka | Method and apparatus for compression-encoding moving image |
US8098732B2 (en) * | 2007-10-10 | 2012-01-17 | Sony Corporation | System for and method of transcoding video sequences from a first format to a second format |
US8063942B2 (en) * | 2007-10-19 | 2011-11-22 | Qualcomm Incorporated | Motion assisted image sensor configuration |
US8170342B2 (en) * | 2007-11-07 | 2012-05-01 | Microsoft Corporation | Image recognition of content |
JP2009152672A (en) * | 2007-12-18 | 2009-07-09 | Samsung Techwin Co Ltd | Recording apparatus, reproducing apparatus, recording method, reproducing method, and program |
JP5242151B2 (en) * | 2007-12-21 | 2013-07-24 | セミコンダクター・コンポーネンツ・インダストリーズ・リミテッド・ライアビリティ・カンパニー | Vibration correction control circuit and imaging apparatus including the same |
JP2009159359A (en) * | 2007-12-27 | 2009-07-16 | Samsung Techwin Co Ltd | Moving image data encoding apparatus, moving image data decoding apparatus, moving image data encoding method, moving image data decoding method and program |
US20090217338A1 (en) * | 2008-02-25 | 2009-08-27 | Broadcom Corporation | Reception verification/non-reception verification of base/enhancement video layers |
US20090323810A1 (en) * | 2008-06-26 | 2009-12-31 | Mediatek Inc. | Video encoding apparatuses and methods with decoupled data dependency |
US7990421B2 (en) * | 2008-07-18 | 2011-08-02 | Sony Ericsson Mobile Communications Ab | Arrangement and method relating to an image recording device |
JP2010039788A (en) * | 2008-08-05 | 2010-02-18 | Toshiba Corp | Image processing apparatus and method thereof, and image processing program |
JP2010147808A (en) * | 2008-12-18 | 2010-07-01 | Olympus Imaging Corp | Imaging apparatus and image processing method in same |
US8311115B2 (en) * | 2009-01-29 | 2012-11-13 | Microsoft Corporation | Video encoding using previously calculated motion information |
US20100194851A1 (en) * | 2009-02-03 | 2010-08-05 | Aricent Inc. | Panorama image stitching |
US20100229206A1 (en) * | 2009-03-03 | 2010-09-09 | Viasat, Inc. | Space shifting over forward satellite communication channels |
US8520083B2 (en) * | 2009-03-27 | 2013-08-27 | Canon Kabushiki Kaisha | Method of removing an artefact from an image |
US20100309987A1 (en) * | 2009-06-05 | 2010-12-09 | Apple Inc. | Image acquisition and encoding system |
JP5473536B2 (en) * | 2009-10-28 | 2014-04-16 | 京セラ株式会社 | Portable imaging device with projector function |
US20110170608A1 (en) * | 2010-01-08 | 2011-07-14 | Xun Shi | Method and device for video transcoding using quad-tree based mode selection |
US8681255B2 (en) * | 2010-09-28 | 2014-03-25 | Microsoft Corporation | Integrated low power depth camera and projection device |
US9007428B2 (en) * | 2011-06-01 | 2015-04-14 | Apple Inc. | Motion-based image stitching |
US8554011B2 (en) * | 2011-06-07 | 2013-10-08 | Microsoft Corporation | Automatic exposure correction of images |
-
2011
- 2011-09-14 US US13/232,045 patent/US20130021488A1/en not_active Abandoned
- 2011-09-14 US US13/232,052 patent/US20130021512A1/en not_active Abandoned
- 2011-09-19 US US13/235,975 patent/US20130021504A1/en not_active Abandoned
- 2011-09-27 US US13/245,941 patent/US20130021489A1/en not_active Abandoned
- 2011-10-26 US US13/281,521 patent/US20130021490A1/en not_active Abandoned
- 2011-12-07 US US13/313,345 patent/US20130022116A1/en not_active Abandoned
- 2011-12-07 US US13/313,352 patent/US9092861B2/en active Active
- 2011-12-19 US US13/330,047 patent/US20130021484A1/en not_active Abandoned
-
2012
- 2012-03-07 US US13/413,863 patent/US20130021491A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7034848B2 (en) * | 2001-01-05 | 2006-04-25 | Hewlett-Packard Development Company, L.P. | System and method for automatically cropping graphical images |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220172509A1 (en) * | 2012-10-19 | 2022-06-02 | Google Llc | Image Optimization During Facial Recognition |
US11741749B2 (en) * | 2012-10-19 | 2023-08-29 | Google Llc | Image optimization during facial recognition |
US9406110B2 (en) * | 2014-01-30 | 2016-08-02 | Adobe Systems Incorporated | Cropping boundary simplicity |
US20150213609A1 (en) * | 2014-01-30 | 2015-07-30 | Adobe Systems Incorporated | Image Cropping Suggestion |
US9251594B2 (en) | 2014-01-30 | 2016-02-02 | Adobe Systems Incorporated | Cropping boundary simplicity |
US9245347B2 (en) * | 2014-01-30 | 2016-01-26 | Adobe Systems Incorporated | Image Cropping suggestion |
US20160267357A1 (en) * | 2015-03-12 | 2016-09-15 | Care Zone Inc. | Importing Structured Prescription Records from a Prescription Label on a Medication Package |
US11694776B2 (en) | 2015-03-12 | 2023-07-04 | Walmart Apollo, Llc | Generating prescription records from a prescription label on a medication package |
US11721414B2 (en) * | 2015-03-12 | 2023-08-08 | Walmart Apollo, Llc | Importing structured prescription records from a prescription label on a medication package |
US9456195B1 (en) * | 2015-10-08 | 2016-09-27 | Dual Aperture International Co. Ltd. | Application programming interface for multi-aperture imaging systems |
US9774880B2 (en) | 2015-10-08 | 2017-09-26 | Dual Aperture International Co. Ltd. | Depth-based video compression |
US10791265B1 (en) | 2017-10-13 | 2020-09-29 | State Farm Mutual Automobile Insurance Company | Systems and methods for model-based analysis of damage to a vehicle |
US11159715B1 (en) | 2017-10-13 | 2021-10-26 | State Farm Mutual Automobile Insurance Company | Systems and methods for model-based analysis of damage to a vehicle |
US11587046B1 (en) | 2017-10-25 | 2023-02-21 | State Farm Mutual Automobile Insurance Company | Systems and methods for performing repairs to a vehicle |
US11410413B2 (en) | 2018-09-10 | 2022-08-09 | Samsung Electronics Co., Ltd. | Electronic device for recognizing object and method for controlling electronic device |
Also Published As
Publication number | Publication date |
---|---|
US9092861B2 (en) | 2015-07-28 |
US20130021490A1 (en) | 2013-01-24 |
US20130021483A1 (en) | 2013-01-24 |
US20130021504A1 (en) | 2013-01-24 |
US20130022116A1 (en) | 2013-01-24 |
US20130021484A1 (en) | 2013-01-24 |
US20130021489A1 (en) | 2013-01-24 |
US20130021488A1 (en) | 2013-01-24 |
US20130021491A1 (en) | 2013-01-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20130021512A1 (en) | Framing of Images in an Image Capture Device | |
US20200227089A1 (en) | Method and device for processing multimedia information | |
KR101772177B1 (en) | Method and apparatus for obtaining photograph | |
EP2878121B1 (en) | Method and apparatus for dual camera shutter | |
WO2017016030A1 (en) | Image processing method and terminal | |
US11741749B2 (en) | Image optimization during facial recognition | |
EP3136391B1 (en) | Method, device and terminal device for video effect processing | |
US10382734B2 (en) | Electronic device and color temperature adjusting method | |
US20150163391A1 (en) | Image capturing apparatus, control method of image capturing apparatus, and non-transitory computer readable storage medium | |
US9973687B2 (en) | Capturing apparatus and method for capturing images without moire pattern | |
JP2012199675A (en) | Image processing apparatus, image processing method, and program | |
JP2012205037A (en) | Image processor and image processing method | |
JP6892524B2 (en) | Slow motion video capture based on target tracking | |
CN106464799A (en) | Automatic zooming method and device | |
KR102209070B1 (en) | Apparatus and method for providing thumbnail image of moving picture | |
KR20150011742A (en) | User terminal device and the control method thereof | |
JP2023500510A (en) | A system for performing ambient light image correction | |
KR20170060411A (en) | Method and photographing device for controlling the photographing device according to proximity of a user | |
US20150015724A1 (en) | Electronic device and method for controlling image capturing | |
JP2013017218A (en) | Image processing device, image processing method, and program | |
KR20150014226A (en) | Electronic Device And Method For Taking Images Of The Same | |
EP3273437A1 (en) | Method and device for enhancing readability of a display | |
KR102372711B1 (en) | Image photographing apparatus and control method thereof | |
CN105812642A (en) | Information processing method and electronic device | |
JP6220276B2 (en) | Imaging apparatus and imaging method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PATUCK, NAUSHIR;SEWELL, BENJAMIN;REEL/FRAME:027152/0417 Effective date: 20110913 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 |
|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |