US20040252217A1 - System and method for analyzing a digital image - Google Patents
System and method for analyzing a digital image Download PDFInfo
- Publication number
- US20040252217A1 US20040252217A1 US10/461,600 US46160003A US2004252217A1 US 20040252217 A1 US20040252217 A1 US 20040252217A1 US 46160003 A US46160003 A US 46160003A US 2004252217 A1 US2004252217 A1 US 2004252217A1
- Authority
- US
- United States
- Prior art keywords
- image
- advice
- setting
- characteristic
- logic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
Definitions
- the present invention relates generally to digital photography, and, more particularly, to a system and method for analyzing a digital image.
- a system for analyzing a digital image comprises an image sensor including a plurality of image capture elements, each of the image capture elements configured to capture image data.
- the image data is captured according to at least one setting.
- the system also includes a memory for storing the image data, logic for analyzing the image data to determine at least one characteristic of the image data, and a display for communicating a description of the characteristic.
- FIG. 1 is a block diagram illustrating a digital camera constructed in accordance with an embodiment of the invention.
- FIG. 2 is a graphical illustration of an image file.
- FIG. 3 is a flow chart describing the operation of an embodiment of the image analysis and improvement logic of FIG. 1.
- FIGS. 4A and 4B are graphical illustrations showing an instant review screen and a help screen in accordance with an embodiment of the invention.
- the invention described below is applicable to any digital camera that provides an “instant review” function.
- the system and method for analyzing a captured image can be implemented in hardware, software, firmware, or a combination thereof.
- the system and method for analyzing a captured image is implemented using a combination of hardware, software or firmware that is stored in a memory and that is executed by a suitable instruction execution system.
- the hardware portion of the system and method for analyzing a captured image can be implemented with any or a combination of the following technologies, which are all well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
- ASIC application specific integrated circuit
- PGA programmable gate array
- FPGA field programmable gate array
- the software portion of the system and method for analyzing a captured image can be stored in one or more memory elements and executed by a suitable general purpose or application specific processor.
- the software for analyzing a captured image which comprises an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
- a “computer-readable medium” can be any means, which contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- FIG. 1 is a block diagram illustrating a digital camera 100 constructed in accordance with an embodiment of the invention.
- the digital camera 100 includes an application specific integrated circuit (ASIC) 102 that executes the image analysis logic 150 of the invention.
- ASIC application specific integrated circuit
- the image analysis logic 150 can be software that is stored in memory and executed by the ASIC 102 .
- the image analysis logic 150 may be implemented in firmware, which can be stored and executed in the ASIC 102 .
- the digital camera 100 may include additional processors, digital signal processors (DSPs) and ASICs.
- DSPs digital signal processors
- the ASIC 102 may also include other elements, which are omitted for simplicity.
- the ASIC 102 controls the function of various aspects of the digital camera 100 .
- the camera 100 includes an image sensor 104 .
- the image sensor 104 may comprise a charge coupled device (CCD) array or an array of complementary metal oxide semiconductor (CMOS) sensors. Regardless of whether the image sensor 104 comprises an array of individual CCD elements or CMOS sensors, each of the elements in the array comprises a pixel (picture element) of the image sensor 104 .
- An exemplary pixel is indicated using reference numeral 204 .
- the pixels in the image sensor 104 are typically arranged in a two-dimensional array. For example, an image array may comprise 2272 pixels in length and 1712 pixels in height.
- the image sensor 104 captures an image of a subject by converting incident light into an analog signal, and sends this representation of the image via connection 109 to an analog front end (AFE) processor 111 .
- the analog front end processor 111 typically includes an analog-to-digital converter for converting the analog signal received from the image sensor 104 into a digital signal.
- the analog front end processor provides this digital signal as image data via connection 112 to the ASIC 102 for image processing.
- the ASIC 102 couples via connection 118 to one or more motor drivers 119 .
- the motor drivers 119 control the operation of various parameters of the lens 122 via connection 121 .
- lens controls such as zoom, focus, aperture and shutter operations can be controlled by the motor drivers 119 .
- the connection 123 between the lens 122 and the image sensor 104 is shown as a dotted line to illustrate the operation of the lens 122 focusing on a subject and communicating light to the image sensor 104 , which captures the image provided by the lens 122 .
- the ASIC 102 also sends display data via connection 124 to a display controller 126 .
- the display controller may be, for example, a national television system committee (NTSC)/phase alternate line (PAL) encoder, although, depending on the application, other standards for presenting display data may be used.
- the display controller 126 converts the display data from the ASIC 102 into a signal that can be forwarded via connection 127 to image display 128 .
- the image display 128 which can be, for example a liquid crystal display (LCD) or other display, displays the captured image to the user of a digital camera 100 , and is typically the color display located on the digital camera 100 .
- LCD liquid crystal display
- the image shown to a user on the image display 128 may be shown before the image is captured and processed, in what is referred to as “live view” mode, or after the image is captured and processed, in what is referred to as “instant review” mode, or, if the image was previously captured, in what is referred to as “review” or “playback” mode.
- the playback mode can be invoked via a menu command.
- the instant review mode is typically used to display the captured image to the user immediately after the image is captured and the playback mode is typically used to display the captured image to the user sometime after the image has been captured and stored in memory.
- the instant review mode allows the user of the camera 100 to immediately view the image on the display 128 .
- the image display 128 is typically small, only gross features, or characteristics, of the image can be visually observed. Further, the image display 128 may not accurately reproduce color, tint, brightness, etc., which may further make it difficult for a user to determine the quality of the captured image.
- the difficulty in visually determining the quality of the captured image leads to the possibility of saving an image that may include deficiencies that, if visually detected, would likely cause the user to discard the image and attempt to capture another image having better quality.
- the image analysis logic 150 dynamically analyzes one or more characteristics of the captured image and presents to the user, via the image display 128 and a user interface, an analysis of the captured image.
- An exemplary dynamic analysis of the data for each pixel in a captured image is described below in FIG. 2. For example, information associated with each pixel may be analyzed to determine whether a significant number of the pixels forming the image is either black or white. A predominance of white pixels may be indicative of overexposure and a predominance of black pixels may be indicative of underexposure.
- Similar dynamic analyses can be performed to determine whether an image is in focus or to determine the white balance the image is correct.
- pixels in an image are examined to determine whether sharp transitions exist between pixels. For example, a black pixel adjoining a white pixel may indicate that the image is in focus, while a black pixel separated from a white pixel by a number of gray pixels may indicate that the image is out of focus.
- White balance is a characteristic of the image that generally refers to the color balance in the image to ensure that white portions of the image appear white. An image in which each pixel is a different shade of the same color may indicate an image in which the white balance is improperly adjusted.
- an image improvement logic 160 may be provided to present to the user a recommendation in the form of instructions presented on the image display 128 on ways in which to possibly improve a subsequent image by, for example, adjusting a condition under which the image was captured or adjusting a setting used to capture the image.
- the image analysis logic 150 analyzes the captured image and, optionally, the camera settings used to capture the image, and determines a value of one or more characteristics of the captured image. For example, to determine whether the exposure of the image is satisfactory, if a predefined number of white pixels in the image is exceeded, then the image analysis logic 150 may indicate that the image is overexposed.
- the image improvement logic 160 determines whether a condition used to capture the image should be adjusted, or whether a camera setting should be adjusted, to improve a subsequent image. For example, if the image analysis logic 150 determines that the image is underexposed, the image improvement logic 160 can determine that a subsequent image may be improved by activating the camera flash for a subsequent image. When the image analysis logic 150 analyzes the data representing the captured image and the settings used to capture the image, the analysis can be used by the image improvement logic 160 to suggest adjustments to the settings to improve a subsequent image. These suggested adjustments to the camera settings can be presented to the user on a help screen via the image display 128 , or, in an alternative configuration, can be automatically changed for a subsequent image.
- the ASIC 102 couples to a microcontroller 161 via connection 154 .
- the microcontroller 161 can be a specific or a general purpose microprocessor that controls the various operating aspects and parameters of the digital camera 100 .
- the microcontroller 161 is coupled to a user interface 164 via connection 162 .
- the user interface 164 may include, for example but not limited to, a keypad, one or more buttons, a mouse or pointing device, a shutter release, and any other buttons or switches that allow the user of the digital camera 100 to input commands.
- the image analysis logic 150 and the image improvement logic 160 communicate with the user via the user interface 164 , through the image display 128 .
- the ASIC 102 also couples to one or more different memory elements, collectively referred to as memory 136 .
- the memory 136 may include memory internal to the digital camera 100 and/or memory external to the digital camera 100 .
- the internal memory may, for example, comprise flash memory and the external memory may comprise, for example, a removable compact flash memory card.
- the various memory elements may comprise volatile, and/or non-volatile memory, such as, for example but not limited to, synchronous dynamic random access memory (SDRAM) 141 , illustrated as a portion of the memory 136 and flash memory.
- SDRAM synchronous dynamic random access memory
- the memory elements may comprise memory distributed over various elements within the digital camera 100 .
- the ASIC 102 couples to memory 136 via connection 131 .
- the memory 136 includes the image analysis logic 150 , the image improvement logic 160 , the settings file 155 and the various software and firmware elements and components (not shown) that allow the digital camera 100 to perform its various functions.
- the memory also stores the image file 135 , which represents a captured image.
- the software code i.e., the image analysis logic 150
- the settings file 155 comprises the various settings used when capturing an image.
- the exposure time, aperture setting (f-stop), shutter speed, white balance, flash on or off, focus, contrast, saturation, sharpness, ISO speed, exposure compensation, color, resolution and compression, and other camera settings may be stored in the settings file 155 .
- the settings file 155 may be accessed by the image analysis logic 150 to analyze a captured image by, in one example, determining the camera settings used to capture the image that is under analysis.
- the ASIC 102 executes the image analysis logic 150 so that after an image is captured by the image sensor 104 , the image analysis logic 150 analyzes various characteristics of the captured image. These characteristics may include characteristics of the captured image, or alternatively, may include the settings used to capture the image. Further, if the image improvement logic 160 determines that the image could be improved by changing one or more of the conditions under which the image was captured, or by changing one or more camera settings, then the image improvement logic 160 can either suggest these changes via the user interface 164 and the image display 128 , or can automatically change the settings and prepare the camera for a subsequent image.
- FIG. 2 is a graphical illustration of an image file 135 .
- the image file 135 includes a header portion 202 and a pixel array 208 .
- the pixel array 208 comprises a plurality of pixels, exemplary ones of which are illustrated using reference numerals 204 , 206 and 212 .
- Each pixel in the pixel array 208 represents a portion of the captured image represented by the image file 135 .
- An array size can be, for example, 2272 pixels wide by 1712 pixels high.
- the image file 135 can also be represented as a table of values for each pixel and can be stored, for example, in the memory 136 of FIG. 1. For example, each pixel has an associated red (R), green (G) and blue (B) value.
- the value for each R, G and B component can be, for example, a value between 0 and 255, where the value of each R, G and B component represents the color that the pixel has captured. For example, if pixel 204 has respective R, G and B values of 0, 0 and 0, respectively, (or close to 0,0,0) the pixel 204 represents the color black, or is close to black. Conversely, for the pixel 212 , a respective value of 255 (or close to 255) for each R, G and B component represents the color white, or close to white. R, G and B values between 0 and 255 represent a range of colors between black and white.
- the data for each pixel in the image file 135 can be analyzed by the image analysis logic 150 to determine characteristics of the image.
- characteristics including, but not limited to, the exposure, focus or the white balance of the captured image can be analyzed.
- a predominance of white pixels may be indicative of overexposure and a predominance of black pixels may be indicative of underexposure.
- pixels in an image are analyzed to determine whether sharp transitions exist between pixels. For example, a black pixel adjoining a white pixel may indicate that the image is in focus, while a black pixel separated from a white pixel by a number of gray pixels may indicate that the image is out of focus.
- An image in which each pixel is a different shade of the same color may indicate a problem with the white balance of the image.
- FIG. 3 is a flow chart 300 describing the operation of an embodiment of the image analysis logic 150 and the image improvement logic 160 of FIG. 1.
- Any process descriptions or blocks in the flow chart to follow should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternative implementations are included within the scope of the preferred embodiment.
- functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
- the image sensor 104 of FIG. 1 captures an image.
- the image is stored in the memory 136 as image file 135 .
- the image is displayed to the user of the digital camera 100 via the image display 128 (FIG. 1) during the “instant review” mode.
- the instant review mode affords the user the opportunity to view the captured image immediately after capture.
- the user determines whether they want to view the settings with which the image was captured. If the user wants to view the settings, the settings are displayed to the user on the image display 128 as indicated in block 308 . If the user does not want to view the settings, then, in block 312 , it is determined whether the user wants the image analysis logic 150 to analyze the image. If the user does not want the image to be analyzed, then, in block 314 the image can be saved or discarded. Alternatively, the image analysis logic 150 can be invoked automatically without user intervention.
- the image analysis logic 150 analyzes the data within the image file 135 .
- the data is analyzed to determine various characteristics of the captured image.
- the following example will use exposure as the characteristic that is analyzed by the image analysis logic 150 .
- other characteristics such as, for example, focus and white balance, can be analyzed.
- the image analysis logic 150 performs a pixel by pixel analysis to determine whether the image includes a predominance of either black or white pixels.
- the data associated with each pixel in the image file 135 is analyzed to determine whether a pixel is a black pixel or a white pixel.
- Each pixel is analyzed to determine its corresponding R, G and B values. For example, if the R, G and B values for the pixel 204 are all zeros, the pixel is considered a black pixel.
- Each pixel in the pixel array 208 is analyzed in this manner to determine the number of black or white pixels in the pixel array 208 for this image file.
- a determination in block 306 that a substantial portion of the pixels in the array 208 are black indicates that the image is likely underexposed. Conversely, a determination that many of pixels in the array 208 are white indicates that the image is likely overexposed. Of course, the image may be of an all white or an all black subject, in which case the user may choose to disregard the analysis.
- the data in the image file 135 can be analyzed in combination with other data available either in the image file 135 or from the settings file 155 in the camera 100 .
- additional data saved in the header 202 of the image file 135 can be analyzed in conjunction with the information from each pixel in the array 208 .
- This information might include, for example, the ISO setting and the aperture setting (f-stop) used to capture the image.
- These data items can be used in conjunction with the pixel data described above to develop additional information regarding the characteristic of the analyzed image.
- the image analysis logic 150 can also analyze the camera settings used to capture the image and use those settings when analyzing the data in the image file 135 to develop additional data regarding the image file 135 .
- the image analysis logic 150 can access the settings file 155 in the memory 136 of FIG. 1 to determine, for example, whether the flash was enabled, or to determine the position of the lens when the image was captured. In this manner, the image analysis logic 150 can gather a range of information relating to the captured image to perform an analysis on the captured image file 135 to determine whether the captured image meets certain criteria.
- the image analysis logic 150 determines that the image is underexposed, i.e., the image file contains many black pixels, the image analysis logic 150 can access the settings file 155 to determine whether the flash was active when the image was captured. If the image analysis logic 150 determines that the flash was turned off, the image analysis logic 150 may communicate with the image improvement logic 160 to recommend that the user activate the flash so that a subsequent image may have less likelihood of being underexposed.
- block 318 it is determined whether the image data analyzed in block 316 represents an acceptable image. This can be an objective determination based on criteria that the user enters into the camera 100 via a user interface 164 (FIG. 1) or can be preset in the camera 100 at the time of manufacture. Alternatively, the determination of whether the image data represents an acceptable image can be a subjective determination based on user input. If the image is determined to be acceptable, then no further calculation is performed.
- the image improvement logic 160 evaluates the settings used to capture the data in the image file 135 to determine whether a condition or setting can be changed to improve the image.
- the image improvement logic 160 can also develop recommendations to present to the user of the camera to improve a subsequent image. For example, if the analysis in block 316 suggests that the image was underexposed, the image improvement logic 160 may develop “advice” to be presented to the user. In this example, as will be described below, the image improvement logic 160 may suggest that the user activate the flash to improve a subsequent image. This suggestion may be provided to the user via the image display 128 in conjunction with the user interface 164 .
- the instant review and help screen may include, for example, a thumbnail size display of the image, a display of the settings used to capture the image, an evaluation of the image and, if the user desires, suggestions on ways to improve the image.
- the evaluation of the image may include, for example, a notification that characteristics, such as exposure, focus and color balance are satisfactory.
- Suggestions on ways in which to improve the image may be communicated to the user via the image display 128 and may include, for example, changing a condition under which the image was captured, changing a setting with which the image was captured, or a combination of both changing a condition and a setting.
- block 326 determines whether they want to capture another image. If the user does not want to capture another image, the process ends. If, however, in block 326 , the user wants to capture another image, then, in block 332 , it is determined whether the user wants to manually change a condition or setting for the subsequent image or, if the setting is one that can be changed by the digital camera 100 , whether the user wants the digital camera 100 to automatically change the setting.
- FIGS. 4A and 4B are graphical illustrations showing an instant review screen and a help screen provided by the image analysis logic 150 and the image improvement logic 160 .
- the captured image is displayed to the user via the instant review screen 400 immediately after an image is captured. If the user desires additional information regarding the image, then, in this example, the user actuates an appropriate control on the user interface 164 to display the instant review help screen shown in FIG. 4B.
- the instant review help screen 410 includes a thumbnail image 402 of the captured image from FIG. 4A, the exposure settings 404 , and any other settings 406 used when the image in FIG. 4A was captured.
- the instant review help screen 410 also includes an improvement message portion 410 , referred to as an “advice” portion.
- the advice portion 410 may include, for example, the settings that were evaluated in block 322 of FIG. 3, and/or advice on ways in which to improve the image.
- the digital camera 100 captures an image, analyzes the image, and provides instructions, via the instant review help screen 410 , on ways to improve a subsequent image.
Abstract
A system and method for analyzing a captured image is disclosed. In one embodiment, a system for analyzing a digital image comprises an image sensor including a plurality of image capture elements, each of the image capture elements configured to capture image data. The image data is captured according to at least one setting. The system also includes a memory for storing the image data and logic for dynamically analyzing the image data to determine at least one characteristic of the image data, and a display for communicating a description of the characteristic.
Description
- The present invention relates generally to digital photography, and, more particularly, to a system and method for analyzing a digital image.
- With the proliferation of low cost microprocessors, memory and image capture electronics, digital cameras are gaining popularity and are becoming more and more widely available to a larger number of consumers. One of the advantages that a digital camera enjoys over a conventional film camera is that when a digital camera captures an image, the image is stored electronically in a memory element associated with the camera and is available for immediate viewing. For example, it is common to capture an image using a digital camera and then immediately display the captured image to the user on a display screen associated with the digital camera. This ability to immediately view the image is commonly referred to as “instant review.” The ability to immediately review the captured image allows the user to immediately decide if the image is satisfactory, worth keeping and perhaps printing.
- Unfortunately, many characteristics for determining whether the image is satisfactory may not be readily visually noticeable on the small display associated with many digital cameras. For example, while the image may appear to be in focus when viewed on the camera display, the image may appear out of focus when printed. Unfortunately, printing the image is a time consuming and costly way to determine whether the image is satisfactory.
- Therefore, it would be desirable to determine the quality of various characteristics associated with the captured image prior to printing the image. Further, if the image is deemed unacceptable, it would be desirable to provide instructions to the user for improving a subsequent image.
- A system and method for analyzing a digital image is disclosed. In one embodiment, a system for analyzing a digital image comprises an image sensor including a plurality of image capture elements, each of the image capture elements configured to capture image data. The image data is captured according to at least one setting. The system also includes a memory for storing the image data, logic for analyzing the image data to determine at least one characteristic of the image data, and a display for communicating a description of the characteristic.
- Related methods of operation and computer readable media are also provided. Other systems, methods, features, and advantages of the invention will be or become apparent to one with skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the invention, and be protected by the accompanying claims.
- Embodiments of the present invention, as defined in the claims, can be better understood with reference to the following drawings. The components within the drawings are not necessarily to scale relative to each other, emphasis instead being placed upon clearly illustrating the principles of the present invention.
- FIG. 1 is a block diagram illustrating a digital camera constructed in accordance with an embodiment of the invention.
- FIG. 2 is a graphical illustration of an image file.
- FIG. 3 is a flow chart describing the operation of an embodiment of the image analysis and improvement logic of FIG. 1.
- FIGS. 4A and 4B are graphical illustrations showing an instant review screen and a help screen in accordance with an embodiment of the invention.
- The invention described below is applicable to any digital camera that provides an “instant review” function. The system and method for analyzing a captured image can be implemented in hardware, software, firmware, or a combination thereof. In the preferred embodiment(s), the system and method for analyzing a captured image is implemented using a combination of hardware, software or firmware that is stored in a memory and that is executed by a suitable instruction execution system. The hardware portion of the system and method for analyzing a captured image can be implemented with any or a combination of the following technologies, which are all well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc. The software portion of the system and method for analyzing a captured image can be stored in one or more memory elements and executed by a suitable general purpose or application specific processor.
- The software for analyzing a captured image, which comprises an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer-readable medium” can be any means, which contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- FIG. 1 is a block diagram illustrating a
digital camera 100 constructed in accordance with an embodiment of the invention. In the implementation to be described below, thedigital camera 100 includes an application specific integrated circuit (ASIC) 102 that executes theimage analysis logic 150 of the invention. As will be described below, theimage analysis logic 150 can be software that is stored in memory and executed by the ASIC 102. In an alternative embodiment, theimage analysis logic 150 may be implemented in firmware, which can be stored and executed in the ASIC 102. Further, while illustrated using asingle ASIC 102, thedigital camera 100 may include additional processors, digital signal processors (DSPs) and ASICs. - The ASIC102 may also include other elements, which are omitted for simplicity. The ASIC 102 controls the function of various aspects of the
digital camera 100. - The
camera 100 includes animage sensor 104. Theimage sensor 104 may comprise a charge coupled device (CCD) array or an array of complementary metal oxide semiconductor (CMOS) sensors. Regardless of whether theimage sensor 104 comprises an array of individual CCD elements or CMOS sensors, each of the elements in the array comprises a pixel (picture element) of theimage sensor 104. An exemplary pixel is indicated usingreference numeral 204. The pixels in theimage sensor 104 are typically arranged in a two-dimensional array. For example, an image array may comprise 2272 pixels in length and 1712 pixels in height. - The
image sensor 104 captures an image of a subject by converting incident light into an analog signal, and sends this representation of the image viaconnection 109 to an analog front end (AFE)processor 111. The analogfront end processor 111 typically includes an analog-to-digital converter for converting the analog signal received from theimage sensor 104 into a digital signal. The analog front end processor provides this digital signal as image data viaconnection 112 to the ASIC 102 for image processing. - The ASIC102 couples via
connection 118 to one ormore motor drivers 119. Themotor drivers 119 control the operation of various parameters of thelens 122 viaconnection 121. For example, lens controls, such as zoom, focus, aperture and shutter operations can be controlled by themotor drivers 119. Theconnection 123 between thelens 122 and theimage sensor 104 is shown as a dotted line to illustrate the operation of thelens 122 focusing on a subject and communicating light to theimage sensor 104, which captures the image provided by thelens 122. - The ASIC102 also sends display data via
connection 124 to adisplay controller 126. The display controller may be, for example, a national television system committee (NTSC)/phase alternate line (PAL) encoder, although, depending on the application, other standards for presenting display data may be used. Thedisplay controller 126 converts the display data from the ASIC 102 into a signal that can be forwarded viaconnection 127 toimage display 128. Theimage display 128, which can be, for example a liquid crystal display (LCD) or other display, displays the captured image to the user of adigital camera 100, and is typically the color display located on thedigital camera 100. Depending on the configuration of thedigital camera 100, the image shown to a user on theimage display 128 may be shown before the image is captured and processed, in what is referred to as “live view” mode, or after the image is captured and processed, in what is referred to as “instant review” mode, or, if the image was previously captured, in what is referred to as “review” or “playback” mode. The playback mode can be invoked via a menu command. The instant review mode is typically used to display the captured image to the user immediately after the image is captured and the playback mode is typically used to display the captured image to the user sometime after the image has been captured and stored in memory. - The instant review mode allows the user of the
camera 100 to immediately view the image on thedisplay 128. Unfortunately, because theimage display 128 is typically small, only gross features, or characteristics, of the image can be visually observed. Further, theimage display 128 may not accurately reproduce color, tint, brightness, etc., which may further make it difficult for a user to determine the quality of the captured image. The difficulty in visually determining the quality of the captured image leads to the possibility of saving an image that may include deficiencies that, if visually detected, would likely cause the user to discard the image and attempt to capture another image having better quality. To determine whether the image includes deficiencies that may not be apparent to the user when viewing the captured image on theimage display 128 in the instant review mode, theimage analysis logic 150 dynamically analyzes one or more characteristics of the captured image and presents to the user, via theimage display 128 and a user interface, an analysis of the captured image. An exemplary dynamic analysis of the data for each pixel in a captured image is described below in FIG. 2. For example, information associated with each pixel may be analyzed to determine whether a significant number of the pixels forming the image is either black or white. A predominance of white pixels may be indicative of overexposure and a predominance of black pixels may be indicative of underexposure. - Similar dynamic analyses can be performed to determine whether an image is in focus or to determine the white balance the image is correct. To determine whether an image is in focus, pixels in an image are examined to determine whether sharp transitions exist between pixels. For example, a black pixel adjoining a white pixel may indicate that the image is in focus, while a black pixel separated from a white pixel by a number of gray pixels may indicate that the image is out of focus.
- White balance is a characteristic of the image that generally refers to the color balance in the image to ensure that white portions of the image appear white. An image in which each pixel is a different shade of the same color may indicate an image in which the white balance is improperly adjusted.
- Further, an
image improvement logic 160 may be provided to present to the user a recommendation in the form of instructions presented on theimage display 128 on ways in which to possibly improve a subsequent image by, for example, adjusting a condition under which the image was captured or adjusting a setting used to capture the image. As will be described below, theimage analysis logic 150 analyzes the captured image and, optionally, the camera settings used to capture the image, and determines a value of one or more characteristics of the captured image. For example, to determine whether the exposure of the image is satisfactory, if a predefined number of white pixels in the image is exceeded, then theimage analysis logic 150 may indicate that the image is overexposed. Further, if theimage analysis logic 150 determines that one or more characteristics of the captured image is not satisfactory to yield a high quality image, theimage improvement logic 160 determines whether a condition used to capture the image should be adjusted, or whether a camera setting should be adjusted, to improve a subsequent image. For example, if theimage analysis logic 150 determines that the image is underexposed, theimage improvement logic 160 can determine that a subsequent image may be improved by activating the camera flash for a subsequent image. When theimage analysis logic 150 analyzes the data representing the captured image and the settings used to capture the image, the analysis can be used by theimage improvement logic 160 to suggest adjustments to the settings to improve a subsequent image. These suggested adjustments to the camera settings can be presented to the user on a help screen via theimage display 128, or, in an alternative configuration, can be automatically changed for a subsequent image. - The
ASIC 102 couples to amicrocontroller 161 viaconnection 154. Themicrocontroller 161 can be a specific or a general purpose microprocessor that controls the various operating aspects and parameters of thedigital camera 100. For example, themicrocontroller 161 is coupled to auser interface 164 viaconnection 162. Theuser interface 164 may include, for example but not limited to, a keypad, one or more buttons, a mouse or pointing device, a shutter release, and any other buttons or switches that allow the user of thedigital camera 100 to input commands. Further, theimage analysis logic 150 and theimage improvement logic 160 communicate with the user via theuser interface 164, through theimage display 128. - The
ASIC 102 also couples to one or more different memory elements, collectively referred to asmemory 136. Thememory 136 may include memory internal to thedigital camera 100 and/or memory external to thedigital camera 100. The internal memory may, for example, comprise flash memory and the external memory may comprise, for example, a removable compact flash memory card. The various memory elements may comprise volatile, and/or non-volatile memory, such as, for example but not limited to, synchronous dynamic random access memory (SDRAM) 141, illustrated as a portion of thememory 136 and flash memory. Furthermore, the memory elements may comprise memory distributed over various elements within thedigital camera 100. - The
ASIC 102 couples tomemory 136 viaconnection 131. Thememory 136 includes theimage analysis logic 150, theimage improvement logic 160, the settings file 155 and the various software and firmware elements and components (not shown) that allow thedigital camera 100 to perform its various functions. The memory also stores theimage file 135, which represents a captured image. When the system and method for analyzing an image is implemented in software, the software code (i.e., the image analysis logic 150) is typically stored in thememory 136 and transferred to theSDRAM 141 to enable the efficient execution of the software in theASIC 102. The settings file 155 comprises the various settings used when capturing an image. For example, the exposure time, aperture setting (f-stop), shutter speed, white balance, flash on or off, focus, contrast, saturation, sharpness, ISO speed, exposure compensation, color, resolution and compression, and other camera settings may be stored in the settings file 155. As will be described below, the settings file 155 may be accessed by theimage analysis logic 150 to analyze a captured image by, in one example, determining the camera settings used to capture the image that is under analysis. - The
ASIC 102 executes theimage analysis logic 150 so that after an image is captured by theimage sensor 104, theimage analysis logic 150 analyzes various characteristics of the captured image. These characteristics may include characteristics of the captured image, or alternatively, may include the settings used to capture the image. Further, if theimage improvement logic 160 determines that the image could be improved by changing one or more of the conditions under which the image was captured, or by changing one or more camera settings, then theimage improvement logic 160 can either suggest these changes via theuser interface 164 and theimage display 128, or can automatically change the settings and prepare the camera for a subsequent image. - FIG. 2 is a graphical illustration of an
image file 135. Theimage file 135 includes aheader portion 202 and apixel array 208. Thepixel array 208 comprises a plurality of pixels, exemplary ones of which are illustrated usingreference numerals pixel array 208 represents a portion of the captured image represented by theimage file 135. An array size can be, for example, 2272 pixels wide by 1712 pixels high. When processed, theimage file 135 can also be represented as a table of values for each pixel and can be stored, for example, in thememory 136 of FIG. 1. For example, each pixel has an associated red (R), green (G) and blue (B) value. The value for each R, G and B component can be, for example, a value between 0 and 255, where the value of each R, G and B component represents the color that the pixel has captured. For example, ifpixel 204 has respective R, G and B values of 0, 0 and 0, respectively, (or close to 0,0,0) thepixel 204 represents the color black, or is close to black. Conversely, for thepixel 212, a respective value of 255 (or close to 255) for each R, G and B component represents the color white, or close to white. R, G and B values between 0 and 255 represent a range of colors between black and white. The data for each pixel in theimage file 135 can be analyzed by theimage analysis logic 150 to determine characteristics of the image. For example, characteristics including, but not limited to, the exposure, focus or the white balance of the captured image can be analyzed. A predominance of white pixels may be indicative of overexposure and a predominance of black pixels may be indicative of underexposure. To determine whether an image is in focus, pixels in an image are analyzed to determine whether sharp transitions exist between pixels. For example, a black pixel adjoining a white pixel may indicate that the image is in focus, while a black pixel separated from a white pixel by a number of gray pixels may indicate that the image is out of focus. An image in which each pixel is a different shade of the same color may indicate a problem with the white balance of the image. An example of determining the exposure will be described below with respect to FIG. 3. - FIG. 3 is a
flow chart 300 describing the operation of an embodiment of theimage analysis logic 150 and theimage improvement logic 160 of FIG. 1. Any process descriptions or blocks in the flow chart to follow should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternative implementations are included within the scope of the preferred embodiment. For example, functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention. - In
block 302 theimage sensor 104 of FIG. 1 captures an image. The image is stored in thememory 136 asimage file 135. - In
block 304, the image is displayed to the user of thedigital camera 100 via the image display 128 (FIG. 1) during the “instant review” mode. The instant review mode affords the user the opportunity to view the captured image immediately after capture. - In
block 306, the user determines whether they want to view the settings with which the image was captured. If the user wants to view the settings, the settings are displayed to the user on theimage display 128 as indicated inblock 308. If the user does not want to view the settings, then, inblock 312, it is determined whether the user wants theimage analysis logic 150 to analyze the image. If the user does not want the image to be analyzed, then, inblock 314 the image can be saved or discarded. Alternatively, theimage analysis logic 150 can be invoked automatically without user intervention. - In
block 316, theimage analysis logic 150 analyzes the data within theimage file 135. The data is analyzed to determine various characteristics of the captured image. The following example will use exposure as the characteristic that is analyzed by theimage analysis logic 150. However, other characteristics, such as, for example, focus and white balance, can be analyzed. - When analyzing exposure, the
image analysis logic 150 performs a pixel by pixel analysis to determine whether the image includes a predominance of either black or white pixels. In this example, the data associated with each pixel in theimage file 135 is analyzed to determine whether a pixel is a black pixel or a white pixel. Each pixel is analyzed to determine its corresponding R, G and B values. For example, if the R, G and B values for thepixel 204 are all zeros, the pixel is considered a black pixel. Each pixel in thepixel array 208 is analyzed in this manner to determine the number of black or white pixels in thepixel array 208 for this image file. A determination inblock 306 that a substantial portion of the pixels in thearray 208 are black indicates that the image is likely underexposed. Conversely, a determination that many of pixels in thearray 208 are white indicates that the image is likely overexposed. Of course, the image may be of an all white or an all black subject, in which case the user may choose to disregard the analysis. - In an alternative embodiment, the data in the
image file 135 can be analyzed in combination with other data available either in theimage file 135 or from the settings file 155 in thecamera 100. For example, additional data saved in theheader 202 of theimage file 135 can be analyzed in conjunction with the information from each pixel in thearray 208. This information might include, for example, the ISO setting and the aperture setting (f-stop) used to capture the image. These data items can be used in conjunction with the pixel data described above to develop additional information regarding the characteristic of the analyzed image. - Furthermore, the
image analysis logic 150 can also analyze the camera settings used to capture the image and use those settings when analyzing the data in theimage file 135 to develop additional data regarding theimage file 135. For example, theimage analysis logic 150 can access the settings file 155 in thememory 136 of FIG. 1 to determine, for example, whether the flash was enabled, or to determine the position of the lens when the image was captured. In this manner, theimage analysis logic 150 can gather a range of information relating to the captured image to perform an analysis on the capturedimage file 135 to determine whether the captured image meets certain criteria. To illustrate an example, if theimage analysis logic 150 determines that the image is underexposed, i.e., the image file contains many black pixels, theimage analysis logic 150 can access the settings file 155 to determine whether the flash was active when the image was captured. If theimage analysis logic 150 determines that the flash was turned off, theimage analysis logic 150 may communicate with theimage improvement logic 160 to recommend that the user activate the flash so that a subsequent image may have less likelihood of being underexposed. - In
block 318, it is determined whether the image data analyzed inblock 316 represents an acceptable image. This can be an objective determination based on criteria that the user enters into thecamera 100 via a user interface 164 (FIG. 1) or can be preset in thecamera 100 at the time of manufacture. Alternatively, the determination of whether the image data represents an acceptable image can be a subjective determination based on user input. If the image is determined to be acceptable, then no further calculation is performed. - If, however, in
block 318 theimage analysis logic 150 determines that certain conditions under which the image was captured or settings used to capture the image can be changed to improve the image, then, inblock 322, theimage improvement logic 160 evaluates the settings used to capture the data in theimage file 135 to determine whether a condition or setting can be changed to improve the image. In addition, theimage improvement logic 160 can also develop recommendations to present to the user of the camera to improve a subsequent image. For example, if the analysis inblock 316 suggests that the image was underexposed, theimage improvement logic 160 may develop “advice” to be presented to the user. In this example, as will be described below, theimage improvement logic 160 may suggest that the user activate the flash to improve a subsequent image. This suggestion may be provided to the user via theimage display 128 in conjunction with theuser interface 164. - In
block 324, and instant review settings and help screen (to be described below with regard to FIG. 4B) is displayed to the user. The instant review and help screen may include, for example, a thumbnail size display of the image, a display of the settings used to capture the image, an evaluation of the image and, if the user desires, suggestions on ways to improve the image. The evaluation of the image may include, for example, a notification that characteristics, such as exposure, focus and color balance are satisfactory. Suggestions on ways in which to improve the image may be communicated to the user via theimage display 128 and may include, for example, changing a condition under which the image was captured, changing a setting with which the image was captured, or a combination of both changing a condition and a setting. - In
block 326, determines whether they want to capture another image. If the user does not want to capture another image, the process ends. If, however, inblock 326, the user wants to capture another image, then, inblock 332, it is determined whether the user wants to manually change a condition or setting for the subsequent image or, if the setting is one that can be changed by thedigital camera 100, whether the user wants thedigital camera 100 to automatically change the setting. - If, in
block 332, the user decides to manually change the setting, then, inblock 334, the user changes the setting and the process returns to block 302 where another image is captured and the process repeats. If, however, inblock 332, the user wants thedigital camera 100 to automatically change the setting, then, in block 336, the setting used to capture the previous image are changed according to the new settings determined inblock 324, and the process returns to block 302 to capture a subsequent image. - FIGS. 4A and 4B are graphical illustrations showing an instant review screen and a help screen provided by the
image analysis logic 150 and theimage improvement logic 160. In FIG. 4A, the captured image is displayed to the user via theinstant review screen 400 immediately after an image is captured. If the user desires additional information regarding the image, then, in this example, the user actuates an appropriate control on theuser interface 164 to display the instant review help screen shown in FIG. 4B. The instantreview help screen 410 includes athumbnail image 402 of the captured image from FIG. 4A, theexposure settings 404, and anyother settings 406 used when the image in FIG. 4A was captured. The instantreview help screen 410 also includes animprovement message portion 410, referred to as an “advice” portion. Theadvice portion 410 may include, for example, the settings that were evaluated inblock 322 of FIG. 3, and/or advice on ways in which to improve the image. In this manner, thedigital camera 100 captures an image, analyzes the image, and provides instructions, via the instantreview help screen 410, on ways to improve a subsequent image. - While various embodiments of the invention have been described, it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible that are within the scope of this invention. All such modifications and variations are intended to be included herein within the scope of this disclosure and the present invention and protected by the following claims.
Claims (28)
1. A system for analyzing a digital image, comprising:
an image sensor including a plurality of image capture elements, each of the image capture elements configured to capture image data, the image data captured according to at least one setting;
a memory for storing the image data;
logic for dynamically analyzing the image data to determine at least one characteristic of the image; and
a display for communicating a description of the characteristic.
2. The system of claim 1 , wherein the at least one characteristic of the image determines whether the image is in focus.
3. The system of claim 1 , wherein the at least one characteristic of the image determines exposure of the image.
4. The system of claim 1 , wherein the at least one characteristic of the image determines a white balance of the image.
5. The system of claim 1 , further comprising logic for revising the at least one setting to alter the at least one characteristic.
6. The system of claim 5 , wherein the at least one setting is automatically revised.
7. The system of claim 5 , wherein the at least one setting is manually revised by a user of the system.
8. The system of claim 5 , wherein the display communicates the at least one revised setting to a user.
9. The system of claim 1 , further comprising logic for providing advice on improving a subsequent image, wherein the advice comprises adjusting the at least one setting, and wherein the advice is communicated via the display.
10. The system of claim 1 , further comprising logic for providing advice on improving a subsequent image, wherein the advice comprises adjusting a condition under which the image was captured, and wherein the advice is communicated via the display.
11. A method for analyzing a digital image, comprising:
capturing an image;
storing the image as image data;
dynamically analyzing the image data to determine at least one characteristic of the image; and
communicating a description of the characteristic to a user.
12. The method of claim 11 , wherein the at least one characteristic of the image determines whether the image is in focus.
13. The method of claim 11 , wherein the at least one characteristic of the image determines exposure of the image.
14. The method of claim 11 , wherein the at least one characteristic of the image determines a white balance of the image.
15. The method of claim 11 , further comprising revising a setting to alter the at least one characteristic.
16. The method of claim 15 , further comprising automatically revising the at least one setting.
17. The method of claim 15 , further comprising manually revising the at least one setting.
18. The method of claim 15 , further comprising displaying the at least one revised setting to a user.
19. The method of claim 11 , further comprising providing advice on improving a subsequent image, wherein the advice comprises adjusting the at least one setting, and wherein the advice is communicated via the display.
20. The method of claim 11 , further comprising providing advice on improving a subsequent image, wherein the advice comprises adjusting a condition under which the image was captured, and wherein the advice is communicated via the display.
21. A digital camera, comprising:
an image sensor including a plurality of image capture elements, each of the image capture elements configured to capture image data, the image data captured according to at least one setting;
a memory for storing the image data;
logic for dynamically analyzing the image data to determine at least one characteristic of the image;
logic for revising the at least one setting to alter the characteristic; and
a display for communicating the revised setting to a user.
22. The camera of claim 21 , further comprising logic for providing advice on improving a subsequent image, wherein the advice comprises adjusting the at least one setting, and wherein the advice is communicated via the display.
23. The camera of claim 21 , further comprising logic for providing advice on improving a subsequent image, wherein the advice comprises adjusting a condition under which the image was captured, and wherein the advice is communicated via the display.
24. A computer readable medium having a program for analyzing a digital image, the program including logic for:
capturing an image;
storing the image as image data;
dynamically analyzing the image data to determine at least one characteristic of the image; and
communicating a description of the characteristic to a user.
25. The program of claim 24 , further comprising logic for automatically revising a setting to alter the at least one characteristic.
26. The program of claim 24 , further comprising logic for manually revising a setting to alter the at least one characteristic.
27. The program of claim 24 , further comprising logic for providing advice on improving a subsequent image, wherein the advice comprises adjusting the at least one setting, and wherein the advice is communicated via the display.
28. The program of claim 24 , further comprising logic for providing advice on improving a subsequent image, wherein the advice comprises adjusting a condition under which the image was captured, and wherein the advice is communicated via the display.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/461,600 US20040252217A1 (en) | 2003-06-12 | 2003-06-12 | System and method for analyzing a digital image |
DE102004007649A DE102004007649A1 (en) | 2003-06-12 | 2004-02-17 | System and method for analyzing a digital image |
JP2004173461A JP2005006330A (en) | 2003-06-12 | 2004-06-11 | System for analyzing digital image |
US11/054,291 US20050212955A1 (en) | 2003-06-12 | 2005-02-08 | System and method for analyzing a digital image |
US11/412,155 US20060239674A1 (en) | 2003-06-12 | 2006-04-26 | System and method for analyzing a digital image |
US12/684,505 US8780232B2 (en) | 2003-06-12 | 2010-01-08 | System and method for analyzing a digital image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/461,600 US20040252217A1 (en) | 2003-06-12 | 2003-06-12 | System and method for analyzing a digital image |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/054,291 Continuation-In-Part US20050212955A1 (en) | 2003-06-12 | 2005-02-08 | System and method for analyzing a digital image |
US11/054,291 Continuation US20050212955A1 (en) | 2003-06-12 | 2005-02-08 | System and method for analyzing a digital image |
Publications (1)
Publication Number | Publication Date |
---|---|
US20040252217A1 true US20040252217A1 (en) | 2004-12-16 |
Family
ID=33511283
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/461,600 Abandoned US20040252217A1 (en) | 2003-06-12 | 2003-06-12 | System and method for analyzing a digital image |
US11/054,291 Abandoned US20050212955A1 (en) | 2003-06-12 | 2005-02-08 | System and method for analyzing a digital image |
US12/684,505 Active 2026-10-16 US8780232B2 (en) | 2003-06-12 | 2010-01-08 | System and method for analyzing a digital image |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/054,291 Abandoned US20050212955A1 (en) | 2003-06-12 | 2005-02-08 | System and method for analyzing a digital image |
US12/684,505 Active 2026-10-16 US8780232B2 (en) | 2003-06-12 | 2010-01-08 | System and method for analyzing a digital image |
Country Status (3)
Country | Link |
---|---|
US (3) | US20040252217A1 (en) |
JP (1) | JP2005006330A (en) |
DE (1) | DE102004007649A1 (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040190789A1 (en) * | 2003-03-26 | 2004-09-30 | Microsoft Corporation | Automatic analysis and adjustment of digital images with exposure problems |
US20040258308A1 (en) * | 2003-06-19 | 2004-12-23 | Microsoft Corporation | Automatic analysis and adjustment of digital images upon acquisition |
US20060256403A1 (en) * | 2005-05-13 | 2006-11-16 | Chin-Yuan Lin | Optical scanning module of scanning apparatus |
US20070058064A1 (en) * | 2005-09-14 | 2007-03-15 | Sony Corporation | Image processing apparatus and method, and program therefor |
US20070077053A1 (en) * | 2005-09-30 | 2007-04-05 | Casio Computer Co., Ltd. | Imaging device, imaging method and program |
US20070153111A1 (en) * | 2006-01-05 | 2007-07-05 | Fujifilm Corporation | Imaging device and method for displaying shooting mode |
US20080013802A1 (en) * | 2006-07-14 | 2008-01-17 | Asustek Computer Inc. | Method for controlling function of application software and computer readable recording medium |
US20080056706A1 (en) * | 2006-08-29 | 2008-03-06 | Battles Amy E | Photography advice based on captured image attributes and camera settings |
US20100245596A1 (en) * | 2009-03-27 | 2010-09-30 | Motorola, Inc. | System and method for image selection and capture parameter determination |
US20110013039A1 (en) * | 2009-07-17 | 2011-01-20 | Kazuki Aisaka | Image processing apparatus, image processing method, and program |
US20110115944A1 (en) * | 2006-08-04 | 2011-05-19 | Nikon Corporation | Digital camera |
WO2013184571A1 (en) * | 2012-06-06 | 2013-12-12 | Board Of Regents, The University Of Texas System | Maximizing perceptual quality and naturalness of captured images |
CN104038702A (en) * | 2013-03-06 | 2014-09-10 | 佳能株式会社 | Image capture apparatus and control method thereof |
CN104284083A (en) * | 2013-07-02 | 2015-01-14 | 佳能株式会社 | Imaging apparatus and method for controlling same |
US20150189164A1 (en) * | 2013-12-27 | 2015-07-02 | Samsung Electronics Co Ltd | Electronic apparatus having a photographing function and method of controlling the same |
US20160112652A1 (en) * | 2013-07-04 | 2016-04-21 | Sony Corporation | Method, apparatus and system for image processing |
CN107924579A (en) * | 2015-08-14 | 2018-04-17 | 麦特尔有限公司 | The method for generating personalization 3D head models or 3D body models |
CN108574797A (en) * | 2017-03-13 | 2018-09-25 | 奥林巴斯株式会社 | Information terminal device, information processing system, information processing method and recording medium |
US20190020838A1 (en) * | 2017-07-12 | 2019-01-17 | Olympus Corporation | Image pickup device, image pickup apparatus, recording medium and image pickup method |
US10360833B2 (en) * | 2017-03-10 | 2019-07-23 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method for controlling image display and terminal |
EP3554070A4 (en) * | 2016-12-07 | 2020-06-17 | ZTE Corporation | Photograph-capture method, apparatus, terminal, and storage medium |
US11403739B2 (en) * | 2010-04-12 | 2022-08-02 | Adobe Inc. | Methods and apparatus for retargeting and prioritized interpolation of lens profiles |
Families Citing this family (77)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7630006B2 (en) | 1997-10-09 | 2009-12-08 | Fotonation Ireland Limited | Detecting red eye filter and apparatus using meta-data |
US7738015B2 (en) | 1997-10-09 | 2010-06-15 | Fotonation Vision Limited | Red-eye filter method and apparatus |
US7042505B1 (en) | 1997-10-09 | 2006-05-09 | Fotonation Ireland Ltd. | Red-eye filter method and apparatus |
US7920723B2 (en) | 2005-11-18 | 2011-04-05 | Tessera Technologies Ireland Limited | Two stage detection for photographic eye artifacts |
US7565030B2 (en) | 2003-06-26 | 2009-07-21 | Fotonation Vision Limited | Detecting orientation of digital images using face detection information |
US7574016B2 (en) * | 2003-06-26 | 2009-08-11 | Fotonation Vision Limited | Digital image processing using face detection information |
US8254674B2 (en) | 2004-10-28 | 2012-08-28 | DigitalOptics Corporation Europe Limited | Analyzing partial face regions for red-eye detection in acquired digital images |
US8036458B2 (en) | 2007-11-08 | 2011-10-11 | DigitalOptics Corporation Europe Limited | Detecting redeye defects in digital images |
US8170294B2 (en) | 2006-11-10 | 2012-05-01 | DigitalOptics Corporation Europe Limited | Method of detecting redeye in a digital image |
US7689009B2 (en) | 2005-11-18 | 2010-03-30 | Fotonation Vision Ltd. | Two stage detection for photographic eye artifacts |
US7970182B2 (en) | 2005-11-18 | 2011-06-28 | Tessera Technologies Ireland Limited | Two stage detection for photographic eye artifacts |
US7336821B2 (en) | 2006-02-14 | 2008-02-26 | Fotonation Vision Limited | Automatic detection and correction of non-red eye flash defects |
US7792970B2 (en) | 2005-06-17 | 2010-09-07 | Fotonation Vision Limited | Method for establishing a paired connection between media devices |
US9412007B2 (en) | 2003-08-05 | 2016-08-09 | Fotonation Limited | Partial face detector red-eye filter method and apparatus |
US8520093B2 (en) | 2003-08-05 | 2013-08-27 | DigitalOptics Corporation Europe Limited | Face tracker and partial face tracker for red-eye filter method and apparatus |
JP2005309409A (en) * | 2004-03-25 | 2005-11-04 | Fuji Photo Film Co Ltd | Red-eye preventing device, program and recording medium with recorded program |
US7920180B2 (en) * | 2004-04-06 | 2011-04-05 | Hewlett-Packard Development Company, L.P. | Imaging device with burst zoom mode |
JP5055686B2 (en) * | 2004-05-13 | 2012-10-24 | ソニー株式会社 | Imaging system, imaging apparatus, and imaging method |
US20070005490A1 (en) * | 2004-08-31 | 2007-01-04 | Gopalakrishnan Kumar C | Methods and System for Distributed E-commerce |
US20060047704A1 (en) * | 2004-08-31 | 2006-03-02 | Kumar Chitra Gopalakrishnan | Method and system for providing information services relevant to visual imagery |
US7873911B2 (en) * | 2004-08-31 | 2011-01-18 | Gopalakrishnan Kumar C | Methods for providing information services related to visual imagery |
US8370323B2 (en) | 2004-08-31 | 2013-02-05 | Intel Corporation | Providing information services related to multimodal inputs |
JP2006203477A (en) * | 2005-01-19 | 2006-08-03 | Fuji Photo Film Co Ltd | Photographing apparatus |
JP2006332789A (en) * | 2005-05-23 | 2006-12-07 | Nippon Telegr & Teleph Corp <Ntt> | Video photographing method, apparatus, and program, and storage medium for storing the program |
US7599577B2 (en) | 2005-11-18 | 2009-10-06 | Fotonation Vision Limited | Method and apparatus of correcting hybrid flash artifacts in digital images |
US20070236437A1 (en) * | 2006-03-30 | 2007-10-11 | Hannstar Display Corp. | Dynamic gamma control method for LCD |
US20070229850A1 (en) * | 2006-04-04 | 2007-10-04 | Boxternal Logics, Llc | System and method for three-dimensional image capture |
JP2007295479A (en) * | 2006-04-27 | 2007-11-08 | Sony Corp | Photographing apparatus and photographing method, and program |
US7965875B2 (en) | 2006-06-12 | 2011-06-21 | Tessera Technologies Ireland Limited | Advances in extending the AAM techniques from grayscale to color images |
JP2008091999A (en) * | 2006-09-29 | 2008-04-17 | Olympus Corp | Camera |
JP4218723B2 (en) * | 2006-10-19 | 2009-02-04 | ソニー株式会社 | Image processing apparatus, imaging apparatus, image processing method, and program |
US8055067B2 (en) | 2007-01-18 | 2011-11-08 | DigitalOptics Corporation Europe Limited | Color segmentation |
EP2145288A4 (en) | 2007-03-05 | 2013-09-04 | Digitaloptics Corp Europe Ltd | Red eye false positive filtering using face location and orientation |
US20090016565A1 (en) * | 2007-07-11 | 2009-01-15 | Sriram Kulumani | Image analysis |
US8346002B2 (en) * | 2007-07-20 | 2013-01-01 | Microsoft Corporation | High dynamic range image hallucination |
KR101424717B1 (en) * | 2007-09-13 | 2014-08-01 | 삼성전자주식회사 | Apparatus and method for exposure time setting |
US8503818B2 (en) | 2007-09-25 | 2013-08-06 | DigitalOptics Corporation Europe Limited | Eye defect detection in international standards organization images |
JP5381714B2 (en) * | 2007-11-22 | 2014-01-08 | 日本電気株式会社 | Imaging device, information processing terminal, mobile phone, program, and light emission control method |
US20090172756A1 (en) * | 2007-12-31 | 2009-07-02 | Motorola, Inc. | Lighting analysis and recommender system for video telephony |
US8212864B2 (en) | 2008-01-30 | 2012-07-03 | DigitalOptics Corporation Europe Limited | Methods and apparatuses for using image acquisition data to detect and correct image defects |
US8081254B2 (en) | 2008-08-14 | 2011-12-20 | DigitalOptics Corporation Europe Limited | In-camera based method of detecting defect eye with high accuracy |
JP5254822B2 (en) * | 2009-01-26 | 2013-08-07 | キヤノン株式会社 | Imaging apparatus and control method thereof |
US9077905B2 (en) * | 2009-02-06 | 2015-07-07 | Canon Kabushiki Kaisha | Image capturing apparatus and control method thereof |
US8355059B2 (en) * | 2009-02-06 | 2013-01-15 | Canon Kabushiki Kaisha | Image capturing apparatus and control method thereof |
US8125557B2 (en) * | 2009-02-08 | 2012-02-28 | Mediatek Inc. | Image evaluation method, image capturing method and digital camera thereof for evaluating and capturing images according to composition of the images |
US9258458B2 (en) * | 2009-02-24 | 2016-02-09 | Hewlett-Packard Development Company, L.P. | Displaying an image with an available effect applied |
JP2010251817A (en) * | 2009-04-10 | 2010-11-04 | Sony Corp | Imaging apparatus and method, and program |
US20110069179A1 (en) * | 2009-09-24 | 2011-03-24 | Microsoft Corporation | Network coordinated event capture and image storage |
JP5182312B2 (en) * | 2010-03-23 | 2013-04-17 | 株式会社ニコン | Image processing apparatus and image processing program |
US8810715B1 (en) | 2012-04-20 | 2014-08-19 | Seth A. Rudin | Methods and systems for user guided automatic exposure control |
US20130286234A1 (en) * | 2012-04-25 | 2013-10-31 | Atif Hussain | Method and apparatus for remotely managing imaging |
CN104584526B (en) * | 2012-08-16 | 2016-04-13 | 富士胶片株式会社 | Image file generation apparatus and display unit |
US9215433B2 (en) * | 2014-02-11 | 2015-12-15 | Duelight Llc | Systems and methods for digital photography |
CN105103534B (en) * | 2013-03-27 | 2018-06-22 | 富士胶片株式会社 | Photographic device and calibration method |
US20150042843A1 (en) * | 2013-08-09 | 2015-02-12 | Broadcom Corporation | Systems and methods for improving images |
WO2015028587A2 (en) | 2013-08-31 | 2015-03-05 | Dacuda Ag | User feedback for real-time checking and improving quality of scanned image |
EP3540683A1 (en) | 2013-12-03 | 2019-09-18 | ML Netherlands C.V. | User feedback for real-time checking and improving quality of scanned image |
US10410321B2 (en) | 2014-01-07 | 2019-09-10 | MN Netherlands C.V. | Dynamic updating of a composite image |
EP3748953B1 (en) * | 2014-01-07 | 2024-04-17 | ML Netherlands C.V. | Adaptive camera control for reducing motion blur during real-time image capture |
US9661215B2 (en) | 2014-04-22 | 2017-05-23 | Snapaid Ltd. | System and method for controlling a camera based on processing an image captured by other camera |
US10484561B2 (en) | 2014-05-12 | 2019-11-19 | Ml Netherlands C.V. | Method and apparatus for scanning and printing a 3D object |
US10191986B2 (en) | 2014-08-11 | 2019-01-29 | Microsoft Technology Licensing, Llc | Web resource compatibility with web applications |
US9705637B2 (en) | 2014-08-19 | 2017-07-11 | Microsoft Technology Licensing, Llc | Guard band utilization for wireless data communication |
US9524429B2 (en) | 2014-08-21 | 2016-12-20 | Microsoft Technology Licensing, Llc | Enhanced interpretation of character arrangements |
US9805483B2 (en) * | 2014-08-21 | 2017-10-31 | Microsoft Technology Licensing, Llc | Enhanced recognition of charted data |
US9397723B2 (en) | 2014-08-26 | 2016-07-19 | Microsoft Technology Licensing, Llc | Spread spectrum wireless over non-contiguous channels |
US9723200B2 (en) | 2014-10-15 | 2017-08-01 | Microsoft Technology Licensing, Llc | Camera capture recommendation for applications |
US10542204B2 (en) | 2015-08-05 | 2020-01-21 | Microsoft Technology Licensing, Llc | Methods and apparatuses for capturing multiple digital image frames |
US10397469B1 (en) * | 2015-08-31 | 2019-08-27 | Snap Inc. | Dynamic image-based adjustment of image capture parameters |
US9986169B2 (en) * | 2016-02-04 | 2018-05-29 | KARL STORZ, Imaging, Inc. | Exposure control method and system for an image capture device |
US10691201B2 (en) * | 2016-12-19 | 2020-06-23 | Intel Corporation | Image stream switcher |
CN109302570B (en) * | 2018-10-23 | 2021-01-22 | 深圳市宸电电子有限公司 | Night vision environment detection processing method based on ROI (region of interest) sub-image brightness value |
JP7204456B2 (en) * | 2018-12-04 | 2023-01-16 | キヤノン株式会社 | Strobe device and its control method and program |
US11438465B2 (en) * | 2018-12-21 | 2022-09-06 | Xerox Corporation | Ambient lighting indicating machine status conditions |
US11004254B2 (en) * | 2019-07-25 | 2021-05-11 | Nvidia Corporation | Performance of ray-traced shadow creation within a scene |
US11212460B2 (en) | 2020-02-28 | 2021-12-28 | Hand Held Products, Inc. | Apparatuses, methods, and computer program products for flicker reduction in a multi-sensor environment |
US11533439B2 (en) * | 2020-05-29 | 2022-12-20 | Sanjeev Kumar Singh | Multi capture settings of multi light parameters for automatically capturing multiple exposures in digital camera and method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6608650B1 (en) * | 1998-12-01 | 2003-08-19 | Flashpoint Technology, Inc. | Interactive assistant process for aiding a user in camera setup and operation |
US6930718B2 (en) * | 2001-07-17 | 2005-08-16 | Eastman Kodak Company | Revised recapture camera and method |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11352541A (en) * | 1998-06-09 | 1999-12-24 | Minolta Co Ltd | Camera |
US6714249B2 (en) | 1998-12-31 | 2004-03-30 | Eastman Kodak Company | Producing panoramic digital images by digital camera systems |
US6539177B2 (en) * | 2001-07-17 | 2003-03-25 | Eastman Kodak Company | Warning message camera and method |
JP4095265B2 (en) * | 2001-09-06 | 2008-06-04 | キヤノン株式会社 | Image processing apparatus, image processing method, computer-readable storage medium, and computer program |
US6970199B2 (en) * | 2001-10-05 | 2005-11-29 | Eastman Kodak Company | Digital camera using exposure information acquired from a scene |
JP3868273B2 (en) * | 2001-11-16 | 2007-01-17 | オリンパス株式会社 | Camera shake detection method |
US7573514B2 (en) * | 2005-02-03 | 2009-08-11 | Eastman Kodak Company | Digital imaging system with digital zoom warning |
-
2003
- 2003-06-12 US US10/461,600 patent/US20040252217A1/en not_active Abandoned
-
2004
- 2004-02-17 DE DE102004007649A patent/DE102004007649A1/en not_active Withdrawn
- 2004-06-11 JP JP2004173461A patent/JP2005006330A/en active Pending
-
2005
- 2005-02-08 US US11/054,291 patent/US20050212955A1/en not_active Abandoned
-
2010
- 2010-01-08 US US12/684,505 patent/US8780232B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6608650B1 (en) * | 1998-12-01 | 2003-08-19 | Flashpoint Technology, Inc. | Interactive assistant process for aiding a user in camera setup and operation |
US6930718B2 (en) * | 2001-07-17 | 2005-08-16 | Eastman Kodak Company | Revised recapture camera and method |
Cited By (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080137986A1 (en) * | 2003-03-26 | 2008-06-12 | Microsoft Corporation | Automatic analysis and adjustment of digital images with exposure problems |
US20040190789A1 (en) * | 2003-03-26 | 2004-09-30 | Microsoft Corporation | Automatic analysis and adjustment of digital images with exposure problems |
US7359572B2 (en) | 2003-03-26 | 2008-04-15 | Microsoft Corporation | Automatic analysis and adjustment of digital images with exposure problems |
US7646931B2 (en) | 2003-03-26 | 2010-01-12 | Microsoft Corporation | Automatic analysis and adjustment of digital images with exposure problems |
US20040258308A1 (en) * | 2003-06-19 | 2004-12-23 | Microsoft Corporation | Automatic analysis and adjustment of digital images upon acquisition |
US7532234B2 (en) * | 2003-06-19 | 2009-05-12 | Microsoft Corporation | Automatic analysis and adjustment of digital images upon acquisition |
US20060256403A1 (en) * | 2005-05-13 | 2006-11-16 | Chin-Yuan Lin | Optical scanning module of scanning apparatus |
US20070058064A1 (en) * | 2005-09-14 | 2007-03-15 | Sony Corporation | Image processing apparatus and method, and program therefor |
US8289433B2 (en) * | 2005-09-14 | 2012-10-16 | Sony Corporation | Image processing apparatus and method, and program therefor |
US20070077053A1 (en) * | 2005-09-30 | 2007-04-05 | Casio Computer Co., Ltd. | Imaging device, imaging method and program |
US20100272425A1 (en) * | 2005-09-30 | 2010-10-28 | Casio Computer Co., Ltd. | Imaging device, imaging method and program |
US7957637B2 (en) | 2005-09-30 | 2011-06-07 | Casio Computer Co., Ltd. | Imaging device, imaging method and program |
US20070153111A1 (en) * | 2006-01-05 | 2007-07-05 | Fujifilm Corporation | Imaging device and method for displaying shooting mode |
US20080013802A1 (en) * | 2006-07-14 | 2008-01-17 | Asustek Computer Inc. | Method for controlling function of application software and computer readable recording medium |
US20110115944A1 (en) * | 2006-08-04 | 2011-05-19 | Nikon Corporation | Digital camera |
US8970736B2 (en) | 2006-08-04 | 2015-03-03 | Nikon Corporation | Digital camera |
US7668454B2 (en) | 2006-08-29 | 2010-02-23 | Hewlett-Packard Development Company, L.P. | Photography advice based on captured image attributes and camera settings |
US20080056706A1 (en) * | 2006-08-29 | 2008-03-06 | Battles Amy E | Photography advice based on captured image attributes and camera settings |
WO2010111203A1 (en) * | 2009-03-27 | 2010-09-30 | Motorola, Inc. | System and method for image selection and capture parameter determination |
US20100245596A1 (en) * | 2009-03-27 | 2010-09-30 | Motorola, Inc. | System and method for image selection and capture parameter determination |
US8726324B2 (en) | 2009-03-27 | 2014-05-13 | Motorola Mobility Llc | Method for identifying image capture opportunities using a selected expert photo agent |
US20110013039A1 (en) * | 2009-07-17 | 2011-01-20 | Kazuki Aisaka | Image processing apparatus, image processing method, and program |
US8698910B2 (en) * | 2009-07-17 | 2014-04-15 | Sony Corporation | Apparatus, camera, method, and computer-readable storage medium for generating advice for capturing an image |
US11403739B2 (en) * | 2010-04-12 | 2022-08-02 | Adobe Inc. | Methods and apparatus for retargeting and prioritized interpolation of lens profiles |
WO2013184571A1 (en) * | 2012-06-06 | 2013-12-12 | Board Of Regents, The University Of Texas System | Maximizing perceptual quality and naturalness of captured images |
US9277148B2 (en) | 2012-06-06 | 2016-03-01 | Board Of Regents, The University Of Texas System | Maximizing perceptual quality and naturalness of captured images |
US9571742B2 (en) * | 2013-03-06 | 2017-02-14 | Canon Kabushiki Kaisha | Image capture apparatus and control method thereof |
US20140253792A1 (en) * | 2013-03-06 | 2014-09-11 | Canon Kabushiki Kaisha | Image capture apparatus and control method thereof |
CN104038702A (en) * | 2013-03-06 | 2014-09-10 | 佳能株式会社 | Image capture apparatus and control method thereof |
CN104284083A (en) * | 2013-07-02 | 2015-01-14 | 佳能株式会社 | Imaging apparatus and method for controlling same |
US10075654B2 (en) * | 2013-07-04 | 2018-09-11 | Sony Corporation | Method, apparatus and system for image processing |
US20160112652A1 (en) * | 2013-07-04 | 2016-04-21 | Sony Corporation | Method, apparatus and system for image processing |
US9538071B2 (en) * | 2013-12-27 | 2017-01-03 | Samsung Electronics Co., Ltd. | Electronic apparatus having a photographing function and method of controlling the same |
US20150189164A1 (en) * | 2013-12-27 | 2015-07-02 | Samsung Electronics Co Ltd | Electronic apparatus having a photographing function and method of controlling the same |
CN107924579A (en) * | 2015-08-14 | 2018-04-17 | 麦特尔有限公司 | The method for generating personalization 3D head models or 3D body models |
US10796480B2 (en) * | 2015-08-14 | 2020-10-06 | Metail Limited | Methods of generating personalized 3D head models or 3D body models |
EP3554070A4 (en) * | 2016-12-07 | 2020-06-17 | ZTE Corporation | Photograph-capture method, apparatus, terminal, and storage medium |
US10939035B2 (en) | 2016-12-07 | 2021-03-02 | Zte Corporation | Photograph-capture method, apparatus, terminal, and storage medium |
US10360833B2 (en) * | 2017-03-10 | 2019-07-23 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method for controlling image display and terminal |
CN108574797A (en) * | 2017-03-13 | 2018-09-25 | 奥林巴斯株式会社 | Information terminal device, information processing system, information processing method and recording medium |
US10404902B2 (en) * | 2017-03-13 | 2019-09-03 | Olympus Corporation | Information terminal apparatus, information processing system, information processing method and recording medium that records information processing program |
US20190020838A1 (en) * | 2017-07-12 | 2019-01-17 | Olympus Corporation | Image pickup device, image pickup apparatus, recording medium and image pickup method |
US10827141B2 (en) * | 2017-07-12 | 2020-11-03 | Olympus Corporation | Image pickup device and and image pickup method having capability of adding additional information indicating the characteristic of a pixel data sequence to the pixel data sequence |
Also Published As
Publication number | Publication date |
---|---|
US20100123805A1 (en) | 2010-05-20 |
JP2005006330A (en) | 2005-01-06 |
US8780232B2 (en) | 2014-07-15 |
US20050212955A1 (en) | 2005-09-29 |
DE102004007649A1 (en) | 2005-01-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20040252217A1 (en) | System and method for analyzing a digital image | |
KR101900097B1 (en) | Image capturing method and image capturing apparatus | |
JP4760742B2 (en) | Digital camera, information display method, and information display control program | |
WO2020057199A1 (en) | Imaging method and device, and electronic device | |
JP5169318B2 (en) | Imaging apparatus and imaging method | |
US20060239674A1 (en) | System and method for analyzing a digital image | |
JP2002232906A (en) | White balance controller | |
JPH03204281A (en) | Image pickup device | |
US9674427B2 (en) | Image processing method and apparatus | |
US20210168273A1 (en) | Control Method and Electronic Device | |
JP2003174619A (en) | Electronic device | |
US8502882B2 (en) | Image pick-up apparatus, white balance setting method and recording medium | |
US8570407B2 (en) | Imaging apparatus, image processing program, image processing apparatus, and image processing method | |
JP4081609B2 (en) | Imaging device | |
US11336802B2 (en) | Imaging apparatus | |
US20120154617A1 (en) | Image capturing device | |
JP6875603B2 (en) | Imaging device, imaging method, and program | |
JP2006270631A (en) | Digital camera | |
JP2008035457A (en) | Electronic camera and image processing program | |
JP2006197081A (en) | Digital camera with dynamic range compressing function | |
JP5663294B2 (en) | Imaging device | |
JP4674384B2 (en) | Imaging apparatus and imaging method | |
KR101446941B1 (en) | Apparatus and method for storing image when the image is speedly captured | |
JP2024060345A (en) | Imaging device | |
JP2004205736A (en) | Camera, printer and printing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BATTLES, AMY E.;DALTON, DAN L.;MOORE, SHELLEY I.;AND OTHERS;REEL/FRAME:013991/0992;SIGNING DATES FROM 20030604 TO 20030611 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |