US20190215453A1 - Image capturing device - Google Patents
Image capturing device Download PDFInfo
- Publication number
- US20190215453A1 US20190215453A1 US16/354,193 US201916354193A US2019215453A1 US 20190215453 A1 US20190215453 A1 US 20190215453A1 US 201916354193 A US201916354193 A US 201916354193A US 2019215453 A1 US2019215453 A1 US 2019215453A1
- Authority
- US
- United States
- Prior art keywords
- image
- test
- value
- capturing device
- test image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N5/23229—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/629—Protecting access to data via a platform, e.g. using keys or access control rules to features or functions of an application
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/70—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
- G06F21/82—Protecting input, output or interconnection devices
- G06F21/83—Protecting input, output or interconnection devices input devices, e.g. keyboards, mice or controllers thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/97—Determining parameters from multiple pictures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
-
- H04N5/232—
-
- H04N5/2351—
-
- H04N9/735—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/03—Indexing scheme relating to G06F21/50, monitoring users, programs or devices to maintain the integrity of platforms
- G06F2221/031—Protect user input by software means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
Definitions
- the present invention relates to an image capturing device, and more particularly, to an image capturing device capable of improving the security of image data.
- the present invention primarily provides an image output method and an image capturing device, which is capable of improving the security of image data to solve the above mentioned problems.
- the present invention discloses an image capturing device, comprising: an image sensing circuit, for capturing a first test image at a first time point and capturing a second test image at a second time point; a determination circuit, for calculating a first image feature value of the first test image and a second image feature value of the second test image, calculating a difference value between the first image feature value and the second image feature value, and comparing the difference value with a threshold value to generate a determination result, wherein the first image feature value of the first test image is associated with a color value of the first test image and the second image feature value of the second test image is associated with a color value of the second test image; and an output circuit, for outputting the images generated by the image capturing device according to the determination result; wherein the determination result indicates outputting the images generated by the image capturing device when the difference value is greater than the threshold value, and the output circuit accordingly outputs the images generated by the image capturing device.
- the present invention further discloses an image capturing device, comprising: an image sensing circuit, for capturing a first test image at a first time point and capturing a second test image at a second time point; a determination circuit, for calculating a first image feature value of the first test image and a second image feature value of the second test image, calculating a difference value between the first image feature value and the second image feature value, and comparing the difference value with a threshold value to generate a determination result, wherein the first image feature value of the first test image is associated with a brightness value of the first test image and the second image feature value of the second test image is associated with a brightness value of the second test image; and an output circuit, for outputting the images generated by the image capturing device according to the determination result; wherein the determination result indicates outputting the images generated by the image capturing device when the difference value is greater than the threshold value, and the output circuit accordingly outputs the images generated by the image capturing device.
- the present invention further discloses an image capturing device, comprising: an image sensing circuit, for capturing a first test image at a first time point and capturing a second test image at a second time point; a determination circuit, for calculating a first image feature value of the first test image and a second image feature value of the second test image, calculating a difference value between the first image feature value and the second image feature value, and comparing the difference value with a threshold value to generate a determination result, wherein the first image feature value of the first test image is associated with a focal distance of the first test image and the second image feature value of the second test image is associated with a focal distance of the second test image; and an output circuit, for outputting the images generated by the image capturing device according to the determination result; wherein the determination result indicates outputting the images generated by the image capturing device when the difference value is greater than the threshold value, and the output circuit accordingly outputs the images generated by the image capturing device.
- FIG. 1 is a schematic diagram illustrating an image capturing device according to an embodiment of the present invention.
- FIG. 2 is a schematic diagram illustrating a procedure according to an embodiment of the present invention.
- FIGS. 3-5 are schematic diagrams illustrating different procedures according to alternative embodiments of the present invention.
- FIG. 1 is a schematic diagram illustrating an image capturing device 10 according to an embodiment of the present invention.
- the image capturing device 10 can be utilized in all kinds of electronic devices, such as notebooks, tablets, desktops, mobile phones, wearable devices, smart TVs, but not limited thereto.
- the image capturing device 10 includes an image sensing unit 102 , a determination unit 104 and an output unit 106 .
- the image sensing unit 102 is utilized for capturing images.
- the image sensing unit 102 can be a charge-coupled device (CCD) sensor, a complementary metal oxide semiconductor (CMOS) sensor, or other image sensors.
- the image captured by the image sensing unit 102 can be raw images.
- the image captured by the image sensing unit 102 can be outputted directly to other external devices.
- the image captured by the image sensing unit 102 can also be provided to an image signal processor (ISP) of the image capturing device 10 , such that the image signal processor handles corresponding image processing procedure of the captured image and outputs for consequent application.
- ISP image signal processor
- image processing procedure executed by the image signal processor can be image format transformation, auto white balance (AWB), auto exposure (AE), auto focus (AF), contrast adjustment, saturation adjustment, noise elimination, interpolation, edge enhancement and the like, but not limited thereto.
- the image sensing unit 102 can capture images consecutively. For example, the image sensing unit 102 captures test images in a test mode and captures relative images in a normal mode. In the test mode, the image sensing unit 102 captures a first test image at a first time point and a second test image at a second time point.
- the determination unit 104 is used for calculating a first image feature value of a first test image and a second image feature value of a second test image. And the determination unit 104 determines whether to output images generated by the image capturing device 10 according to the first image feature value of the first test image and the second image feature value of the second test image, and accordingly generates a determination result.
- the output unit 106 outputs the images generated by the image capturing device 10 according to the determination result generated by the determination unit 104 .
- at least one of the determination unit 104 and the output unit 106 can be implemented by an image signal processor.
- the image capturing device 10 of the present invention can determine whether to output the images generated by the image capturing device 10 according to image feature values of the captured images at different time points. That is, after determination and authentication of the determination unit 104 and the output unit 106 , the images generated by the image capturing device 10 can be outputted, so as to provide a security control procedure of image data access. As such, the invention can effectively prevent hackers from stealing or peeping the images generated by the image capturing device 10 via remote control, and therefore, improving security of image data.
- FIG. 2 is a schematic diagram illustrating a procedure 20 according to an embodiment of the present invention.
- the procedure 20 includes the following steps:
- Step S 200 Start.
- Step S 202 Capture the first test image at the first time point and capture the second test image at the second time point.
- Step S 204 Calculate the first image feature value of the first test image and the second image feature value of the second test image.
- Step S 206 Determine whether to output the image generated by the image capturing device according to the first image feature value of the first test image and the second image feature value of the second test image.
- Step S 208 End.
- the present invention can control whether to output the images based on the images captured by the image sensing unit 102 for improving the security of the image data.
- the image capturing device 10 receives image access requests or before the image capturing device 10 is ready to output the images
- the image capturing device 10 enters a test mode, and the image sensing unit 102 captures a first test image at a first time point and captures a second test image at a second time point.
- the second time point is after the first time point.
- a time interval between the first time point and the second time point can be determined according to system requirements.
- the time interval between the first time point and the second time point is 30 microseconds (ms).
- the time interval between the first time point and the second time point is 5 seconds(s).
- the determination unit 104 calculates a first image feature value of the first test image and a second image feature value of the second test image.
- the first image feature value of the first test image can be a brightness value, a color value, a focal distance (focal length) or any other image feature values associated with the first test image.
- the second image feature value of the second test image can be a brightness value, a color value, a focal distance or any other image feature values associated with the second test image.
- Step S 206 the determination unit 104 determines whether to output the images generated by the image capturing device 10 according to the first image feature value of the first test image and the second image feature value of the second test image, and accordingly generates a determination result.
- the determination unit 104 can calculate an image difference value between the first image feature value and the second image feature value. For instance, the determination unit 104 can subtract the second image feature value from the first image feature value to calculate a difference value between the first image feature value and the second image feature value. As such, the calculated difference value is the image difference value between the first image feature value and the second image feature value.
- the determination unit 104 can calculate an absolute difference between the first image feature value and the second image feature value, and the absolute difference is the image difference value between the first image feature value and the second image feature value.
- the image difference value represents a variation between the first test image at the first time point and the second test image at the second time point. Then, the determination unit 104 compares the image difference value with a threshold and generates the determination result accordingly. For example, when the image difference value is greater than the threshold, the determination result indicates to output the images generated by the image capturing device 10 . When the image difference value is equal to or smaller than the threshold, the determination result indicates not to output the images generated by the image capturing device 10 .
- the output unit 106 outputs the images generated by the image capturing device 10 according to the determination result determined by the determination unit 104 .
- the output unit 106 outputs the images generated by the image capturing device 10 to other external devices or other devices, wherein the images generated by the image capturing device 10 includes at least one of test images captured by the image sensing unit 102 in the test mode, images captured by the image sensing unit 102 in the normal mode, images processed by the image processor.
- the output unit 106 can output the test images captured by the image sensing unit 102 of the image capturing device 10 in the test mode and/or output the images captured by the image sensing unit 102 of the image capturing device 10 in the normal mode. Or, the output unit 106 can output the images processed by the image processor of the image capturing device 10 . Therefore, after the determination and authentication procedure of the procedure 20 , if the determination result indicates to output the images generated by the image capturing device 10 , the image capturing device 10 performs image output procedure normally.
- the output unit 106 when the determination result indicates not to output the images generated by the image capturing device 10 , the output unit 106 will not perform any operation. In such a situation, the images generated by the image capturing device 10 will not be outputted and provided to other devices. For example, when the determination result indicates not to output the images generated by the image capturing device 10 , the output unit 106 can output (or control a display device to display) a monochrome image, for instance, a black image or a white image, to indicate not allowable to access the images generated by the image capturing device 10 . That is, in this situation, if the determination result indicates not to output the images generated by the image capturing device 10 , the image capturing device 10 will not be allowed to perform the image output procedure normally.
- a monochrome image for instance, a black image or a white image
- the image capturing device 10 can inform a user whether the images generated by the image capturing device 10 can be outputted (or not) by an alarming unit (not illustrated in the figure).
- the alarming unit can generate alarm signals (via all kinds of ways, e.g., words, voices, lights or vibration) to inform the user.
- FIG. 3 is a schematic diagram illustrating a procedure 30 according to an embodiment of the present invention.
- the procedure 30 includes the following steps:
- Step S 300 Start.
- Step S 302 Capture the first test image and the second test image.
- Step S 304 Calculate the brightness value associated with the first test image and the brightness value associated with the second test image.
- Step S 306 Calculate the image difference value between the brightness value associated with the first test image and the brightness value associated with the second test image.
- Step S 308 Determine whether the image difference value is greater than the threshold; if yes, perform the Step S 310 ; if no, perform the Step S 312 .
- Step S 310 Output the image.
- Step S 312 Not output the image.
- Step S 302 when the image capturing device 10 receives image access requests or before the image capturing device 10 is ready to output the images, the image capturing device 10 enters a test mode, and the image sensing unit 102 captures a first test image at a first time point and captures a second test image at a second time point.
- Step S 304 the determination unit 104 calculates the brightness value associated with the first test image and the brightness value associated with the second test image.
- the pixel brightness value of the first test image can be an average brightness value of pixels of at least a portion of the first test image.
- the brightness value of the second test image can be an average brightness value of pixels of at least a portion of the second test image.
- the determination unit 104 respectively detects the pixel brightness value of each pixel of at least a portion of the first test image, and calculates an average of brightness values of all pixels of the at least a portion of the first test image, and the calculated result is the brightness value associated with the first test image.
- the determination unit 104 respectively detects the pixel brightness value of each pixel of at least a portion of the second test image and calculates an average of brightness values of all pixels of at least a portion of the second test image, and the calculated result is the brightness value associated with the second test image.
- a first region of the images captured by the image sensing unit 102 usually corresponds to a head region of the user when using the image capturing device 10 .
- the determination unit 104 can respectively detect the pixel brightness value of each pixel of the first region of the first test image and the second test image. And then, the determination unit 104 calculates the average pixel brightness value of all pixels of the first region of the first test image to obtain the brightness value associated with the first test image, and calculates the average pixel brightness value of all pixels of the second region of the second test image to obtain the brightness value associated with the second test image.
- the determination unit 104 can respectively detect the brightness value of each pixel of the first test image and the second test image. After that, the determination unit 104 calculates an average of brightness values of all pixels of the first test image to obtain the brightness value associated with the first test image, i.e. taking the calculated average value as the brightness value associated with the first test image. The determination unit 104 calculates an average of brightness values of all pixels of the second test image to obtain the brightness value associated with the second test image, i.e. taking the calculated average value as the brightness value associated with the second test image.
- Step S 306 the determination unit 104 calculates an image difference value between the brightness value associated with the first test image and the brightness value associated with the second test image. For example, the determination unit 104 can subtract the brightness value associated with the second test image from the brightness value associated with the first test image, so as to calculate a difference between the brightness value associated with the first test image and the brightness value associated with the second test image. In this situation, the calculated result is the image difference value between the brightness value associated with the first test image and the brightness value associated with the second test image. The calculated image difference value can show a variation between the first test image and the second test image.
- the determination unit 104 can calculate an absolute difference between the brightness value associated with the first test image and the brightness value associated with the second test image, and the calculated result is the image difference value between the brightness value associated with the first test image and the brightness value associated with the second test image.
- Step S 308 the determination unit 104 can compare the image difference value calculated in Step S 306 with a threshold to generate a determination result. For example, when the image difference value is greater than the threshold, this means that the user is in the shooting range of the image capturing device 10 and the user is using the image capturing device 10 . As a result, the determination result indicates to output the images generated by the image capturing device 10 .
- the output unit 106 accordingly outputs the images generated by the image capturing device 10 (Step S 310 ).
- the determination result indicates not to output the images generated by the image capturing device 10 .
- the output unit 106 will not output any image generated by the image capturing device 10 , or the output unit 106 controls a display device to display a black image (Step S 312 ).
- the present invention can determine whether to output the images generated by the image capturing device 10 according to the variation of brightness of the images, and effectively improve the security of image data access.
- the relative operations performed by the determination unit 104 in steps S 304 , S 306 and S 308 can also be implemented by an auto exposure module of an image signal processor.
- FIG. 4 is a schematic diagram of a procedure 40 according to another embodiment of the present invention.
- the procedure 40 includes the following steps:
- Step S 400 Start.
- Step S 402 Capture the first test image and the second test image.
- Step S 404 Calculate the color value associated with the first test image and the color value associated with the second test image.
- Step S 406 Calculate the image difference value between the color value associated with the first test image and the color value associated with the second test image.
- Step S 408 Determine whether the image difference value is greater than the threshold; if yes, perform Step S 410 ; if no, perform Step S 412 .
- Step S 410 Output the image.
- Step S 412 Not output the image.
- Step S 402 when the image capturing device 10 receives image access requests or before the image capturing device 10 is ready to output the images.
- the image capturing device 10 enters a test mode, and the image sensing unit 102 captures a first test image at a first time point and captures a second test image at a second time point.
- the determination unit 104 calculates the color value associated with the first test image and the color value associated with the second test image.
- the color value of the first test image can be an average of color values of pixels of at least a portion of the first test image.
- the color value of the second test image can be an average of color values of pixels of at least a portion of the second test image.
- the color value can be at least a color component value of three-primary colors or component value of other colors.
- the determination unit 104 respectively detects the color value of each pixel of at least a portion of the first test image, and calculates the average of color values of all pixels of the at least a portion of the first test image, so that the calculated result is the color value associated with the first test image.
- the determination unit 104 respectively detects the pixel color value of each pixel of at least a portion of the second test image, and calculates the average of color values of all pixels of the at least a portion of the second test image, so that the calculated result is the color value associated with the second test image.
- the determination unit 104 can respectively detect blue component value of each pixel of the first test image and the second test image. After that, the determination unit 104 calculates the average of blue component values of all pixels of the first test image to obtain the color value associated with the first test image, i.e. taking the calculated average value as the color value associated with the first test image. The determination unit 104 calculates the average of blue component values of all pixels of the second test image to obtain the color value associated with the second test image, i.e. taking the calculated average value as the color value associated with the second test image.
- Step S 406 the determination unit 104 calculates an image difference value between the color value associated with the first test image and the color value associated with the second test image.
- the determination unit 104 can calculate a difference or an absolute difference between the color value associated with the first test image and the color value associated with the second test image.
- the calculated result is the image difference value between the color value associated with the first test image and the color value associated with the second test image.
- the calculated image difference value can show a color variation between the first test image and the second test image.
- Step S 408 the determination unit 104 can compare the image difference value calculated in Step S 406 with a threshold to generate a determination result.
- a threshold For example, assume that the blue component value is taken as a calculation basis in Step S 404 .
- the determination result indicates to output the images generated by the image capturing device 10 .
- the output unit 106 accordingly outputs the images generated by the image capturing device 10 (Step S 410 ).
- the image difference value is smaller than or equal to the threshold, this means that no blue or similar color object is approaching the shooting range of the image capturing device 10 .
- the determination result indicates not to output the images generated by the image capturing device 10 .
- the output unit 106 will not output any image generated by the image capturing device 10 , or the output unit 106 controls a display device to display a black image (Step S 412 ). That is, the present invention can take colors as a key of image output control. For example, when the blue component value is used as the calculation basis and the user takes a blue card in front of lens of the image sensing unit 102 , the output unit 106 will output the images captured by the image capturing device 10 after performing the steps of the procedure 40 . In other words, the present invention can determine whether to output the images generated by the image capturing device 10 according to the variation of the color value of the images, and effectively improve the security of image data access.
- relative operations performed by the determination unit 104 in steps S 406 , S 406 and Step S 408 can also be implemented by an auto white balance module of an image signal processor.
- FIG. 5 is a schematic diagram of a procedure 50 according to another embodiment of the present invention.
- the procedure 50 includes the following steps:
- Step S 500 Start.
- Step S 502 Capture the first test image and the second test image.
- Step S 504 Calculate the focal distance associated with the first test image and the focal distance associated with the second test image.
- Step S 506 Calculate the image difference value between the focal distance associated with the first test image and the focal distance associated with the second test image.
- Step S 508 Determine whether the image difference value is greater than the threshold; if yes, perform Step S 510 ; if no, perform Step S 512 .
- Step S 510 Output the image.
- Step S 512 Not output the image.
- Step S 502 when the image capturing device 10 receives image access requests or before the image capturing device 10 is ready to output the images.
- the image capturing device 10 enters a test mode, and the image sensing unit 102 captures a first test image at a first time point and captures a second test image at a second time point.
- the determination unit 104 calculates the focal distance associated with the first test image and the focal distance associated with the second test image.
- the focal distance of the first test image can be the focal distance of at least a portion of the first test image.
- the focal distance of the second test image can be the focal distance of at least a portion of the second test image.
- a first region of the images captured by the image sensing unit 102 is taken as a focal distance evaluation region.
- the determination unit 104 respectively calculates a first focal distance corresponding to the first region of the first test image and a second focal distance corresponding to the first region of the second test image.
- the whole region of the images captured by the image sensing unit 102 can be taken as the focal distance evaluation region.
- the determination unit 104 can respectively calculate the first focal distance corresponding to the first test image and the second focal distance corresponding to the second test image.
- Step S 506 the determination unit 104 calculates an image difference value between the first focal distance associated with the first test image and the second focal distance associated with the second test image.
- the determination unit 104 can calculate a difference between the first focal distance associated with the first test image and the second focal distance associated with the second test image.
- the calculated result is the image difference value between the first focal distance associated with the first test image and the second focal distance associated with the second test image.
- the calculated image difference value can show a variation of an object's movement in the first test image and the second test image.
- the determination unit 104 can calculate the absolute difference between the first focal distance associated with the first test image and the second focal distance associated with the second test image, and the calculated result is the image difference value between the focal distance associated with the first test image and the focal distance associated with the second test image.
- Step S 508 the determination unit 104 can compare the image difference value calculated in Step S 506 with a threshold to generate a determination result. For example, when the image difference value is greater than the threshold, this means that a user is moving, so that the determination result indicates to output the images generated by the image capturing device 10 .
- the output unit 106 accordingly outputs the images generated by the image capturing device 10 (Step S 510 ).
- the image difference value is smaller than or equal to the threshold, this means that no user exists, so that the determination result indicates not to output the images generated by the image capturing device 10 . And the output unit 106 will not output any image generated by the image capturing device 10 (Step S 512 ).
- the invention can automatically recognize whether the object is moving in the image, and allow image output when detecting movement variation of the object. That is, the present invention can determine whether to output the images generated by the image capturing device 10 according to the focal distance variation of the images, and effectively improve the security of image data access.
- relative operations performed by the determination unit 104 in steps S 506 , S 506 and S 508 can also be implemented by an auto focus balance module of an image signal processor.
- the captured images are directly transmitted to the display device or the external device for preview or storage after capturing images.
- the image capturing device 10 of the present invention can determine whether to output the images captured by the image capturing device 10 according to the image feature values of the captured images at different time points. That is, the images generated by the image capturing device 10 of the present invention can be outputted only after determination and authentication. As such, the present invention can effectively prevent hackers from stealing or peeping the images generated by the image capturing device 10 via remote control and thus enhancing the security of image data access.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
Abstract
An image capturing device is provided. The image capturing device includes an image sensing circuit, a determination circuit and an output circuit. The image sensing circuit is utilized for capturing a first test image at a first time point and capturing a second test image at a second time point. The determination circuit is utilized for calculating a first image feature value associated with a color value of the first test image and a second image feature value associated with a color value of the second test image, calculating a difference value between the first image feature value and the second image feature value and comparing the difference value with a threshold value to generate a determination result. The output circuit is utilized for outputting the images generated by the image capturing device according to the determination result.
Description
- This is a continuation application of U.S. patent application Ser. No. 15/603,430, filed on May 23, 2017 and entitled “IMAGE OUTPUT METHOD AND IMAGE CAPTURING DEVICE”, which is included in its entirety herein by reference.
- The present invention relates to an image capturing device, and more particularly, to an image capturing device capable of improving the security of image data.
- With the development of technology and progress of industry, digital cameras have been widely utilized in all kinds of electronic devices, such as laptops, tablets, desktops, mobile phones, wearable devices, smart TVs and so on. Since the electronic devices have ability to connect to the Internet, users utilize the digital cameras mounted on the electronic devices to take photos and share the photos to other electronic devices via the Internet. However, the electronic devices with the ability to connect to the Internet are easy to be invaded by hackers via remote control, so as to manipulate the cameras, peep or steal the images taken by the camera, thus threatening the privacy and security of the users. Therefore, there is a need for improvement.
- Therefore, the present invention primarily provides an image output method and an image capturing device, which is capable of improving the security of image data to solve the above mentioned problems.
- The present invention discloses an image capturing device, comprising: an image sensing circuit, for capturing a first test image at a first time point and capturing a second test image at a second time point; a determination circuit, for calculating a first image feature value of the first test image and a second image feature value of the second test image, calculating a difference value between the first image feature value and the second image feature value, and comparing the difference value with a threshold value to generate a determination result, wherein the first image feature value of the first test image is associated with a color value of the first test image and the second image feature value of the second test image is associated with a color value of the second test image; and an output circuit, for outputting the images generated by the image capturing device according to the determination result; wherein the determination result indicates outputting the images generated by the image capturing device when the difference value is greater than the threshold value, and the output circuit accordingly outputs the images generated by the image capturing device.
- The present invention further discloses an image capturing device, comprising: an image sensing circuit, for capturing a first test image at a first time point and capturing a second test image at a second time point; a determination circuit, for calculating a first image feature value of the first test image and a second image feature value of the second test image, calculating a difference value between the first image feature value and the second image feature value, and comparing the difference value with a threshold value to generate a determination result, wherein the first image feature value of the first test image is associated with a brightness value of the first test image and the second image feature value of the second test image is associated with a brightness value of the second test image; and an output circuit, for outputting the images generated by the image capturing device according to the determination result; wherein the determination result indicates outputting the images generated by the image capturing device when the difference value is greater than the threshold value, and the output circuit accordingly outputs the images generated by the image capturing device.
- The present invention further discloses an image capturing device, comprising: an image sensing circuit, for capturing a first test image at a first time point and capturing a second test image at a second time point; a determination circuit, for calculating a first image feature value of the first test image and a second image feature value of the second test image, calculating a difference value between the first image feature value and the second image feature value, and comparing the difference value with a threshold value to generate a determination result, wherein the first image feature value of the first test image is associated with a focal distance of the first test image and the second image feature value of the second test image is associated with a focal distance of the second test image; and an output circuit, for outputting the images generated by the image capturing device according to the determination result; wherein the determination result indicates outputting the images generated by the image capturing device when the difference value is greater than the threshold value, and the output circuit accordingly outputs the images generated by the image capturing device.
- These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
-
FIG. 1 is a schematic diagram illustrating an image capturing device according to an embodiment of the present invention. -
FIG. 2 is a schematic diagram illustrating a procedure according to an embodiment of the present invention. -
FIGS. 3-5 are schematic diagrams illustrating different procedures according to alternative embodiments of the present invention. - Please refer to
FIG. 1 , which is a schematic diagram illustrating an image capturingdevice 10 according to an embodiment of the present invention. The image capturingdevice 10 can be utilized in all kinds of electronic devices, such as notebooks, tablets, desktops, mobile phones, wearable devices, smart TVs, but not limited thereto. The image capturingdevice 10 includes animage sensing unit 102, adetermination unit 104 and anoutput unit 106. - The
image sensing unit 102 is utilized for capturing images. Theimage sensing unit 102 can be a charge-coupled device (CCD) sensor, a complementary metal oxide semiconductor (CMOS) sensor, or other image sensors. The image captured by theimage sensing unit 102 can be raw images. The image captured by theimage sensing unit 102 can be outputted directly to other external devices. The image captured by theimage sensing unit 102 can also be provided to an image signal processor (ISP) of theimage capturing device 10, such that the image signal processor handles corresponding image processing procedure of the captured image and outputs for consequent application. For example, image processing procedure executed by the image signal processor can be image format transformation, auto white balance (AWB), auto exposure (AE), auto focus (AF), contrast adjustment, saturation adjustment, noise elimination, interpolation, edge enhancement and the like, but not limited thereto. Theimage sensing unit 102 can capture images consecutively. For example, theimage sensing unit 102 captures test images in a test mode and captures relative images in a normal mode. In the test mode, theimage sensing unit 102 captures a first test image at a first time point and a second test image at a second time point. - The
determination unit 104 is used for calculating a first image feature value of a first test image and a second image feature value of a second test image. And thedetermination unit 104 determines whether to output images generated by theimage capturing device 10 according to the first image feature value of the first test image and the second image feature value of the second test image, and accordingly generates a determination result. Theoutput unit 106 outputs the images generated by theimage capturing device 10 according to the determination result generated by thedetermination unit 104. In an embodiment, at least one of thedetermination unit 104 and theoutput unit 106 can be implemented by an image signal processor. - In brief, the
image capturing device 10 of the present invention can determine whether to output the images generated by theimage capturing device 10 according to image feature values of the captured images at different time points. That is, after determination and authentication of thedetermination unit 104 and theoutput unit 106, the images generated by theimage capturing device 10 can be outputted, so as to provide a security control procedure of image data access. As such, the invention can effectively prevent hackers from stealing or peeping the images generated by theimage capturing device 10 via remote control, and therefore, improving security of image data. - For an illustration of the operations of the
image capturing device 10, please refer toFIG. 2 , which is a schematic diagram illustrating aprocedure 20 according to an embodiment of the present invention. Theprocedure 20 includes the following steps: - Step S200: Start.
- Step S202: Capture the first test image at the first time point and capture the second test image at the second time point.
- Step S204: Calculate the first image feature value of the first test image and the second image feature value of the second test image.
- Step S206: Determine whether to output the image generated by the image capturing device according to the first image feature value of the first test image and the second image feature value of the second test image.
- Step S208: End.
- According to the
procedure 20, the present invention can control whether to output the images based on the images captured by theimage sensing unit 102 for improving the security of the image data. When theimage capturing device 10 receives image access requests or before theimage capturing device 10 is ready to output the images, first, in Step S202, theimage capturing device 10 enters a test mode, and theimage sensing unit 102 captures a first test image at a first time point and captures a second test image at a second time point. For example, the second time point is after the first time point. A time interval between the first time point and the second time point can be determined according to system requirements. For example, the time interval between the first time point and the second time point is 30 microseconds (ms). For another example, the time interval between the first time point and the second time point is 5 seconds(s). - In Step S204, the
determination unit 104 calculates a first image feature value of the first test image and a second image feature value of the second test image. For example, the first image feature value of the first test image can be a brightness value, a color value, a focal distance (focal length) or any other image feature values associated with the first test image. The second image feature value of the second test image can be a brightness value, a color value, a focal distance or any other image feature values associated with the second test image. - In Step S206, the
determination unit 104 determines whether to output the images generated by theimage capturing device 10 according to the first image feature value of the first test image and the second image feature value of the second test image, and accordingly generates a determination result. Thedetermination unit 104 can calculate an image difference value between the first image feature value and the second image feature value. For instance, thedetermination unit 104 can subtract the second image feature value from the first image feature value to calculate a difference value between the first image feature value and the second image feature value. As such, the calculated difference value is the image difference value between the first image feature value and the second image feature value. For another example, thedetermination unit 104 can calculate an absolute difference between the first image feature value and the second image feature value, and the absolute difference is the image difference value between the first image feature value and the second image feature value. As such, the image difference value represents a variation between the first test image at the first time point and the second test image at the second time point. Then, thedetermination unit 104 compares the image difference value with a threshold and generates the determination result accordingly. For example, when the image difference value is greater than the threshold, the determination result indicates to output the images generated by theimage capturing device 10. When the image difference value is equal to or smaller than the threshold, the determination result indicates not to output the images generated by theimage capturing device 10. - Furthermore, the
output unit 106 outputs the images generated by theimage capturing device 10 according to the determination result determined by thedetermination unit 104. In an embodiment, when the determination result indicates to output the images generated by theimage capturing device 10, theoutput unit 106 outputs the images generated by theimage capturing device 10 to other external devices or other devices, wherein the images generated by theimage capturing device 10 includes at least one of test images captured by theimage sensing unit 102 in the test mode, images captured by theimage sensing unit 102 in the normal mode, images processed by the image processor. In other words, theoutput unit 106 can output the test images captured by theimage sensing unit 102 of theimage capturing device 10 in the test mode and/or output the images captured by theimage sensing unit 102 of theimage capturing device 10 in the normal mode. Or, theoutput unit 106 can output the images processed by the image processor of theimage capturing device 10. Therefore, after the determination and authentication procedure of theprocedure 20, if the determination result indicates to output the images generated by theimage capturing device 10, theimage capturing device 10 performs image output procedure normally. - In an embodiment, when the determination result indicates not to output the images generated by the
image capturing device 10, theoutput unit 106 will not perform any operation. In such a situation, the images generated by theimage capturing device 10 will not be outputted and provided to other devices. For example, when the determination result indicates not to output the images generated by theimage capturing device 10, theoutput unit 106 can output (or control a display device to display) a monochrome image, for instance, a black image or a white image, to indicate not allowable to access the images generated by theimage capturing device 10. That is, in this situation, if the determination result indicates not to output the images generated by theimage capturing device 10, theimage capturing device 10 will not be allowed to perform the image output procedure normally. - On the other hand, after the
determination unit 104 generates the corresponding determination result, theimage capturing device 10 can inform a user whether the images generated by theimage capturing device 10 can be outputted (or not) by an alarming unit (not illustrated in the figure). For example, the alarming unit can generate alarm signals (via all kinds of ways, e.g., words, voices, lights or vibration) to inform the user. - Please refer to
FIG. 3 , which is a schematic diagram illustrating aprocedure 30 according to an embodiment of the present invention. Theprocedure 30 includes the following steps: - Step S300: Start.
- Step S302: Capture the first test image and the second test image.
- Step S304: Calculate the brightness value associated with the first test image and the brightness value associated with the second test image.
- Step S306: Calculate the image difference value between the brightness value associated with the first test image and the brightness value associated with the second test image.
- Step S308: Determine whether the image difference value is greater than the threshold; if yes, perform the Step S310; if no, perform the Step S312.
- Step S310: Output the image.
- Step S312: Not output the image.
- According to the
procedure 30, in Step S302, when theimage capturing device 10 receives image access requests or before theimage capturing device 10 is ready to output the images, theimage capturing device 10 enters a test mode, and theimage sensing unit 102 captures a first test image at a first time point and captures a second test image at a second time point. - In Step S304, the
determination unit 104 calculates the brightness value associated with the first test image and the brightness value associated with the second test image. The pixel brightness value of the first test image can be an average brightness value of pixels of at least a portion of the first test image. The brightness value of the second test image can be an average brightness value of pixels of at least a portion of the second test image. In detail, thedetermination unit 104 respectively detects the pixel brightness value of each pixel of at least a portion of the first test image, and calculates an average of brightness values of all pixels of the at least a portion of the first test image, and the calculated result is the brightness value associated with the first test image. Similarly, for the second test image, thedetermination unit 104 respectively detects the pixel brightness value of each pixel of at least a portion of the second test image and calculates an average of brightness values of all pixels of at least a portion of the second test image, and the calculated result is the brightness value associated with the second test image. - For example, assume that a first region of the images captured by the
image sensing unit 102 usually corresponds to a head region of the user when using theimage capturing device 10. Thedetermination unit 104 can respectively detect the pixel brightness value of each pixel of the first region of the first test image and the second test image. And then, thedetermination unit 104 calculates the average pixel brightness value of all pixels of the first region of the first test image to obtain the brightness value associated with the first test image, and calculates the average pixel brightness value of all pixels of the second region of the second test image to obtain the brightness value associated with the second test image. - For example, calculation of the whole region of the image is also available. The
determination unit 104 can respectively detect the brightness value of each pixel of the first test image and the second test image. After that, thedetermination unit 104 calculates an average of brightness values of all pixels of the first test image to obtain the brightness value associated with the first test image, i.e. taking the calculated average value as the brightness value associated with the first test image. Thedetermination unit 104 calculates an average of brightness values of all pixels of the second test image to obtain the brightness value associated with the second test image, i.e. taking the calculated average value as the brightness value associated with the second test image. - In Step S306, the
determination unit 104 calculates an image difference value between the brightness value associated with the first test image and the brightness value associated with the second test image. For example, thedetermination unit 104 can subtract the brightness value associated with the second test image from the brightness value associated with the first test image, so as to calculate a difference between the brightness value associated with the first test image and the brightness value associated with the second test image. In this situation, the calculated result is the image difference value between the brightness value associated with the first test image and the brightness value associated with the second test image. The calculated image difference value can show a variation between the first test image and the second test image. For example, thedetermination unit 104 can calculate an absolute difference between the brightness value associated with the first test image and the brightness value associated with the second test image, and the calculated result is the image difference value between the brightness value associated with the first test image and the brightness value associated with the second test image. - Since an object remains stationary in the images, neighboring frames usually do not exhibit any significant variation, so that the image difference value is usually small. When the object is moving in the images, the image difference value will be larger. Moreover, the user's body usually moves slightly or slenderly and is not completely stationary when the user is in shooting range of the
image capturing device 10. Therefore, in Step S308, thedetermination unit 104 can compare the image difference value calculated in Step S306 with a threshold to generate a determination result. For example, when the image difference value is greater than the threshold, this means that the user is in the shooting range of theimage capturing device 10 and the user is using theimage capturing device 10. As a result, the determination result indicates to output the images generated by theimage capturing device 10. Theoutput unit 106 accordingly outputs the images generated by the image capturing device 10 (Step S310). On the other hand, when the image difference value is smaller than or equal to the threshold, this means that the user is not in the shooting range of theimage capturing device 10. As such, the determination result indicates not to output the images generated by theimage capturing device 10. Theoutput unit 106 will not output any image generated by theimage capturing device 10, or theoutput unit 106 controls a display device to display a black image (Step S312). In other words, the present invention can determine whether to output the images generated by theimage capturing device 10 according to the variation of brightness of the images, and effectively improve the security of image data access. - Besides, in an embodiment, the relative operations performed by the
determination unit 104 in steps S304, S306 and S308, can also be implemented by an auto exposure module of an image signal processor. - Please refer to
FIG. 4 , which is a schematic diagram of aprocedure 40 according to another embodiment of the present invention. Theprocedure 40 includes the following steps: - Step S400: Start.
- Step S402: Capture the first test image and the second test image.
- Step S404: Calculate the color value associated with the first test image and the color value associated with the second test image.
- Step S406: Calculate the image difference value between the color value associated with the first test image and the color value associated with the second test image.
- Step S408: Determine whether the image difference value is greater than the threshold; if yes, perform Step S410; if no, perform Step S412.
- Step S410: Output the image.
- Step S412: Not output the image.
- According to the
procedure 40, in Step S402, when theimage capturing device 10 receives image access requests or before theimage capturing device 10 is ready to output the images. Theimage capturing device 10 enters a test mode, and theimage sensing unit 102 captures a first test image at a first time point and captures a second test image at a second time point. - In Step S404, the
determination unit 104 calculates the color value associated with the first test image and the color value associated with the second test image. The color value of the first test image can be an average of color values of pixels of at least a portion of the first test image. The color value of the second test image can be an average of color values of pixels of at least a portion of the second test image. The color value can be at least a color component value of three-primary colors or component value of other colors. In detail, thedetermination unit 104 respectively detects the color value of each pixel of at least a portion of the first test image, and calculates the average of color values of all pixels of the at least a portion of the first test image, so that the calculated result is the color value associated with the first test image. Similarly, for the second test image, thedetermination unit 104 respectively detects the pixel color value of each pixel of at least a portion of the second test image, and calculates the average of color values of all pixels of the at least a portion of the second test image, so that the calculated result is the color value associated with the second test image. - For example, calculation of blue component value of the whole region of the image is also available. The
determination unit 104 can respectively detect blue component value of each pixel of the first test image and the second test image. After that, thedetermination unit 104 calculates the average of blue component values of all pixels of the first test image to obtain the color value associated with the first test image, i.e. taking the calculated average value as the color value associated with the first test image. Thedetermination unit 104 calculates the average of blue component values of all pixels of the second test image to obtain the color value associated with the second test image, i.e. taking the calculated average value as the color value associated with the second test image. - In Step S406, the
determination unit 104 calculates an image difference value between the color value associated with the first test image and the color value associated with the second test image. For example, thedetermination unit 104 can calculate a difference or an absolute difference between the color value associated with the first test image and the color value associated with the second test image. In this situation, the calculated result is the image difference value between the color value associated with the first test image and the color value associated with the second test image. The calculated image difference value can show a color variation between the first test image and the second test image. - Therefore, in Step S408, the
determination unit 104 can compare the image difference value calculated in Step S406 with a threshold to generate a determination result. For example, assume that the blue component value is taken as a calculation basis in Step S404. When the image difference value is greater than the threshold, this means that a blue or similar color object is approaching the shooting range of theimage capturing device 10. As such, the determination result indicates to output the images generated by theimage capturing device 10. Theoutput unit 106 accordingly outputs the images generated by the image capturing device 10 (Step S410). On the other hand, when the image difference value is smaller than or equal to the threshold, this means that no blue or similar color object is approaching the shooting range of theimage capturing device 10. As such, the determination result indicates not to output the images generated by theimage capturing device 10. Theoutput unit 106 will not output any image generated by theimage capturing device 10, or theoutput unit 106 controls a display device to display a black image (Step S412). That is, the present invention can take colors as a key of image output control. For example, when the blue component value is used as the calculation basis and the user takes a blue card in front of lens of theimage sensing unit 102, theoutput unit 106 will output the images captured by theimage capturing device 10 after performing the steps of theprocedure 40. In other words, the present invention can determine whether to output the images generated by theimage capturing device 10 according to the variation of the color value of the images, and effectively improve the security of image data access. - In an embodiment, relative operations performed by the
determination unit 104 in steps S406, S406 and Step S408 can also be implemented by an auto white balance module of an image signal processor. - Please refer to
FIG. 5 , which is a schematic diagram of aprocedure 50 according to another embodiment of the present invention. Theprocedure 50 includes the following steps: - Step S500: Start.
- Step S502: Capture the first test image and the second test image.
- Step S504: Calculate the focal distance associated with the first test image and the focal distance associated with the second test image.
- Step S506: Calculate the image difference value between the focal distance associated with the first test image and the focal distance associated with the second test image.
- Step S508: Determine whether the image difference value is greater than the threshold; if yes, perform Step S510; if no, perform Step S512.
- Step S510: Output the image.
- Step S512: Not output the image.
- According to the
procedure 50, in Step S502, when theimage capturing device 10 receives image access requests or before theimage capturing device 10 is ready to output the images. Theimage capturing device 10 enters a test mode, and theimage sensing unit 102 captures a first test image at a first time point and captures a second test image at a second time point. - In Step S504, the
determination unit 104 calculates the focal distance associated with the first test image and the focal distance associated with the second test image. The focal distance of the first test image can be the focal distance of at least a portion of the first test image. The focal distance of the second test image can be the focal distance of at least a portion of the second test image. For example, assume that a first region of the images captured by theimage sensing unit 102 is taken as a focal distance evaluation region. Thedetermination unit 104 respectively calculates a first focal distance corresponding to the first region of the first test image and a second focal distance corresponding to the first region of the second test image. For example, assume that the whole region of the images captured by theimage sensing unit 102 can be taken as the focal distance evaluation region. Thedetermination unit 104 can respectively calculate the first focal distance corresponding to the first test image and the second focal distance corresponding to the second test image. - In Step S506, the
determination unit 104 calculates an image difference value between the first focal distance associated with the first test image and the second focal distance associated with the second test image. For example, thedetermination unit 104 can calculate a difference between the first focal distance associated with the first test image and the second focal distance associated with the second test image. In this situation, the calculated result is the image difference value between the first focal distance associated with the first test image and the second focal distance associated with the second test image. The calculated image difference value can show a variation of an object's movement in the first test image and the second test image. For example, thedetermination unit 104 can calculate the absolute difference between the first focal distance associated with the first test image and the second focal distance associated with the second test image, and the calculated result is the image difference value between the focal distance associated with the first test image and the focal distance associated with the second test image. - In Step S508, the
determination unit 104 can compare the image difference value calculated in Step S506 with a threshold to generate a determination result. For example, when the image difference value is greater than the threshold, this means that a user is moving, so that the determination result indicates to output the images generated by theimage capturing device 10. Theoutput unit 106 accordingly outputs the images generated by the image capturing device 10 (Step S510). On the other hand, when the image difference value is smaller than or equal to the threshold, this means that no user exists, so that the determination result indicates not to output the images generated by theimage capturing device 10. And theoutput unit 106 will not output any image generated by the image capturing device 10 (Step S512). In other words, by the above mentioned method, the invention can automatically recognize whether the object is moving in the image, and allow image output when detecting movement variation of the object. That is, the present invention can determine whether to output the images generated by theimage capturing device 10 according to the focal distance variation of the images, and effectively improve the security of image data access. - Besides, in an embodiment, relative operations performed by the
determination unit 104 in steps S506, S506 and S508 can also be implemented by an auto focus balance module of an image signal processor. - In summary, in the traditional method, the captured images are directly transmitted to the display device or the external device for preview or storage after capturing images. In comparison, the
image capturing device 10 of the present invention can determine whether to output the images captured by theimage capturing device 10 according to the image feature values of the captured images at different time points. That is, the images generated by theimage capturing device 10 of the present invention can be outputted only after determination and authentication. As such, the present invention can effectively prevent hackers from stealing or peeping the images generated by theimage capturing device 10 via remote control and thus enhancing the security of image data access. - Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Claims (4)
1. An image capturing device, comprising:
an image sensing circuit, for capturing a first test image at a first time point and capturing a second test image at a second time point;
a determination circuit, for calculating a first image feature value of the first test image and a second image feature value of the second test image, calculating a difference value between the first image feature value and the second image feature value, and comparing the difference value with a threshold value to generate a determination result, wherein the first image feature value of the first test image is associated with a color value of the first test image and the second image feature value of the second test image is associated with a color value of the second test image; and
an output circuit, for outputting the images generated by the image capturing device according to the determination result;
wherein the determination result indicates outputting the images generated by the image capturing device when the difference value is greater than the threshold value, and the output circuit accordingly outputs the images generated by the image capturing device.
2. The image capturing device of claim 1 , wherein the difference value is an absolute difference between the first image feature value and the second image feature value.
3. An image capturing device, comprising:
an image sensing circuit, for capturing a first test image at a first time point and capturing a second test image at a second time point;
a determination circuit, for calculating a first image feature value of the first test image and a second image feature value of the second test image, calculating a difference value between the first image feature value and the second image feature value, and comparing the difference value with a threshold value to generate a determination result, wherein the first image feature value of the first test image is associated with a brightness value of the first test image and the second image feature value of the second test image is associated with a brightness value of the second test image; and
an output circuit, for outputting the images generated by the image capturing device according to the determination result;
wherein the determination result indicates outputting the images generated by the image capturing device when the difference value is greater than the threshold value, and the output circuit accordingly outputs the images generated by the image capturing device.
4. An image capturing device, comprising:
an image sensing circuit, for capturing a first test image at a first time point and capturing a second test image at a second time point;
a determination circuit, for calculating a first image feature value of the first test image and a second image feature value of the second test image, calculating a difference value between the first image feature value and the second image feature value, and comparing the difference value with a threshold value to generate a determination result, wherein the first image feature value of the first test image is associated with a focal distance of the first test image and the second image feature value of the second test image is associated with a focal distance of the second test image; and
an output circuit, for outputting the images generated by the image capturing device according to the determination result;
wherein the determination result indicates outputting the images generated by the image capturing device when the difference value is greater than the threshold value, and the output circuit accordingly outputs the images generated by the image capturing device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/354,193 US20190215453A1 (en) | 2017-03-03 | 2019-03-15 | Image capturing device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW106106926 | 2017-03-03 | ||
TW106106926A TWI653885B (en) | 2017-03-03 | 2017-03-03 | Image output method and image capturing device |
US15/603,430 US10306138B2 (en) | 2017-03-03 | 2017-05-23 | Image output method and image capturing device |
US16/354,193 US20190215453A1 (en) | 2017-03-03 | 2019-03-15 | Image capturing device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/603,430 Continuation US10306138B2 (en) | 2017-03-03 | 2017-05-23 | Image output method and image capturing device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190215453A1 true US20190215453A1 (en) | 2019-07-11 |
Family
ID=59686722
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/603,430 Active US10306138B2 (en) | 2017-03-03 | 2017-05-23 | Image output method and image capturing device |
US16/354,193 Abandoned US20190215453A1 (en) | 2017-03-03 | 2019-03-15 | Image capturing device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/603,430 Active US10306138B2 (en) | 2017-03-03 | 2017-05-23 | Image output method and image capturing device |
Country Status (3)
Country | Link |
---|---|
US (2) | US10306138B2 (en) |
EP (1) | EP3370411B1 (en) |
TW (1) | TWI653885B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI653885B (en) * | 2017-03-03 | 2019-03-11 | 宏碁股份有限公司 | Image output method and image capturing device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080278588A1 (en) * | 2007-05-11 | 2008-11-13 | Michael Philip Greenberg | Devices, Systems, and Methods Regarding Camera Imaging |
US20150097993A1 (en) * | 2013-10-09 | 2015-04-09 | Canon Kabushiki Kaisha | Image processing apparatus, image pickup apparatus, image processing method, and non-transitory computer-readable storage medium |
US20160026900A1 (en) * | 2013-04-26 | 2016-01-28 | Olympus Corporation | Image processing device, information storage device, and image processing method |
US20160110629A1 (en) * | 2013-05-28 | 2016-04-21 | Bank Of America Corporation | Image overlay for duplicate image detection |
US20170024619A1 (en) * | 2015-07-22 | 2017-01-26 | Xerox Corporation | Video-based system and method for parking occupancy detection |
US20180255239A1 (en) * | 2017-03-03 | 2018-09-06 | Acer Incorporated | Image output method and image capturing device |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6930703B1 (en) | 2000-04-29 | 2005-08-16 | Hewlett-Packard Development Company, L.P. | Method and apparatus for automatically capturing a plurality of images during a pan |
TWI220480B (en) * | 2001-04-11 | 2004-08-21 | Cyberlink Corp | System and method for generating synchronous playback of the slides and the corresponding audio/video information during the presentation |
JP4394356B2 (en) * | 2003-02-07 | 2010-01-06 | Hoya株式会社 | Electronic endoscope device |
US8187174B2 (en) | 2007-01-22 | 2012-05-29 | Capso Vision, Inc. | Detection of when a capsule camera enters into or goes out of a human body and associated operations |
JP2008311775A (en) | 2007-06-12 | 2008-12-25 | Toshiba Corp | Information processor and camera input video image control method |
US20090171148A1 (en) * | 2007-12-27 | 2009-07-02 | Shih-Chieh Lu | Capsule endoscope system having a sensing and data discriminating device and discrimination method thereof |
JP4735693B2 (en) * | 2008-09-22 | 2011-07-27 | ソニー株式会社 | Image processing apparatus, imaging apparatus, image processing method, and program |
TWI334393B (en) | 2008-10-07 | 2010-12-11 | Ind Tech Res Inst | Image-based vehicle maneuvering assistant method and system |
US8422315B2 (en) * | 2010-07-06 | 2013-04-16 | Winbond Electronics Corp. | Memory chips and memory devices using the same |
KR20120047598A (en) * | 2010-11-04 | 2012-05-14 | 삼성전자주식회사 | Digital photographing apparatus and control method thereof |
TWI445396B (en) | 2010-11-22 | 2014-07-11 | Altek Corp | Electronic apparatus, image capturing apparatus and method thereof |
WO2014082566A1 (en) * | 2012-11-27 | 2014-06-05 | Novozymes A/S | Milling process |
TW201643777A (en) | 2015-06-12 | 2016-12-16 | 宏碁股份有限公司 | Image capture method and image capture apparatus |
SG10202108705SA (en) * | 2015-07-03 | 2021-09-29 | Applied Materials Inc | Process kit having tall deposition ring and deposition ring clamp |
-
2017
- 2017-03-03 TW TW106106926A patent/TWI653885B/en active
- 2017-05-23 US US15/603,430 patent/US10306138B2/en active Active
- 2017-07-18 EP EP17181840.4A patent/EP3370411B1/en active Active
-
2019
- 2019-03-15 US US16/354,193 patent/US20190215453A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080278588A1 (en) * | 2007-05-11 | 2008-11-13 | Michael Philip Greenberg | Devices, Systems, and Methods Regarding Camera Imaging |
US20160026900A1 (en) * | 2013-04-26 | 2016-01-28 | Olympus Corporation | Image processing device, information storage device, and image processing method |
US20160110629A1 (en) * | 2013-05-28 | 2016-04-21 | Bank Of America Corporation | Image overlay for duplicate image detection |
US20150097993A1 (en) * | 2013-10-09 | 2015-04-09 | Canon Kabushiki Kaisha | Image processing apparatus, image pickup apparatus, image processing method, and non-transitory computer-readable storage medium |
US20170024619A1 (en) * | 2015-07-22 | 2017-01-26 | Xerox Corporation | Video-based system and method for parking occupancy detection |
US20180255239A1 (en) * | 2017-03-03 | 2018-09-06 | Acer Incorporated | Image output method and image capturing device |
Also Published As
Publication number | Publication date |
---|---|
TW201834441A (en) | 2018-09-16 |
EP3370411A1 (en) | 2018-09-05 |
US20180255239A1 (en) | 2018-09-06 |
US10306138B2 (en) | 2019-05-28 |
TWI653885B (en) | 2019-03-11 |
EP3370411B1 (en) | 2021-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8149280B2 (en) | Face detection image processing device, camera device, image processing method, and program | |
US9357127B2 (en) | System for auto-HDR capture decision making | |
US20130230209A1 (en) | Image processing device, image processing method and computer-readable medium | |
US8009202B2 (en) | Device and method for capturing an image of a human face | |
US10013632B2 (en) | Object tracking apparatus, control method therefor and storage medium | |
US10764550B2 (en) | Image processing apparatus, image processing method, and storage medium | |
WO2023040725A1 (en) | White balance processing method and electronic device | |
TWI451184B (en) | Focus adjusting method and image capture device thereof | |
US9215459B2 (en) | Image processing apparatus, image capturing apparatus, and program | |
JP2008017259A (en) | Image recognition camera | |
JP6876068B2 (en) | Saturated pixel detection methods, detection devices, color correction devices, electronic devices and storage media | |
JP6904788B2 (en) | Image processing equipment, image processing methods, and programs | |
US20140068514A1 (en) | Display controlling apparatus and display controlling method | |
US20190215453A1 (en) | Image capturing device | |
US10304171B2 (en) | Projection control device, projection control method, and non-transitory storage medium | |
US7835552B2 (en) | Image capturing apparatus and face area extraction method | |
JP5441669B2 (en) | Image processing apparatus and control method thereof | |
JP6450107B2 (en) | Image processing apparatus, image processing method, program, and storage medium | |
JP2016173777A (en) | Image processing apparatus | |
CN108632503B (en) | Image output method and image acquisition equipment | |
US11012631B2 (en) | Image capturing and processing device, electronic instrument, image capturing and processing method, and recording medium | |
US10742862B2 (en) | Information processing device, information processing method, and information processing system | |
US10306153B2 (en) | Imaging apparatus, image sensor, and image processor | |
US20100110226A1 (en) | Method and apparatus for detecting type of back light of an image | |
JP6469451B2 (en) | Image processing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |