WO2019144262A1 - Procédé et appareil de détection de tache, et dispositif électronique mobile - Google Patents

Procédé et appareil de détection de tache, et dispositif électronique mobile Download PDF

Info

Publication number
WO2019144262A1
WO2019144262A1 PCT/CN2018/073758 CN2018073758W WO2019144262A1 WO 2019144262 A1 WO2019144262 A1 WO 2019144262A1 CN 2018073758 W CN2018073758 W CN 2018073758W WO 2019144262 A1 WO2019144262 A1 WO 2019144262A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
frame captured
preview frame
preview
captured
Prior art date
Application number
PCT/CN2018/073758
Other languages
English (en)
Inventor
Peng Guo
Original Assignee
Sony Mobile Communications Inc.
Sony Mobile Communications (China) Co., Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Mobile Communications Inc., Sony Mobile Communications (China) Co., Ltd filed Critical Sony Mobile Communications Inc.
Priority to PCT/CN2018/073758 priority Critical patent/WO2019144262A1/fr
Publication of WO2019144262A1 publication Critical patent/WO2019144262A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • H04N23/811Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation by dust removal, e.g. from surfaces of the image sensor or processing of the image signal output by the electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light

Definitions

  • Embodiments of this disclosure relate to the field of mobile device technologies, and in particular to a smudge detection method and apparatus and a mobile electronic device.
  • mobile electronic devices such as a smart mobile phone, a tablet PC, etc.
  • a camera has become a standard feature in these devices.
  • a user may capture an interested or favorite photo or video at any time, and in order to improve users’experiences, configuration of dual cameras is more and more applied to mobile electronic devices.
  • a usual method is to use a soft cloth to clean the lens of the camera. Cleaning the lens (especially when cleaning is not necessary) each time the camera is used will result in wearing of the lens (affecting the lifetime of the lens) and also lowering user experience.
  • the camera application when the user starts up a camera application for a first time, the camera application will pop up a prompt box, prompting the user to clean the lens; and thereafter, the prompt box may appear randomly, prompting the user to clean the lens.
  • the prompting box pops up occasionally, the user may be disturbed when the lens is not actually dirty, and the user may miss the best time for capturing an image.
  • embodiments of this disclosure provide a smudge detection method and apparatus and a mobile electronic device, in which by detecting whether a lens of a camera is actually dirty, a user may be prompted when a lens is actually dirty, thereby preventing the user from being plagued due to being prompted when a lens of a camera is not actually dirty, and improving users’experiences.
  • a smudge detection method applicable to a mobile electronic device, the mobile electronic device including a first camera and a second camera; wherein, the method includes:
  • the calculating of the difference between the preview frames captured by the first camera and the second camera includes:
  • the method before calculating the difference, further includes: preprocessing the preview frame captured by the first camera and the preview frame captured by the second camera.
  • the preprocessing of the preview frame captured by the first camera and the preview frame captured by the second camera includes:
  • the preprocessing of the preview frame captured by the first camera and the preview frame captured by the second camera includes:
  • the calculating of the difference between the preview frames captured by the first camera and the second camera includes:
  • the method further includes:
  • the smudge detection method is periodically executed according to a set time interval.
  • the set time interval is preset or set according to a demand of a user.
  • the first camera and the second camera are both rear cameras of the mobile electronic device.
  • a processor included in a mobile electronic device the mobile electronic device additionally including a first camera and a second camera, wherein, the processor is configured to:
  • a mobile electronic device including a first camera, a second camera, a non-transitory computer readable medium (memory) , and a processor, wherein the memory stores instructions executed by the processor, and the processor is configured to perform the following steps when executing the instructions:
  • the calculating of the difference between the preview frames captured by the first camera and the second camera includes:
  • the processor is configured to perform the following step when executing the instructions:
  • the preprocessing of the preview frame captured by the first camera and the preview frame captured by the second camera includes:
  • the preprocessing of the preview frame captured by the first camera and the preview frame captured by the second camera includes:
  • the calculating of the difference between the preview frames captured by the first camera and the second camera includes:
  • the processor is configured to perform the following step when executing the instructions:
  • the processor is configured to periodically execute the instructions according to a set time interval.
  • the set time interval is preset or set according to a demand of a user.
  • the first camera and the second camera are both rear cameras of the mobile electronic device.
  • Advantages of the embodiments of this disclosure exist in that in the method of the embodiment of this disclosure, by detecting whether a lens of a camera is actually dirty, a user may be prompted when a lens is actually dirty, thereby preventing the user from being plagued due to being prompted when a lens of a camera is not actually dirty. And furthermore, the detection process in this disclosure is performed periodically in a background, or the time interval for detection is changed according to a setting of the user, or the detection function is disabled according to a setting of the user, hence, the user may be notified in advance when the lens is dirty before the user starts up the camera application to capture, thereby avoiding missing a best time for capturing.
  • FIG. 1 is a schematic diagram of one implementation of a smudge detection method of an embodiment of this disclosure
  • FIG. 2 is a schematic diagram of another implementation of a smudge detection method of the embodiment of this disclosure.
  • FIG. 3 is a schematic diagram of a smudge detection apparatus of an embodiment of this disclosure.
  • FIG. 4 is a schematic diagram of a mobile electronic device of an embodiment of this disclosure.
  • first and “second” , etc.
  • terms “first” , and “second” , etc. are used to differentiate different elements with respect to names, and do not indicate spatial arrangement or temporal orders of these elements, and these elements should not be limited by these terms.
  • Terms “and/or” include any one and all combinations of one or more relevantly listed terms.
  • Terms “contain” , “include” and “have” refer to existence of stated features, elements, components, or assemblies, but do not exclude existence or addition of one or more other features, elements, components, or assemblies.
  • single forms “a” , and “the” , etc. include plural forms, and should be understood as “a kind of” or “a type of” in a broad sense, but should not defined as a meaning of “one” ; and the term “the” should be understood as including both a single form and a plural form, except specified otherwise.
  • the term “according to” should be understood as “at least partially according to”
  • the term “based on” should be understood as “at least partially based on” , except specified otherwise.
  • the interchangeable terms “electronic equipment” and “electronic device” may include portable radio communication equipment or mobile electronic device.
  • portable radio communication equipment which hereinafter is referred to as a “mobile radio terminal” , “portable electronic device” , or “portable communication device” , comprises all apparatuses such as mobile telephones, pagers, communicators, electronic organizers, personal digital assistants (PDAs) , smart phones, media players, tablet PCs, portable communication devices, portable game devices, or the like.
  • PDAs personal digital assistants
  • a portable electronic device in the form of a mobile telephone (also referred to as “mobile phone” ) .
  • mobile telephone also referred to as “mobile phone”
  • the disclosure is not limited to the context of a mobile telephone and may relate to any type of appropriate mobile electronic device, examples of such type of mobile electronic device including a smart mobile phone, a tablet PC, a portable digital camera, a media player, a portable game device, a PDA, a computer, or the like.
  • unit may have a conventional meaning in the fields of electronics, electrical equipment and/or electronic equipment, such as an electrical and/or electronic circuit (s) , equipment, module (s) , processor (s) , and memory (ies) , logic solid and/or discrete equipment, computer programs or instructions for executing respective tasks, processes, operations, output and/or display functions, as described below.
  • a unit may be a processor and a set of instructions stored in a memory, or another processor and one or more instructions executed by equipment responsive to control of the processor.
  • An embodiment of this disclosure provides a smudge detection method performed by a mobile electronic device, the mobile electronic device including at least two cameras, which are referred to as a first camera and a second camera, for the sake of convenience of description.
  • Both the first camera and the second camera may, for example, use Bayer sensors to sense and record images, and may also use Mono sensors to sense and record images, or one camera uses a Bayer sensor and the other camera uses a Mono sensor to sense and record images, and this disclosure is not limited thereto.
  • FIG. 1 is a schematic diagram of one implementation of the smudge detection method of the embodiment of this disclosure. As shown in FIG. 1, the method includes:
  • step 101 acquiring at the same time preview frames captured by the first camera and the second camera are acquired respectively;
  • step 102 calculating a difference between the preview frames captured by the first camera and the second camera is calculated.
  • step 103 determining that a lens of at least one of the first camera and the second camera has a smudge when the calculated difference is greater than a preset threshold.
  • threshold values are used to determine when the difference between two preview frames is due to a dirty lens. That is, if the difference between the preview frames captured by the two cameras is greater than a preset threshold, it shows that a lens of at least one of the cameras is dirty, which obscures the captured preview frame and causes a larger difference between the preview frames.
  • “at the same time” means within certain number of milliseconds (not precisely at the same nanosecond) . For example, within 1 ms, 10 ms, etc.
  • step 101 of this embodiment preview frames are captured by the cameras, rather than real photos. Hence, the user will not be bothered by erroneous notices to clean the lens when the lens is not dirty. And furthermore, a manner of capturing the above preview frames by the cameras is not limited in this embodiment, and any existing methods for capturing preview frames may be applicable to step 101.
  • the difference between preview frames and real photos is mainly resolution and image format.
  • the image format of preview frames may be YUV, but the image format of real photos may be JPEG, furthermore, the resolution of preview frames is lower than that of the real photos, for example, the resolution of preview frames may be 720p or up to 1080p, and the resolution of real photos may be 3840*2160.
  • a preview frame captured by the first camera and a preview frame captured by the second camera may be first converted into binary images respectively, and then the binary image of the preview frame captured by the first camera is subtracted by the binary image of the preview frame captured by the second camera, so as to obtain the difference between the preview frame captured by the first camera and the preview frame captured by the second camera.
  • converting the preview frames into the binary images facilitates comparison of the two preview frames, and subtraction of the two binary images, a difference between the two binary images may be obtained.
  • the subtraction here may be performed pixel by pixel, that is, subtracting a pixel value of a pixel in one binary image by a pixel value of a corresponding pixel in the other binary image, so as to obtain a difference between the pixels.
  • the difference between the two binary images is obtained.
  • an absolute value is taken for difference between pixels, thereby obtaining the difference between the two binary images.
  • MSE Mean Squared Error
  • SSIM Structural Similarity Index
  • histograms of two preview frames may be calculated first, and then difference (also referred to as distance) between two histograms may be calculated, which is difference between the two preview frames.
  • difference also referred to as distance
  • any suitable method for measuring a difference between two images may be used.
  • the two captured preview frames may be preprocessed.
  • the preprocessing may be adjusting the two preview frames, such that the two preview frames have an identical perspective.
  • An identical perspective does not mean that every pixel of the two preview frames is exactly the same when the lenses are clean. That is, there may still be some differences between the perspective of the two preview frames, but the differences may be small or within a predefined range.
  • the two cameras may possibly be located at different positions of the mobile electronic device, such as being arranged at the back of the mobile electronic device side to side, referred to as rear cameras, or being horizontally arranged side to side, or being vertically arranged side to side, capturing perspectives of the two cameras are not completely identical, which results in that the preview frames respectively captured by them are also not completely identical.
  • the preview frames may have an identical perspective, thereby improving accuracy of a comparison result.
  • a particular manner of adjustment is not limited in this embodiment.
  • the difference between the two preview frames may be calculated, a manner of calculation being identical to that as described above, which shall not be described herein any further.
  • the preprocessing may be performing gray scale conversion on the two preview frames.
  • the camera (s) (the first camera and/or the second camera) may be Bayer sensor (s)
  • the preview frame (s) captured by it (them) may be color image (s) .
  • it (they) may be first converted in to gray-scale image (s) , and then a difference between two gray-scale images of two preview frames may be calculated, a manner of calculation being identical to that as described above, which shall not be described herein any further.
  • the preprocessing is described above by way of two implementations. However, this embodiment is not limited thereto, and other manners of preprocessing in which images are preprocessed so as to facilitate subsequent comparison of the two images, such as light alignment of two preview frames, histogram alignment, and perspective rectification alignment, etc., are all applicable this embodiment, which shall not be described herein any further. And furthermore, in this embodiment, various manners of preprocessing may be combined for use, so as to improve accuracy of subsequent comparison of the two images.
  • the preprocessed preview frames may be respectively converted into binary images, and then one of them is subtracted by the other, so as to obtain the difference between the two preview frames.
  • the threshold used for the comparison with the difference between the preview frames is not limited, which may be determined according to an empirical value, or may be determined by machine learning. For example, in a case where the above preprocessing is not performed, as an error is relatively large, the threshold may be set to be a relatively large value; and in a case where the above preprocessing is performed, as some errors are excluded, the threshold may be set to be a relatively small value. What described above is illustrative only, an example of the threshold is 10%, and this embodiment is not limited thereto.
  • prompt information may be outputted to prompt the user to clean the lens of the camera.
  • routine operations of the user will not be affected, and the user is only prompted when the lens is actually dirty, and a prompt box will not pop up randomly, thereby improving users’experiences.
  • a manner of outputting the prompt information is not limited in this embodiment, which may be a voice prompt, a flash prompt, and a prompt box prompt, etc., and the prompt manner may be default, or may be set by the user itself. And if it is set that the prompt is performed by a prompt box, contents of the prompt box may be default, or may be set by the user.
  • the detection method may be periodically executed according to a setting, such as being executed according to a set time interval.
  • the time interval may be preset, such as being defaulted by camera software, or may be set by the user as demanded, the user may make modification on the time interval, and the user may further disable the detection function.
  • the cameras may be rear cameras, and the number of which may be two or more, and this embodiment is not limited thereto.
  • FIG. 2 is a schematic diagram of another implementation of the smudge detection method of the embodiment of this disclosure. As shown in FIG. 2, the method includes:
  • step 201 acquiring preview frames
  • step 202 correcting and preprocessing the preview frames
  • step 203 calculating a difference between the preview frames
  • step 204 judging whether the above difference is greater than a threshold, and executing step 205 if it is judged yes; otherwise, executing step 207;
  • step 205 determining that a lens of at least one of the two cameras is dirty
  • step 206 outputting prompt information
  • step 207 judging whether an indication for terminating detection is received, and terminating the processing if it is judged yes; otherwise, executing step 208;
  • step 208 judging whether a preset time interval is passed, and executing step 201 if it is judged yes; otherwise, executing step 208 to wait.
  • manners of acquiring preview frames (step 201) and calculating a difference between the preview frames (step 203) may be identical to those described above, and shall not be described herein any further.
  • the acquired preview frames may be preprocessed (step 202) , that is, step 202 is optional.
  • a manner of preprocessing is not limited in this embodiment.
  • the detection function may possibly be disabled by the user, whether an indication for terminating detection is received may further be judged (step 207) in this embodiment; and if the indication is received, it shows that the user disables the detection function, and processing is terminated in this implementation; otherwise, it is deemed that the user does not disable the detection function.
  • the detection function may be executed periodically. Whether the set time interval is passed may further be judged in this embodiment, a manner of setting the time interval having been described above, and being not going to be described herein any further. And if the set time interval is passed, a next time of detection needs to be performed, and the process turns back to step 201; and if the set time interval is not passed, the process continues to wait.
  • an order of execution of steps 207 and 208 is not limited. For example, whether the set time interval is passed may be judged (step 208) first, and then whether an indication for terminating detection is received may be judged (step 207) .
  • the method is applicable to any scenarios, and only if the mobile electronic device is not in a dark environment, the method of this embodiment may be used to detect whether a lens of a camera is dirty.
  • a step for judging whether the mobile electronic device is in a dark environment may be added, a particular method of judgment being not limited in this embodiment.
  • whether the mobile electronic device is in a dark environment may be determined according to the pixel value of an image captured by at least one of the cameras (e.g., average pixel value or sum of all pixel values) , but any suitable method for sensing a dark environment may be used.
  • the above step for judging may not be added into this embodiment, and whether the mobile electronic device is in a dark environment may be judged only according to the captured preview frames, a particular method of judgment being not limited in this embodiment.
  • a user may be prompted when a lens is actually dirty, thereby preventing the user from being plagued due to being prompted when a lens of a camera is not actually dirty.
  • the detection process in this disclosure is performed in the background, hence, the user may be notified in advance when the lens is dirty before the user starts up the camera application to capture photos, thereby avoiding missing a best time for capturing.
  • An embodiment of this disclosure provides a smudge detection apparatus, which may be a mobile electronic device, or may be one or more components or assemblies configured in the mobile electronic device.
  • the mobile electronic device includes at least two cameras, as described in the embodiment of the first aspect. Contents in this embodiment identical to those in the embodiment of the first aspect shall not be described herein any further.
  • FIG. 3 is a schematic diagram of the smudge detection apparatus of the embodiment of this disclosure. As shown in FIG. 3, the apparatus includes:
  • an acquiring unit 301 configured to acquire at the same time preview frames captured by the first camera and the second camera respectively;
  • a calculating unit 302 configured to calculate a difference between the preview frames captured by the first camera and the second camera
  • a determining unit 303 configured to determine that a lens of at least one of the first camera and the second camera has a smudge when the calculated difference is greater than a preset threshold.
  • the calculating unit 302 respectively converts a preview frame captured by the first camera and a preview frame captured by the second camera into binary images, and subtracts the binary image of the preview frame captured by the first camera by the binary image of the preview frame captured by the second camera, so as to obtain the difference between the preview frame captured by the first camera and the preview frame captured by the second camera.
  • the apparatus 300 may further include:
  • a preprocessing unit 304 configured to, before the calculating unit 302 calculates the difference, preprocess the preview frame captured by the first camera and the preview frame captured by the second camera acquired by the acquiring unit 301.
  • the preprocessing here may be, for example, adjusting the preview frame captured by the first camera and the preview frame captured by the second camera, such that the preview frame captured by the first camera and the preview frame captured by the second camera have an identical perspective, or may be, for another example, converting the preview frame captured by the first camera and/or the preview frame captured by the second camera into gray-scale image (s) .
  • the calculating unit 302 may convert the preprocessed preview frame captured by the first camera and the preprocessed preview frame captured by the second camera into binary images respectively, and subtract the binary image of the preview frame captured by the first camera by the binary image of the preview frame captured by the second camera, so as to obtain the difference between the preview frame captured by the first camera and the preview frame captured by the second camera.
  • the apparatus 300 may further include:
  • a prompting unit 305 configured to output prompt information when the determining unit 303 determines that a lens of at least one of the first camera and the second camera has a smudge, a particular manner of outputting the prompt information being not limited in this embodiment.
  • the apparatus 300 may further include:
  • a first judging unit 306 configured to judge whether an indication for terminating detection is received, terminate the processing when it is judged yes, and activate the processing of the acquiring unit 301 when it is judged no.
  • the user may start up or close (i.e. enable or disable) the detection function of the smudge detection apparatus 300 as demanded.
  • the apparatus 300 may further include:
  • a second judging unit 307 configured to judge whether a set time interval is passed, activate the processing of the acquiring unit 301 when it is judged YES, and continue to wait when it is judged NO. Hence, the user may set a detection period of the smudge detection apparatus 300 as demanded.
  • the apparatus 300 may further include:
  • controlling unit 308 configured to control the processing of the above units.
  • the controlling unit 308 may be carried out by a processor and a memory, and the acquiring unit 301, the calculating unit 302, the determining unit 303, the preprocessing unit 304, the prompting unit 305, the first judging unit 306 and the second judging unit 307 may be carried out by programs stored in the memory, and execute respective functions under control of the controlling unit 308.
  • the components may repeat the above processing according to the above time interval, so as to obtain detection results of each time of detection.
  • the above components (the above units) are virtual components only, and may be carried out by the processor and the memory, details being going to be described in the embodiment of the third aspect.
  • the components related to this disclosure are only described above. However, this disclosure is not limited thereto, and the smudge detection apparatus 300 may further include other components, and the prior art may be referred to for particulars of these components.
  • whether a lens of a camera is actually dirty may be detected, a user may be prompted when a lens is actually dirty, thereby preventing the user from being plagued due to being prompted when a lens of a camera is not actually dirty.
  • the detection process in this disclosure is performed in the background, hence, the user may be notified in advance when the lens is dirty before the user starts up the camera application to capture photos, thereby avoiding missing a best time for capturing.
  • An embodiment of this disclosure provides a mobile electronic device, which may be a mobile phone, a tablet PC, a portable digital camera, a media player, a portable game device, a PDA, a computer, or the like, and this embodiment is not limited thereto.
  • the mobile electronic device may be a smart mobile phone; however, this embodiment is not limited thereto.
  • FIG. 4 is a schematic diagram of one implementation of a systematic structure of the mobile electronic device of the embodiment of this disclosure.
  • the mobile electronic device 400 may include a processor 401, a non-transitory computer readable medium (memory) 402, a first camera 403 and a second camera 404, the first camera 403 and the second camera 404 being rear cameras, and the non-transitory computer readable medium (memory) 402 being coupled to the processor 401, and storing instructions executed by the processor 401.
  • this figure is illustrative only, and other types of structures may also be used, so as to supplement or replace this structure and achieve a telecommunications function or other functions.
  • the functions of the smudge detection apparatus 300 may be integrated into the processor 401, wherein, the processor 401 may be configured to perform the following steps when executing the instructions: acquiring at the same time preview frames captured by the first camera and the second camera respectively; calculating a difference between the preview frames captured by the first camera and the second camera; and determining that a lens of at least one of the first camera and the second camera has a smudge when the calculated difference is greater than a preset threshold.
  • the processor 401 may be configured to perform the following step when executing the instructions: respectively converting a preview frame captured by the first camera and a preview frame captured by the second camera into binary images; and subtracting the binary image of the preview frame captured by the first camera by the binary image of the preview frame captured by the second camera, so as to obtain the difference between the preview frame captured by the first camera and the preview frame captured by the second camera.
  • the processor 401 may be configured to perform the following step when executing the instructions: preprocessing the preview frame captured by the first camera and the preview frame captured by the second camera before calculating the difference.
  • the processor 401 may be configured to perform the following control by executing the instructions: adjusting the preview frame captured by the first camera and the preview frame captured by the second camera, such that the preview frame captured by the first camera and the preview frame captured by the second camera have an identical perspective.
  • the processor 401 may be configured to perform the following step when executing the instructions: converting the preview frame captured by the first camera and/or the preview frame captured by the second camera into gray-scale images.
  • the processor 401 may be configured to perform the following steps when executing the instructions: converting the preprocessed preview frame captured by the first camera and the preprocessed preview frame captured by the second camera into binary images respectively; and subtracting the binary image of the preview frame captured by the first camera by the binary image of the preview frame captured by the second camera, so as to obtain the difference between the preview frame captured by the first camera and the preview frame captured by the second camera.
  • the processor 401 may be configured to perform the following step when executing the instructions: outputting prompt information when the lens of at least one of the first camera and the second camera has a smudge.
  • the processor 401 may be configured to execute the instructions according to a set time interval, the set time interval being preset or set according to a demand of a user.
  • the smudge detection apparatus 300 and the processor 401 may be configured separately.
  • the smudge detection apparatus 300 may be configured as a chip connected to the processor 401, with its functions being realized under control of the processor 401.
  • the mobile electronic device 400 may further include a communication module 405, an input unit 406, a display 407 and a power supply 408.
  • the processor 401 (sometimes referred to as a controller or control, which may include a microprocessor or other processor devices and/or logic devices) receives input and controls operations of every components of the mobile electronic device 400.
  • the input unit 406 provides input to the processor 401.
  • the input unit 406 is, for example, a button or a touch input device.
  • the power supply 408 is configured to supply power to the mobile electronic device 400.
  • the display 407 is configured to display objects to be displayed, such as photos, and words, etc.
  • the display may be, for example, an LCD display or an LED display; however, this disclosure is not limited thereto.
  • the non-transitory computer readable medium (memory) 402 may be a solid memory, such as a read-only memory (ROM) , a random access memory (RAM) , and a SIM card, etc. It may also be such a memory that may store information when power is off, may be selectively erased, and may be provided with more data, and an example of such a memory is sometimes referred to as an EPROM, etc.
  • the non-transitory computer readable medium (memory) 402 may also be another type of device.
  • the non-transitory computer readable medium (memory) 402 may include a buffer memory (sometimes referred to as a buffer) . And the non-transitory computer readable medium (memory) 402 may include an application/function storage portion configured to store applications and function programs or procedures executing operations of the mobile electronic device 400 via the processor 401.
  • the non-transitory computer readable medium (memory) 402 may further include a data storage portion configured to store data, such as a contact, digital data, a picture, a voice and/or any other data used by the mobile electronic device 400.
  • a driver storage portion of the non-transitory computer readable medium (memory) 402 includes various drivers of the mobile electronic device 400 for communication functions and/or for executing other functions (such as message transmission application, and directory application, etc. ) of the mobile electronic device 400.
  • the communication module 405 is a transmitter/receiver transmitting and receiving signals via antennas.
  • the communication module (transmitter/receiver) 405 is coupled to the processor 401 to provide input signals and receive output signals, which may be identical to a case in a conventional communication terminal.
  • multiple communication modules 405, such as a cellular network module, a Bluetooth module and/or a WLAN module, etc., may be provided, so as to achieve general telecommunications functions.
  • FIG. 4 may only illustrate a part of the structure of the mobile electronic device 400.
  • the mobile electronic device 400 does not necessarily include all the components shown in FIG. 4. And furthermore, the mobile electronic device 400 may include components not shown in FIG. 4, and the prior art may be referred to.
  • An embodiment of the present disclosure provides a computer readable program code, which, when executed in a mobile electronic device, will cause the mobile electronic device to carry out the smudge detection method as described in the embodiment of the first aspect.
  • An embodiment of the present disclosure provides a computer readable medium, including a computer readable program code, which will cause mobile electronic device to carry out the s smudge detection method as described in the embodiment of the first aspect.
  • each of the parts of the present disclosure may be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be realized by software or firmware that is stored in the memory and executed by an appropriate instruction executing system.
  • a discrete logic circuit having a logic gate circuit for realizing logic functions of data signals
  • application-specific integrated circuit having an appropriate combined logic gate circuit
  • PGA programmable gate array
  • FPGA field programmable gate array
  • logic and/or steps shown in the flowcharts or described in other manners here may be, for example, understood as a sequencing list of executable instructions for realizing logic functions, which may be implemented in any computer readable medium, for use by an instruction executing system, device or apparatus (such as a system including a computer, a system including a processor, or other systems capable of extracting instructions from an instruction executing system, device or apparatus and executing the instructions) , or for use in combination with the instruction executing system, device or apparatus.

Abstract

L'invention concerne un procédé et un appareil de détection de tache, et un dispositif électronique mobile. Le procédé est mis en œuvre par un dispositif électronique mobile comprenant un premier et un second appareil photo. Le procédé comprend les étapes consistant : à acquérir simultanément des images de prévisualisation capturées respectivement par le premier et le second appareil photo; à calculer une différence entre les images de prévisualisation capturées par le premier et le second appareil photo; et à établir qu'un objectif du premier et/ou du second appareil photo présente une tache lorsque la différence calculée est supérieure à un seuil prédéfini. Dans le procédé selon le mode de réalisation de la présente invention, la détection de la saleté réelle éventuelle de l'objectif d'un appareil photo permet d'avertir un utilisateur quand un objectif est réellement sale, ce qui permet d'éviter de déranger l'utilisateur par des avertissements lorsque l'objectif de l'appareil photo n'est pas réellement sale.
PCT/CN2018/073758 2018-01-23 2018-01-23 Procédé et appareil de détection de tache, et dispositif électronique mobile WO2019144262A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/073758 WO2019144262A1 (fr) 2018-01-23 2018-01-23 Procédé et appareil de détection de tache, et dispositif électronique mobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/073758 WO2019144262A1 (fr) 2018-01-23 2018-01-23 Procédé et appareil de détection de tache, et dispositif électronique mobile

Publications (1)

Publication Number Publication Date
WO2019144262A1 true WO2019144262A1 (fr) 2019-08-01

Family

ID=67395168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/073758 WO2019144262A1 (fr) 2018-01-23 2018-01-23 Procédé et appareil de détection de tache, et dispositif électronique mobile

Country Status (1)

Country Link
WO (1) WO2019144262A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113523548A (zh) * 2021-07-26 2021-10-22 天津荣盛盟固利新能源科技有限公司 激光焊接保护镜片脏污程度检测方法
CN114189671A (zh) * 2020-09-14 2022-03-15 埃尔构人工智能有限责任公司 相机清洁系统的验证
US11341682B2 (en) 2020-08-13 2022-05-24 Argo AI, LLC Testing and validation of a camera under electromagnetic interference

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150163400A1 (en) * 2013-12-06 2015-06-11 Google Inc. Camera Selection Based on Occlusion of Field of View
US20160004144A1 (en) * 2014-07-04 2016-01-07 The Lightco Inc. Methods and apparatus relating to detection and/or indicating a dirty lens condition
CN106385579A (zh) * 2016-09-12 2017-02-08 努比亚技术有限公司 一种摄像头检测装置、方法及多摄像头终端

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150163400A1 (en) * 2013-12-06 2015-06-11 Google Inc. Camera Selection Based on Occlusion of Field of View
US20160004144A1 (en) * 2014-07-04 2016-01-07 The Lightco Inc. Methods and apparatus relating to detection and/or indicating a dirty lens condition
CN106385579A (zh) * 2016-09-12 2017-02-08 努比亚技术有限公司 一种摄像头检测装置、方法及多摄像头终端

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11341682B2 (en) 2020-08-13 2022-05-24 Argo AI, LLC Testing and validation of a camera under electromagnetic interference
US11734857B2 (en) 2020-08-13 2023-08-22 Argo AI, LLC Testing and validation of a camera under electromagnetic interference
CN114189671A (zh) * 2020-09-14 2022-03-15 埃尔构人工智能有限责任公司 相机清洁系统的验证
US11368672B2 (en) 2020-09-14 2022-06-21 Argo AI, LLC Validation of a camera cleaning system
US11758121B2 (en) 2020-09-14 2023-09-12 Argo AI, LLC Validation of a camera cleaning system
CN113523548A (zh) * 2021-07-26 2021-10-22 天津荣盛盟固利新能源科技有限公司 激光焊接保护镜片脏污程度检测方法

Similar Documents

Publication Publication Date Title
CN109040609B (zh) 曝光控制方法、装置、电子设备和计算机可读存储介质
US8913156B2 (en) Capturing apparatus and method of capturing image
US8509482B2 (en) Subject tracking apparatus, subject region extraction apparatus, and control methods therefor
US10057501B2 (en) Imaging apparatus, flicker detection method, and flicker detection program
US9002061B2 (en) Image processing device, image processing method and computer-readable medium
US20160173752A1 (en) Techniques for context and performance adaptive processing in ultra low-power computer vision systems
US20150116544A1 (en) Method and apparatus of working mode control, and electronic device
US20170142387A1 (en) Image processing device, imaging device, image processing method, and program
CN107395991B (zh) 图像合成方法、装置、计算机可读存储介质和计算机设备
US10104309B2 (en) Imaging apparatus, flicker detection method, and flicker detection program
US20110013053A1 (en) Defective pixel detection and correction
US10382671B2 (en) Image processing apparatus, image processing method, and recording medium
US9196029B2 (en) Threshold setting device for setting threshold used in binarization process, object detection device, threshold setting method, and computer readable storage medium
CN107704798B (zh) 图像虚化方法、装置、计算机可读存储介质和计算机设备
CN107690804B (zh) 一种图像处理方法及用户终端
CN110728705B (zh) 图像处理方法、装置、存储介质及电子设备
WO2019144262A1 (fr) Procédé et appareil de détection de tache, et dispositif électronique mobile
CN108401110B (zh) 图像的获取方法、装置、存储介质及电子设备
US8582813B2 (en) Object detection device which detects object based on similarities in different frame images, and object detection method and computer-readable medium recording program
US9633418B2 (en) Image processing device, imaging apparatus, image processing method, and program
US20150138076A1 (en) Communication device and method of processing incoming call by facial image
CN108574803B (zh) 图像的选取方法、装置、存储介质及电子设备
US9930250B2 (en) Method and apparatus for performing image processing operation based on frame/algorithm selection
US7929853B2 (en) Method and apparatus for taking pictures on a mobile communication terminal having a camera module
JP2017073639A (ja) 画像処理装置およびその制御方法ならびにプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902060

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18902060

Country of ref document: EP

Kind code of ref document: A1