WO2024020958A1 - Method of generating an image, electronic device, apparatus, and computer readable storage medium - Google Patents

Method of generating an image, electronic device, apparatus, and computer readable storage medium Download PDF

Info

Publication number
WO2024020958A1
WO2024020958A1 PCT/CN2022/108737 CN2022108737W WO2024020958A1 WO 2024020958 A1 WO2024020958 A1 WO 2024020958A1 CN 2022108737 W CN2022108737 W CN 2022108737W WO 2024020958 A1 WO2024020958 A1 WO 2024020958A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
frequency
camera
transformed
point spread
Prior art date
Application number
PCT/CN2022/108737
Other languages
French (fr)
Inventor
Kunihiro Hasegawa
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2022/108737 priority Critical patent/WO2024020958A1/en
Publication of WO2024020958A1 publication Critical patent/WO2024020958A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20201Motion blur correction

Definitions

  • the present disclosure relates to a method of generating image data, an electronic device implementing such method a computer readable medium including program instructions stored thereon for performing such method, and an apparatus.
  • such conventional electronic devices use a well-known method for correcting blurred images by performing deconvolution using a PSF (Point Spread Function) , which performs blurring and then deblurring the image.
  • PSF Point Spread Function
  • This method may produce a very clean deblurred image if the processing is done correctly.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an electronic device and a method of controlling electronic device.
  • a method of generating an image includes:
  • PSF point spread function
  • an electronic device includes:
  • a camera assembly includes a first camera and a second camera
  • At least one memory including program code
  • the at least one memory and the program code are configured to, with the at least one processor, cause the electronic device to perform:
  • PSF point spread function
  • an apparatus includes:
  • a point spread function estimation unit configured to estimate a point spread function (PSF) based on event data obtained by imaging an object with a first camera;
  • a first image processing unit configured to acquire a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera;
  • a second image processing unit configured to acquire a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data;
  • a transform unit configured to transform the first image, the second image, and the point spread function from an image space to a frequency space
  • a frequency specifying unit configured to specify a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function
  • a replacement unit configured to acquire a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency
  • an inverse transform unit configured to acquire a deblur image by inverse transforming the replacement image from the frequency space to the image space.
  • a computer readable medium comprises program instructions stored thereon for performing at least the following:
  • PSF point spread function
  • FIG. 1 illustrates a plan view of a first side of an electronic device according to an embodiment of the present disclosure
  • FIG. 2 illustrates a plan view of a second side of the electronic device according to the embodiment of the present disclosure
  • FIG. 3 illustrates a block diagram of the electronic device according to the embodiment of the present disclosure
  • FIG. 4 illustrates a block diagram of the processor according to the embodiment of the present disclosure
  • FIG. 5 is a diagram showing an example of a processing flow applied to the embodiment of the present disclosure.
  • FIG. 6 is a diagram showing an example of images obtained by the embodiment of the present disclosure.
  • FIG. 1 illustrates a plan view of a first side of an electronic device 10 according to an embodiment of the present disclosure
  • FIG. 2 illustrates a plan view of a second side of the electronic device 10 according to the embodiment of the present disclosure.
  • the first side may be referred to as a back side of the electronic device 10
  • the second side may be referred to as a front side of the electronic device 10.
  • the electronic device 10 may include a display 20 and a camera assembly 30.
  • the camera assembly 30 includes a first main camera (a first camera) 32, a second main camera (a second camera) 34 and a sub camera 36.
  • the first main camera 32 and the second main camera 34 may capture an image in a first side of the electronic device 10, and the sub camera 36 may capture an image in the second side of the electronic device 10. Therefore, the first main camera 32 and the second main camera 34 are so-called out-cameras whereas the sub camera 36 is a so-called in-camera.
  • the electronic device 10 may be a mobile phone, a tablet computer, a personal digital assistant, and so on.
  • the first main camera 32 is an event camera.
  • the second main camera 34 is an image sensor camera (for example, a CMOS type image sensor camera) .
  • the event data is obtained by imaging the object with the first main camera 32 of the camera assembly 30.
  • a reference frame image is obtained by capturing the object with the second main camera 34 of the camera assembly 30.
  • the reference frame image is an RGB image.
  • FIG. 3 illustrates a block diagram of the electronic device 10 according to the present embodiment.
  • the electronic device 10 may include a main processor 40, an image signal processor 42, a memory 44, a power supply circuit 46 and a communication circuit 48.
  • the display 20, the camera assembly 30, the main processor 40, the image signal processor 42, the memory 44, the power supply circuit 46 and the communication circuit 48 are connected with each other via a bus 50.
  • the processor 100 includes the main processor 40 and the image signal processor 42.
  • the processor 100 acquires event data and the reference frame image, by controlling the camera assembly 30 and taking an image of the object. Furthermore, the processor 100 acquires a deblur image based on the event data and the reference frame image.
  • the main processor 40 executes one or more programs stored in the memory 44.
  • the main processor 40 implements various applications and data processing (including image data processing) of the electronic device 10 by executing the programs.
  • the main processor 40 may be one or more computer processors.
  • the main processor 40 is not limited to one CPU core, but it may have a plurality of CPU cores.
  • the main processor 40 may be a main CPU of the electronic device 10, an image process unit (IPU) , or a DSP provided with the camera assembly 30.
  • the image signal processor 42 controls the camera assembly 30 and processes various kinds of image data captured by the camera assembly 30 to generate a target image data.
  • the image signal processor 42 may execute a de-mosaic process, a noise reduction process, an auto exposure process, an auto focus process, an auto white balance process, a high dynamic range process and so on, to the image data captured by the camera assembly 30.
  • the main processor 40 and the image signal processor 42 collaborate with each other to generate a target image data of the object captured by the camera assembly 30. That is, the main processor 40 and the image signal processor 42 are configured to capture the image of the object by the camera assembly 30 and execute various kinds of image processes to the captured image data.
  • the memory 44 stores a program to be executed by the main processor 40, the image signal processor 42, and various kinds of data. For example, data of the captured image are stored in the memory 44.
  • the memory 44 is configured to store a table that defines relationships between the features (for example, the difference between natural and artificial objects, especially the edge of the object, etc. ) of a plurality of images and a plurality of threshold values.
  • the images including such as the captured images and generated images exist in the form of image data in the electronic device 10 (for, example, the data of the images are stored in the memory 44) . Furthermore, the processor 100 processes the images as image data.
  • the memory 44 includes program code.
  • the memory 44 and the program code is configured to, with the processor 100, cause the electronic device 10 to perform: estimating a point spread function (PSF) based on event data obtained by imaging an object with a first camera; acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera; acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data; transforming the first image, the second image, and the point spread function from an image space to a frequency space; specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function; acquiring a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and acquiring a deblur image by inverse transforming the replacement image from the frequency space
  • the memory 44 may include a high-speed RAM memory, and/or a non-volatile memory such as a flash memory and a magnetic disk memory. That is, the memory 44 may include a non-transitory computer readable medium, in which the program is stored.
  • the power supply circuit 46 may have a battery such as a lithium-ion rechargeable battery and a battery management unit (BMU) for managing the battery.
  • BMU battery management unit
  • the communication circuit 48 is configured to receive and transmit data to communicate with base stations of the telecommunication network system, the Internet or other devices via wireless communication.
  • the wireless communication may adopt any communication standard or protocol, including but not limited to GSM (Global System for Mobile communication) , CDMA (Code Division Multiple Access) , LTE (Long Term Evolution) , LTE-Advanced, 5th generation (5G) .
  • the communication circuit 48 may include an antenna and a RF (radio frequency) circuit.
  • the image generation process is executed by, for example, the main processor 40 (the processor 100) in order to generate the image data.
  • program instructions of the image generation process are stored in the non-transitory computer readable medium of the memory 44.
  • the main processor 40 implements the image generation process.
  • FIG. 4 illustrates a block diagram of the processor 100 according to the embodiment of the present disclosure.
  • the processor 100 shown in FIG. 4 may be substituted with an apparatus.
  • the apparatus comprises: a first image processing unit 100a; a second image processing unit 100b; a point spread function estimation unit 100c; a transform unit 100d; a frequency specifying unit 100e; a replacement unit 100f; an inverse transform unit 100g; and a threshold value adjustment unit 100h.
  • the point spread function estimation unit 100c is configured to estimate a point spread function (PSF) based on event data obtained by imaging an object with a first camera.
  • PSF point spread function
  • the first image processing unit 100a is configured to acquire a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera.
  • the second image processing unit 100b is configured to acquire a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data.
  • the transform unit 100d is configured to transform the first image, the second image, and the point spread function from an image space to a frequency space.
  • the frequency specifying unit 100e is configured to specify a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function.
  • the replacement unit 100f is configured to acquire a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency.
  • the inverse transform unit 100g is configured to acquire a deblur image by inverse transforming the replacement image from the frequency space to the image space.
  • the threshold value adjustment unit 100h is configured to adjust the threshold value before specifying the frequency.
  • the electronic device 10 may be represented as an apparatus including the processor 100.
  • FIG. 5 is a diagram showing an example of a processing flow applied to the embodiment of the present disclosure.
  • FIG. 6 is a diagram showing an example of images obtained by the embodiment of the present disclosure.
  • the processor 100 acquires the event data and the reference frame image, by controlling the camera assembly 30 and taking an image of the object.
  • the event data is obtained by imaging the object with the first main camera (more specifically, the event camera) 32 of the camera assembly 30. Furthermore, the reference frame image is obtained by capturing the object with the second main camera (more specifically, the image sensor camera) 34 of the camera assembly 30.
  • the point spread function estimation unit 100c of the processor 100 estimates a point spread function (PSF) based on event data obtained by imaging an object with the first main camera 32 (the step 1 in Fig. 5) .
  • PSF point spread function
  • the first image processing unit 100a acquires a first image (the deconvolution image of Fig. 6) by executing a deconvolution method using the point spread function based on the reference frame image obtained by capturing the object with the second main camera 34 (the step 2 in Fig. 5) .
  • the second image processing unit 100b acquires a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data (the step 3 in Fig. 5) .
  • the image processing method different from the deconvolution method is an image reconstruction method.
  • the second image is the reconstruct image, for example, as shown in Fig. 6.
  • This image reconstruction method has an advantage that, for example, ringing does not occur in a deblur image, compared to the deconvolution method.
  • this image reconstruction method has the advantage that large objects of the deblur image is relatively cleanly, compared to the deconvolution method.
  • the transform unit 100d of the processor 100 transforms the first image on the image space from the image space (the image space coordinates) to the frequency space (the frequency space coordinates) , using, for example, a Fourier transform (the step 4a in Fig. 5) .
  • the transform unit 100d of the processor 100 transforms the second image on the image space from the image space (the image space coordinates) to the frequency space (the frequency space coordinates) , using, for example, the Fourier transform (the step 4b in Fig. 5) .
  • the transform unit 100d of the processor 100 transforms the point spread function on the image space from the image space (the image space coordinates) to the frequency space (the frequency space coordinates) , using, for example, the Fourier transform (the step 4c in Fig. 5) .
  • the frequency specifying unit 100e of the processor 100 specifies a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function (the step 5 in Fig. 5) .
  • the frequency specifying unit 100e of the processor 100 specifies a frequency at which the frequency intensity is equal to or lower than a threshold value, by comparing the frequency intensity of the transformed point spread function with a threshold value.
  • the threshold value adjustment unit 100h of the processor 100 may adjust the threshold value before specifying the frequency.
  • the threshold value adjustment unit 100h adjusts the threshold value based on the future of the image captured by a second main camera (more specifically, the image sensor camera) 34 in Fig. 1.
  • the memory 44 in Fig. 3 is configured to store a table that defines relationships between the features of a plurality of images and a plurality of threshold values.
  • the threshold value adjustment unit 100h adjusts the threshold value to a threshold value corresponding to the feature of the image captured by the second main camera 34 by searching the table stored in the memory 44.
  • the relationship between the features (for example, the difference between natural and artificial objects, especially the edge of the object, etc. ) of the plurality of images and the plurality of threshold values in the table is pre-constructed by, for example, machine learning.
  • the replacement unit 100f of the processor 100 acquires a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency (the step 6 in Fig. 5) .
  • the inverse transform unit 100g of the processor 100 acquires a deblur image, for example, as shown in Fig. 6, by inverse transforming the replacement image from the frequency space to the image space, for example, using an inverse Fourier transform (the step 7 in Fig. 5) .
  • the processor 100 causes the display 20 (as shown in Fig. 3) to display the deblur image acquired by the inverse transform unit 100g.
  • the processor 100 acquires a deblur image based on the event data and the reference frame image.
  • the electronic device comprises: a camera assembly includes a first camera and a second camera; at least one processor; and at least one memory including program code.
  • the at least one memory and the program code are configured to, with the at least one processor, cause the electronic device to perform: estimating a point spread function (PSF) based on event data obtained by imaging an object with a first camera; acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera; acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data; transforming the first image, the second image, and the point spread function from an image space to a frequency space; specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function; acquiring a replacement image by replacing the part of the transformed first image with
  • PSF point spread function
  • an image from a restoration (deblur) method (For example, the image reconstruction method) that has different characteristics from the deblur image from the deconvolution method is prepared and mixed with to remove ringing and improve deblur results.
  • ringing removal is performed using an image with blur removed by the image reconstruction method using event data.
  • the electronic device may suppress ringing of an image in which motion deblur is executed for deconvolution using the point spread function.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • a plurality of means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electronic connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electronic manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

A method of generating an image includes: estimating a point spread function (PSF) based on event data obtained by imaging an object with a first camera; acquiring a first image by executing a deconvolution method using the point spread function based on a reference frameimage obtained by capturing the object with a second camera; acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data; transforming the first image, the second image, and the point spread function from an image space to a frequency space; specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function; acquiring a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and acquiring a deblur image by inverse transforming the replacement image from the frequency space to the image space.

Description

METHOD OF GENERATING AN IMAGE, ELECTRONIC DEVICE, APPARATUS, AND COMPUTER READABLE STORAGE MEDIUM TECHNICAL FIELD
The present disclosure relates to a method of generating image data, an electronic device implementing such method a computer readable medium including program instructions stored thereon for performing such method, and an apparatus.
BACKGROUND
Conventionally, there are electronic devices such as smartphones equipped with a digital camera that captures an object.
In particular, such conventional electronic devices use a well-known method for correcting blurred images by performing deconvolution using a PSF (Point Spread Function) , which performs blurring and then deblurring the image. This method may produce a very clean deblurred image if the processing is done correctly.
However, the problem with this method is that in many cases, artifacts called "ringing" are generated in the output image.
As described above, it is difficult for the conventional electronic device to suppress the ringing of the image in which the motion deblur is executed for deconvolution using the point spread function.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an electronic device and a method of controlling electronic device.
In accordance with the present disclosure, a method of generating an image, includes:
estimating a point spread function (PSF) based on event data obtained by imaging an object with a first camera;
acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera;
acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data;
transforming the first image, the second image, and the point spread function from an image space to a frequency space;
specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function;
acquiring a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and
acquiring a deblur image by inverse transforming the replacement image from the frequency space to the image space.
In accordance with the present disclosure, an electronic device includes:
a camera assembly includes a first camera and a second camera;
at least one processor; and
at least one memory including program code;
wherein the at least one memory and the program code are configured to, with the at least one processor, cause the electronic device to perform:
estimating a point spread function (PSF) based on event data obtained by imaging an object with the first camera;
acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with the second camera;
acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data;
transforming the first image, the second image, and the point spread function from an image space to a frequency space;
specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function;
acquiring a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and
acquiring a deblur image by inverse transforming the replacement image from the frequency space to the image space.
In accordance with the present disclosure, an apparatus includes:
a point spread function estimation unit configured to estimate a point spread function (PSF) based on event data obtained by imaging an object with a first camera;
a first image processing unit configured to acquire a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera;
a second image processing unit configured to acquire a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data;
a transform unit configured to transform the first image, the second image, and the point spread function from an image space to a frequency space;
a frequency specifying unit configured to specify a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function;
a replacement unit configured to acquire a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and
an inverse transform unit configured to acquire a deblur image by inverse transforming the replacement image from the frequency space to the image space.
In accordance with the present disclosure, a computer readable medium comprises program instructions stored thereon for performing at least the following:
estimating a point spread function (PSF) based on event data obtained by imaging an object with a first camera;
acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera;
acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data;
transforming the first image, the second image, and the point spread function from an image space to a frequency space;
specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function;
acquiring a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and
acquiring a deblur image by inverse transforming the replacement image from the frequency space to the image space.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
FIG. 1 illustrates a plan view of a first side of an electronic device according to an embodiment of the present disclosure;
FIG. 2 illustrates a plan view of a second side of the electronic device according to the embodiment of the present disclosure;
FIG. 3 illustrates a block diagram of the electronic device according to the embodiment of the present disclosure;
FIG. 4 illustrates a block diagram of the processor according to the embodiment of the present disclosure;
FIG. 5 is a diagram showing an example of a processing flow applied to the embodiment of the present disclosure; and
FIG. 6 is a diagram showing an example of images obtained by the embodiment of the present disclosure.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments will be illustrated in the accompanying drawings. The same or similar elements and the elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory, which aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
Here, FIG. 1 illustrates a plan view of a first side of an electronic device 10 according to an embodiment of the present disclosure, and FIG. 2 illustrates a plan view of a second side of the electronic device 10 according to the embodiment of the present disclosure. The first side may be referred to as a back side of the electronic device 10, whereas the second side may be referred to as a front side of the electronic device 10.
As shown in FIG. 1 and FIG. 2, the electronic device 10 may include a display 20 and a camera assembly 30. In the present embodiment, the camera assembly 30 includes a first main camera (a first camera) 32, a second main camera (a second camera) 34 and a sub camera 36. The first main camera 32 and the second main camera 34 may capture an image in a first side of the electronic device 10, and the sub camera 36 may capture an image in the second side of the  electronic device 10. Therefore, the first main camera 32 and the second main camera 34 are so-called out-cameras whereas the sub camera 36 is a so-called in-camera. As an example, the electronic device 10 may be a mobile phone, a tablet computer, a personal digital assistant, and so on.
Here, for example, the first main camera 32 is an event camera. On the other hand, the second main camera 34 is an image sensor camera (for example, a CMOS type image sensor camera) .
For example, the event data is obtained by imaging the object with the first main camera 32 of the camera assembly 30. On the other hand, a reference frame image is obtained by capturing the object with the second main camera 34 of the camera assembly 30. For example, the reference frame image is an RGB image.
FIG. 3 illustrates a block diagram of the electronic device 10 according to the present embodiment. As shown in FIG. 3, in addition to the display 20 and the camera assembly 30, the electronic device 10 may include a main processor 40, an image signal processor 42, a memory 44, a power supply circuit 46 and a communication circuit 48. The display 20, the camera assembly 30, the main processor 40, the image signal processor 42, the memory 44, the power supply circuit 46 and the communication circuit 48 are connected with each other via a bus 50.
The processor 100 includes the main processor 40 and the image signal processor 42.
As will be described later, the processor 100 acquires event data and the reference frame image, by controlling the camera assembly 30 and taking an image of the object. Furthermore, the processor 100 acquires a deblur image based on the event data and the reference frame image.
The main processor 40 executes one or more programs stored in the memory 44. The main processor 40 implements various applications and data processing (including image data processing) of the electronic device 10 by executing the programs. The main processor 40 may be one or more computer processors. The main processor 40 is not limited to one CPU core, but it may have a plurality of CPU cores. The main processor 40 may be a main CPU of the electronic device 10, an image process unit (IPU) , or a DSP provided with the camera assembly 30.
The image signal processor 42 controls the camera assembly 30 and processes various kinds of image data captured by the camera assembly 30 to generate a target image data. For example, the image signal processor 42 may execute a de-mosaic process, a noise reduction process, an auto exposure process, an auto focus process, an auto white balance process, a high dynamic range process and so on, to the image data captured by the camera assembly 30.
In the present embodiment, the main processor 40 and the image signal processor 42 collaborate with each other to generate a target image data of the object captured by the camera assembly 30. That is, the main processor 40 and the image signal processor 42 are configured to capture the image of the object by the camera assembly 30 and execute various kinds of image processes to the captured image data.
The memory 44 stores a program to be executed by the main processor 40, the image signal processor 42, and various kinds of data. For example, data of the captured image are stored in the memory 44.
As will be described later, the memory 44 is configured to store a table that defines relationships between the features (for example, the difference between natural and artificial objects, especially the edge of the object, etc. ) of a plurality of images and a plurality of threshold values.
It is noticed that the images including such as the captured images and generated images exist in the form of image data in the electronic device 10 (for, example, the data of the images are stored in the memory 44) . Furthermore, the processor 100 processes the images as image data.
Especially, for example, the memory 44 includes program code. The memory 44 and the program code is configured to, with the processor 100, cause the electronic device 10 to perform: estimating a point spread function (PSF) based on event data obtained by imaging an object with a first camera; acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera; acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data; transforming the first image, the second image, and the point spread function from an image space to a frequency space; specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function; acquiring a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and acquiring a deblur image by inverse transforming the replacement image from the frequency space to the image space.
The memory 44 may include a high-speed RAM memory, and/or a non-volatile memory such as a flash memory and a magnetic disk memory. That is, the memory 44 may include a non-transitory computer readable medium, in which the program is stored.
The power supply circuit 46 may have a battery such as a lithium-ion rechargeable battery and a battery management unit (BMU) for managing the battery.
The communication circuit 48 is configured to receive and transmit data to communicate with base stations of the telecommunication network system, the Internet or other devices via wireless communication. The wireless communication may adopt any communication standard or protocol, including but not limited to GSM (Global System for Mobile communication) , CDMA (Code Division Multiple Access) , LTE (Long Term Evolution) , LTE-Advanced, 5th generation (5G) . The communication circuit 48 may include an antenna and a RF (radio frequency) circuit.
In the present embodiment, the image generation process is executed by, for example, the main processor 40 (the processor 100) in order to generate the image data.
In addition, in the present embodiment, program instructions of the image generation process are stored in the non-transitory computer readable medium of the memory 44. When the program instructions are read out from the memory 44 and executed in the main processor 40, the main processor 40 implements the image generation process.
Here, FIG. 4 illustrates a block diagram of the processor 100 according to the embodiment of the present disclosure.
As shown in FIG. 4, the processor 100 shown in FIG. 4 may be substituted with an apparatus.
The apparatus comprises: a first image processing unit 100a; a second image processing unit 100b; a point spread function estimation unit 100c; a transform unit 100d; a frequency specifying unit 100e; a replacement unit 100f; an inverse transform unit 100g; and a threshold value adjustment unit 100h.
The point spread function estimation unit 100c is configured to estimate a point spread function (PSF) based on event data obtained by imaging an object with a first camera.
The first image processing unit 100a is configured to acquire a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera.
The second image processing unit 100b is configured to acquire a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data.
The transform unit 100d is configured to transform the first image, the second image, and the point spread function from an image space to a frequency space.
The frequency specifying unit 100e is configured to specify a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function.
The replacement unit 100f is configured to acquire a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency.
The inverse transform unit 100g is configured to acquire a deblur image by inverse transforming the replacement image from the frequency space to the image space.
The threshold value adjustment unit 100h is configured to adjust the threshold value before specifying the frequency.
In this embodiment, the electronic device 10 may be represented as an apparatus including the processor 100.
Next, the method of generating image data by the electronic device 100 according to the present embodiment having the above configuration will be described. In the following, the processing by the processor 100 of the electronic device 10 will be mainly described.
Here, FIG. 5 is a diagram showing an example of a processing flow applied to the embodiment of the present disclosure. FIG. 6 is a diagram showing an example of images obtained by the embodiment of the present disclosure.
As shown Fig. 5, as a premise, the processor 100 acquires the event data and the reference frame image, by controlling the camera assembly 30 and taking an image of the object.
The event data is obtained by imaging the object with the first main camera (more specifically, the event camera) 32 of the camera assembly 30. Furthermore, the reference frame image is obtained by capturing the object with the second main camera (more specifically, the image sensor camera) 34 of the camera assembly 30.
Next, the point spread function estimation unit 100c of the processor 100 estimates a point spread function (PSF) based on event data obtained by imaging an object with the first main camera 32 (the step 1 in Fig. 5) .
Next, the first image processing unit 100a acquires a first image (the deconvolution image of Fig. 6) by executing a deconvolution method using the point spread function based on the reference frame image obtained by capturing the object with the second main camera 34 (the step 2 in Fig. 5) .
On the other hand, the second image processing unit 100b acquires a second image having less ringing characteristics than the first image by executing an image processing method  different from the deconvolution method based on the reference frame image and the event data (the step 3 in Fig. 5) .
For example, the image processing method different from the deconvolution method is an image reconstruction method. In this case, the second image is the reconstruct image, for example, as shown in Fig. 6.
This image reconstruction method has an advantage that, for example, ringing does not occur in a deblur image, compared to the deconvolution method. In addition, this image reconstruction method has the advantage that large objects of the deblur image is relatively cleanly, compared to the deconvolution method.
Next, the transform unit 100d of the processor 100 transforms the first image on the image space from the image space (the image space coordinates) to the frequency space (the frequency space coordinates) , using, for example, a Fourier transform (the step 4a in Fig. 5) .
As a result, the first image transformed on the frequency space is generated.
In the same way, the transform unit 100d of the processor 100 transforms the second image on the image space from the image space (the image space coordinates) to the frequency space (the frequency space coordinates) , using, for example, the Fourier transform (the step 4b in Fig. 5) .
As a result, the second image transformed on the frequency space is generated.
In the same way, the transform unit 100d of the processor 100 transforms the point spread function on the image space from the image space (the image space coordinates) to the frequency space (the frequency space coordinates) , using, for example, the Fourier transform (the step 4c in Fig. 5) .
As a result, the point spread function transformed on the frequency space is generated.
Next, the frequency specifying unit 100e of the processor 100 specifies a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function (the step 5 in Fig. 5) .
In more details, the frequency specifying unit 100e of the processor 100 specifies a frequency at which the frequency intensity is equal to or lower than a threshold value, by comparing the frequency intensity of the transformed point spread function with a threshold value.
Especially, the threshold value adjustment unit 100h of the processor 100 may adjust the threshold value before specifying the frequency.
For example, the threshold value adjustment unit 100h adjusts the threshold value based on the future of the image captured by a second main camera (more specifically, the image sensor camera) 34 in Fig. 1.
As described above, the memory 44 in Fig. 3 is configured to store a table that defines relationships between the features of a plurality of images and a plurality of threshold values.
In this case, the threshold value adjustment unit 100h adjusts the threshold value to a threshold value corresponding to the feature of the image captured by the second main camera 34 by searching the table stored in the memory 44.
The relationship between the features (for example, the difference between natural and artificial objects, especially the edge of the object, etc. ) of the plurality of images and the plurality of threshold values in the table is pre-constructed by, for example, machine learning.
Next, the replacement unit 100f of the processor 100 acquires a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency (the step 6 in Fig. 5) .
Next, the inverse transform unit 100g of the processor 100 acquires a deblur image, for example, as shown in Fig. 6, by inverse transforming the replacement image from the frequency space to the image space, for example, using an inverse Fourier transform (the step 7 in Fig. 5) .
Next, the processor 100 causes the display 20 (as shown in Fig. 3) to display the deblur image acquired by the inverse transform unit 100g.
By the above processing flow, as shown in Fig. 6, the processor 100 acquires a deblur image based on the event data and the reference frame image.
As described above, the electronic device according to the present invention comprises: a camera assembly includes a first camera and a second camera; at least one processor; and at least one memory including program code. The at least one memory and the program code are configured to, with the at least one processor, cause the electronic device to perform: estimating a point spread function (PSF) based on event data obtained by imaging an object with a first camera; acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera; acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data; transforming the first image, the second image, and the point spread function from an image space to a frequency space; specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function; acquiring a replacement image by replacing  the part of the transformed first image with a part of the transformed second image at the specified frequency; and acquiring a deblur image by inverse transforming the replacement image from the frequency space to the image space.
In this embodiment, an image from a restoration (deblur) method (For example, the image reconstruction method) that has different characteristics from the deblur image from the deconvolution method is prepared and mixed with to remove ringing and improve deblur results. In this embodiment, ringing removal is performed using an image with blur removed by the image reconstruction method using event data.
That is, the electronic device according to the present invention may suppress ringing of an image in which motion deblur is executed for deconvolution using the point spread function.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings in discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or must be constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, a feature defined as "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "a plurality of" means “two or more than two” , unless otherwise specified.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electronic connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each  other, but are in contact via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function,  may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electronic manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a  form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (20)

  1. A method of generating an image, comprising:
    estimating a point spread function (PSF) based on event data obtained by imaging an object with a first camera;
    acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera;
    acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data;
    transforming the first image, the second image, and the point spread function from an image space to a frequency space;
    specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function;
    acquiring a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and
    acquiring a deblur image by inverse transforming the replacement image from the frequency space to the image space.
  2. The method according to claim 1, wherein the frequency is specified by specifying a frequency at which the frequency intensity of the transformed point spread function is equal to or lower than a threshold value, by comparing the frequency intensity with the threshold value.
  3. The method according to claim 2, further comprising adjusting the threshold value before specifying the frequency.
  4. The method according to claim 3, wherein the threshold value is adjusted based on a feature of an image captured by the second camera.
  5. The method according to claim 4, wherein the threshold value is adjusted to a threshold value corresponding to the feature of the image captured by the second camera by searching a table that defines relationships between the features of a plurality of images and a plurality of threshold values.
  6. The method according to any one of claims 1 to 5, wherein the first camera is an event camera, and the second camera is an image sensor camera.
  7. The method according to any one of claims 1 to 6, wherein the image processing method different from the deconvolution method is an image reconstruction method.
  8. The method according to any one of claims 1 to 7, wherein the reference frame image is an RGB image.
  9. The method according to any one of claims 1 to 8, the deblur image is displayed on a display.
  10. An electronic device comprising:
    a camera assembly includes a first camera and a second camera;
    at least one processor; and
    at least one memory including program code;
    wherein the at least one memory and the program code are configured to, with the at least one processor, cause the electronic device to perform:
    estimating a point spread function (PSF) based on event data obtained by imaging an object with the first camera;
    acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with the second camera;
    acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data;
    transforming the first image, the second image, and the point spread function from an image space to a frequency space;
    specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function;
    acquiring a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and
    acquiring a deblur image by inverse transforming the replacement image from the frequency space to the image space.
  11. The electronic device according to claim 10, wherein the frequency is specified by specifying a frequency at which the frequency intensity of the transformed point spread function is equal to or lower than a threshold value, by comparing the frequency intensity with the threshold value.
  12. The electronic device according to claim 11, wherein the threshold value is adjusted before specifying the frequency.
  13. The electronic device according to any one of claims 10 to 12, wherein the threshold value is adjusted based on a feature of an image captured by the second camera.
  14. The electronic device according to claim 13,
    wherein the memory is configured to store a table that defines relationships between features of a plurality of images and a plurality of threshold values, and
    wherein the threshold value is adjusted to a threshold value corresponding to the feature of the image captured by the second camera by searching the table stored in the memory.
  15. The electronic device according to any one of claims 10 to 14, wherein the first camera is an event camera, and the second camera is an image sensor camera.
  16. The electronic device according to any one of claims 10 to 15, wherein the image processing method different from the deconvolution method is an image reconstruction method.
  17. The electronic device according to any one of claims 10 to 16, wherein the reference frame image is an RGB image.
  18. The electronic device according to any one of claims 10 to 17, further comprising a display configured to display the deblur image.
  19. An apparatus comprising:
    a point spread function estimation unit configured to estimate a point spread function (PSF) based on event data obtained by imaging an object with a first camera;
    a first image processing unit configured to acquire a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera;
    a second image processing unit configured to acquire a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data;
    a transform unit configured to transform the first image, the second image, and the point spread function from an image space to a frequency space;
    a frequency specifying unit configured to specify a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function;
    a replacement unit configured to acquire a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and
    an inverse transform unit configured to acquire a deblur image by inverse transforming the replacement image from the frequency space to the image space.
  20. A computer readable medium comprising program instructions stored thereon for performing at least the following:
    estimating a point spread function (PSF) based on event data obtained by imaging an object with a first camera;
    acquiring a first image by executing a deconvolution method using the point spread function based on a reference frame image obtained by capturing the object with a second camera;
    acquiring a second image having less ringing characteristics than the first image by executing an image processing method different from the deconvolution method based on the reference frame image and the event data;
    transforming the first image, the second image, and the point spread function from an image space to a frequency space;
    specifying a frequency corresponding to a part to be replaced in the transformed first image, based on a frequency intensity of the transformed point spread function;
    acquiring a replacement image by replacing the part of the transformed first image with a part of the transformed second image at the specified frequency; and
    acquiring a deblur image by inverse transforming the replacement image from the frequency space to the image space.
PCT/CN2022/108737 2022-07-28 2022-07-28 Method of generating an image, electronic device, apparatus, and computer readable storage medium WO2024020958A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/108737 WO2024020958A1 (en) 2022-07-28 2022-07-28 Method of generating an image, electronic device, apparatus, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/108737 WO2024020958A1 (en) 2022-07-28 2022-07-28 Method of generating an image, electronic device, apparatus, and computer readable storage medium

Publications (1)

Publication Number Publication Date
WO2024020958A1 true WO2024020958A1 (en) 2024-02-01

Family

ID=89704965

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/108737 WO2024020958A1 (en) 2022-07-28 2022-07-28 Method of generating an image, electronic device, apparatus, and computer readable storage medium

Country Status (1)

Country Link
WO (1) WO2024020958A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3117597A1 (en) * 2014-03-12 2017-01-18 Sony Corporation Method, system and computer program product for debluring images
CN107369134A (en) * 2017-06-12 2017-11-21 上海斐讯数据通信技术有限公司 A kind of image recovery method of blurred picture
CN107507135A (en) * 2017-07-11 2017-12-22 天津大学 Image reconstructing method based on coding aperture and target
US20190228506A1 (en) * 2018-01-19 2019-07-25 Bae Systems Information And Electronic Systems Integration Inc. Methods for image denoising and deblurring

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3117597A1 (en) * 2014-03-12 2017-01-18 Sony Corporation Method, system and computer program product for debluring images
CN107369134A (en) * 2017-06-12 2017-11-21 上海斐讯数据通信技术有限公司 A kind of image recovery method of blurred picture
CN107507135A (en) * 2017-07-11 2017-12-22 天津大学 Image reconstructing method based on coding aperture and target
US20190228506A1 (en) * 2018-01-19 2019-07-25 Bae Systems Information And Electronic Systems Integration Inc. Methods for image denoising and deblurring

Similar Documents

Publication Publication Date Title
JP6111489B2 (en) Image sharpening processing method and apparatus, and photographing terminal
JP3758452B2 (en) RECORDING MEDIUM, IMAGE PROCESSING DEVICE, AND IMAGE PROCESSING METHOD
US10559070B2 (en) Image processing method and apparatus, and electronic device
US10438321B2 (en) Image processing method and apparatus, and electronic device
CN110766729A (en) Image processing method, image processing device, storage medium and electronic equipment
WO2024020958A1 (en) Method of generating an image, electronic device, apparatus, and computer readable storage medium
US10692199B2 (en) Image processing method and device, and non-transitory computer-readable storage medium
US20230177654A1 (en) Method of removing noise in image, electrical device, and storage medium
CN110956572A (en) Image processing method, device and system
CN111416937A (en) Image processing method, image processing device, storage medium and mobile equipment
WO2021243709A1 (en) Method of generating target image data, electrical device and non-transitory computer readable medium
CN113472997A (en) Image processing method and device, mobile terminal and storage medium
WO2022174460A1 (en) Sensor, electrical device, and non-transitory computer readable medium
WO2022222075A1 (en) Method of generating image data, electronic device, apparatus, and computer readable medium
WO2022174449A1 (en) Method of generating target image, electrical device, and non-transitory computer readable medium
WO2022178683A1 (en) Method of generating target image, electrical device, and non-transitory computer readable medium
WO2022246606A1 (en) Electrical device, method of generating image data, and non-transitory computer readable medium
WO2022016385A1 (en) Method of generating corrected pixel data, electrical device and non-transitory computer readable medium
WO2021253166A1 (en) Method of generating target image data and electrical device
WO2021159295A1 (en) Method of generating captured image and electrical device
US20230239581A1 (en) Electrical device, method of generating image data, and non-transitory computer readable medium
WO2022047614A1 (en) Method of generating target image data, electrical device and non-transitory computer readable medium
WO2023272622A1 (en) Method of generating an image, electronic device, apparatus, and computer readable storage medium
WO2021120107A1 (en) Method of generating captured image and electrical device
CN116523765B (en) Real-time video image noise reduction method, device and memory

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22952416

Country of ref document: EP

Kind code of ref document: A1