WO2022241728A1 - Procédé de traitement d'image, dispositif électronique et support lisible par ordinateur non transitoire - Google Patents

Procédé de traitement d'image, dispositif électronique et support lisible par ordinateur non transitoire Download PDF

Info

Publication number
WO2022241728A1
WO2022241728A1 PCT/CN2021/094963 CN2021094963W WO2022241728A1 WO 2022241728 A1 WO2022241728 A1 WO 2022241728A1 CN 2021094963 W CN2021094963 W CN 2021094963W WO 2022241728 A1 WO2022241728 A1 WO 2022241728A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera image
luminance
emphasis processing
region
bokeh
Prior art date
Application number
PCT/CN2021/094963
Other languages
English (en)
Inventor
Renma SUGAWARA
Jun Luo
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to CN202180098439.0A priority Critical patent/CN117396917A/zh
Priority to PCT/CN2021/094963 priority patent/WO2022241728A1/fr
Publication of WO2022241728A1 publication Critical patent/WO2022241728A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20012Locally adaptive

Definitions

  • the present disclosure relates to an image processing method, an electronic device, and a non-transitory computer-readable media.
  • a technique for generating bokeh in images of objects located in the foreground and the background of a subject is applied to a camera image obtained by imaging the subject with a camera having a deep depth of field such as a smartphone.
  • bokeh is generated by using image processing.
  • Such a technique of generating bokeh makes use of depth information, including a distance between the camera and the subject, to generate bokeh.
  • the conventional technology does not propose an effective method of generating a large bokeh with a clear outline on an illumination region where the illumination has been imaged.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an imaging lens assembly, a camera module, and an imaging device.
  • an image processing method includes:
  • the emphasis processing emphasizing a luminance value of a high-luminance region in the camera image over a luminance value of a low-luminance region in the camera image;
  • the high-luminance region may be a region having a luminance value equal to or greater than a threshold value
  • the low-luminance region may be a region having a luminance value less than the threshold value
  • the emphasis processing may be a processing for increasing an area of a bokeh to be generated on the high-luminance region.
  • the emphasis processing may be a processing for making an outline of a bokeh clear, the bokeh being generated on the high-luminance region.
  • the emphasis processing may be a processing in which a luminance value of the high-luminance region is increased and a luminance value of the low-luminance region is maintained.
  • the emphasis processing may be performed by multiplying a luminance value before emphasis processing by a weight value proportional to a luminance value after emphasis processing.
  • the emphasis processing may be performed using a look–up table in which a correspondence between a luminance value and the weight value is indicated.
  • the weight value may be calculated according to a following conditional expression:
  • x is the luminance value
  • f (x) is the weight value
  • TH is a threshold value of the luminance value
  • W is an adjustment parameter of x direction
  • S is an adjustment parameter of the f (x) direction.
  • the adjustment parameter of the x direction W and the adjustment parameter of the f (x) direction S may be fixed values.
  • the adjustment parameter of the x direction W and the adjustment parameter of the f (x) direction S may be changed according to a user operation.
  • the camera image may be a moving image.
  • the method may further include performing a smoothing process on a camera image of a current frame on which the emphasis processing was performed, the smoothing process being performed by using a camera image of a past frame on which the emphasis processing was performed, wherein
  • the generating the bokeh may be performed on a camera image on which the smoothing process was performed.
  • the high-luminance region may be an illumination region where illumination was imaged.
  • the generating the bokeh may include changing a shape of a bokeh to be generated on the illumination region with respect to a shape of the illumination region.
  • an electronic device includes:
  • processor configured to
  • the emphasis processing emphasizing a luminance value of a high-luminance region in the camera image over a luminance value of a low-luminance region in the camera image;
  • a non-transitory computer-readable media is configured to store a computer program, wherein the computer program is executed by using a computer to implement the method.
  • FIG. 1 is a diagram showing a configuration example of an electronic device capable of implementing an image processing method according to a first embodiment of the present disclosure
  • FIG. 2 is a flowchart showing the image processing method according to the first embodiment of the present disclosure
  • FIG. 3 is a diagram showing the image processing method according to the first embodiment of the present disclosure
  • FIG. 4 is a conceptual diagram showing the image processing method according to the first embodiment of the present disclosure.
  • FIG. 5 is a flowchart showing the image processing method according to a second embodiment of the present disclosure.
  • FIG. 6 is a diagram showing the image processing method according to a third embodiment of the present disclosure.
  • FIG. 1 is a diagram showing a configuration example of an electronic device capable of implementing the image processing method according to the first embodiment of the present disclosure.
  • the electronic device 100 includes a stereo camera module 10, a range sensor module 20, and an image signal processor 30 as an example of a processor.
  • the image signal processor 30 controls the stereo camera module 10 and the range sensor module 20, and processes camera image data acquired from the stereo camera module 10.
  • the stereo camera module 10 includes a master camera module 11 as an example of a camera and a slave camera module 12 to be used for binocular stereo viewing.
  • the master camera module 11 includes a first lens 11a which is capable of focusing on a subject, a first image sensor 11b which detects an image input via the first lens 11a, and a first image sensor driver 11c which drives the first image sensor 11b.
  • the master camera module 11 focuses on the subject among objects within a viewing angle of the master camera module 11 and images the objects including the subject to acquire a master camera image as an example of a camera image.
  • the slave camera module 12 includes a second lens 12a which is capable of focusing on a subject, a second image sensor 12b which detects an image input via the second lens 12a, and a second image sensor driver 12c which drives the second image sensor 12b.
  • the slave camera module 12 focuses on the subject among the objects and images the objects including the subject to acquire a slave camera image.
  • the range sensor module 20 includes a lens 20a, a range sensor 20b, a range sensor driver 20c and a projector 20d, as shown in FIG. 1.
  • the projector 20d emits pulsed light toward the objects including the subject, and the range sensor 20b detects reflection light from the objects through the lens 20a.
  • the range sensor module 20 acquires time of flight (ToF) depth information (ToF depth value) based on the time lapsed from the pulsed light being emitted until the reflected light is received.
  • the resolution of the ToF depth information detected by the range sensor module 20 is lower than the resolution of stereo depth information of a stereo image that is acquired based on the master camera image and the slave camera image.
  • the image signal processor 30 controls the master camera module 11, the slave camera module 12, and the range sensor module 20.
  • the image signal processor 30 generates a bokeh on the master camera image based on the master camera image, the slave camera image, and the ToF depth information.
  • the image signal processor 30 may, for example, correct the stereo depth information based on the ToF depth information, the stereo depth information being obtained by stereo processing of the master camera image and of the slave camera image.
  • the stereo depth information may indicate a value corresponding to a deviation in a horizonal direction (x direction) between corresponding pixels of the master camera image and the slave camera image.
  • the image signal processor 30 may generate the bokeh based on the corrected stereo depth information.
  • the bokeh may be generated using a Gaussian filter having a standard deviation ⁇ corresponding to the corrected stereo depth information.
  • the electronic device 100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial navigation unit (IMU) 47, a main processor 48, and a memory 49.
  • GNSS global navigation satellite system
  • IMU inertial navigation unit
  • the GNSS module 40 measures a current position of the electronic device 100.
  • the wireless communication module 41 performs wireless communications with the internet.
  • the CODEC 42 bidirectionally performs encoding and decoding, using a predetermined encoding/decoding method.
  • the speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42.
  • the microphone 44 outputs sound data to the CODEC 42 based on input sound.
  • the display module 45 displays predetermined information.
  • the input module 46 receives a user’s input.
  • An IMU 47 detects an angular velocity and an acceleration of the electronic device 100.
  • the main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
  • the memory 49 stores a program and data required for the image signal processor 30 to control the stereo camera module 10 and the range sensor module 20, acquired image data, and a program and data required for the main processor 48 to control the electronic device 100.
  • the memory 49 includes a non-transitory computer-readable media having a computer program stored thereon.
  • the computer program is executed by the image signal processor 30 or the main processor 48 to implement an image processing method of the present disclosure.
  • the image processing method includes performing emphasis processing on a camera image acquired by imaging a subject, the emphasis processing emphasizing a luminance value of a high-luminance region in the camera image over a luminance value of a low- luminance region in the camera image.
  • the method further includes generating a bokeh on the camera image on which the emphasis processing was performed.
  • the electronic device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment but may be other types of electronic devices including camera modules 11 and 12.
  • the image signal processor 30 After recording of a moving image of the subject is started, the image signal processor 30 first inputs the master camera image and the slave camera image from the stereo camera module 10 (step S1) .
  • the image signal processor 30 acquires depth information, which indicates distances between the master camera module 11 and the objects within the viewing angle of the master camera module 11, based on the input master camera image and the input slave camera image (step S2) .
  • the depth information may be obtained by correcting the stereo depth information based on the ToF depth information.
  • the subject, and the objects other than the subject may be classified by matting (segmentation) using AI technology, and the depth information may be acquired for each classified object.
  • the specific mode of the depth information is not particularly limited as long as it indicates distances between the master camera module 11 and the objects located within the viewing angle of the master camera module 11.
  • the image signal processor 30 acquires a luminance value of each pixel of the master camera image from the master camera image (step S3) .
  • the image signal processor 30 performs emphasis processing, which emphasizes a luminance value of a high-luminance region in the master camera image over a luminance value of a low-luminance region in the master camera image, on the acquired master camera image based on the acquired luminance value of each pixel (step S4) .
  • the high-luminance region is a region having a luminance value equal to or greater than a threshold value.
  • the high-luminance region is, for example, an illumination region where illumination was imaged.
  • the low-luminance region is a region having a luminance value less than the threshold value.
  • the emphasis processing is a processing for increasing an area of a bokeh to be generated on the high-luminance region as compared with the case where the emphasis processing is not performed.
  • the emphasis processing may be a processing for making an increase rate of an area of a bokeh to be generated on the high-luminance region with respect to an area of the high-luminance region larger than an increase rate of an area of a bokeh to be generated on the low-luminance region with respect to an area of the low-luminance region.
  • the emphasis processing is a processing for making an outline of the bokeh clear, the bokeh being generated on the high-luminance region.
  • the emphasis processing may be a processing for making the outline of the bokeh to be generated on the high-luminance region clearer than an outline of the bokeh to be generated on the low-luminance region.
  • the emphasis processing is a processing in which the luminance value of the high-luminance region is increased and the luminance value of the low-luminance region is maintained.
  • the emphasis processing is performed, for example, by multiplying the luminance value of each pixel of the master camera image (i.e., a luminance value before emphasis processing) by a weight value which is proportional to a luminance value after emphasis processing.
  • the emphasis processing is performed using a look–up table in which a correspondence between the luminance value and the weight value is indicated.
  • the weight value may be a value calculated according to the following formula (1) :
  • x is the luminance value.
  • f (x) is the weight value.
  • TH is a threshold value of the luminance value.
  • W is an adjustment parameter of x direction.
  • S is an adjustment parameter of the f (x) direction.
  • the look-up table is a set of f (x) corresponding to each x from 0 to 255, and can be expressed by the following formula (2) , for example:
  • LUT is the look-up table.
  • f (0) , f (1) , . . ., f (255) are weight values of each pixel.
  • the adjustment parameter of x direction W and the adjustment parameter of the f (x) direction S are, for example, predetermined fixed values.
  • the adjustment parameter of x direction W and the adjustment parameter of the f (x) direction S may be changed by user operations using sliders SL1 and SL2 displayed by the display module 45.
  • the slider SL1 accepts a user operation for changing the adjustment parameter of x direction W.
  • the slider SL2 accepts a user operation for changing the adjustment parameter of the f (x) direction S. It is desirable that ranges of the parameters W and S that can be changed by the sliders SL1 and SL2 are predetermined to the ranges in which a good bokeh can be generated.
  • Reference numeral A in FIG. 3 indicates an illumination region.
  • the image signal processor 30 generates the bokeh on the master camera image on which the emphasis processing was performed (step S5) .
  • the bokeh is generated, for example, by using a convolution operation between the master camera image, on which the emphasis processing was performed using the look–up table LUT, and a kernel (i.e., the Gaussian filter) calculated based on the depth information.
  • the reference numeral B in FIG. 4 indicates a bokeh generated on the illumination region A.
  • the bokeh B generated on the illumination area A has a larger area and a clearer outline than the illumination area A.
  • the bokeh is generated on the camera image on which the emphasis processing was performed, it is possible to generate a large and clear bokeh B on the high-luminance region such as the illumination region A.
  • FIG. 5 is a flowchart showing the image processing method according to a second embodiment of the present disclosure.
  • the image signal processor 30 performs a smoothing process on a master camera image of a current frame on which the emphasis processing was performed (step S6) .
  • the smoothing process is performed using a master camera image of a past frame on which the emphasis processing was performed.
  • the bokeh is generated on the master camera image on which the smoothing process was performed (step S5) .
  • the smoothing process is performed, for example, according to the following formula (3) :
  • f (x) (t) _smooth is a weight value of the master camera image of the current frame on which the emphasis processing and smoothing process were performed.
  • f (x) (t) is a weight value of the master camera image of the current frame on which the emphasis processing was performed.
  • f (x) (t-1) is a weight value of the master camera image of a previous frame on which the emphasis processing was performed.
  • is a value of 0 or more and 1 or less. ⁇ is, for example, a predetermined fixed value. ⁇ may be changeable by user operation.
  • the generating the bokeh (step S5) is performed using a look–up table in which f (x) for each pixel shown in the formula (2) is replaced with f (x) (t) _smooth for each pixel shown in the formula (3) .
  • a master camera image of two or more previous frames may be used as well as the master camera image of a previous frame. It is possible to perform a more appropriate smoothing process by using the master camera image of two or more previous frames.
  • the second embodiment it is possible to deter the luminance value from becoming unstable over time.
  • FIG. 6 is a diagram showing the image processing method according to a third embodiment of the present disclosure.
  • the bokeh B is generated on the illumination region A in a shape which is basically the same as a shape of the illumination region A.
  • the bokeh B may be generated in a shape which is different from the shape of the illumination region A.
  • the shape of the bokeh B generated on the illumination region A may be ring-shaped, star-shaped, or heart-shaped.
  • Such a shape of the bokeh can be realized by adjusting kernel parameters.
  • the bokeh can be provided in a variety of shapes.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • a plurality of means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

Un procédé de traitement d'image comprend l'imagerie d'un sujet pour acquérir une image d'appareil photo, la réalisation d'un traitement d'accentuation sur l'image d'appareil photo acquise, le traitement d'accentuation accentuant une valeur de luminance d'une région à forte luminance dans l'image d'appareil photo au-dessus d'une valeur de luminance d'une région à faible luminance dans l'image d'appareil photo, et la génération d'un bokeh sur l'image d'appareil photo sur laquelle le traitement d'accentuation a été effectué.
PCT/CN2021/094963 2021-05-20 2021-05-20 Procédé de traitement d'image, dispositif électronique et support lisible par ordinateur non transitoire WO2022241728A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180098439.0A CN117396917A (zh) 2021-05-20 2021-05-20 图像处理方法、电子设备及非暂时性计算机可读介质
PCT/CN2021/094963 WO2022241728A1 (fr) 2021-05-20 2021-05-20 Procédé de traitement d'image, dispositif électronique et support lisible par ordinateur non transitoire

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/094963 WO2022241728A1 (fr) 2021-05-20 2021-05-20 Procédé de traitement d'image, dispositif électronique et support lisible par ordinateur non transitoire

Publications (1)

Publication Number Publication Date
WO2022241728A1 true WO2022241728A1 (fr) 2022-11-24

Family

ID=84140118

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/094963 WO2022241728A1 (fr) 2021-05-20 2021-05-20 Procédé de traitement d'image, dispositif électronique et support lisible par ordinateur non transitoire

Country Status (2)

Country Link
CN (1) CN117396917A (fr)
WO (1) WO2022241728A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140152886A1 (en) * 2012-12-03 2014-06-05 Canon Kabushiki Kaisha Bokeh amplification
CN108234882A (zh) * 2018-02-11 2018-06-29 维沃移动通信有限公司 一种图像虚化方法及移动终端
CN110751593A (zh) * 2019-09-25 2020-02-04 北京迈格威科技有限公司 一种图像虚化处理方法及装置
US20200051218A1 (en) * 2018-08-08 2020-02-13 Samsung Electronics Co., Ltd. Method and apparatus for incorporating noise pattern into image on which bokeh processing has been performed

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140152886A1 (en) * 2012-12-03 2014-06-05 Canon Kabushiki Kaisha Bokeh amplification
CN108234882A (zh) * 2018-02-11 2018-06-29 维沃移动通信有限公司 一种图像虚化方法及移动终端
US20200051218A1 (en) * 2018-08-08 2020-02-13 Samsung Electronics Co., Ltd. Method and apparatus for incorporating noise pattern into image on which bokeh processing has been performed
CN110751593A (zh) * 2019-09-25 2020-02-04 北京迈格威科技有限公司 一种图像虚化处理方法及装置

Also Published As

Publication number Publication date
CN117396917A (zh) 2024-01-12

Similar Documents

Publication Publication Date Title
US10997696B2 (en) Image processing method, apparatus and device
KR102385360B1 (ko) 이미지 보정을 수행하는 전자 장치 및 그 동작 방법
US11663733B2 (en) Depth determination for images captured with a moving camera and representing moving features
US10740431B2 (en) Apparatus and method of five dimensional (5D) video stabilization with camera and gyroscope fusion
US11055826B2 (en) Method and apparatus for image processing
US10805508B2 (en) Image processing method, and device
CN112602111A (zh) 模糊基于深度信息组合多个图像而获得的图像的电子设备及驱动该电子设备的方法
KR102524982B1 (ko) 흐림 처리가 수행된 이미지에 노이즈 패턴을 반영하는 방법 및 장치
US11282176B2 (en) Image refocusing
KR20200117562A (ko) 비디오 내에서 보케 효과를 제공하기 위한 전자 장치, 방법, 및 컴퓨터 판독가능 매체
US12008738B2 (en) Defocus blur removal and depth estimation using dual-pixel image data
US20230153960A1 (en) Merging Split-Pixel Data For Deeper Depth of Field
US10692199B2 (en) Image processing method and device, and non-transitory computer-readable storage medium
EP4093015A1 (fr) Procédé et appareil de photographie, support de stockage et dispositif électronique
CN111161299A (zh) 影像分割方法、计算机程序、存储介质及电子装置
WO2022000266A1 (fr) Procédé de création de carte de profondeur pour image animée stéréo et dispositif électronique
JP6221333B2 (ja) 画像処理装置、画像処理回路及び画像処理方法
WO2022241728A1 (fr) Procédé de traitement d'image, dispositif électronique et support lisible par ordinateur non transitoire
WO2022188007A1 (fr) Procédé de traitement d'image et dispositif électronique
WO2022198525A1 (fr) Procédé d'amélioration de la stabilité d'un traitement de bokeh et dispositif électronique
WO2022213332A1 (fr) Procédé de traitement de bokeh, dispositif électronique et support de stockage lisible par ordinateur
WO2022016331A1 (fr) Procédé de compensation de carte de profondeur tof et dispositif électronique
WO2024055290A1 (fr) Procédé de détection de zone de scintillement dans une image capturée, dispositif électronique et support de stockage lisible par ordinateur
US12033308B2 (en) Image correction method and apparatus for camera
WO2022178782A1 (fr) Dispositif électrique, procédé de commande de dispositif électrique, et support de stockage lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21940182

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180098439.0

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21940182

Country of ref document: EP

Kind code of ref document: A1