WO2022218245A1 - Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible - Google Patents

Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible Download PDF

Info

Publication number
WO2022218245A1
WO2022218245A1 PCT/CN2022/086050 CN2022086050W WO2022218245A1 WO 2022218245 A1 WO2022218245 A1 WO 2022218245A1 CN 2022086050 W CN2022086050 W CN 2022086050W WO 2022218245 A1 WO2022218245 A1 WO 2022218245A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer
image
point data
floating point
network
Prior art date
Application number
PCT/CN2022/086050
Other languages
English (en)
Chinese (zh)
Inventor
聂嘉栋
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2022218245A1 publication Critical patent/WO2022218245A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • the present application belongs to the field of communication technologies, and in particular relates to an image processing method, an apparatus, an electronic device, and a readable storage medium.
  • the purpose of the embodiments of the present application is to provide an image processing method, apparatus, electronic device, and readable storage medium, which can solve the problem that traditional ISP image processing is difficult to meet user needs.
  • an embodiment of the present application provides an image processing method, the method is executed by an electronic device, and the method includes:
  • the deep convolutional neural network includes the mapping relationship between the RAW image and the RGB image.
  • an embodiment of the present application provides an image processing apparatus, the apparatus is applied to an electronic device, and the apparatus includes:
  • the acquisition module is used to acquire the corrected RAW image
  • an input module for inputting the RAW image into a deep convolutional neural network
  • a processing module for converting the RAW image into an RGB image through the deep convolutional neural network
  • the deep convolutional neural network includes the mapping relationship between the RAW image and the RGB image.
  • embodiments of the present application provide an electronic device, which includes a processor, a memory, and a program or instruction stored on the memory and executable on the processor, the program or instruction being executed by the The processor implements the steps of the image processing method described in the first aspect when executed.
  • an embodiment of the present application provides a readable storage medium, wherein a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the image processing method according to the first aspect is implemented A step of.
  • an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction, and implement the first aspect the method described.
  • embodiments of the present application further provide a computer program product, the computer program product is stored in a non-transitory storage medium, and the computer program product is executed by at least one processor to implement the first aspect. steps of the method described.
  • an embodiment of the present application further provides an execution apparatus, where the execution apparatus is configured to execute the method described in the first aspect.
  • the deep convolutional neural network is integrated into the ISP process, and the RAW image and the red-green-blue (RGB) image are mapped through the mapping relationship between the RAW image trained in the deep convolutional neural network and the red-green-blue (RGB) image.
  • RGB image can effectively improve the quality of the image, has strong scene adaptability, and reduces the complicated and huge ISP image parameter adjustment work, and replaces the parameter adjustment work with the training work of the deep convolutional neural network model.
  • 1a is a schematic flowchart of an existing image processing method
  • FIG. 1b is one of the schematic flowcharts of the image processing method provided by the embodiment of the application.
  • FIG. 2 is the second schematic flowchart of the image processing method provided by the embodiment of the present application.
  • FIG. 3 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of the structure of the deconvolution layer
  • FIG. 6 is a schematic structural diagram of an image processing apparatus provided by an embodiment of the present application.
  • FIG. 7 is one of the schematic structural diagrams of an electronic device provided by an embodiment of the present application.
  • FIG. 8 is a second schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • first, second and the like in the description and claims of the present application are used to distinguish similar objects, and are not used to describe a specific order or sequence. It is to be understood that the data so used are interchangeable under appropriate circumstances so that the embodiments of the present application can be practiced in sequences other than those illustrated or described herein, and distinguish between “first”, “second”, etc.
  • the objects are usually of one type, and the number of objects is not limited.
  • the first object may be one or more than one.
  • “and/or” in the description and claims indicates at least one of the connected objects, and the character “/" generally indicates that the associated objects are in an "or” relationship.
  • FIG. 1a shows a traditional ISP process
  • FIG. 1b shows the ISP process of the embodiment of the present application.
  • the ISP method in the embodiment of this application is the same as the traditional ISP method, and only black level correction (Black Level Subtract), defective pixel correction (Defect Pixel Correction), and fixed pattern noise (Fix Pattern Noise, FPN) correction (Subtract) are performed in the RAW domain. ), Digital Gain, Green imbalance correction, White balance, Color Correction, Dynamic Range, Gamma, Shadding Correction ), distortion correction (Distortion Correction), color space transformation (Color Space Transfrom), etc.
  • RAW domain image processing and red-green Blue color (Red-Green-Blue, RGB) domain image processing is all replaced by a Deep Convolutional Neural Network.
  • the input to the deep convolutional neural network is a RAW image and the output is an RGB image.
  • the image processing method of the embodiment of the present application integrates the deep convolutional neural network into the traditional ISP process to improve the effect of traditional ISP on image signal processing, such as noise, detail texture, white balance, color, and the like.
  • Traditional ISP noise reduction usually based on texture and edge detection, performs different filtering processing on different regions, which sacrifices certain detailed texture information while reducing noise.
  • the effect of noise reduction and detail texture restoration of deep convolutional neural network is better than traditional image signal processing methods in both objective indicators and subjective feelings.
  • the traditional ISP white balance algorithm usually refers to simulated sunlight (D50), international standard artificial daylight (Artificial Daylight) (D65), American cool white shop light source (Cool White Fluorescent, CWF), American kitchen window spotlight (A ) and other standard light sources, and implement special strategies for blue sky, grass, mixed light sources, etc.
  • the deep convolutional neural network can achieve automatic white balance recovery by training the mapping relationship between the RAW image and the RGB image after white balance restoration, and obtain a white balance algorithm that is more adaptable than traditional algorithms.
  • deep convolutional neural networks can also achieve automatic coloring of images.
  • an embodiment of the present application provides an image processing method.
  • the method is executed by an electronic device, and the method includes:
  • Step 201 Acquire a corrected RAW image
  • a RAW image after correction processing refers to a RAW image after correction processing is performed on a RAW image obtained from an image sensor or other image receiving device, and the specific correction processing process may include as shown in FIGS. Shown in 1b: black level correction, dead pixel correction, FPN correction, green imbalance correction, white balance and other processing steps.
  • Step 202 Input the RAW image into a deep convolutional neural network
  • the RAW image is used as the input value of the deep convolutional neural network, and the conversion of the RAW image to the RGB image is performed through the deep convolutional neural network.
  • 4 ⁇ (W/2) ⁇ (H/2) first floating-point data is input to the deep convolutional neural network; wherein, W is the width of the RAW image, and H is the RAW image height of.
  • the green component in the RAW image is heavier.
  • the RAW image is generally in RGGB, BGGR and other formats, such as a 1920 ⁇ 1080 RAW image, every four pixels It contains one R, two Gs, and one B.
  • the original RAW data is an image with a width of W and a height of H.
  • the number of pixels of different colors in the image is different, therefore, the number of R pixels accounts for the entire screen. 1/4, the number of G pixels accounts for 1/2 of the full screen, and the number of B pixels accounts for 1/4 of the full screen.
  • Each pixel varies from 10bit to 16bit, depending on the Complementary Metal-Oxide-Semiconductor (CMOS) sensor (Sensor), it is converted into a floating point value of 0 to 1, that is, each pixel Both correspond to a floating-point value.
  • CMOS Complementary Metal-Oxide-Semiconductor
  • the RGGB data (or BGGR) in RAW is split into 4 channels (Channel), and the input of the corresponding deep convolutional neural network is 4*(H/2)*(W/2) floating-point data, for example :
  • the 1920 ⁇ 1080 image can be split into 960 ⁇ 540 ⁇ 4 images to reduce the amount of data computation, and the split image is used as the input of the deep convolutional neural network.
  • Step 203 Convert the RAW image to an RGB image through a deep convolutional neural network
  • the deep convolutional neural network includes the mapping relationship between the RAW image and the RGB image, that is, the deep convolutional neural network can realize automatic white balance by training the mapping relationship between the RAW image and the RGB image after white balance restoration. Recovery, achieve a more adaptable white balance algorithm than the traditional algorithm, and can also achieve automatic coloring of the image.
  • the deep convolutional neural network includes a plurality of first network layers and a second network layer, wherein the first network layer includes a convolution layer (Convolution, Conv), a normalization layer (also referred to as batch At least one of a normalization layer (Batch Normalization, BN)) and an activation unit layer (also known as a linear rectification function (Rectified Linear Unit, ReLU)), the second network layer includes a deconvolution layer (Deconvolution, Deconv), normalization At least one of the unification layer and the activation unit layer;
  • Conv convolution layer
  • BN normalization layer
  • an activation unit layer also known as a linear rectification function (Rectified Linear Unit, ReLU)
  • the second network layer includes a deconvolution layer (Deconvolution, Deconv), normalization At least one of the unification layer and the activation unit layer;
  • the deconvolution layer can deconvolve a W ⁇ H image into a 2W ⁇ 2H image.
  • a RAW image is converted to an RGB image through a deep convolutional neural network, including:
  • the basic unit combination of the deep convolutional neural network is the structure of convolution, normalization and/or activation unit, such as Conv, BN, ReLU, after several layers of convolution, normalization and/or After activating the unit to obtain an image with good enough detail, enough noise, and restored white balance and color, the output floating-point data at this time is Channel ⁇ (W/2) ⁇ (H/2). Values such as 32, 64, 128, 192, etc.
  • the second floating point data is converted into an RGB image through the second network layer, including:
  • the width and height of the image are enlarged to obtain a 3 ⁇ H ⁇ W image, and finally the floating point data is converted into RGB image data,
  • the value of the RGB image data is 0 to 2 n -1, where n indicates that the RGB data is several bits of data. For example, if the RGB image data is 8bit data, the corresponding value is 0 to 255, thereby completing the RAW image in the ISP.
  • the task of processing and RGB image processing for example: the final output is 1920 ⁇ 1080 ⁇ 3, and the final output is an image with a resolution of 1920 ⁇ 1080.
  • Each pixel of the image is three components of RGB.
  • the deep convolutional neural network further includes a third network layer, and the third network layer includes at least one of a deconvolution layer, a normalization layer, and an activation unit layer;
  • the method further includes:
  • the third floating point data is converted into the fourth floating point data of 3 ⁇ (2W) ⁇ (2H) through the third network layer;
  • an additional layer of deconvolution normalization and/or activation unit can be added to enlarge the width and height of the image again to obtain 3 ⁇ (2H) ⁇ ( 2W) image.
  • the deep convolutional neural network can support the residual structure and other combined methods to deepen the network depth and improve the expression effect of the network.
  • the residual structure can be implemented by using the existing residual structure, for example: see Figure 5 , an existing residual structure is shown in the figure.
  • the deep convolutional neural network is integrated into the ISP process, and the RAW image and RGB image are converted through the mapping relationship between the RAW image and RGB image trained in the deep convolutional neural network, which effectively improves the quality of the image. It has strong scene adaptability, and reduces the complicated and huge ISP image parameter adjustment work, and replaces the parameter adjustment work with the training work of the deep convolutional neural network model.
  • the execution subject may be an image processing apparatus, or a control module in the image processing apparatus for executing the image processing method.
  • the image processing apparatus provided by the embodiments of the present application is described by taking an image processing apparatus executing an image processing method as an example.
  • an embodiment of the present application provides an image processing apparatus 600, the apparatus is applied to electronic equipment, and the apparatus includes:
  • an acquisition module 601 configured to acquire a RAW image after correction processing
  • an input module 602 configured to input the RAW image into a deep convolutional neural network
  • the deep convolutional neural network includes the mapping relationship between the RAW image and the RGB image.
  • the input module 602 is further configured to:
  • W is the width of the RAW image
  • H is the height of the RAW image
  • the deep convolutional neural network includes a plurality of first network layers and a second network layer
  • the first network layer includes at least one of a convolutional layer, a normalization layer, and an activation unit layer
  • the second network layer includes at least one of a deconvolution layer, a normalization layer and an activation unit layer
  • the processing module 603 is further used for:
  • the second floating point data is converted into the RGB image by the second network layer.
  • processing module 603 is further configured to:
  • the deep convolutional neural network further includes a third network layer, and the third network layer includes at least one of a deconvolution layer, a normalization layer, and an activation unit layer;
  • the processing module 603 is also used for:
  • the deep convolutional neural network is integrated into the ISP process, and the RAW image and RGB image are converted through the mapping relationship between the RAW image and RGB image trained in the deep convolutional neural network, which effectively improves the quality of the image. It has strong scene adaptability, and reduces the complicated and huge ISP image parameter adjustment work, and replaces the parameter adjustment work with the training work of the deep convolutional neural network model.
  • the image processing apparatus in this embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal.
  • the apparatus may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, an in-vehicle electronic device, a wearable device, an Ultra-Mobile Personal Computer (UMPC), a netbook, or a personal digital assistant (Personal Digital Assistant).
  • UMPC Ultra-Mobile Personal Computer
  • netbook or a personal digital assistant (Personal Digital Assistant).
  • non-mobile electronic devices can be servers, network attached storage (Network Attached Storage, NAS), personal computer (Personal Computer, PC), television (Television, TV), teller machine or self-service machine, etc., this application Examples are not specifically limited.
  • Network Attached Storage NAS
  • PC Personal Computer
  • TV Television, TV
  • teller machine or self-service machine etc.
  • the image processing apparatus in this embodiment of the present application may be an apparatus having an operating system.
  • the operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
  • the image processing apparatus provided in this embodiment of the present application can implement the various processes implemented by the method embodiments shown in FIG. 1 b and FIG. 2 . In order to avoid repetition, details are not repeated here.
  • an embodiment of the present application further provides an electronic device 700, including a memory 701, a processor 702, and a program or instruction stored in the memory 701 and executable on the processor 702,
  • an electronic device 700 including a memory 701, a processor 702, and a program or instruction stored in the memory 701 and executable on the processor 702,
  • the program or instruction is executed by the processor 702
  • each process of the above-mentioned image processing method embodiments can be implemented, and the same technical effect can be achieved. To avoid repetition, details are not described here.
  • the electronic devices in the embodiments of the present application include mobile electronic devices and non-mobile electronic devices.
  • FIG. 8 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
  • the electronic device 800 includes but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, and a processor 810, etc. part.
  • the electronic device 800 may also include a power source (such as a battery) for supplying power to various components, and the power source may be logically connected to the processor 810 through a power management system, so as to manage charging, discharging, and power management through the power management system. consumption management and other functions.
  • a power source such as a battery
  • the structure of the electronic device shown in FIG. 8 does not constitute a limitation on the electronic device.
  • the electronic device may include more or less components than the one shown, or combine some components, or arrange different components, which will not be repeated here. .
  • the processor 810 is used for:
  • the deep convolutional neural network includes the mapping relationship between the RAW image and the RGB image.
  • processor 810 is further configured to:
  • W is the width of the RAW image
  • H is the height of the RAW image
  • the deep convolutional neural network includes a plurality of first network layers and a second network layer
  • the first network layer includes at least one of a convolution layer, a normalization layer, and an activation unit layer
  • the second network layer includes at least one of a deconvolution layer, a normalization layer and an activation unit layer
  • the processor 810 is further configured to:
  • the second floating point data is converted into the RGB image by the second network layer.
  • processor 810 is further configured to:
  • the deep convolutional neural network further includes a third network layer, and the third network layer includes at least one of a deconvolution layer, a normalization layer, and an activation unit layer;
  • the processor 810 is further configured to:
  • the deep convolutional neural network is integrated into the ISP process, and the RAW image and RGB image are converted through the mapping relationship between the RAW image and RGB image trained in the deep convolutional neural network, which effectively improves the quality of the image. It has strong scene adaptability, and reduces the complicated and huge ISP image parameter adjustment work, and replaces the parameter adjustment work with the training work of the deep convolutional neural network model.
  • the input unit 804 may include a graphics processor (Graphics Processing Unit, GPU) 8041 and a microphone 8042. Such as camera) to obtain still pictures or video image data for processing.
  • the display unit 806 may include a display panel 8061, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 807 includes a touch panel 8071 and other input devices 8072 .
  • the touch panel 8071 is also called a touch screen.
  • the touch panel 8071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 8072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be described herein again.
  • Memory 809 may be used to store software programs as well as various data, including but not limited to application programs and operating systems.
  • the processor 810 may integrate an application processor and a modem processor, wherein the application processor mainly handles the operating system, user interface, and application programs, and the like, and the modem processor mainly handles wireless communication. It can be understood that the above-mentioned modulation and demodulation processor may not be integrated into the processor 810
  • Embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium.
  • a program or an instruction is stored on the readable storage medium.
  • the processor is the processor in the electronic device described in the foregoing embodiments.
  • the readable storage medium includes a computer-readable storage medium, such as a computer read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
  • An embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement the above image processing method embodiments.
  • the chip includes a processor and a communication interface
  • the communication interface is coupled to the processor
  • the processor is configured to run a program or an instruction to implement the above image processing method embodiments.
  • the chip mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip, or the like.
  • An embodiment of the present application provides a computer program product, the computer program product is stored in a non-transitory storage medium, the computer program product is executed by at least one processor to implement the steps of the method described above, and The same technical effect can be achieved, and in order to avoid repetition, details are not repeated here.
  • An embodiment of the present application provides an execution apparatus, and the execution apparatus is configured to execute the method described above.
  • the method of the above embodiment can be implemented by means of software plus a necessary general hardware platform, and of course can also be implemented by hardware, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or in a part that contributes to the prior art, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, CD-ROM), including several instructions to make a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of this application.
  • a storage medium such as ROM/RAM, magnetic disk, CD-ROM
  • the disclosed apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the related technology or the part of the technical solution.
  • the computer software product is stored in a storage medium, including several
  • the instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk and other mediums that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

La présente demande, qui relève du domaine technique des communications, concerne un procédé et un appareil de traitement d'image, un dispositif électronique et un support de stockage lisible. Le procédé de traitement d'image comprend : l'obtention d'une image RAW corrigée ; l'introduction de l'image RAW dans un réseau neuronal convolutif profond ; et la conversion de l'image RAW en une image RVB au moyen du réseau neuronal convolutif profond, le réseau neuronal convolutif profond comprenant une relation de mappage entre l'image RAW et l'image RVB.
PCT/CN2022/086050 2021-04-16 2022-04-11 Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible WO2022218245A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110412069.5 2021-04-16
CN202110412069.5A CN113112428A (zh) 2021-04-16 2021-04-16 图像处理方法、装置、电子设备及可读存储介质

Publications (1)

Publication Number Publication Date
WO2022218245A1 true WO2022218245A1 (fr) 2022-10-20

Family

ID=76717884

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/086050 WO2022218245A1 (fr) 2021-04-16 2022-04-11 Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible

Country Status (2)

Country Link
CN (1) CN113112428A (fr)
WO (1) WO2022218245A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113112428A (zh) * 2021-04-16 2021-07-13 维沃移动通信有限公司 图像处理方法、装置、电子设备及可读存储介质
CN114638348A (zh) * 2022-05-20 2022-06-17 福思(杭州)智能科技有限公司 网络模型调整方法、装置、感知设备和存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109379572A (zh) * 2018-12-04 2019-02-22 北京达佳互联信息技术有限公司 图像转换方法、装置、电子设备及存储介质
CN110557584A (zh) * 2018-05-31 2019-12-10 杭州海康威视数字技术股份有限公司 图像处理方法及装置、计算机可读存储介质
CN110992272A (zh) * 2019-10-18 2020-04-10 深圳大学 基于深度学习的暗光图像增强方法、装置、设备及介质
CN111127336A (zh) * 2019-11-18 2020-05-08 复旦大学 一种基于自适应选择模块的图像信号处理方法
US20200234402A1 (en) * 2019-01-18 2020-07-23 Ramot At Tel-Aviv University Ltd. Method and system for end-to-end image processing
CN111818318A (zh) * 2020-06-12 2020-10-23 北京阅视智能技术有限责任公司 图像处理器的白平衡调谐方法、装置、设备及存储介质
CN112529775A (zh) * 2019-09-18 2021-03-19 华为技术有限公司 一种图像处理的方法和装置
CN113112428A (zh) * 2021-04-16 2021-07-13 维沃移动通信有限公司 图像处理方法、装置、电子设备及可读存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110557584A (zh) * 2018-05-31 2019-12-10 杭州海康威视数字技术股份有限公司 图像处理方法及装置、计算机可读存储介质
CN109379572A (zh) * 2018-12-04 2019-02-22 北京达佳互联信息技术有限公司 图像转换方法、装置、电子设备及存储介质
US20200234402A1 (en) * 2019-01-18 2020-07-23 Ramot At Tel-Aviv University Ltd. Method and system for end-to-end image processing
CN112529775A (zh) * 2019-09-18 2021-03-19 华为技术有限公司 一种图像处理的方法和装置
CN110992272A (zh) * 2019-10-18 2020-04-10 深圳大学 基于深度学习的暗光图像增强方法、装置、设备及介质
CN111127336A (zh) * 2019-11-18 2020-05-08 复旦大学 一种基于自适应选择模块的图像信号处理方法
CN111818318A (zh) * 2020-06-12 2020-10-23 北京阅视智能技术有限责任公司 图像处理器的白平衡调谐方法、装置、设备及存储介质
CN113112428A (zh) * 2021-04-16 2021-07-13 维沃移动通信有限公司 图像处理方法、装置、电子设备及可读存储介质

Also Published As

Publication number Publication date
CN113112428A (zh) 2021-07-13

Similar Documents

Publication Publication Date Title
US11750785B2 (en) Video signal processing method and apparatus
EP4024323A1 (fr) Procédé et appareil de traitement d'image
WO2022218245A1 (fr) Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible
CN109274985B (zh) 视频转码方法、装置、计算机设备和存储介质
CN111510698A (zh) 图像处理方法、装置、存储介质及移动终端
CN113132696B (zh) 图像色调映射方法、装置、电子设备和存储介质
WO2017203942A1 (fr) Dispositif et procédé de traitement d'image, et programme
CN111885312B (zh) Hdr图像的成像方法、系统、电子设备及存储介质
JP5810803B2 (ja) ホワイトボードの画像を調整する方法、装置及びシステム
WO2020215180A1 (fr) Procédé et appareil de traitement d'image, et dispositif électronique
JP2015512194A (ja) 画像の色相を決定するための方法およびワイヤレスハンドヘルドデバイス
CN110349097B (zh) 图像显着性的色彩增强方法及图像处理装置
WO2022042754A1 (fr) Procédé et appareil de traitement d'image, et dispositif
WO2021218924A1 (fr) Procédé et appareil de mappage de plage dynamique
WO2021073330A1 (fr) Procédé et appareil de traitement de signaux vidéo
CN107592517B (zh) 一种肤色处理的方法及装置
CN110807735A (zh) 图像处理方法、装置、终端设备及计算机可读存储介质
TWI415480B (zh) 影像處理方法與影像處理系統
WO2023241339A1 (fr) Procédé et appareil de correction de dominante de couleur, dispositif, support de stockage et produit de programme
TWI523500B (zh) 影像的動態範圍壓縮方法與影像處理裝置
CN113132562B (zh) 镜头阴影校正方法、装置及电子设备
CN109600596B (zh) 一种非线性无色恒常的白平衡方法
CN113256631A (zh) 一种提升阅片效率和调整医用图像质量的方法
CN106657945B (zh) 一种非线性分段的伽马校正实现方法
CN111613168A (zh) 一种影像显示处理方法、装置以及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22787471

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE