WO2020078102A1 - Procédé et appareil d'amélioration d'image et support de stockage lisible par ordinateur - Google Patents

Procédé et appareil d'amélioration d'image et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2020078102A1
WO2020078102A1 PCT/CN2019/102020 CN2019102020W WO2020078102A1 WO 2020078102 A1 WO2020078102 A1 WO 2020078102A1 CN 2019102020 W CN2019102020 W CN 2019102020W WO 2020078102 A1 WO2020078102 A1 WO 2020078102A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
edge
edge image
resolution
enhancement method
Prior art date
Application number
PCT/CN2019/102020
Other languages
English (en)
Chinese (zh)
Inventor
陈宇聪
闻兴
郑云飞
陈敏
王晓楠
蔡砚刚
黄跃
于冰
Original Assignee
北京达佳互联信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京达佳互联信息技术有限公司 filed Critical 北京达佳互联信息技术有限公司
Publication of WO2020078102A1 publication Critical patent/WO2020078102A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Definitions

  • This application belongs to the field of computer software applications, especially image enhancement methods, devices, and computer-readable storage media.
  • Image enhancement is to use a series of techniques to improve the quality and visual effects of the image, highlight the features of interest in the image, and obtain valuable information in the image, thereby transforming the image into a more suitable for human or machine analysis and processing
  • the form makes the processed image better for some specific applications.
  • Image enhancement theory is widely used in the fields of biomedicine, industrial production, public safety, and aerospace.
  • the image enhancement method uses a super-resolution technology, that is, up-converting the input image into a high-resolution output image.
  • the super-resolution algorithm based on deep learning and machine learning trains the super-resolution model through a large number of low-resolution images and high-resolution image samples to achieve super-resolution image enhancement effects.
  • the image enhancement method can also achieve the subjective image enhancement effect of equivalent high resolution through the traditional filtering algorithm without changing the objective resolution of the image.
  • the noise and blur of the image need to be considered at the same time to achieve a clear and natural high-resolution visual effect.
  • the present application discloses an image enhancement method and device, which performs adaptive feature processing on the first edge image extracted from the original image to obtain a second edge image to achieve better
  • the anti-noise image enhancement effect also solves the power consumption problem.
  • an image enhancement method including:
  • an image enhancement device including:
  • Edge extraction unit configured to extract the edge image from the original image to obtain the first edge image
  • Adaptive feature processing unit configured to perform adaptive feature processing on the first edge image to obtain a second edge image
  • Adder unit linearly superimpose the original image and the second edge image to obtain an edge-enhanced image.
  • an image enhancement device including:
  • Memory for storing processor executable instructions
  • the processor is configured to perform any one of the image enhancement methods described above.
  • a computer-readable storage medium stores computer instructions, and when the computer instructions are executed, the above image enhancement method is implemented.
  • the first edge image is subjected to adaptive feature processing to obtain a second edge image, and the pixels and noise of the first edge image are adaptively distinguished to avoid noise on the first edge image Pixels are processed in error, so as to have better anti-noise effect, and at the same time, solve the power consumption problem caused by the use of super-resolution algorithms of deep learning and machine learning.
  • Fig. 1 is a flowchart of an image enhancement method according to an exemplary embodiment
  • Fig. 2 is a flowchart of an image enhancement method according to an exemplary embodiment
  • Fig. 3 is a flowchart of an image enhancement method according to an exemplary embodiment
  • Fig. 4 is a schematic diagram of an image enhancement device according to an exemplary embodiment
  • Fig. 5 is a block diagram of a device for performing an image enhancement method according to an exemplary embodiment
  • Fig. 6 is a block diagram of a device for performing an image enhancement method according to an exemplary embodiment.
  • FIG. 1 is a flowchart of an image enhancement method according to an exemplary embodiment, and specifically includes the following steps:
  • step S101 an edge image is extracted from the original image to obtain a first edge image.
  • step S102 adaptive feature processing is performed on the first edge image to obtain a second edge image.
  • step S103 the original image and the second edge image are linearly superimposed to obtain an edge-enhanced image.
  • an edge image is extracted from the original image to obtain a first edge image.
  • adaptive feature processing is performed on the first edge image to obtain a second edge image.
  • the adaptive feature processing is to perform adaptive feature compensation on the time domain feature and the frequency domain feature of the first edge image.
  • the original image and the second edge image are linearly superimposed to obtain an edge-enhanced image.
  • adaptive feature processing is performed on the first edge image to obtain a second edge image.
  • Adaptively distinguish the pixels and noises of the first edge image avoid erroneous processing of the noise pixels of the first edge image, have better anti-noise effect, and at the same time, solve 1.
  • Fig. 2 is a flowchart of an image enhancement method according to an exemplary embodiment, and specifically includes the following steps:
  • step S201 an edge image is extracted from the original image to obtain a first edge image.
  • step S202 the neighborhood noise estimate of each pixel of the first edge image is calculated separately.
  • step S203 the neighborhood noise estimates of each pixel of the first edge image are superimposed to obtain a second edge image.
  • step S204 the original image and the second edge image are linearly superimposed to obtain an edge-enhanced image.
  • step S205 the edge-enhanced image is output.
  • an edge image is extracted from the original image to obtain a first edge image. Then, the neighborhood noise estimates of each pixel of the first edge image are calculated separately. Secondly, the neighborhood noise estimates of each pixel of the first edge image are superimposed to obtain the second edge image. Again, the original image and the second edge image are linearly superimposed to obtain an edge-enhanced image. Finally, the edge-enhanced image is output.
  • the neighborhood noise estimate of each pixel of the first edge image is calculated separately.
  • the neighborhood noise estimates of each pixel of the first edge image are superimposed to obtain the second edge image.
  • the adaptive feature processing of the edge image requires less computation, is suitable for use on mobile platforms such as mobile phones, and has lower power consumption.
  • the formula for superimposing the neighborhood noise estimate of each pixel of the first edge image to obtain the second edge image is:
  • P is the first edge image
  • k, a, b are parameter constants
  • is the neighborhood noise estimate.
  • x is the pixel set of the first edge image
  • x 0 is a central pixel in the first edge image
  • x ij is the pixel around x 0
  • i is the pixel x ij is in the first
  • j is the ordinate of the pixel x ij in the first edge image.
  • the neighborhood relationship is the position relationship of adjacent pixels in the image.
  • Each pixel of the first edge image can be used as a central pixel.
  • the separately calculating the neighborhood noise estimate of each pixel of the first edge image is to calculate the absolute value of the error between the pixel of the center pixel of the first edge image and the pixel of the neighborhood position And, the difference between the error sum of the pixel of the center pixel of the first edge image and the pixel of the pixel at the neighborhood position.
  • the neighborhood of the neighborhood noise estimate includes a neighborhood relationship of at least one of the following neighborhood relationships: 4 neighborhoods, D neighborhoods, and 8 neighborhoods.
  • the coordinate of any pixel in the first edge image is (h, k).
  • N 4 (p) There are four adjacent pixels at the 4 neighborhood positions of the pixels, usually represented by N 4 (p), and their coordinates are: (h + 1, k), (h-1, k), (h, k + 1) and (h, k-1).
  • N D (p) There position D at the neighborhood of the four pixel points corresponding to the vertex pixels, usually expressed in N D (p), which coordinates are: (h + 1, k + 1), (h + 1, k -1), (h-1, k + 1) and (h-1, k-1).
  • N 8 (p) N 4 (p) + N D (p).
  • an edge detection operator is used to extract the edge image from the original image.
  • the edge detection operator includes at least one of the following detection operators: a Laplacian Gaussian operator, a Roberts operator, and a Sobel operator.
  • the Laplacian Gaussian operator first performs Gaussian convolution filtering on the original image to reduce noise, and then uses the Laplacian operator for edge detection.
  • Roberts operator is a kind of operator that uses local difference operator to find the edge of image. The difference between the pixels of two adjacent pixels in the diagonal direction is used to approximate the gradient amplitude to detect the image edge of the original image.
  • the Roberts operator detects vertical edges better than diagonal edges and has high positioning accuracy, but it is sensitive to noise and cannot suppress the effects of noise.
  • the Sobel operator weights the difference in gray values of the pixels at the four upper, lower, left, and right neighborhoods of each pixel in the original image. The weighted difference reaches the extreme at the image edge of the original image. Value for edge detection.
  • Fig. 3 is a flowchart of an image enhancement method according to an exemplary embodiment, and specifically includes the following steps:
  • step S301 the low-resolution image is up-sampled to obtain an up-sampled high-resolution image.
  • step S302 extract an edge image from the up-sampled high-resolution image to obtain a third edge image.
  • step S303 the neighborhood noise estimate of each pixel of the third edge image is calculated separately.
  • step S304 the neighborhood noise estimates of each pixel of the third edge image are superimposed to obtain a fourth edge image.
  • step S305 the up-sampled high-resolution image and the fourth edge image are linearly superimposed to obtain an edge-enhanced image.
  • step S306 the edge-enhanced image is output.
  • Image resolution refers to the amount of information stored in an image, which is how many pixels per inch of image, the image resolution determines the quality of the image output, the image resolution and the image size (height and width) values together determine the file Size, and the larger the value, the more storage space the image file occupies;
  • a low-resolution image refers to an image with a relatively small value of image resolution.
  • the low-resolution image has a low resolution and contains insufficient pixels, which will appear rough and take up less storage space;
  • a high-resolution image refers to an image with a relatively large value of image resolution.
  • the high-resolution image has a high resolution, contains many pixels, the image is clear, and the storage space occupied is relatively large.
  • the low-resolution image is up-sampled to obtain an up-sampled high-resolution image.
  • the neighborhood noise estimates of each pixel of the third edge image are calculated separately.
  • the neighborhood noise estimates of each pixel of the third edge image are superimposed to obtain a fourth edge image.
  • the edge-enhanced image is output.
  • the low-resolution image is up-sampled to obtain an up-sampled high-resolution image.
  • the super-resolution image enhancement effect can be achieved without establishing a super-resolution model, which greatly reduces the algorithm calculation amount of the image enhancement method, so that the image enhancement method has less power consumption.
  • an edge detection operator is used to extract an edge image from the up-sampled high-resolution image.
  • the edge detection operator includes at least one of the following detection operators: a Laplacian Gaussian operator, a Roberts operator, and a Sobel operator.
  • the method for upsampling the low-resolution image includes at least one of the following sampling methods: bilinear interpolation, deconvolution, and depooling.
  • bilinear interpolation is also called bilinear interpolation.
  • Convolution is to convolve a neighborhood of an image to obtain the neighborhood features of the image.
  • Deconvolution is the inverse process of convolution. It is an algorithm-based process used to reverse the effect of convolution on recorded data.
  • One layer in the neural network is the pooling layer, which uses pooling to extract features and reduce image data.
  • Anti-pooling is the reverse operation of pooling. Because the pooling process only retains the main information of the original image and discards part of the information, the de-pooling cannot restore all the original image data through the pooling result.
  • Fig. 4 is a schematic diagram of an image enhancement device according to an exemplary embodiment.
  • the device 40 includes an edge extraction unit 401, an adaptive feature processing unit 402, an adder unit 403, an upsampling unit 404, and an image output unit 405.
  • Edge extraction unit 401 configured to extract an edge image from the original image to obtain a first edge image.
  • Adaptive feature processing unit 402 configured to perform adaptive feature processing on the first edge image to obtain a second edge image.
  • Adder unit 403 linearly superimpose the original image and the second edge image to obtain an edge-enhanced image.
  • Up-sampling unit 404 configured to up-sample the low-resolution image to obtain an up-sampled high-resolution image.
  • Image output unit 405 configured to output the edge-enhanced image.
  • the edge extraction unit 401 is configured to extract an edge image from the up-sampled high-resolution image to obtain a third edge image.
  • the adaptive feature processing unit 402 is configured to perform adaptive feature processing on the third edge image to obtain a fourth edge image .
  • the adder unit 403 is configured to linearly superimpose the up-sampled high-resolution image and the fourth edge image to obtain Edge enhanced image.
  • Fig. 5 is a block diagram of a device 1200 for performing an image enhancement method according to an exemplary embodiment.
  • the interactive device 1200 may be a mobile phone, a computer, a digital broadcasting terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, and so on.
  • the device 1200 may include one or more of the following components: processing component 1202, memory 1204, power supply component 1206, multimedia component 1208, audio component 1210, input / output (I / O) interface 1212, sensor component 1214, ⁇ ⁇ ⁇ 1216 ⁇ And communication components 1216.
  • the processing component 1202 generally controls the overall operations of the device 1200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • the processing component 1202 may include one or more processors 1220 to execute instructions to complete all or part of the steps in the above method.
  • the processing component 1202 may include one or more modules to facilitate interaction between the processing component 1202 and other components.
  • the processing component 1202 may include a multimedia module to facilitate interaction between the multimedia component 1208 and the processing component 1202.
  • the memory 1204 is configured to store various types of data to support operation at the device 1200. Examples of these data include instructions for any application or method operating on the device 1200, contact data, phonebook data, messages, pictures, videos, and so on.
  • the memory 1204 may be implemented by any type of volatile or nonvolatile storage device or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable and removable Programmable read only memory (EPROM), programmable read only memory (PROM), read only memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM erasable and removable Programmable read only memory
  • PROM programmable read only memory
  • ROM read only memory
  • magnetic memory flash memory
  • flash memory magnetic disk or optical disk.
  • the power supply component 1206 provides power to various components of the device 1200.
  • the power supply component 1206 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 1200.
  • the multimedia component 1208 includes a screen between the device 1200 and the user that provides an output interface.
  • the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or sliding action, but also detect the duration and pressure related to the touch or sliding operation.
  • the multimedia component 1208 includes a front camera and / or a rear camera. When the device 1200 is in an operation mode, such as a shooting mode or a video mode, the front camera and / or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 1210 is configured to output and / or input audio signals.
  • the audio component 1210 includes a microphone (MIC).
  • the microphone is configured to receive an external audio signal.
  • the received audio signal may be further stored in the memory 1204 or sent via the communication component 1216.
  • the audio component 1210 further includes a speaker for outputting audio signals.
  • the I / O interface 1212 provides an interface between the processing component 1202 and a peripheral interface module.
  • the peripheral interface module may be a keyboard, a click wheel, or a button. These buttons may include, but are not limited to: home button, volume button, start button, and lock button.
  • the sensor assembly 1214 includes one or more sensors for providing the device 1200 with status assessments in various aspects.
  • the sensor component 1214 can detect the on / off state of the device 1200, and the relative positioning of the components, for example, the component is the display and keypad of the device 1200, and the sensor component 1214 can also detect the position change of the device 1200 or a component of the device 1200 The presence or absence of user contact with the device 1200, the orientation or acceleration / deceleration of the device 1200, and the temperature change of the device 1200.
  • the sensor assembly 1214 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • the sensor assembly 1214 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor assembly 1214 may further include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • the communication component 1216 is configured to facilitate wired or wireless communication between the device 1200 and other devices.
  • the device 1200 may access a wireless network based on a communication standard, such as WiFi, an operator network (such as 2G, 3G, 4G, or 5G), or a combination thereof.
  • the communication component 1216 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel.
  • the communication component 1216 further includes a near field communication (NFC) module to facilitate short-range communication.
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra wideband
  • Bluetooth Bluetooth
  • the apparatus 1200 may be one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are implemented to perform the above method.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable A gate array
  • controller microcontroller, microprocessor or other electronic components are implemented to perform the above method.
  • a non-transitory computer-readable storage medium including instructions is also provided, for example, a memory 1204 including instructions, which can be executed by the processor 1220 of the device 1200 to complete the above method.
  • the non-transitory computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, or the like.
  • Fig. 6 is a block diagram of a device 1300 for performing an image enhancement method according to an exemplary embodiment.
  • the device 1300 may be provided as a server. 6
  • the device 1300 includes a processing component 1322, which further includes one or more processors, and memory resources represented by the memory 1332, for storing instructions executable by the processing component 1322, such as application programs.
  • the application programs stored in the memory 1332 may include one or more modules each corresponding to a set of instructions.
  • the processing component 1322 is configured to execute instructions to execute the above-mentioned information list display method.
  • the device 1300 may also include a power component 1326 configured to perform power management of the device 1300, a wired or wireless network interface 1350 configured to connect the device 1300 to the network, and an input output (I / O) interface 1358.
  • the device 1300 can operate based on an operating system stored in the memory 1332, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un procédé et un appareil d'amélioration d'image et un support de stockage lisible par ordinateur. Le procédé d'amélioration d'image comprend : l'extraction d'une image de bord à partir d'une image d'origine pour obtenir une première image de bord ; la réalisation d'un traitement de caractéristique adaptatif sur la première image de bord pour obtenir une seconde image de bord ; et la superposition linéaire de l'image d'origine et de la seconde image de bord pour obtenir une image à bords améliorés. Dans le procédé d'amélioration d'image, un traitement de caractéristique adaptatif est réalisé sur la première image de bord pour obtenir la seconde image de bord. Les points de pixel et le bruit de la première image de bord sont distingués de manière adaptative, évitant ainsi un traitement incorrect des points de pixel de bruit de la première image de bord, et ayant en outre un meilleur effet anti-bruit ; de plus, le problème concernant la consommation d'énergie provoquée par l'utilisation d'un algorithme de super-résolution sur la base d'un apprentissage profond et d'un apprentissage machine est résolu.
PCT/CN2019/102020 2018-10-17 2019-08-22 Procédé et appareil d'amélioration d'image et support de stockage lisible par ordinateur WO2020078102A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811207500.7A CN109544490B (zh) 2018-10-17 2018-10-17 图像增强方法、装置和计算机可读存储介质
CN201811207500.7 2018-10-17

Publications (1)

Publication Number Publication Date
WO2020078102A1 true WO2020078102A1 (fr) 2020-04-23

Family

ID=65843795

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/102020 WO2020078102A1 (fr) 2018-10-17 2019-08-22 Procédé et appareil d'amélioration d'image et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN109544490B (fr)
WO (1) WO2020078102A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109544490B (zh) * 2018-10-17 2021-07-13 北京达佳互联信息技术有限公司 图像增强方法、装置和计算机可读存储介质
CN111227522A (zh) * 2019-04-03 2020-06-05 泰州市康平医疗科技有限公司 智能衣柜门体控制方法
CN115082438B (zh) * 2022-07-22 2022-11-25 裕钦精密拉深技术(苏州)有限公司 一种基于计算机视觉的拉深零件质检系统

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100014771A1 (en) * 2008-07-18 2010-01-21 Samsung Electro-Mechanics Co., Ltd. Apparatus for improving sharpness of image
US20130044965A1 (en) * 2011-08-16 2013-02-21 Himax Technologies Limited Super resolution system and method with database-free texture synthesis
CN105225209A (zh) * 2015-10-29 2016-01-06 Tcl集团股份有限公司 一种非均匀插值图像的锐化实现方法及系统
CN105243647A (zh) * 2015-10-30 2016-01-13 哈尔滨工程大学 一种基于线性空间滤波的图像增强方法
CN105957030A (zh) * 2016-04-26 2016-09-21 成都市晶林科技有限公司 一种应用于红外热像仪图像细节增强和噪声抑制方法
CN106296637A (zh) * 2015-06-05 2017-01-04 北京中传视讯科技有限公司 一种图像纹理生成方法及装置
CN106530237A (zh) * 2016-09-19 2017-03-22 中山大学 一种图像增强方法
CN106600550A (zh) * 2016-11-29 2017-04-26 深圳开立生物医疗科技股份有限公司 超声图像处理方法及系统
CN108389170A (zh) * 2018-03-07 2018-08-10 鞍钢集团矿业有限公司 多广角摄像机重叠区域的图像增强及去噪方法和装置
CN109544490A (zh) * 2018-10-17 2019-03-29 北京达佳互联信息技术有限公司 图像增强方法、装置和计算机可读存储介质

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100420269C (zh) * 2005-12-09 2008-09-17 逐点半导体(上海)有限公司 一种图像增强处理系统和处理方法
WO2011145668A1 (fr) * 2010-05-20 2011-11-24 シャープ株式会社 Dispositif de traitement d'image, circuit de traitement d'image, procédé de traitement d'image, et programme
CN103093428A (zh) * 2013-01-23 2013-05-08 中南大学 一种时空联合的图像序列多尺度几何变换去噪方法
CN103700071B (zh) * 2013-12-16 2016-08-31 华中科技大学 一种深度图上采样边缘增强方法
CN104063848B (zh) * 2014-06-19 2017-09-19 中安消技术有限公司 一种低照度图像增强方法和装置
CN104574435B (zh) * 2014-09-24 2016-03-02 中国人民解放军国防科学技术大学 基于块聚类的运动相机前景分割方法
CN104240208A (zh) * 2014-09-30 2014-12-24 成都市晶林科技有限公司 非制冷红外焦平面探测器图像细节增强方法
CN104574304A (zh) * 2014-12-25 2015-04-29 深圳市一体太赫兹科技有限公司 一种毫米波图像重构方法及系统
CN105574834B (zh) * 2015-12-23 2018-09-04 小米科技有限责任公司 图像处理方法及装置
CN106570850B (zh) * 2016-10-12 2019-06-04 成都西纬科技有限公司 一种图像融合方法
CN107689050B (zh) * 2017-08-15 2020-11-17 武汉科技大学 一种基于彩色图像边缘引导的深度图像上采样方法

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100014771A1 (en) * 2008-07-18 2010-01-21 Samsung Electro-Mechanics Co., Ltd. Apparatus for improving sharpness of image
US20130044965A1 (en) * 2011-08-16 2013-02-21 Himax Technologies Limited Super resolution system and method with database-free texture synthesis
CN106296637A (zh) * 2015-06-05 2017-01-04 北京中传视讯科技有限公司 一种图像纹理生成方法及装置
CN105225209A (zh) * 2015-10-29 2016-01-06 Tcl集团股份有限公司 一种非均匀插值图像的锐化实现方法及系统
CN105243647A (zh) * 2015-10-30 2016-01-13 哈尔滨工程大学 一种基于线性空间滤波的图像增强方法
CN105957030A (zh) * 2016-04-26 2016-09-21 成都市晶林科技有限公司 一种应用于红外热像仪图像细节增强和噪声抑制方法
CN106530237A (zh) * 2016-09-19 2017-03-22 中山大学 一种图像增强方法
CN106600550A (zh) * 2016-11-29 2017-04-26 深圳开立生物医疗科技股份有限公司 超声图像处理方法及系统
CN108389170A (zh) * 2018-03-07 2018-08-10 鞍钢集团矿业有限公司 多广角摄像机重叠区域的图像增强及去噪方法和装置
CN109544490A (zh) * 2018-10-17 2019-03-29 北京达佳互联信息技术有限公司 图像增强方法、装置和计算机可读存储介质

Also Published As

Publication number Publication date
CN109544490B (zh) 2021-07-13
CN109544490A (zh) 2019-03-29

Similar Documents

Publication Publication Date Title
CN109344832B (zh) 图像处理方法及装置、电子设备和存储介质
CN109658401B (zh) 图像处理方法及装置、电子设备和存储介质
KR102463101B1 (ko) 이미지 처리 방법 및 장치, 전자 기기 및 저장 매체
KR101743861B1 (ko) 이미지 안정화를 위한 이미지 융합 방법
CN110060215B (zh) 图像处理方法及装置、电子设备和存储介质
WO2020134866A1 (fr) Procédé et appareil de détection de point-clé, dispositif électronique, et support de stockage
CN109118430B (zh) 超分辨率图像重建方法及装置、电子设备及存储介质
KR102446687B1 (ko) 이미지 처리 방법 및 장치, 전자 기기 및 저장 매체
CN107798654B (zh) 图像磨皮方法及装置、存储介质
WO2020078102A1 (fr) Procédé et appareil d'amélioration d'image et support de stockage lisible par ordinateur
TW202209254A (zh) 圖像分割方法、電子設備和電腦可讀儲存介質
WO2021169136A1 (fr) Procédé et appareil de traitement d'image, dispositif électronique et support d'enregistrement
CN110933334B (zh) 视频降噪方法、装置、终端及存储介质
JP2017537403A (ja) 超解像画像を生成するための方法、装置およびコンピュータ・プログラム・プロダクト
WO2022227394A1 (fr) Procédé et appareil de traitement d'image, ainsi que dispositif, support de stockage et programme
CN109784327B (zh) 边界框确定方法、装置、电子设备及存储介质
CN112258404A (zh) 图像处理方法、装置、电子设备和存储介质
WO2022032998A1 (fr) Procédé et appareil de traitement d'image, dispositif électronique, support de stockage et produit-programme
CN113706421B (zh) 一种图像处理方法及装置、电子设备和存储介质
CN111968052B (zh) 图像处理方法、图像处理装置及存储介质
CN111583142A (zh) 图像降噪方法及装置、电子设备和存储介质
CN110874809A (zh) 图像处理方法及装置、电子设备和存储介质
CN115660945A (zh) 一种坐标转换方法、装置、电子设备及存储介质
WO2021056770A1 (fr) Procédé et appareil de reconstruction d'image, dispositif électronique, et support de stockage
CN113660531B (zh) 视频处理方法及装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19874496

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19874496

Country of ref document: EP

Kind code of ref document: A1