WO2023221119A1 - 一种图像的处理方法、装置及存储介质 - Google Patents

一种图像的处理方法、装置及存储介质 Download PDF

Info

Publication number
WO2023221119A1
WO2023221119A1 PCT/CN2022/094215 CN2022094215W WO2023221119A1 WO 2023221119 A1 WO2023221119 A1 WO 2023221119A1 CN 2022094215 W CN2022094215 W CN 2022094215W WO 2023221119 A1 WO2023221119 A1 WO 2023221119A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target object
images
processing method
focus
Prior art date
Application number
PCT/CN2022/094215
Other languages
English (en)
French (fr)
Inventor
黄敬斌
Original Assignee
北京小米移动软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小米移动软件有限公司 filed Critical 北京小米移动软件有限公司
Priority to PCT/CN2022/094215 priority Critical patent/WO2023221119A1/zh
Priority to CN202280004386.6A priority patent/CN117461050A/zh
Publication of WO2023221119A1 publication Critical patent/WO2023221119A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/60Image enhancement or restoration using machine learning, e.g. neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • the present disclosure relates to the field of image processing technology, and in particular, to an image processing method, device and storage medium.
  • the present disclosure provides an image processing method, device and storage medium.
  • an image processing method is provided, which is applied to a terminal.
  • the processing method includes:
  • an image of the target object is generated.
  • obtaining multiple frames of first images of the target object includes:
  • the shooting parameters include the focus parameter
  • the first images include a focus image acquired under the focus parameters and an out-of-focus state of multiple frames acquired under other shooting parameters. image.
  • obtaining the second image from the first image includes:
  • the image other than the third image in the first image is used as the second image.
  • the preset distribution state includes that the pixel point set is centrally symmetrically distributed.
  • generating an image of the target object based on the second image includes:
  • An image of the target object is generated based on the second image and the focused image.
  • generating an image of the target object based on the second image and the focus image includes:
  • An image of the target object is generated based on the point spread function, the second image and the focused image.
  • the target object includes a display screen
  • the processing method further includes:
  • the corresponding initial point spread function between the out-of-focus image and the focused image is determined.
  • the processing method further includes:
  • the brightness change rate is a preset brightness change rate, it is determined that the target object includes a display screen.
  • the processing method further includes: determining the refresh frequency of the display screen according to the brightness change rate.
  • an image processing device is provided, applied to a terminal, and the processing device includes:
  • An acquisition module configured to acquire multiple frames of first images of the target object, where the multiple frames of the first images have different shooting parameters
  • a selection module configured to obtain a second image from the first image, where the second image includes an image whose moiré-related parameters meet a first preset condition
  • a generating module configured to generate an image of the target object according to the second image.
  • an image processing device including:
  • Memory used to store instructions executable by the processor
  • the processor is configured to perform the method according to any one of the first aspects of the embodiments of the present disclosure.
  • a non-transitory computer-readable storage medium which when instructions in the storage medium are executed by a processor of a device, enables the device to perform the first step of the embodiment of the present disclosure.
  • Adopting the above method of the present disclosure has the following beneficial effects: without adding additional hardware and processing algorithms, the second image is selected from the first image by acquiring multiple frames of first images under different shooting parameters of the target object. An image whose parameters related to moiré pattern meet preset conditions is included, and an image of the target object is generated based on the second image. The generated image is an image after the moiré pattern is weakened, thereby improving the user's shooting experience when shooting the display screen.
  • Figure 1 is a flow chart of an image processing method according to an exemplary embodiment
  • Figure 2 exemplarily shows the flow chart of the method for obtaining multiple frames of first images of a target object in step S101;
  • Figure 3 exemplarily shows the flow chart of the method for obtaining the second image from the first image in step S102;
  • Figure 4 exemplarily shows the flow chart of the method for generating an image of a target object according to the second image in step S103;
  • Figure 5 exemplarily shows a flow chart of a method for generating an image of a target object according to the second image and the focus image;
  • Figure 6 exemplarily shows a flow chart of a method for determining the corresponding initial point spread function between an out-of-focus image and a focused image according to the refresh frequency of the display screen;
  • Figure 7 is a flow chart of an image processing method according to an exemplary embodiment
  • Figure 8 is a block diagram of an image processing device according to an exemplary embodiment
  • FIG. 9 is a block diagram of an image processing device according to an exemplary embodiment.
  • an image processing method is provided, which is applied to a terminal.
  • the terminal includes an electronic device with an image capturing function, such as a smartphone or an iPad.
  • Figure 1 is a flow chart of an image processing method according to an exemplary embodiment. The processing method includes the following steps:
  • Step S101 Obtain multiple frames of first images of the target object, where the shooting parameters of the multiple frames of first images are different;
  • Step S102 Obtain a second image from the first image, where the second image includes an image whose moiré related parameters meet preset conditions;
  • Step S103 Generate an image of the target object based on the second image.
  • an image processing method that does not require adding additional hardware to the camera and does not affect the image generation time. Based on different shooting parameters, obtain multiple frames of first images of the target object, obtain a second image from the first image, the second image includes an image whose moiré related parameters meet preset conditions, and generate a frame of the target object based on the second image. Image, the generated image is a moiré weakened image.
  • step S101 if the target object includes an object that can produce moiré patterns, such as a computer display screen or a projector canvas, which has a spatial frequency similar to that of the image sensor, when the target object is photographed, obvious moiré will appear in the acquired image. streak phenomenon.
  • the target object is photographed based on different shooting parameters, and multiple frames of the first image of the target object are obtained. For example, by changing the shooting parameters, the images of the target object under different shooting parameters are obtained respectively, which is a multi-frame First image. Shooting multiple frames of first images with different shooting parameters can change the display effect of the target object on the image sensor, making the spatial frequencies of the two different and weakening the moiré pattern.
  • a second image is selected from the first image according to the characteristics of the moiré pattern, such as energy characteristics or distribution characteristics of pixels.
  • the second image includes an image whose moiré pattern related parameters meet the preset conditions.
  • the parameters are parameters that can characterize the moiré phenomenon, and the preset conditions are those that can determine the conditions under which moiré in the image is weak based on the moiré parameters.
  • the moiré pattern-related parameter is the intensity of the moiré pattern
  • the intensity of the moiré pattern is represented by the energy of the high-frequency pixels in the image.
  • the preset condition is that the moiré pattern intensity is less than or equal to the preset intensity threshold.
  • the preset intensity threshold can be set according to actual needs. Certainly. An image whose moiré-related parameters meet preset conditions is obtained from the first image, that is, an image with weak moiré intensity is selected from the first image as the second image.
  • the acquired second image is an image with weak moiré intensity, so the image of the target object generated based on the second image is also an image with weak moiré intensity.
  • the second image can be deblurred, and the image with the highest definition can be selected as the image of the target object, or the image with the highest definition in the second image can be selected for deblurring. , generate an image of the target object.
  • multiple frames of first images of the target object are acquired based on different shooting parameters, and a second image is acquired from the first image.
  • the second image includes an image whose moiré related parameters meet preset conditions. , based on the second image, generate an image of the target object. There is no need to add additional hardware or algorithms.
  • Figure 2 exemplarily shows a flow chart of a method for obtaining multiple frames of first images of a target object in step S101:
  • Step S201 Obtain the focus parameters of the target object in the focus state
  • Step S202 Obtain multiple shooting parameters according to the focus parameters, where the shooting parameters include focus parameters;
  • Step S203 Acquire multiple first images based on multiple shooting parameters.
  • the first images include focus images acquired under focus parameters and multi-frame out-of-focus images acquired under other shooting parameters.
  • the terminal receives the shooting instruction and obtains the focus parameters of the shooting target object in the focused state based on the focus position of the user shooting the target object. Adjust the focus parameters to obtain multiple shooting parameters, that is, multiple different shooting focal lengths.
  • the focus parameter in the focus state is a1.
  • multiple shooting parameters a1+1, a1+2, a1+3, etc. of different focal lengths are obtained.
  • the first images are multiple images shot at different shooting focal lengths, including focused images acquired under focus parameters and those acquired under other shooting parameters.
  • images captured under multiple shooting parameters are images captured in an out-of-focus state.
  • the image obtained in the out-of-focus state is the image taken at different shooting focal lengths after adjusting the focus parameters, and the image in the out-of-focus state is a blurred image.
  • multiple images with different degrees of defocus can be obtained by post-frame acquisition based on the focus position.
  • the current frame image obtained for the current focus position when the shooting instruction is triggered randomly select multiple frames of images under different shooting parameters after the current frame image, that is, multiple frames of images with different degrees of defocus.
  • the multiple first images acquired include both clear images acquired under the focus parameters and blurred images acquired in multiple frames in an out-of-focus state.
  • the image acquired in the out-of-focus state changes the display effect of the ambient light source on the camera's image sensor, causing the spatial frequencies of the ambient light source and the camera sensor to be different, thereby weakening the moiré pattern.
  • Figure 3 exemplarily shows a flow chart of the method of obtaining the second image from the first image in step S102:
  • Step S301 Convert each frame of the first image into a corresponding spectrogram
  • Step S302 Determine the third image in which the distribution state of the pixel point set in the spectrogram conforms to the preset distribution state, and the brightness value of the pixel point set is greater than the first preset threshold;
  • Step S303 Use the image other than the third image in the first image as the second image.
  • the target object includes an object with a spatial frequency similar to that of the image sensor, such as a display screen or a projector canvas
  • convert each frame of the first image into a corresponding spectrogram for example
  • the first image of each frame undergoes Fourier transformation to obtain the spectral information of the image, thereby converting it into a spectrogram.
  • the preset distribution state can be any distribution state that can represent the moiré phenomenon, for example, a set of pixels with a brightness value greater than the first preset threshold is centrally symmetrically distributed.
  • the spectrogram corresponding to the first image includes a set of pixels with a brightness value greater than the first preset threshold, and the distribution of the set of pixels with a brightness value greater than the first preset threshold is in a preset distribution state, it means that the spectrogram corresponds to The moiré pattern intensity is strong in the image, and the image that conforms to this feature is used as the third image.
  • the third image needs to be eliminated from the first image.
  • the image other than the third image in the first image is used as the second image, that is, the second image is an image with weaker moiré intensity in the first image.
  • the second image may include at least one frame of an out-of-focus image. For example, multiple frames of out-of-focus images may be included.
  • the first image By converting the first image into a spectrogram, it is easier to identify the third image with obvious moiré pattern in the first image based on the characteristics of the moiré pattern, and then it is easier to determine the image other than the third image in the first image as the second image. image.
  • Figure 4 schematically shows a flowchart of a method for generating an image of a target object based on the second image in step S103, as shown in Figure 4:
  • Step S401 select the image with a sharpness greater than the second preset threshold as the second image among the images other than the third image in the first image;
  • Step S402 Generate an image of the target object based on the second image and the focus image.
  • the remaining images may include multiple out-of-focus images
  • the image whose sharpness is greater than the second preset threshold is used as the second image.
  • Clarity can be represented by sharpness, that is, an image whose sharpness is greater than the second preset threshold is selected from the remaining images as the second image, and the second preset threshold can be set according to actual needs.
  • the second image may be a clearer image of multiple frames, or the image with the greatest sharpness, that is, the clearest image, among the second images.
  • the focused image is a clear image in a focused state
  • the second image is an image taken in an out-of-focus state.
  • the clarity of the second image needs to be improved. Therefore, according to the second image and the first image
  • the clear image obtained in the focused state is used to improve the clarity of the second image through related algorithms to generate an image of the target object.
  • Generating an image of the target object based on the image whose clarity is greater than the second preset threshold can weaken the moiré of the generated image while ensuring the clarity of the generated image.
  • Figure 5 exemplarily shows a flow chart of a method for generating an image of a target object based on the second image and the focus image:
  • Step S501 Obtain the initial point spread function corresponding to the second image and the focused image
  • Step S502 Determine the point spread function used to generate the image of the target object based on the initial point spread function
  • Step S503 Generate an image of the target object based on the point spread function, the second image and the focus image.
  • the initial point spread function (PSF, Point Spread Function) is a preset value, which is obtained according to the mapping relationship table between blurred images and clear images pre-stored in the terminal.
  • the blurred image closest to the second image is determined according to the shooting parameters, and the initial point spread function is obtained. point spread function.
  • the point spread function corresponding to the blurred image is determined to be used to generate the target object.
  • point spread function of the image when there is no blurred image in the mapping relationship table and the shooting parameters when acquiring it are the same as the shooting parameters of the second image, all point spread functions in the mapping relationship table are input to the corresponding machine
  • the point spread function corresponding to the clearest image in the obtained image is determined as the point spread function used to generate the image of the target object.
  • the mapping relationship table includes the following two sets of correspondences: the shooting parameter m1 corresponds to the point spread function PFS1, and the shooting parameter m2 corresponds to the point spread function PFS2, where m1 ⁇ m2, and it has been determined that one of the above two sets of correspondences
  • the shooting parameters of are the closest to the shooting parameters of the second image.
  • the shooting parameter of the second image is m1
  • the point spread function used to generate the image of the target object is PFS1
  • the shooting parameter of the second image is m3, and m1 ⁇ m3 ⁇ m2
  • the point spread function PFS1 is used to generate the image of the target object.
  • PFS2 are applied to the second image to obtain image 1 and image 2.
  • the sharpness of image 2 is greater than that of image 1, then the point spread function used to generate the image of the target object is PFS2.
  • the point spread function is a convolution matrix that can simulate the image from a clear state to an out-of-focus state.
  • Blur_Image and Delur_Image are respectively the blurred image in the out-of-focus state and the clear image in the focused state of the same target object.
  • Noise is noise, that is, the blurred image is convolved according to PSF for each pixel in the spatial domain through the clear image. Obtained, so a clear image can be obtained by deconvolution of the blurred image in the out-of-focus state and PSF. Therefore, based on the point spread function used to generate the image of the target object, the second image and the focus image, the image of the target object can be generated.
  • Deblurring the second image through a point spread function can make the generated image of the target object clearer.
  • the image processing method when the target object includes a display screen, the image processing method further includes: determining the corresponding initial point spread function between the out-of-focus image and the in-focus image according to the refresh frequency of the display screen. As shown in Figure 6, it includes the following steps:
  • Step S601 Shoot a clear image at a set refresh frequency, and record the focus parameters when shooting the clear image;
  • a clear image is the image with the highest definition, regardless of the presence of moiré in the image.
  • Step S602 Set different focus parameters, actively adjust the focus parameters, and obtain a blurred image in an out-of-focus state;
  • Step S603 Restore each out-of-focus blurred image into a clear image through different point spread functions, and record the point spread function corresponding to the image with the highest definition among the restored images as the point spread function corresponding to the blurred image;
  • Step S604 Repeat step S603, record the point spread function of the blurred image corresponding to different focus parameters and restore it to a clear image, which is the corresponding initial point spread function between the out-of-focus image and the focused image;
  • Step S605 Replace a display screen with a different refresh frequency and repeat steps S601 to S604.
  • the image processing method further includes the following steps:
  • Step S701 Receive an instruction to capture an image
  • Step S702 Determine whether the preview image of the target object includes an area where the brightness value changes
  • Step S703 When the area where the brightness value changes is included, within a preset time period, obtain multiple brightness values at different time points in the area where the brightness value changes;
  • Step S704 Determine the brightness change rate according to multiple brightness values
  • Step S705 If the brightness change rate is the preset brightness change rate, determine that the target object includes a display screen.
  • the terminal After the terminal receives the instruction to capture an image, it needs to determine whether the target object includes a display screen. When the target object does not include a display screen, there is no problem of moiré patterns in the captured image, so the normal shooting mode can be used. When the target object includes a display screen The problem of moiré patterns in captured images only occurs when the screen is used, so it is necessary to use a shooting mode that weakens moiré patterns.
  • a preset time period for example, 3 seconds.
  • Brightness value based on multiple brightness values, determine the brightness change rate, that is, the light source refresh frequency of the target object, determine whether the brightness change rate is the preset brightness change rate, that is, determine whether the refresh rate is the refresh frequency of the display screen, if it is a display screen
  • the refresh frequency indicates that the target object includes the display screen, and it is necessary to use an image capture mode that weakens moiré.
  • the ambient light is sampled through the anti-flicker sensor (Flicker Sensor) in the terminal to obtain the refresh frequency of the target object's light source.
  • the measurement frequency range of the anti-flicker sensor is 1 to 1kHz.
  • the refresh frequency of the display screen includes 60Hz, 75Hz, 120Hz, 144Hz, 240Hz. When the refresh frequency of the light source collected by the sensor is the refresh frequency of the display screen, for example, 75Hz, it is determined that the target object includes the display screen.
  • the terminal can select a shooting mode according to the actual situation of the target object, which can save power consumption to a certain extent.
  • multiple frames of first images of the target object are acquired based on different shooting parameters, images whose moiré related parameters meet preset conditions are selected from the first images, and then images are selected from the selected images.
  • An image with higher definition is selected as the second image, and an image of the target object can be generated based on the second image, the clear image in the focused state, and the point spread function used to generate the image of the target object. It can indirectly remove moire patterns in images generated by screen shooting through the blur effect of active focusing without adding additional hardware structures or algorithms.
  • an image processing device which is applied to a terminal. As shown in Figure 8, the image processing device includes:
  • the acquisition module 801 is configured to acquire multiple frames of first images of the target object, where the multiple frames of the first images have different shooting parameters;
  • the selection module 802 is configured to obtain a second image from the first image, where the second image includes an image whose moiré-related parameters meet the first preset condition;
  • the generation module 803 is configured to generate an image of the target object according to the second image.
  • FIG. 9 is a block diagram of an image processing device 900 according to an exemplary embodiment.
  • the device 900 may include one or more of the following components: a processing component 902, a memory 904, a power supply component 906, a multimedia component 908, an audio component 910, an input/output (I/O) interface 912, a sensor component 914, and communications component 916.
  • Processing component 902 generally controls the overall operations of device 900, such as operations associated with display, phone calls, data communications, camera operations, and recording operations.
  • the processing component 902 may include one or more processors 920 to execute instructions to complete all or part of the steps of the above method.
  • processing component 902 may include one or more modules that facilitate interaction between processing component 902 and other components.
  • processing component 902 may include a multimedia module to facilitate interaction between multimedia component 908 and processing component 902.
  • Memory 904 is configured to store various types of data to support operations at device 900 . Examples of such data include instructions for any application or method operating on device 900, contact data, phonebook data, messages, pictures, videos, etc.
  • Memory 904 may be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EEPROM), Programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EEPROM erasable programmable read-only memory
  • EPROM Programmable read-only memory
  • PROM programmable read-only memory
  • ROM read-only memory
  • magnetic memory flash memory, magnetic or optical disk.
  • Power supply component 906 provides power to various components of device 900.
  • Power supply components 906 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power to device 900 .
  • Multimedia component 908 includes a screen that provides an output interface between the device 900 and the user.
  • the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide action.
  • multimedia component 908 includes a front-facing camera and/or a rear-facing camera.
  • the front camera and/or the rear camera may receive external multimedia data.
  • Each front-facing camera and rear-facing camera can be a fixed optical lens system or have a focal length and optical zoom capabilities.
  • Audio component 910 is configured to output and/or input audio signals.
  • audio component 910 includes a microphone (MIC) configured to receive external audio signals when device 900 is in operating modes, such as call mode, recording mode, and speech recognition mode. The received audio signals may be further stored in memory 904 or sent via communications component 916 .
  • audio component 910 also includes a speaker for outputting audio signals.
  • the I/O interface 912 provides an interface between the processing component 902 and a peripheral interface module, which may be a keyboard, a click wheel, a button, etc. These buttons may include, but are not limited to: Home button, Volume buttons, Start button, and Lock button.
  • Sensor component 914 includes one or more sensors for providing various aspects of status assessment for device 900 .
  • the sensor component 914 can detect the open/closed state of the device 900, the relative positioning of components, such as the display and keypad of the device 900, and the sensor component 914 can also detect a change in position of the device 900 or a component of the device 900. , the presence or absence of user contact with the device 900 , device 900 orientation or acceleration/deceleration and temperature changes of the device 900 .
  • Sensor assembly 914 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 914 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor component 914 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 916 is configured to facilitate wired or wireless communication between apparatus 900 and other devices.
  • Device 900 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
  • the communication component 916 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
  • the communications component 916 also includes a near field communications (NFC) module to facilitate short-range communications.
  • NFC near field communications
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • apparatus 900 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable Gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are implemented for executing the above method.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable Gate array
  • controller microcontroller, microprocessor or other electronic components are implemented for executing the above method.
  • a non-transitory computer-readable storage medium including instructions such as a memory 904 including instructions, which are executable by the processor 920 of the apparatus 900 to complete the above method is also provided.
  • the non-transitory computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
  • a non-transitory computer-readable storage medium when instructions in the storage medium are executed by a processor of a device, enable the device to perform an image processing method, and the method includes any of the above methods.
  • the image processing method in this disclosure does not require additional hardware and processing algorithms, can weaken the moiré phenomenon when photographing a display screen, and improve user experience.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

本公开是关于一种图像的处理方法、装置及存储介质。图像的处理方法包括:获取目标对象的多帧第一图像,多帧第一图像的拍摄参数不同;从第一图像中获取第二图像,第二图像包括摩尔纹相关参数符合预设条件的图像;根据第二图像,生成目标对象的图像。

Description

一种图像的处理方法、装置及存储介质 技术领域
本公开涉及图像处理技术领域,尤其涉及一种图像的处理方法、装置及存储介质。
背景技术
随着移动终端的普及,通过终端拍摄获取屏幕(例如电脑显示屏、投影仪画布等)中的图像文字信息已经成为了大多数用户的习惯,但是由于屏幕和终端相机的图像传感器一样,硬件排列上都是由一个个像素点构成,由于两者空间频率相近,会导致拍摄生成图像中有大量的摩尔纹,严重影响画质,而无法识别拍摄图像中的关键信息,严重影响了用户体验。
发明内容
为克服相关技术中存在的问题,本公开提供一种图像的处理方法、装置及存储介质。
根据本公开实施例的第一方面,提供一种图像的处理方法,应用于终端,所述处理方法包括:
获取目标对象的多帧第一图像,多帧第一图像的拍摄参数不同;
从所述第一图像中获取第二图像,所述第二图像包括摩尔纹相关参数符合预设条件的图像;
根据所述第二图像,生成所述目标对象的图像。
在一示例性的实施例中,所述获取目标对象的多帧第一图像,包括:
获取所述目标对象的对焦状态下的对焦参数;
根据所述对焦参数,获得多个所述拍摄参数,所述拍摄参数包括所述对焦参数;
基于多个所述拍摄参数,获取多个所述第一图像,所述第一图像包括在所述对焦参数下所获取的对焦图像和在其他拍摄参数下获取的多帧失焦状态的失焦图像。
在一示例性的实施例中,所述从所述第一图像中获取第二图像,包括:
将每一帧所述第一图像转换成对应的频谱图;
确定所述频谱图中的像素点集的分布状态符合预设分布状态的第三图像,所述像素点集的亮度值大于第一预设阈值;
将所述第一图像中所述第三图像之外的图像作为第二图像。
在一示例性的实施例中,所述预设分布状态包括所述像素点集呈中心对称分布。
在一示例性的实施例中,所述根据所述第二图像,生成所述目标对象的图像,包括:
在所述第一图像中所述第三图像之外的图像中选择清晰度大于第二预设阈值的图像,作为第二图像;
根据所述第二图像和所述对焦图像,生成所述目标对象的图像。
在一示例性的实施例中,所述根据所述第二图像和所述对焦图像,生成所述目标对象的图像,包括:
获取所述第二图像和所述对焦图像对应的初始点扩散函数;
根据所述初始点扩散函数,确定用于生成所述目标对象的图像的点扩散函数;
基于所述点扩散函数、所述第二图像和所述对焦图像,生成所述目标对象的图像。
在一示例性的实施例中,所述目标对象包括显示屏,所述处理方法还包括:
根据所述显示屏的刷新频率,确定所述失焦图像和所述对焦图像之间对应的所述初始点扩散函数。
在一示例性的实施例中,所述处理方法还包括:
接收拍摄图像的指令;
确定所述目标对象的预览图像中是否包括亮度值发生变化的区域;
当包括亮度值发生变化的区域时,在预设时长内,获取所述亮度值发生变化的区域的不同时间点的多个亮度值;
根据所述多个亮度值确定亮度变化率;
若所述亮度变化率为预设亮度变化率,确定所述目标对象包括显示屏。
在一些示例性的实施例中,所述处理方法还包括:根据所述亮度变化率,确定所述显示屏的所述刷新频率。
根据本公开实施例的第二方面,提供一种图像的处理装置,应用于终端,所述处理装置包括:
获取模块,被配置为获取目标对象的多帧第一图像,多帧所述第一图像的拍摄参数不同;
选择模块,被配置为从所述第一图像中获取第二图像,所述第二图像包括摩尔纹相关参数符合第一预设条件的图像;
生成模块,被配置为根据所述第二图像,生成所述目标对象的图像。
根据本公开实施例的第三方面,提供一种图像的处理装置,包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,所述处理器被配置为执行如本公开实施例的第一方面中任一项所述的方法。
根据本公开实施例的第四方面,提供一种非临时性计算机可读存储介质,当所述存储介质中的指令由装置的处理器执行时,使得装置能够执行如本公开实施例的第一方面中任一项所述的方法。
采用本公开的上述方法,具有以下有益效果:无需额外增加硬件及处理算法,通过获取目标对象的不同拍摄参数下的多帧第一图像,从第一图像中选择出第二图像,第二图像包括摩尔纹相关参数符合预设条件的图像,并根据第二图像生成目标对象的图像,所生成的图像为摩尔纹削弱之后的图像,从而提升用户拍摄显示屏时的拍摄体验。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
图1是根据一示例性的实施例示出的一种图像的处理方法的流程图;
图2示例性地示出了步骤S101中获取目标对象的多帧第一图像的方法流程图;
图3示例性地示出了步骤S102中从第一图像中获取第二图像的方法流程图;
图4示例性地示出了步骤S103中根据第二图像,生成目标对象的图像的方法流程图;
图5示例性地示出了根据第二图像和对焦图像,生成目标对象的图像的方法流程图;
图6示例性地示出了根据显示屏的刷新频率,确定失焦图像和对焦图像之间对应的初始点扩散函数的方法的流程图;
图7是根据一示例性的实施例示出的一种图像的处理方法的流程图;
图8是根据一示例性的实施例示出的一种图像的处理装置框图;
图9是根据一示例性的实施例示出的一种图像的处理装置的框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
相关技术中,为了削弱终端在拍摄屏幕时出现的摩尔纹,通常采用以下三种方法:
1.在相机镜头后增加低通滤波器,减少环境中的高频信息在图像传感器上成像,从而图像传感器的空间频率将降低与环境的空间频率接近的风险;
2.搭建神经网络,辨识图像中是否存在摩尔纹,提取摩尔纹的区域,利用插值算法,利用摩尔纹像素周围正常的像素点进行插值;
3.获取同一场景有摩尔纹和无摩尔纹的图像,搭建神经网络,获取训练模型进行处理。
但是,上述削弱摩尔纹的方法分别存在以下问题:
1.增加硬件成本;
2.增加了额外的算法,影响图像的生成时间,搭建神经网络模型需大量的图像数据;
3.需要大量的有摩尔纹图像和无摩尔纹图像的对比。
本公开示例性的实施例中,提供一种图像的处理方法,应用于终端,终端包括智能手机、iPad等具有图像拍摄功能的电子设备。图1是根据一示例性的实施例示出的一种图像的处理方法的流程图,该处理方法包括以下步骤:
步骤S101:获取目标对象的多帧第一图像,多帧第一图像的拍摄参数不同;
步骤S102:从第一图像中获取第二图像,第二图像包括摩尔纹相关参数符合预设条件的图像;
步骤S103:根据第二图像,生成目标对象的图像。
在本公开示例性的实施例中,为了克服相关技术中存在的问题,提供一种图像的处理方法,该方法无需对相机增加额外的硬件,且不会影响图像的生成时间。基于不同的拍摄参数,获取目标对象的多帧第一图像,从第一图像中获取第二图像,第二图像包括摩尔纹相关参数符合预设条件的图像,根据第二图像,生成目标对象的图像,所生成的图像为摩尔纹削弱的图像。
在步骤S101中,如果目标对象包括能产生摩尔纹的对象,例如电脑显示屏或者投影仪画布等与图像传感器的空间频率相近的物体,拍摄目标对象时,所获取的图像中会出现明显的摩尔纹现象。为了削弱图像中摩尔纹,基于不同的拍摄参数对目标对象进行拍摄,获取目标对象的多帧第一图像,例如通过改变拍摄参数,分别获取不同拍摄参数下的目标对象的图像,即为多帧第一图像。通过不同的拍摄参数拍摄多帧第一图像,能够改变目标对象在图像传感器上的显示效果,使得二者的空间频率不同而削弱摩尔纹。
在步骤S102中,根据摩尔纹的特征,例如能量特征或者像素点的分布特征,从第一图像中挑选出第二图像,第二图像包括摩尔纹相关参数符合预设条件的图像,摩尔纹相关参数为能够表征摩尔纹现象的参数,预设条件为能够根据摩尔纹参数,判定图像中摩尔纹较弱的条 件。例如,摩尔纹相关参数为摩尔纹的强度时,通过图像中高频像素点的能量表征摩尔纹的强度,预设条件为摩尔纹强度小于等于预设强度阈值,预设强度阈值可以根据实际需求设定。从第一图像中获取摩尔纹相关参数符合预设条件的图像,即从第一图像中挑选出摩尔纹强度较弱的图像,作为第二图像。
在步骤S103中,所获取的第二图像为摩尔纹强度较弱的图像,因此根据第二图像生成的目标对象的图像,也为摩尔纹强度较弱的图像。根据第二图像生成目标对象的图像时,可以对第二图像进行去模糊处理,选择清晰度最高的图像作为目标对象的图像,也可以选择出第二图像中清晰度最高的图像进行去模糊处理,生成目标对象的图像。
在本公开示例性的实施例中,基于不同的拍摄参数,获取目标对象的多帧第一图像,从第一图像中获取第二图像,第二图像包括摩尔纹相关参数符合预设条件的图像,根据第二图像,生成目标对象的图像。无需增加额外的硬件或者算法,通过改变拍摄参数,改变目标对象的显示效果,从而削弱摩尔纹,根据多帧第一图像中摩尔纹较弱的图像,即可获得摩尔纹强度较弱的目标对象的图像。
在一示例性的实施例中,如图2所示,图2示例性地示出了步骤S101中获取目标对象的多帧第一图像的方法流程图:
步骤S201:获取目标对象的对焦状态下的对焦参数;
步骤S202:根据对焦参数,获得多个拍摄参数,拍摄参数包括对焦参数;
步骤S203:基于多个拍摄参数,获取多个第一图像,第一图像包括在对焦参数下所获取的对焦图像和在其他拍摄参数下获取的多帧失焦状态的失焦图像。
在目标对象包括显示屏或者投影仪画布等与图像传感器的空间频率相近的物体时,终端接收到拍摄指令,根据用户拍摄目标对象的对焦位置,获取拍摄目标对象在对焦状态下的对焦参数。调整对焦参数,获取多个拍摄参数,即多个不同拍摄焦距。例如对焦状态下的对焦参数为a1,通过调节对焦参数a1获得多个不同焦距的拍摄参数a1+1、a1+2、a1+3等。基于多个拍摄参数获取多个第一图像时,由于拍摄焦距不同,第一图像为不同拍摄焦距下所拍摄的多个图像,其中包括在对焦参数下所获取的对焦图像和在其他拍摄参数下所获取的多帧失焦状态的图像。由于在对焦参数下所获取的图像为终端在对焦位置所对应的拍摄焦距下拍摄的图像,该图像为清晰图像。由于对对焦参数进行调整获得多个拍摄参数,因此,多个拍摄参数下的拍摄的图像为失焦状态下所拍摄的图像。失焦状态下所获取的图像为调整对焦参数后不同拍摄焦距下所拍摄的图像,失焦状态的图像为模糊图像。
在一示例中,调整对焦参数时,可以基于对焦位置,采用后取帧的方式获取多张失焦程 度不同的图像。以触发拍摄指令时为当前对焦位置所获取的当前帧图像,随机选取当前帧图像之后的多个不同拍摄参数下的多帧图像,即多帧失焦程度不同的图像。
通过调整对焦参数,所获取的多个第一图像,既包括在对焦参数下所获取的清晰图像,又包括多帧在失焦状态下所获取的模糊图像。在失焦状态所获取的图像,由于改变了环境光源在相机的图像传感器上的显示效果,使得环境光源和相机传感器的空间频率不同,从而削弱了摩尔纹。
在一示例性的实施例中,如图3所示,图3示例性地示出了步骤S102中从第一图像中获取第二图像的方法流程图:
步骤S301:将每一帧第一图像转换成对应的频谱图;
步骤S302:确定频谱图中的像素点集的分布状态符合预设分布状态的第三图像,像素点集的亮度值大于第一预设阈值;
步骤S303:将第一图像中第三图像之外的图像作为第二图像。
在目标对象包括显示屏或者投影仪画布等与图像传感器的空间频率相近的物体时,获取多帧失焦状态的第一图像后,将每一帧第一图像转换成对应的频谱图,例如对每一帧第一图像进行傅里叶变化,获取图像的频谱信息,从而转换成频谱图。确定所转换成的频谱图中是否包括亮度值大于第一预设阈值的像素点集,且亮度值大于第一预设阈值的像素点集的分布是否呈预设分布状态,亮度值可以反映频谱图中每个像素点的能量,频谱图中亮度值大于第一预设阈值的像素点集的能量和分布状态可以反映摩尔纹的强度,以找出第一图像中摩尔纹强度较强的图像。预设分布状态可以为任意能够表征摩尔纹现象的分布状态,例如亮度值大于第一预设阈值的像素点集呈中心对称分布。
第一图像对应的频谱图中,包括亮度值大于第一预设阈值的像素点集,且亮度值大于第一预设阈值的像素点集的分布呈预设分布状态时,说明该频谱图对应的图像中摩尔纹强度较强,将符合该特征的图像作为第三图像。需要将第三图像从第一图像中剔除,此时第一图像中第三图像之外的图像作为第二图像,即第二图像是第一图像中摩尔纹强度较弱的图像。第二图像可以包括至少一帧失焦状态的图像。例如,可以包括多帧失焦状态的图像。
通过将第一图像转换为频谱图,能够更加方便根据摩尔纹的特征识别出第一图像中摩尔纹明显的第三图像,进而方便将第一图像中第三图像之外的图像确定为第二图像。
在一示例性的实施例中,图4示例性地示出了步骤S103中根据第二图像,生成目标对象的图像的方法流程图,如图4所示:
步骤S401,在第一图像中第三图像之外的图像中选择清晰度大于第二预设阈值的图像作 为第二图像;
步骤S402,根据第二图像和对焦图像,生成目标对象的图像。
从第一图像中剔除摩尔纹明显的图像后,由于其余的图像可能包括多帧失焦状态的图像,为了进一步确认第二图像,则在第一图像中第三图像之外的图像中选择出清晰度大于第二预设阈值的图像作为第二图像。清晰度可以用锐度来表示,即从其余的图像中选择出锐度大于第二预设阈值的图像,作为第二图像,第二预设阈值可以根据实际需要设定。第二图像可以为多帧较清晰的图像,也可以第二图像中锐度最大即最清晰的图像。
对焦图像为对焦状态下的清晰图像,第二图像为失焦状态下拍摄的图像,在生成目标对象的图像时,需要提高第二图像的清晰度,因此,根据第二图像以及在获取第一图像时所获取的对焦状态下的清晰图像,通过相关算法提高第二图像的清晰度,生成目标对象的图像。
根据清晰度大于第二预设阈值的图像,生成目标对象的图像,能够使得生成的图像摩尔纹削弱的同时,保证所生成图像的清晰度。
在一示例性的实施例中,如图5所示,图5示例性的示出了根据第二图像和对焦图像,生成目标对象的图像的方法流程图:
步骤S501:获取第二图像和对焦图像对应的初始点扩散函数;
步骤S502:根据初始点扩散函数,确定用于生成目标对象的图像的点扩散函数;
步骤S503:基于点扩散函数、第二图像和对焦图像,生成目标对象的图像。
初始点扩散函数(PSF,Point Spread Function)为预设值,根据预先存储在终端内的模糊图像与清晰图像的映射关系表获取,根据拍摄参数确定与第二图像最接近的模糊图像,获取初始点扩散函数。
确定第二图像的拍摄参数,当映射关系表中存在一张模糊图像,其获取时的拍摄参数与第二图像的拍摄参数相同时,该模糊图像对应的点扩散函数确定为用于生成目标对象的图像的点扩散函数;当映射关系表中不存在一张模糊图像,其获取时的拍摄参数与第二图像的拍摄参数相同时,将映射关系表中的所有点扩散函数输入到相应的机器学习中的超清算法,例如deblur算法,所得到的图像中最清晰的图像对应的点扩散函数,确定为用于生成目标对象的图像的点扩散函数。
在一示例中,映射关系表中包括以下两组对应关系:拍摄参数m1对应于点扩散函数PFS1,拍摄参数m2对应于点扩散函数PFS2,其中m1<m2,并且已经确定上述两组对应关系中的拍摄参数是最接近于第二图像的拍摄参数的。当第二图像的拍摄参数为m1时,则用于生成目标对象的图像的点扩散函数为PFS1;当第二图像的拍摄参数为m3时,且m1< m3<m2,分别将点扩散函数PFS1和PFS2应用到第二图像,得到图像1和图像2,图像2的清晰度大于图像1,则用于生成目标对象的图像的点扩散函数为PFS2。
点扩散函数为能够模拟图像从清晰状态到失焦状态的卷积矩阵,失焦状态下所获取的模糊图像在空间域上的变化过程可以表示为:Blur_Image=Delur_Image*PSF+Noise。其中,Blur_Image和Delur_Image分别为同一目标对象失焦状态下的模糊图像和对焦状态下的清晰图像,Noise为噪声,即模糊图像是通过清晰图像在空间域上对每一个像素点按照PSF进行卷积获得的,因此可以通过失焦状态下的模糊图像和PSF进行反卷积获得清晰图像。因此,基于用于生成目标对象的图像的点扩散函数,第二图像和对焦图像,即可生成目标对象的图像。
通过点扩散函数对第二图像进行去模糊,能够使得生成的目标对象的图像更加清晰。
在一示例性的实施例中,目标对象包括显示屏时,图像的处理方法还包括:根据显示屏的刷新频率,确定失焦图像和对焦图像之间对应的初始点扩散函数。如图6所示,包括以下步骤:
步骤S601:设定刷新频率下,拍摄清晰图像,记录清晰图像拍摄时的对焦参数;
清晰图像为清晰度最高的图像,不考虑图像中是否存在摩尔纹。
步骤S602:设定不同的对焦参数,主动调节对焦参数,获取失焦状态的模糊图像;
步骤S603:通过不同的点扩散函数将每张失焦状态的模糊图像恢复成清晰图像,记录所恢复的图像中清晰度最高的图像对应的点扩散函数,作为该模糊图像对应的点扩散函数;
步骤S604:重复步骤S603,记录不同对焦参数对应的模糊图像恢复成清晰图像的点扩散函数,即为失焦图像和对焦图像之间对应的初始点扩散函数;
步骤S605:更换不同刷新频率的显示屏,重复步骤S601-步骤S604。
对于不同的显示屏的刷新频率,例如刷新频率为60Hz的显示屏和刷新频率为90Hz的显示屏等,分别获取对焦图像和不同拍摄参数的失焦图像之间的初始点扩散函数,如表1所示,行表示不同刷新频率的显示屏,列表示不同的拍摄参数。
表1
PFS 拍摄参数1 拍摄参数2 …… 拍摄参数N
刷新频率1 PFS11 PFS12 …… PFS1n
刷新频率2 PFS21 PFS22 …… PFS2n
…… …… …… …… ……
本公开示例性的实施例中,如图7所示,图像的处理方法还包括以下步骤:
步骤S701:接收拍摄图像的指令;
步骤S702:确定目标对象的预览图像中是否包括亮度值发生变化的区域;
步骤S703:当包括亮度值发生变化的区域时,在预设时长内,获取亮度值发生变化的区域的不同时间点的多个亮度值;
步骤S704:根据多个亮度值确定亮度变化率;
步骤S705:若亮度变化率为预设亮度变化率,确定目标对象包括显示屏。
终端接收到拍摄图像的指令后,需要确定目标对象是否包括显示屏,当目标对象不包括显示屏时,不存在拍摄图像有摩尔纹的问题,因此可以使用正常拍摄模式,而当目标对象包括显示屏时,才会存在拍摄图像有摩尔纹的问题,因此需要使用削弱摩尔纹的拍摄模式。
确定目标对象的预览图像中是否包括亮度值发生变化的区域,当存在亮度值发生变化的区域时,在预设时长内,预设时长例如为3秒,获取该区域中不同时间点的多个亮度值,根据多个亮度值,确定亮度变化率,即目标对象的光源刷新频率,确定亮度变化率是否为预设亮度变化率,即确定刷新率是否为显示屏的刷新频率,如果是显示屏的刷新频率则说明目标物体包括显示屏,需要使用削弱摩尔纹的图像拍摄模式。
在一示例中,通过终端中的防闪烁传感器(Flicker Sensor)对环境光进行采样,获取目标对象光源的刷新频率,防闪烁传感器的测量频率范围是1~1kHz,显示屏的刷新频率包括60Hz、75Hz、120Hz、144Hz、240Hz,当传感器采集到的光源刷新频率为显示屏的刷新频率时,例如为75Hz时,确定目标对象包括显示屏。
通过确定目标对象是否包括显示屏,可以使终端根据目标对象的实际情况选择拍摄模式,能够在一定程度上节省功耗。
在本公开示例性的实施例中,基于不同的拍摄参数,获取目标对象的多帧第一图像,从第一图像中选择摩尔纹相关参数符合预设条件的图像,再从选出的图像中选择清晰度较高的图像,作为第二图像,根据第二图像和对焦状态下的清晰图像,以及用于生成目标对象的图像的点扩散函数,即可生成目标对象的图像。能够在不额外增加硬件结构或者算法的前提下,通过主动调焦的模糊效应间接去除屏幕拍摄所生成图像中的摩尔纹。
本公开示例性的实施例中,提供一种图像的处理装置,应用于终端,如图8所示,图像的处理装置包括:
获取模块801,被配置为获取目标对象的多帧第一图像,多帧所述第一图像的拍摄参数不同;
选择模块802,被配置为从所述第一图像中获取第二图像,所述第二图像包括摩尔纹相关参数符合第一预设条件的图像;
生成模块803,被配置为根据所述第二图像,生成所述目标对象的图像。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
当图像的处理装置为终端时,图9是根据一示例性的实施例示出的一种图像的处理装置900的框图。
参照图9,装置900可以包括以下一个或多个组件:处理组件902,存储器904,电源组件906,多媒体组件908,音频组件910,输入/输出(I/O)的接口912,传感器组件914,以及通信组件916。
处理组件902通常控制装置900的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件902可以包括一个或多个处理器920来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件902可以包括一个或多个模块,便于处理组件902和其他组件之间的交互。例如,处理组件902可以包括多媒体模块,以方便多媒体组件908和处理组件902之间的交互。
存储器904被配置为存储各种类型的数据以支持在装置900的操作。这些数据的示例包括用于在装置900上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器904可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电源组件906为装置900的各种组件提供电源。电源组件906可以包括电源管理系统,一个或多个电源,及其他与为装置900生成、管理和分配电力相关联的组件。
多媒体组件908包括在所述装置900和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件908包括一个前置摄像头和/或后置摄像头。当装置900处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄 像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件910被配置为输出和/或输入音频信号。例如,音频组件910包括一个麦克风(MIC),当装置900处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器904或经由通信组件916发送。在一些实施例中,音频组件910还包括一个扬声器,用于输出音频信号。
I/O接口912为处理组件902和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件914包括一个或多个传感器,用于为装置900提供各个方面的状态评估。例如,传感器组件914可以检测到装置900的打开/关闭状态,组件的相对定位,例如所述组件为装置900的显示器和小键盘,传感器组件914还可以检测装置900或装置900一个组件的位置改变,用户与装置900接触的存在或不存在,装置900方位或加速/减速和装置900的温度变化。传感器组件914可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件914还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件914还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件916被配置为便于装置900和其他设备之间有线或无线方式的通信。装置900可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件916经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件916还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,装置900可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器904,上述指令可由装置900的处理器920执行以完成上述方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
一种非临时性计算机可读存储介质,当所述存储介质中的指令由装置的处理器执行时, 使得装置能够执行一种图像的处理方法,所述方法包括上述的任一种方法。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本发明的其它实施方案。本申请旨在涵盖本发明的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本发明的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本发明的真正范围和精神由下面的权利要求指出。
应当理解的是,本发明并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本发明的范围仅由所附的权利要求来限制。
工业实用性
本公开中图像的处理方法无需额外增加硬件及处理算法,可以削弱拍摄显示屏时的摩尔纹现象,提升用户体验。

Claims (12)

  1. 一种图像的处理方法,应用于终端,其特征在于,所述处理方法包括:
    获取目标对象的多帧第一图像,多帧所述第一图像的拍摄参数不同;
    从所述第一图像中获取第二图像,所述第二图像包括摩尔纹相关参数符合预设条件的图像;
    根据所述第二图像,生成所述目标对象的图像。
  2. 根据权利要求1所述的图像的处理方法,其特征在于,所述获取目标对象的多帧第一图像,包括:
    获取所述目标对象的对焦状态下的对焦参数;
    根据所述对焦参数,获得多个所述拍摄参数,所述拍摄参数包括所述对焦参数;
    基于多个所述拍摄参数,获取多个所述第一图像,所述第一图像包括在所述对焦参数下所获取的对焦图像和在其他拍摄参数下获取的多帧失焦状态的失焦图像。
  3. 根据权利要求2所述的图像的处理方法,其特征在于,所述从所述第一图像中获取第二图像,包括:
    将每一帧所述第一图像转换成对应的频谱图;
    确定所述频谱图中的像素点集的分布状态符合预设分布状态的第三图像,所述像素点集的亮度值大于第一预设阈值;
    将所述第一图像中所述第三图像之外的图像作为所述第二图像。
  4. 根据权利要求3所述的图像的处理方法,其特征在于,所述预设分布状态包括所述像素点集呈中心对称分布。
  5. 根据权利要求3或4所述的图像的处理方法,其特征在于,所述根据所述第二图像,生成所述目标对象的图像,包括:
    在所述第一图像中所述第三图像之外的图像中选择清晰度大于第二预设阈值的图像作为所述第二图像;
    根据所述第二图像和所述对焦图像,生成所述目标对象的图像。
  6. 根据权利要求5所述的图像的处理方法,其特征在于,所述根据所述第二图像和所述对焦图像,生成所述目标对象的图像,包括:
    获取所述第二图像和所述对焦图像对应的初始点扩散函数;
    根据所述初始点扩散函数,确定用于生成所述目标对象的图像的点扩散函数;
    基于所述点扩散函数、所述第二图像和所述对焦图像,生成所述目标对象的图像。
  7. 根据权利要求6所述的图像的处理方法,其特征在于,所述目标对象包括显示屏,所述处理方法还包括:
    根据所述显示屏的刷新频率,确定所述失焦图像和所述对焦图像之间对应的所述初始点扩散函数。
  8. 根据权利要求1所述的图像的处理方法,其特征在于,所述处理方法还包括:
    接收拍摄图像的指令;
    确定所述目标对象的预览图像中是否包括亮度值发生变化的区域;
    当包括亮度值发生变化的区域时,在预设时长内,获取所述亮度值发生变化的区域的不同时间点的多个亮度值;
    根据所述多个亮度值确定亮度变化率;
    若所述亮度变化率为预设亮度变化率,确定所述目标对象包括显示屏。
  9. 根据权利要求8所述的图像的处理方法,其特征在于,所述处理方法还包括:根据所述亮度变化率,确定所述显示屏的所述刷新频率。
  10. 一种图像的处理装置,应用于终端,其特征在于,所述处理装置包括:
    获取模块,被配置为获取目标对象的多帧第一图像,多帧所述第一图像的拍摄参数不同;
    选择模块,被配置为从所述第一图像中获取第二图像,所述第二图像包括摩尔纹相关参数符合第一预设条件的图像;
    生成模块,被配置为根据所述第二图像,生成所述目标对象的图像。
  11. 一种图像的处理装置,其特征在于,包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为执行如权利要求1-9中任一项所述的方法。
  12. 一种非临时性计算机可读存储介质,当所述存储介质中的指令由装置的处理器执行时,使得装置能够执行如权利要求1-9中任一项所述的方法。
PCT/CN2022/094215 2022-05-20 2022-05-20 一种图像的处理方法、装置及存储介质 WO2023221119A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/094215 WO2023221119A1 (zh) 2022-05-20 2022-05-20 一种图像的处理方法、装置及存储介质
CN202280004386.6A CN117461050A (zh) 2022-05-20 2022-05-20 一种图像的处理方法、装置及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/094215 WO2023221119A1 (zh) 2022-05-20 2022-05-20 一种图像的处理方法、装置及存储介质

Publications (1)

Publication Number Publication Date
WO2023221119A1 true WO2023221119A1 (zh) 2023-11-23

Family

ID=88834408

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/094215 WO2023221119A1 (zh) 2022-05-20 2022-05-20 一种图像的处理方法、装置及存储介质

Country Status (2)

Country Link
CN (1) CN117461050A (zh)
WO (1) WO2023221119A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009118294A (ja) * 2007-11-08 2009-05-28 Fujifilm Corp カメラ装置
CN104486534A (zh) * 2014-12-16 2015-04-01 西安诺瓦电子科技有限公司 摩尔纹检测抑制方法及装置
CN110738609A (zh) * 2019-09-11 2020-01-31 北京大学 一种去除图像摩尔纹的方法及装置
CN111756989A (zh) * 2019-03-29 2020-10-09 北京小米移动软件有限公司 控制镜头对焦的方法及装置
CN113962869A (zh) * 2020-07-20 2022-01-21 Tcl科技集团股份有限公司 一种摩尔纹去除方法、装置及设备
WO2022088882A1 (zh) * 2020-10-30 2022-05-05 中兴通讯股份有限公司 拍摄方法、装置、终端及计算机可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009118294A (ja) * 2007-11-08 2009-05-28 Fujifilm Corp カメラ装置
CN104486534A (zh) * 2014-12-16 2015-04-01 西安诺瓦电子科技有限公司 摩尔纹检测抑制方法及装置
CN111756989A (zh) * 2019-03-29 2020-10-09 北京小米移动软件有限公司 控制镜头对焦的方法及装置
CN110738609A (zh) * 2019-09-11 2020-01-31 北京大学 一种去除图像摩尔纹的方法及装置
CN113962869A (zh) * 2020-07-20 2022-01-21 Tcl科技集团股份有限公司 一种摩尔纹去除方法、装置及设备
WO2022088882A1 (zh) * 2020-10-30 2022-05-05 中兴通讯股份有限公司 拍摄方法、装置、终端及计算机可读存储介质

Also Published As

Publication number Publication date
CN117461050A (zh) 2024-01-26

Similar Documents

Publication Publication Date Title
KR102310430B1 (ko) 촬영 방법, 장치 및 디바이스
CN110493526B (zh) 基于多摄像模块的图像处理方法、装置、设备及介质
JP6027287B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
CN109345485B (zh) 一种图像增强方法、装置、电子设备及存储介质
CN111709891B (zh) 图像去噪模型的训练方法、图像去噪方法、装置及介质
CN110958401B (zh) 一种超级夜景图像颜色校正方法、装置和电子设备
CN108154465B (zh) 图像处理方法及装置
WO2016029637A1 (zh) 拍摄方法和装置
JP5864037B2 (ja) 画像処理装置、撮像装置、画像処理方法及びプログラム
CN104869308A (zh) 拍摄照片的方法及装置
CN110769147B (zh) 拍摄方法及电子设备
JP7136956B2 (ja) 画像処理方法及び装置、端末並びに記憶媒体
CN112614064B (zh) 图像处理方法、装置、电子设备及存储介质
CN111756989A (zh) 控制镜头对焦的方法及装置
WO2015015943A1 (ja) 画像処理装置、撮像装置、画像処理方法、及びプログラム
CN111741187B (zh) 图像处理方法、装置及存储介质
CN114500821B (zh) 拍照方法及装置、终端及存储介质
US11222235B2 (en) Method and apparatus for training image processing model, and storage medium
CN106803920B (zh) 一种图像处理的方法、装置及智能会议终端
CN110876014B (zh) 图像处理方法及装置、电子设备及存储介质
CN111835941B (zh) 图像生成方法及装置、电子设备、计算机可读存储介质
KR101491963B1 (ko) 모바일 단말의 아웃 포커싱 영상 통화 방법 및 장치
WO2023221119A1 (zh) 一种图像的处理方法、装置及存储介质
CN112188095B (zh) 拍照方法、拍照装置及存储介质
WO2023220868A1 (zh) 一种图像处理方法、装置、终端及存储介质

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202280004386.6

Country of ref document: CN