CN111626931B - Image processing method, image processing device, storage medium and electronic apparatus - Google Patents

Image processing method, image processing device, storage medium and electronic apparatus Download PDF

Info

Publication number
CN111626931B
CN111626931B CN202010373472.7A CN202010373472A CN111626931B CN 111626931 B CN111626931 B CN 111626931B CN 202010373472 A CN202010373472 A CN 202010373472A CN 111626931 B CN111626931 B CN 111626931B
Authority
CN
China
Prior art keywords
image
displayed
target
processing
reconstruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010373472.7A
Other languages
Chinese (zh)
Other versions
CN111626931A (en
Inventor
张弓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010373472.7A priority Critical patent/CN111626931B/en
Publication of CN111626931A publication Critical patent/CN111626931A/en
Application granted granted Critical
Publication of CN111626931B publication Critical patent/CN111626931B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The disclosure provides an image processing method, an image processing device, a computer readable storage medium and electronic equipment, and relates to the technical field of image processing. The image processing apparatus includes: acquiring an image to be displayed; performing super-resolution reconstruction processing on the image to be displayed at least twice through at least two image reconstruction units to obtain at least two target images; the target images are used for being displayed on the image display terminal, and the resolutions of the at least two target images are different. According to the method and the device, the super-resolution reconstruction processing can be carried out on the image to be displayed through at least two image reconstruction units according to actual requirements in different application scenes, and target images under various resolutions are obtained.

Description

Image processing method, image processing device, storage medium and electronic apparatus
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a computer readable storage medium, and an electronic device.
Background
With the increasing demand of people for images, the situation that the definition of the images needs to be improved or more details of the images need to be acquired often occurs, and at this time, super-resolution reconstruction processing is often performed on the images. The super-resolution reconstruction process is a process of obtaining a high-resolution image through a series of low-resolution images.
In the prior art, super-resolution reconstruction is performed on an image, typically, after a display terminal acquires an image with original resolution, resolution adjustment with unified standard is performed, and the image is directly displayed on the display terminal. When display requirements of multiple resolutions exist, corresponding resolution adjustment cannot be performed according to the multiple requirements at the same time, so that image output of different channels cannot be optimized at the same time. Therefore, how to improve the flexibility of image processing and meet the diversified image requirements is a problem to be solved in the prior art.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure provides an image processing method, an image processing device, a computer readable storage medium and an electronic device, so that an image is effectively processed at least to a certain extent, and the flexibility of image processing is improved.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an image processing method, the method comprising: acquiring an image to be displayed; performing super-resolution reconstruction processing on the image to be displayed at least twice through at least two image reconstruction units to obtain at least two target images; the target images are used for being displayed on the image display terminal, and the resolutions of the at least two target images are different.
According to a second aspect of the present disclosure, there is provided an image processing apparatus, the apparatus comprising: the image acquisition module is used for acquiring an image to be displayed; the image processing module is used for carrying out super-resolution reconstruction processing on the image to be displayed at least twice through at least two image reconstruction units to obtain at least two target images; the target images are used for being displayed on the image display terminal, and the resolutions of the at least two target images are different.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described image processing method.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the above-described image processing method via execution of the executable instructions.
The technical scheme of the present disclosure has the following beneficial effects:
acquiring an image to be displayed according to the image processing method, the image processing device, the computer readable storage medium and the electronic equipment; performing super-resolution reconstruction processing on an image to be displayed at least twice through at least two image reconstruction units to obtain at least two target images; the target images are used for being displayed on the image display terminal, and the resolutions of at least two target images are different. On the one hand, the present exemplary embodiment may perform super-resolution reconstruction processing on an image to be displayed at least twice according to different image reconstruction units, so as to obtain target images with different resolutions for display, thereby improving the singleness of the image processing process in the prior art and improving the diversity of the target images; on the other hand, the present exemplary embodiment can obtain target images with different resolutions, and can output diversified target images on the image display terminal according to different display requirements, thereby having a wider application range.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
Fig. 1 shows a schematic diagram of a system architecture of the present exemplary embodiment;
fig. 2 shows a schematic diagram of an electronic device of the present exemplary embodiment;
fig. 3 shows a flowchart of an image processing method of the present exemplary embodiment;
fig. 4 is a schematic diagram showing an image processing procedure of the present exemplary embodiment;
fig. 5 shows a schematic diagram of another image processing procedure of the present exemplary embodiment;
fig. 6 shows a schematic diagram of still another image processing procedure of the present exemplary embodiment;
fig. 7 shows a block diagram of the structure of an image processing apparatus of the present exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. However, those skilled in the art will recognize that the aspects of the present disclosure may be practiced with one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 shows a schematic diagram of a system architecture of an exemplary embodiment of the present disclosure. As shown in fig. 1, the system architecture 100 may include: an image processing terminal 110, and an image display terminal 120. The image processing terminal 110 may be various electronic devices including, but not limited to, a mobile phone, a tablet computer, a digital camera, a personal computer, etc., and the image processing terminal 110 is configured to obtain an image to be displayed and perform image processing on the image to be displayed so as to display a corresponding image on the image display terminal 120. The image to be displayed can be acquired by acquiring the image from other equipment, or can be acquired in real time by arranging an image sensor or a camera at an image processing terminal. The image display terminal 120 may be various electronic devices for performing image display, such as a television, a personal computer, or other screen-throwable devices. It should be understood that the numbers of image processing terminals 110, 120 in fig. 1 are merely illustrative. There may be any number of image processing terminals and image display terminals as actually needed. For example, the image display terminal 120 may be a cluster formed by a plurality of image display terminals.
The image processing method provided by the embodiment of the present disclosure may be performed by the image processing terminal 110, for example, after the image processing terminal 110 acquires the image to be displayed, the image to be displayed is processed and then displayed in the image display terminal 120; or may be executed by the image display terminal 120, for example, after the image processing terminal 110 acquires the image to be displayed, the image processing terminal 110 transmits the acquired image to the image display terminal 120 to process the image, and then the image is displayed on the image display terminal 120; a terminal for performing image processing or the like may be additionally provided in the image processing terminal 110 and the image display terminal 120. The present disclosure is not limited in this regard.
Exemplary embodiments of the present disclosure provide an electronic device for implementing an image processing method, which may be the image processing terminal 110 or the image display terminal 120 in fig. 1. The electronic device comprises at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the image processing method via execution of the executable instructions.
The electronic device may be implemented in various forms, and may include mobile devices such as a mobile phone, a tablet computer, a notebook computer, a personal digital assistant (Personal Digital Assistant, PDA), a navigation device, a wearable device, a drone, and fixed devices such as a desktop computer and a smart television. The configuration of the electronic device will be exemplarily described below using the mobile terminal 200 of fig. 2 as an example. It will be appreciated by those skilled in the art that the configuration of fig. 2 can also be applied to stationary type devices in addition to components specifically for mobile purposes. In other embodiments, mobile terminal 200 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is shown schematically only and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also employ a different interface from that of fig. 2, or a combination of interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: processor 210, internal memory 221, external memory interface 222, universal serial bus (Universal Serial Bus, USB) interface 230, charge management module 240, power management module 241, battery 242, antenna 1, antenna 2, mobile communication module 250, wireless communication module 260, audio module 270, speaker 271, receiver 272, microphone 273, headset interface 274, sensor module 280, display screen 290, camera module 291, indicator 292, motor 293, keys 294, and subscriber identity module (Subscriber Identification Module, SIM) card interface 295, and the like. Wherein the sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyroscope sensor 2803, a barometric pressure sensor 2804, and the like.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing Unit, GPU), an image signal processor (Image Signal Processor, ISP), a controller, a video codec, a digital signal processor (Digital Signal Processor, DSP), a baseband processor, and/or a Neural network processor (Neural-Network Processing Unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to complete the control of reading instructions and executing instructions.
A memory may also be provided in the processor 210 for storing instructions and data. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transfer instructions, and notification instructions, and are controlled to be executed by the processor 210. In some implementations, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that the processor 210 has just used or recycled. If the processor 210 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 210 is reduced, thereby improving the efficiency of the system.
In some implementations, the processor 210 may include one or more interfaces. The interfaces may include an integrated circuit (Inter-Integrated Circuit, I2C) interface, an integrated circuit built-in audio (Inter-Integrated Circuit Sound, I2S) interface, a pulse code modulation (Pulse Code Modulation, PCM) interface, a universal asynchronous receiver Transmitter (Universal Asynchronous Receiver/Transmitter, UART) interface, a mobile industry processor interface (Mobile Industry Processor Interface, MIPI), a General-Purpose Input/Output (GPIO) interface, a subscriber identity module (Subscriber Identity Module, SIM) interface, and/or a universal serial bus (Universal Serial Bus, USB) interface, among others. Connections are made through different interfaces with other components of mobile terminal 200.
The USB interface 230 is an interface conforming to the USB standard specification, and may specifically be a MiniUSB interface, a micro USB interface, a USB type c interface, or the like. The USB interface 230 may be used to connect to a charger to charge the mobile terminal 200, may also be connected to a headset to play audio, and may also be used to connect to other electronic devices, such as a computer, a peripheral device, etc. with the mobile terminal 200.
The charge management module 240 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 240 may receive a charging input of a wired charger through the USB interface 230. In some wireless charging embodiments, the charge management module 240 may receive wireless charging input through a wireless charging coil of the mobile terminal 200. The charging management module 240 may also provide power to the electronic device through the power management module 241 while charging the battery 242.
The power management module 241 is used for connecting the battery 242, the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the internal memory 221, the display 290, the camera module 291, the wireless communication module 260, and the like. The power management module 241 may also be configured to monitor battery capacity, battery cycle times, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 241 may also be disposed in the processor 210. In other embodiments, the power management module 241 and the charge management module 240 may be disposed in the same device.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in mobile terminal 200 may be configured to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied on the mobile terminal 200. The mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (Low Noise Amplifier, LNA), etc. The mobile communication module 250 may receive electromagnetic waves from the antenna 1, perform processes such as filtering and amplifying the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 250 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be provided in the same device as at least some of the modules of the processor 210.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 271, the receiver 272, etc.), or displays images or videos through the display screen 290. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 250 or other functional module, independent of the processor 210.
The wireless communication module 260 may provide solutions for wireless communication including wireless local area network (Wireless Local Area Networks, WLAN) (e.g., wireless fidelity (Wireless Fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (Global Navigation Satellite System, GNSS), frequency modulation (Frequency Modulation, FM), near field wireless communication technology (Near Field Communication, NFC), infrared technology (IR), etc., applied on the mobile terminal 200. The wireless communication module 260 may be one or more devices that integrate at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 250 of mobile terminal 200 are coupled, and antenna 2 and wireless communication module 260 are coupled, so that mobile terminal 200 may communicate with a network and other devices through wireless communication technology. The wireless communication techniques may include the Global System for Mobile communications (Global System for Mobile communications, GSM), general packet Radio service (General Packet Radio Service, GPRS), code Division multiple Access (Code Division Multiple Access, CDMA), wideband code Division multiple Access (Wideband Code Division Multiple Access, WCDMA), time Division multiple Access (TD-Synchronous Code Division Multiple Access, TD-SCDMA), long term evolution (Long Term Evolution, LTE), new Radio, NR), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (Global Positioning System, GPS), a global navigation satellite system (Global Navigation Satellite System, GLONASS), a Beidou satellite navigation system (Beidou Navigation Satellite System, BDS), a Quasi-zenith satellite system (Quasi-Zenith Satellite System, QZSS) and/or a satellite-based augmentation system (Satellite Based Augmentation Systems, SBAS).
The mobile terminal 200 implements display functions through a GPU, a display screen 290, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 290 is used for displaying images, videos, and the like. The display screen 290 includes a display panel. The display panel may employ a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), an Active-Matrix Organic Light Emitting Diode (AMOLED), a flexible Light-Emitting Diode (flex), a mini, a Micro-OLED, a quantum dot Light-Emitting Diode (Quantum dot Light Emitting Diodes, QLED), or the like. In some embodiments, mobile terminal 200 may include 1 or N displays 290, N being a positive integer greater than 1.
The mobile terminal 200 may implement a photographing function through an ISP, a camera module 291, a video codec, a GPU, a display screen 290, an application processor, and the like.
The ISP is used to process the data fed back by the camera module 291. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some implementations, an ISP may be provided in the camera module 291.
The camera module 291 is used for capturing still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (Charge Coupled Device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the mobile terminal 200 may include 1 or N camera modules 291, where N is a positive integer greater than 1, and if the mobile terminal 200 includes N cameras, one of the N cameras is a master camera.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the mobile terminal 200 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, etc.
Video codecs are used to compress or decompress digital video. The mobile terminal 200 may support one or more video codecs. In this way, the mobile terminal 200 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (Moving Picture Experts Group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the mobile terminal 200. The external memory card communicates with the processor 210 via an external memory interface 222 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 221 may be used to store computer executable program code that includes instructions. The internal memory 221 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (Universal Flash Storage, UFS), and the like. The processor 210 performs various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement audio functions through an audio module 270, a speaker 271, a receiver 272, a microphone 273, an earphone interface 274, an application processor, and the like. Such as music playing, recording, etc.
The audio module 270 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 270 may also be used to encode and decode audio signals. In some implementations, the audio module 270 may be disposed in the processor 210, or some functional modules of the audio module 270 may be disposed in the processor 210.
A speaker 271, also called "horn", is used to convert the audio electrical signal into a sound signal. The mobile terminal 200 can listen to music through the speaker 271 or listen to hands-free calls.
A receiver 272, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the mobile terminal 200 receives a telephone call or voice message, the voice can be received by placing the receiver 272 close to the human ear.
A microphone 273, also called "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 273 through the mouth, inputting a sound signal to the microphone 273. The mobile terminal 200 may be provided with at least one microphone 273. In other embodiments, the mobile terminal 200 may be provided with two microphones 273, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the mobile terminal 200 may further be provided with three, four or more microphones 273 to collect sound signals, reduce noise, identify sound sources, implement directional recording functions, etc.
The earphone interface 274 is used to connect a wired earphone. The headset interface 274 may be a USB interface 230 or a 3.5mm open mobile electronic device platform (Open Mobile Terminal Platform, OMTP) standard interface, a american cellular telecommunications industry association (Cellular Telecommunications Industry Association of the USA, CTIA) standard interface.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided to the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 2802 may be disposed on display 290. The pressure sensor 2802 is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 2803. The gyro sensor 2803 can be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 2803 detects the angle of the shake of the mobile terminal 200, calculates the distance to be compensated by the lens module according to the angle, and allows the lens to counteract the shake of the mobile terminal 200 by the reverse motion, thereby realizing anti-shake. The gyro sensor 2803 can also be used for navigation, somatosensory of game scenes.
The air pressure sensor 2804 is used to measure air pressure. In some embodiments, the mobile terminal 200 calculates altitude from barometric pressure values measured by the barometric pressure sensor 2804, aiding in positioning and navigation.
In addition, sensors for other functions, such as magnetic sensors, acceleration sensors, distance sensors, proximity sensors, fingerprint sensors, temperature sensors, touch sensors, ambient light sensors, bone conduction sensors, etc., may be provided in the sensor module 280 according to actual needs.
The keys 294 include a power on key, a volume key, etc. The keys 294 may be mechanical keys. Or may be a touch key. The mobile terminal 200 may receive key inputs, generating key signal inputs related to user settings and function controls of the mobile terminal 200.
The motor 293 may generate vibration cues, such as vibration cues of a call, an alarm clock, a received message, etc., and may also be used for touch vibration feedback, such as touch operations on different applications (e.g., photographing, gaming, audio playing, etc.), or touch operations on different areas of the display screen 290, which may correspond to different vibration feedback effects. The touch vibration feedback effect may support customization.
The indicator 292 may be an indicator light, which may be used to indicate a state of charge, a change in power, a message indicating a missed call, a notification, etc.
The SIM card interface 295 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 295 or removed from the SIM card interface 295 to enable contact and separation from the mobile terminal 200. The mobile terminal 200 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 295 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 295 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 295 may also be compatible with different types of SIM cards. The SIM card interface 295 may also be compatible with external memory cards. The mobile terminal 200 interacts with the network through the SIM card to realize functions such as communication and data communication. In some implementations, the mobile terminal 200 employs esims, i.e.: an embedded SIM card. The eSIM card may be embedded in the mobile terminal 200 and cannot be separated from the mobile terminal 200.
An image processing apparatus and an image processing method according to exemplary embodiments of the present disclosure are specifically described below.
Fig. 3 shows a flowchart of an image processing method in the present exemplary embodiment, which may specifically include the following steps S310 to S320:
Step S310, an image to be displayed is acquired.
The image to be displayed refers to an original image for image display at the image display terminal, and the original image can be acquired locally from the terminal for image processing, for example, the image to be displayed is acquired in real time through a camera or an image sensor; or taking the internally stored image as an image to be displayed; or may be obtained from other image source devices, such as from a network download or in real time.
In an exemplary embodiment, the step S310 may include:
acquiring a video stream to be displayed;
and taking each frame of image in the video stream to be displayed as the image to be displayed in sequence.
It is considered that in practical applications, in addition to the image processing, it is often necessary to process the video stream to display the corresponding video stream. Therefore, the present exemplary embodiment can process each frame image by acquiring a video stream to be displayed and sequentially taking it as an image to be displayed when acquiring the image to be displayed.
The video stream to be displayed may be a video stream collected in real time by the terminal where the image processing terminal is located, or may be a video stream inherent to the video stream, or may be a video stream received in real time from another terminal, or the like. The intrinsic video stream may include hardware entities already stored or readable in the terminal, such as a video stream stored in an SD (Secure Digital) card, a video stream stored in an optical disc or a video stream stored in an external hard disk, etc. Video streams received in real time from other terminals are not available in original equipment, and need to be obtained through other communication modes, for example, video obtained in real time from a network (such as a video or a live broadcast platform), or screen display content of a screen throwing device is generated into a real-time video stream, and the like. The exemplary embodiment can realize the processing of the video stream to be displayed by processing each frame of image in the video stream to be displayed, and display the video stream in the image display terminal, thereby having wider application scenes.
Step S320, performing super-resolution reconstruction processing on an image to be displayed at least twice through at least two image reconstruction units to obtain at least two target images;
the target images are used for being displayed on the image display terminal, and the resolutions of at least two target images are different.
The image reconstruction unit is used for performing super-resolution reconstruction processing on the image to be displayed, and the target image with higher resolution can be obtained by performing super-resolution reconstruction on the image to be displayed through the image reconstruction unit, so that the target image can be clearer and the details are richer when the target image is displayed on the image display terminal. In this exemplary embodiment, at least two image reconstruction units may be provided, and super-resolution reconstruction may be performed on an image to be displayed to generate target images with different resolutions, and display the target images on the image display terminal. When the target images are displayed on the image display terminal, the target images with different resolutions can be output at different times on the same image display terminal according to actual requirements, and the target images with different resolutions can also be output at the same time on different image display terminals, so as to meet diversified requirements of users or scenes.
Each image reconstruction unit can be realized by interpolation or a neural network when performing super-resolution reconstruction processing. For example, a pixel value gradient is calculated for an image to be displayed, and new pixel points are inserted between the original pixel points according to the pixel value gradient, so that a target image with higher pixel number (i.e. higher resolution) is obtained. The method can also adopt SRCNN (Super-Resolution Convolutional Neural Network) or an improved version of SRCNN, train through a large number of sample image pairs, and each sample image pair comprises a sample image (sample) and a corresponding high-definition image (group trunk), so that network parameters are adjusted until a certain accuracy is achieved; and inputting the image to be displayed into a trained network during application, and outputting a corresponding target image.
In an exemplary embodiment, at least two image reconstruction units may be arranged in any one of series, parallel, a combination of series and parallel.
Each image reconstruction unit outputs a target image when performing super-resolution reconstruction processing once.
In addition, any one of the image reconstruction units takes the target image output by the previous image reconstruction unit connected with the image reconstruction unit as an input, and outputs another target image through super-resolution reconstruction processing.
Next, the image reconstruction units in the three arrangements of the series, parallel, series and parallel will be specifically described.
In the first way, they are connected in series.
The serial connection is a structure for sequentially connecting at least two image reconstruction units, as shown in fig. 4, and may include an image acquisition terminal 410, and at least two image reconstruction units 420 (image reconstruction unit a shown in fig. 4) 1 Image reconstruction unit A 2 …, image reconstruction Unit A n ) And an image display terminal 430. At least two image reconstruction units 420 may be disposed inside the image acquisition terminal 410 and collectively serve as a terminal for performing image processing. The image acquisition terminal 410 acquires an image a to be displayed 0 After that, it is input to an image reconstruction unit A 1 Performing super-resolution reconstruction processing, and outputting a target image a 1 The method comprises the steps of carrying out a first treatment on the surface of the The target image a can be further processed 1 As image reconstruction unit A 2 Is subjected to super-resolution reconstruction again to obtain another target image a 2 The method comprises the steps of carrying out a first treatment on the surface of the In this way, the target image output by each image reconstruction unit is used as the next image reconstruction unit adjacent to the target image reconstruction unit in the serial structure to perform super resolution processing, and finally the target image a can be obtained 1 Target image a 2 … target image a n
In the second mode, the two are connected in parallel.
When the image reconstruction units are arranged in parallel, as shown in fig. 5, the image reconstruction unit comprises an image acquisition terminal 510, and at least two image reconstruction units 520 (as shown in fig. 5Image reconstruction unit B 1 Image reconstruction unit B 2 …, image reconstruction unit B n ) And an image display terminal 530. Specifically, the processing procedure of the image to be displayed may be that, after the image acquisition terminal 510 acquires the image to be displayed, it may be synchronously input to different image reconstruction units B 1 、B 2 、…、B n And the like, enabling each image reconstruction unit to respectively perform super-resolution processing on the image to be processed. Wherein each image reconstruction unit can output a target image, and the output target image corresponds to the image reconstruction unit B 1 Image reconstruction unit B 2 …, image reconstruction unit B n Respectively target image b 1 Target image b 2 … target image b n
In a third way, series and parallel are combined.
In the present exemplary embodiment, super-resolution reconstruction processing may also be performed on an image to be displayed in a combination of series and parallel. As shown in fig. 6, includes an image acquisition terminal 610, at least two image reconstruction units 620 (fig. 6 schematically shows an image reconstruction unit C 1 Image reconstruction unit C 2 Image reconstruction unit C 3 Image reconstruction unit C 4 Image reconstruction unit C 5 5 image reconstruction units, but the number of image reconstruction units is not limited thereto), and an image display terminal 630. The processing procedure of the image to be displayed may specifically be that, after the image acquisition terminal 610 acquires the image to be displayed, the image to be displayed is taken as the image reconstruction unit C 1 And an image reconstruction unit C 2 The two image reconstruction units respectively perform super-resolution reconstruction processing on the images to be displayed to obtain a target image c 1 And c 2 Further in the form of target image c 2 As image reconstruction unit C 3 And an image reconstruction unit C 4 To make the two image reconstruction units to the target image c 2 Performing super-resolution processing to obtain a target image c 3 And a target image c 4 Finally, the target image c 4 As image reconstruction unit C 5 Is input intoPerforming super-resolution reconstruction processing on the line to obtain a target image c 5 Based on this, the target image c is determined 1 、c 2 、c 3 、c 4 、c 5 5 target images. The method and the device can combine advantages of series connection and parallel connection in a mode of combining the series connection and the parallel connection, reduce processing calculation amount, output target images with different resolutions according to different application scenes, and improve processing efficiency of super-resolution reconstruction of the images.
In this exemplary embodiment, the image capturing terminal and the image display terminal may further respectively internally provide an image processing unit, which is used for processing the acquired image to be displayed and the received target image, and a specific processing procedure may include clipping, scaling, or image enhancement of the image. By performing a plurality of processes on the image at a plurality of stages, the image quality can be ensured so that it can exhibit a better display effect in the image display terminal.
In an exemplary embodiment, the image processing method may further include the steps of:
performing display effect optimization processing on the image to be displayed and/or the target image through a first optimization unit;
wherein the display effect optimization processing includes any one or more of the following modes:
image enhancement, image sharpening, image smoothing, image denoising, image deblurring, image defogging, and image restoration.
Wherein, the image enhancement mainly refers to enhancing the region of interest (Region Of Interest, ROI) in the image and inhibiting the region of no interest;
the image sharpening, the edge and the gray jump part of the image are enhanced by compensating the outline of the image, so that the image becomes clear;
Image denoising, namely reducing noise in a digital image by means of filtering, image smoothing and the like;
deblurring an image, namely repairing a blurred part in the image;
defogging images, namely, defogging phenomena generally exist in the images due to the influences of dust, haze and the like in the atmosphere, and the defogging cost in the images is reduced or removed through algorithms such as a foggy graph model and the like.
Image restoration refers to reconstructing lost and damaged portions of an image.
In this exemplary embodiment, the first optimizing unit may be arranged in any one of series, parallel, series and parallel combination with at least two image reconstructing units, so that the image to be displayed may also perform the first optimizing process when performing the super-resolution reconstruction process.
In an exemplary embodiment, the image processing method may further include the steps of:
performing morphological optimization processing on the image to be displayed and/or the target image through a second optimization unit;
wherein the morphology optimization process includes any one or more of the following:
image cropping, image stitching, image compositing, image rotation.
The image clipping refers to modifying the size of an image, and may be intercepting a certain local area in the image;
Image stitching and image synthesis means that an original image and other images are subjected to specific processing to form a new image;
the image rotation means that a new image is formed by rotating a certain angle with a certain point of the image as the center.
In this exemplary embodiment, the second optimizing unit may be arranged in any one of series, parallel, series and parallel combination with at least two image reconstructing units, so that the image to be displayed may also perform the second optimizing process when performing the super-resolution reconstruction process.
The first optimizing unit and the second optimizing unit can further optimize the image to be displayed, and the quality of the target image is improved, so that the viewing experience of a user is improved.
In addition, the first optimizing unit, the second optimizing unit and the at least two image reconstructing units can be arranged in any mode of series connection, parallel connection, combination of series connection and parallel connection, so that the image to be displayed can be subjected to the first optimizing process, the second optimizing process and the super-resolution reconstructing process.
In an exemplary embodiment, the image processing method may further include:
The target image is transmitted to the image display terminal to display the target image at the image display terminal.
The present exemplary embodiment performs super-resolution reconstruction on an image to be displayed, so as to obtain at least two target images with higher resolution, and when the target images are displayed on the image display terminal, the target images are clearer. When the image display terminal displays the target image, various modes may be included: the method comprises the steps that an image to be displayed is obtained from the same terminal, the image processing terminal processes the image to be displayed to obtain at least two target images, and the target images with different resolutions are displayed on the same image display terminal at different moments, namely in a one-to-one mode; or obtaining images to be displayed from different terminals, carrying out an image processing process on each image to be displayed, and displaying target images with different resolutions in the same image display terminal at different moments, namely a mode of 'many-to-one'; or the images to be displayed are obtained from the same terminal, the images to be displayed are processed through the image display terminal, at least two target images with different resolutions are obtained, and the at least two target images are displayed on different image display terminals at different moments, namely, in a one-to-many mode; and then or acquiring images to be displayed from different terminals, performing an image processing process on each image to be displayed, and displaying at least two obtained target images on different image display terminals, namely a mode of 'many-to-many', and the like.
In an exemplary embodiment, the transmitting the target image to the image display terminal may include:
each target image is sent to a different image display terminal.
That is, in a preferred embodiment, super-resolution reconstruction may be performed on the image to be displayed according to the resolution required by the image display terminal, so as to obtain at least two target images. And then, each obtained target image is respectively sent to a corresponding image display terminal. The resolution required by the image display terminal may be the resolution of its screen display, the resolution of a system setting, or the like. The image display terminal may directly communicate with the image processing terminal to acquire its required resolution, or determine its required resolution or the like by a model parameter or the like of the display device.
In an alternative exemplary embodiment, the present exemplary embodiment may be applied to a smart phone terminal, where the smart phone outputs a low resolution video acquired by an ISP through two processes during a shooting process: firstly, each frame of image in the video to be displayed is used as the image to be displayed to be sent to an image processing terminal, super-resolution reconstruction processing is carried out, and a target image is output; secondly, the image processing unit arranged in the intelligent mobile phone terminal can cut the region of interest of the image to be displayed, the cut image is used as the image to be displayed and is input into the image processing terminal to be subjected to super-resolution reconstruction processing, the target image is sent to the image display terminal, and then the image processing unit arranged in the image display terminal is used for modifying the target image so as to meet the screen resolution and display the target image.
In another alternative exemplary embodiment, the present exemplary embodiment may be applied to a television, and when the television performs a picture-in-picture playing, a high resolution video, which may be acquired from a network in real time, is output through two processes: firstly, processing an image to be displayed through an image processing terminal, and displaying the processed image in a form of a small image on an image display terminal; and secondly, performing optimization processing such as image enhancement on the image to be displayed, performing super-resolution reconstruction processing on the image to be displayed, adjusting the resolution of the image to be displayed, adapting to the maximum resolution of a display terminal, and performing full-screen display except for a small image.
In summary, in the present exemplary embodiment, an image to be displayed is acquired; performing super-resolution reconstruction processing on an image to be displayed at least twice through at least two image reconstruction units to obtain at least two target images; the target images are used for being displayed on the image display terminal, and the resolutions of at least two target images are different. On the one hand, the present exemplary embodiment may perform super-resolution reconstruction processing on an image to be displayed at least twice according to different image reconstruction units, so as to obtain target images with different resolutions for display, thereby improving the singleness of the image processing process in the prior art and improving the diversity of the target images; on the other hand, the present exemplary embodiment can obtain target images with different resolutions, and can output diversified target images on the image display terminal according to different display requirements, thereby having a wider application range.
Exemplary embodiments of the present disclosure also provide an image processing apparatus. As shown in fig. 7, the image processing apparatus 700 may include: an image acquisition module 710, configured to acquire an image to be displayed; the image processing module 720 is configured to perform super-resolution reconstruction processing on an image to be displayed at least twice through at least two image reconstruction units, so as to obtain at least two target images; the target images are used for being displayed on the image display terminal, and the resolutions of at least two target images are different.
In an exemplary embodiment, at least two image reconstruction units are arranged in any one of series, parallel, a combination of series and parallel.
In an exemplary embodiment, each image reconstruction unit performs a super-resolution reconstruction process once, outputting one target image.
In an exemplary embodiment, any one of the image reconstruction units takes as input a target image output from a previous image reconstruction unit connected thereto, and outputs another target image through super-resolution reconstruction processing.
In an exemplary embodiment, the image processing apparatus further includes: and the display module is used for sending the target image to the image display terminal so as to display the target image on the image display terminal.
In an exemplary embodiment, a display module includes: and the display unit is used for respectively sending each target image to different image display terminals.
In an exemplary embodiment, the image acquisition module includes: the video stream acquisition unit is used for acquiring a video stream to be displayed; and the image confirmation unit is used for taking each frame of image in the video stream to be displayed as the image to be displayed in sequence.
In an exemplary embodiment, an image processing apparatus includes: the first optimizing module is used for optimizing the display effect of the image to be displayed and/or the target image through the first optimizing unit; wherein the display effect optimization processing includes any one or more of the following modes: image enhancement, image sharpening, image smoothing, image denoising, image deblurring, image defogging, and image restoration.
In an exemplary embodiment, the first optimizing unit and the at least two image reconstructing units are arranged in any one of series, parallel, a combination of series and parallel.
In an exemplary embodiment, an image processing apparatus includes: the second optimization module is used for carrying out morphological optimization processing on the image to be displayed and/or the target image through the second optimization unit; wherein the morphology optimization process includes any one or more of the following: image cropping, image stitching, image compositing, image rotation.
In an exemplary embodiment, the second optimizing unit and the at least two image reconstructing units are arranged in any one of series, parallel, a combination of series and parallel.
The specific details of each step in the above apparatus are already described in the method section embodiments, and the details that are not disclosed may refer to the embodiment contents of the apparatus section, so that they are not described in detail.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, e.g. any one or more of the steps of fig. 3, when the program product is run on the terminal device.
The present disclosure describes a program product for implementing the above method, which may employ a portable compact disc read-only memory (CD-ROM) and comprise program code and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (5)

1. An image processing method, the method comprising:
acquiring an image to be displayed;
performing super-resolution reconstruction processing on the image to be displayed at least twice through at least two image reconstruction units to obtain at least two target images;
the target images are used for being displayed on the image display terminal, and the resolutions of the at least two target images are different;
The at least two image reconstruction units are arranged in any mode of parallel connection, serial connection and parallel connection combination; each image reconstruction unit performs super-resolution reconstruction processing once and outputs a target image; any one of the image reconstruction units takes a target image output by a previous image reconstruction unit connected with the image reconstruction unit as input, and outputs another target image through super-resolution reconstruction processing;
the method further comprises the steps of:
each target image is sent to different image display terminals respectively, so that target images with different resolutions are displayed on the different image display terminals; or, sending the target images with different resolutions to the same image display terminal so as to respectively display the target images with different resolutions in a small image form and a full screen form except for the small image form at the same image display terminal;
the method further comprises the steps of:
performing display effect optimization processing on the image to be displayed and/or the target image through a first optimization unit; the first optimizing unit and the at least two image reconstruction units are arranged in any one mode of series connection, parallel connection, series connection and parallel connection combination;
wherein the display effect optimization processing comprises any one or more of the following modes:
Image enhancement, image sharpening, image smoothing, image denoising, image deblurring, image defogging and image restoration;
performing morphological optimization processing on the image to be displayed and/or the target image through a second optimization unit; the second optimizing unit and the at least two image reconstruction units are arranged in any one mode of series connection, parallel connection, series connection and parallel connection combination;
wherein the morphological optimization process includes any one or more of the following:
image cropping, image stitching, image compositing, image rotation.
2. The method of claim 1, wherein the acquiring the image to be displayed comprises:
acquiring a video stream to be displayed;
and taking each frame of image in the video stream to be displayed as the image to be displayed in sequence.
3. An image processing apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for acquiring an image to be displayed;
the image processing module is used for carrying out super-resolution reconstruction processing on the image to be displayed at least twice through at least two image reconstruction units to obtain at least two target images;
the target images are used for being displayed on the image display terminal, and the resolutions of the at least two target images are different;
The at least two image reconstruction units are arranged in any mode of parallel connection, serial connection and parallel connection combination; each image reconstruction unit performs super-resolution reconstruction processing once and outputs a target image; any one of the image reconstruction units takes a target image output by a previous image reconstruction unit connected with the image reconstruction unit as input, and outputs another target image through super-resolution reconstruction processing;
the image processing apparatus is further configured to:
each target image is sent to different image display terminals respectively, so that target images with different resolutions are displayed on the different image display terminals; or, sending the target images with different resolutions to the same image display terminal so as to respectively display the target images with different resolutions in a small image form and a full screen form except for the small image form at the same image display terminal;
the apparatus is further configured to:
performing display effect optimization processing on the image to be displayed and/or the target image through a first optimization unit; the first optimizing unit and the at least two image reconstruction units are arranged in any one mode of series connection, parallel connection, series connection and parallel connection combination;
Wherein the display effect optimization processing comprises any one or more of the following modes:
image enhancement, image sharpening, image smoothing, image denoising, image deblurring, image defogging and image restoration;
performing morphological optimization processing on the image to be displayed and/or the target image through a second optimization unit; the second optimizing unit and the at least two image reconstruction units are arranged in any one mode of series connection, parallel connection, series connection and parallel connection combination;
wherein the morphological optimization process includes any one or more of the following:
image cropping, image stitching, image compositing, image rotation.
4. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any of claims 1-2.
5. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1-2 via execution of the executable instructions.
CN202010373472.7A 2020-05-06 2020-05-06 Image processing method, image processing device, storage medium and electronic apparatus Active CN111626931B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010373472.7A CN111626931B (en) 2020-05-06 2020-05-06 Image processing method, image processing device, storage medium and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010373472.7A CN111626931B (en) 2020-05-06 2020-05-06 Image processing method, image processing device, storage medium and electronic apparatus

Publications (2)

Publication Number Publication Date
CN111626931A CN111626931A (en) 2020-09-04
CN111626931B true CN111626931B (en) 2023-08-08

Family

ID=72273010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010373472.7A Active CN111626931B (en) 2020-05-06 2020-05-06 Image processing method, image processing device, storage medium and electronic apparatus

Country Status (1)

Country Link
CN (1) CN111626931B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114071197B (en) * 2020-07-30 2024-04-12 华为技术有限公司 Screen projection data processing method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678728A (en) * 2016-01-19 2016-06-15 西安电子科技大学 High-efficiency super-resolution imaging device and method with regional management
CN106600536A (en) * 2016-12-14 2017-04-26 同观科技(深圳)有限公司 Video imager super-resolution reconstruction method and apparatus
CN108460723A (en) * 2018-02-05 2018-08-28 西安电子科技大学 Bilateral full variation image super-resolution rebuilding method based on neighborhood similarity
CN110490807A (en) * 2019-08-27 2019-11-22 中国人民公安大学 Image rebuilding method, device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678728A (en) * 2016-01-19 2016-06-15 西安电子科技大学 High-efficiency super-resolution imaging device and method with regional management
CN106600536A (en) * 2016-12-14 2017-04-26 同观科技(深圳)有限公司 Video imager super-resolution reconstruction method and apparatus
CN108460723A (en) * 2018-02-05 2018-08-28 西安电子科技大学 Bilateral full variation image super-resolution rebuilding method based on neighborhood similarity
CN110490807A (en) * 2019-08-27 2019-11-22 中国人民公安大学 Image rebuilding method, device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
沈焕锋 ; 李平湘 ; 张良培 ; 王毅 ; .图像超分辨率重建技术与方法综述.光学技术.2009,(02),全文. *

Also Published As

Publication number Publication date
CN111626931A (en) 2020-09-04

Similar Documents

Publication Publication Date Title
CN111179282B (en) Image processing method, image processing device, storage medium and electronic apparatus
WO2020238741A1 (en) Image processing method, related device and computer storage medium
CN112954251B (en) Video processing method, video processing device, storage medium and electronic equipment
CN111552451B (en) Display control method and device, computer readable medium and terminal equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN111741303B (en) Deep video processing method and device, storage medium and electronic equipment
CN111770282B (en) Image processing method and device, computer readable medium and terminal equipment
US12086957B2 (en) Image bloom processing method and apparatus, and storage medium
WO2022148319A1 (en) Video switching method and apparatus, storage medium, and device
CN111161176B (en) Image processing method and device, storage medium and electronic equipment
CN113744257A (en) Image fusion method and device, terminal equipment and storage medium
CN117911299A (en) Video processing method and device
CN113096022B (en) Image blurring processing method and device, storage medium and electronic device
CN113473013A (en) Display method and device for beautifying effect of image and terminal equipment
CN112037157B (en) Data processing method and device, computer readable medium and electronic equipment
CN111626931B (en) Image processing method, image processing device, storage medium and electronic apparatus
CN115546858B (en) Face image processing method and electronic equipment
CN117440194A (en) Method and related device for processing screen throwing picture
CN111294905B (en) Image processing method, image processing device, storage medium and electronic apparatus
CN113364964B (en) Image processing method, image processing apparatus, storage medium, and terminal device
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN111626929B (en) Depth image generation method and device, computer readable medium and electronic equipment
CN117974519B (en) Image processing method and related equipment
WO2024082713A1 (en) Image rendering method and apparatus
CN115019803B (en) Audio processing method, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant