WO2023108443A1 - Procédé de traitement d'images, terminal intelligent et support de stockage - Google Patents

Procédé de traitement d'images, terminal intelligent et support de stockage Download PDF

Info

Publication number
WO2023108443A1
WO2023108443A1 PCT/CN2021/138110 CN2021138110W WO2023108443A1 WO 2023108443 A1 WO2023108443 A1 WO 2023108443A1 CN 2021138110 W CN2021138110 W CN 2021138110W WO 2023108443 A1 WO2023108443 A1 WO 2023108443A1
Authority
WO
WIPO (PCT)
Prior art keywords
parameter value
target
preset
shooting
spectrum
Prior art date
Application number
PCT/CN2021/138110
Other languages
English (en)
Chinese (zh)
Inventor
孙文君
Original Assignee
深圳传音控股股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳传音控股股份有限公司 filed Critical 深圳传音控股股份有限公司
Priority to PCT/CN2021/138110 priority Critical patent/WO2023108443A1/fr
Publication of WO2023108443A1 publication Critical patent/WO2023108443A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Definitions

  • the present application relates to the technical field of image processing, and in particular to an image processing method, an intelligent terminal and a storage medium.
  • processing the shooting picture can make the image color of the shooting picture closer to the real color of the shooting picture.
  • the present application provides an image processing method, a smart terminal and a storage medium, which improve the processing effect of the photographed picture.
  • the present application provides an image processing method, which is applied to a smart terminal, and the image processing method may include:
  • S1 Determine or generate target parameter values corresponding to the shooting environment according to preset parameter values.
  • the S1 step includes:
  • a target spectrum corresponding to the shooting environment is determined or generated according to the preset parameter value.
  • the target parameter value is determined or generated.
  • the determining or generating the target parameter value based on the target spectrum includes:
  • At least one parameter value corresponding to the target spectrum is determined or generated.
  • the target parameter value is determined or generated from the at least one parameter value.
  • the determining or generating the target parameter value from the at least one parameter value includes:
  • the target parameter value is determined or generated from the at least one parameter value according to the target shooting brightness.
  • the determining or generating the target parameter value from the at least one parameter value according to the target shooting brightness includes:
  • a parameter value corresponding to the target shooting brightness is determined or generated from the at least one parameter value.
  • the parameter value corresponding to the target shooting brightness is determined as the target parameter value.
  • the determining or generating the target spectrum corresponding to the shooting environment according to the preset parameter value includes:
  • the spectrum corresponding to the preset parameter value is determined as the target spectrum.
  • the method also includes:
  • a mapping relationship between the preset spectra and parameter values is determined or generated.
  • the present application also provides an image processing method, which may include:
  • S20 Based on the preset data, determine or generate a target parameter value corresponding to the shooting environment corresponding to the shooting picture.
  • the S20 includes:
  • the target parameter value is determined or generated based on the target spectrum of the shooting environment.
  • the preset data may be original data or a compressed image, etc.
  • the present application uses the preset data as an example to describe the original data and the compressed image.
  • the determining or generating the target spectrum of the shooting environment based on the preset data includes:
  • the determining or generating the preset parameter value corresponding to the target gray pixel in the original data in the preset data based on the gray pixels in the compressed image in the preset data includes:
  • the position of the target gray pixel in the original data is determined or generated.
  • a preset parameter value corresponding to the target gray pixel is determined.
  • the determining or generating the position of the target gray pixel in the original data based on the gray pixels in the compressed image includes:
  • a size ratio between the original data and the compressed image is determined according to the size of the original data and the size of the compressed image.
  • the determining or generating the target parameter value based on the target spectrum of the shooting environment includes:
  • At least one parameter value corresponding to the target spectrum is determined or generated.
  • the target parameter value is determined or generated from the at least one parameter value.
  • the determining or generating the target parameter value from the at least one parameter value according to the target shooting brightness includes:
  • a parameter value corresponding to the target shooting brightness is determined or generated from the at least one parameter value.
  • the parameter value corresponding to the target shooting brightness is determined as the target parameter value.
  • the present application also provides an image processing device, and the image processing device may include:
  • the determining unit is configured to determine or generate target parameter values corresponding to the shooting environment according to preset parameter values.
  • a processing unit configured to perform preset processing on the shooting picture corresponding to the shooting environment based on the target parameter value.
  • the determining unit includes a first determining module and a second determining module.
  • the first determining module is configured to determine or generate a target spectrum corresponding to the shooting environment according to the preset parameter value.
  • the second determination module is configured to determine or generate the target parameter value based on the target spectrum.
  • the second determination module is specifically configured to determine or generate at least one parameter value corresponding to the target spectrum based on a preset mapping relationship between the spectrum and the parameter value; from the at least one parameter value Determining or generating said target parameter value.
  • the second determining module is specifically configured to acquire target shooting brightness; determine or generate the target parameter value from the at least one parameter value according to the target shooting brightness.
  • the second determining module is specifically configured to determine or generate a parameter value corresponding to the target shooting brightness from the at least one parameter value based on a preset mapping relationship between shooting brightness and parameter values; The parameter value corresponding to the target shooting brightness is determined as the target parameter value.
  • the first determining module is specifically configured to determine or generate a spectrum corresponding to the preset parameter value based on a preset mapping relationship between the preset parameter value and the spectrum; The corresponding spectrum is determined as the target spectrum.
  • the device further includes a collection unit and a generation unit.
  • the collection unit is configured to collect target parameter values corresponding to each spectrum in a full-spectrum scenario.
  • the generating unit is configured to determine or generate a mapping relationship between the preset spectra and parameter values according to the target parameter values corresponding to the respective spectra.
  • the present application also provides an image processing device, which may include:
  • the obtaining unit is used to obtain preset data of the shooting picture.
  • the determining unit is configured to determine or generate a target parameter value corresponding to the shooting environment corresponding to the shooting picture based on the preset data.
  • a processing unit configured to perform preset processing on the shooting picture corresponding to the shooting environment based on the target parameter value.
  • the determining unit includes a first determining module and a second determining module
  • the first determining module is configured to determine or generate the target spectrum of the shooting environment based on the preset data.
  • the second determination module is configured to determine or generate the target parameter value based on the target spectrum of the shooting environment.
  • the first determining module is specifically configured to determine or generate preset parameters corresponding to target gray pixels in the original data in the preset data based on the gray pixels in the compressed image in the preset data value; based on the preset parameter value corresponding to the target gray pixel, determine or generate the target spectrum of the shooting environment.
  • the first determination module is specifically configured to determine or generate the position of the target gray pixel in the original data based on the gray pixels in the compressed image; based on the position of the target gray pixel in the The position in the original data determines the preset parameter value corresponding to the target gray pixel.
  • the first determination module is specifically configured to determine the size ratio between the original data and the compressed image according to the size of the original data and the size of the compressed image;
  • the position of the gray pixel and the size ratio are used to determine or generate the position of the target gray pixel in the original data.
  • the second determination module is specifically configured to determine or generate at least one parameter value corresponding to the target spectrum based on a preset mapping relationship between the spectrum and the parameter value; from the at least one parameter value Determining or generating said target parameter value.
  • the second determining module is specifically configured to determine or generate a parameter value corresponding to the target shooting brightness from the at least one parameter value based on a preset mapping relationship between shooting brightness and parameter values; The parameter value corresponding to the target shooting brightness is determined as the target parameter value.
  • the present application also provides an intelligent terminal, including: a memory and a processor.
  • an image processing method program is stored in the memory, and when the image processing method program is executed by the processor, the above-mentioned image processing method is implemented. A step of.
  • the present application also provides a computer storage medium, where the computer storage medium stores a computer program, and when the computer program is executed by a processor, the steps of the above-mentioned image processing method are realized.
  • the preset parameter value of the collected shooting environment is an accurate preset parameter value, therefore, according to the accurate preset parameter value,
  • the target parameter value corresponding to the shooting environment can be accurately determined or generated; and based on the target parameter value, preset processing is performed on the shooting picture corresponding to the shooting environment, which can effectively improve the processing effect.
  • FIG. 1 is a schematic diagram of a hardware structure of an intelligent terminal implementing various embodiments of the present application
  • FIG. 2 is a system architecture diagram of a communication network provided by the present application.
  • FIG. 3 is a schematic diagram of a hardware structure of a controller provided by the present application.
  • FIG. 4 is a schematic diagram of a hardware structure of a network node provided by the present application.
  • FIG. 5 is a schematic diagram of a hardware structure of a network node provided by the present application.
  • FIG. 6 is a schematic diagram of a hardware structure of a controller provided by the present application.
  • FIG. 7 is a schematic diagram of a hardware structure of a network node provided by the present application.
  • FIG. 8 is a schematic flowchart of an image processing method provided by the present application.
  • FIG. 9 is a schematic flowchart of another image processing method provided by the present application.
  • FIG. 10 is a schematic structural diagram of an image processing device provided by the present application.
  • FIG. 11 is a schematic structural diagram of another image processing device provided by the present application.
  • first, second, third, etc. may be used herein to describe various information, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from one another. For example, without departing from the scope of this document, first information may also be called second information, and similarly, second information may also be called first information.
  • first information may also be called second information, and similarly, second information may also be called first information.
  • second information may also be called first information.
  • the word “if” as used herein may be interpreted as “at” or “when” or “in response to a determination”.
  • the singular forms "a”, “an” and “the” are intended to include the plural forms as well, unless the context indicates otherwise.
  • A, B, C means “any of the following: A; B; C; A and B; A and C; B and C; A and B and C
  • A, B or C or "A, B and/or C” means "any of the following: A; B; C; A and B; A and C; B and C; A and B and C”. Exceptions to this definition will only arise when combinations of elements, functions, steps or operations are inherently mutually exclusive in some way.
  • the words “if”, “if” as used herein may be interpreted as “at” or “when” or “in response to determining” or “in response to detecting”.
  • the phrases “if determined” or “if detected (the stated condition or event)” could be interpreted as “when determined” or “in response to the determination” or “when detected (the stated condition or event) )” or “in response to detection of (a stated condition or event)”.
  • step codes such as S101 and S102 are used, the purpose of which is to express the corresponding content more clearly and concisely, and does not constitute a substantive limitation on the order.
  • S102 will be executed first, followed by S101, etc., but these should be within the protection scope of this application.
  • Smart terminals can be implemented in various forms.
  • the intelligent terminal described in this application may include such as mobile phone, tablet computer, notebook computer, palmtop computer, PDA (Personal Digital Assistant, personal digital assistant), PMP (Portable Media Player, portable media player), navigation device, Smart terminals such as wearable devices, smart bracelets, and pedometers, as well as fixed terminals such as digital TVs and desktop computers.
  • a smart terminal will be taken as an example, and those skilled in the art will understand that, in addition to elements specially used for mobile purposes, the configurations according to the embodiments of the present application can also be applied to fixed-type terminals.
  • FIG. 1 is a schematic diagram of the hardware structure of a smart terminal implementing various embodiments of the present application.
  • the smart terminal 100 may include: an RF (Radio Frequency, radio frequency) unit 101, a WiFi module 102, an audio output unit 103, an /V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111 and other components.
  • RF Radio Frequency, radio frequency
  • the radio frequency unit 101 can be used for sending and receiving information or receiving and sending signals during a call. Specifically, after receiving the downlink information of the base station, it is processed by the processor 110; in addition, the uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through wireless communication.
  • the above wireless communication can use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication, Global System for Mobile Communications), GPRS (General Packet Radio Service, General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 , Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access, Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, Time Division Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency division duplex long-term evolution), TDD-LTE (Time Division Duplexing-Long Term Evolution, time-division duplex long-term evolution) and 5G, etc.
  • GSM Global System of Mobile communication, Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • CDMA2000 Code Division Multiple Access 2000
  • WCDMA Wideband Code Division Multiple Access
  • TD-SCDMA Time Division-Synchronous Code Division Multiple Access, Time Division Synchro
  • WiFi is a short-distance wireless transmission technology.
  • the smart terminal can help users send and receive emails, browse web pages, and access streaming media, etc., and it provides users with wireless broadband Internet access.
  • Fig. 1 shows the WiFi module 102, it can be understood that it is not an essential component of the smart terminal, and can be completely omitted as required without changing the essence of the invention.
  • the audio output unit 103 can store the information received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 when the smart terminal 100 is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, or the like.
  • the audio data is converted into an audio signal and output as sound.
  • the audio output unit 103 may also provide audio output related to specific functions performed by the smart terminal 100 (for example, call signal receiving sound, message receiving sound, etc.).
  • the audio output unit 103 may include a speaker, a buzzer, and the like.
  • the A/V input unit 104 is used to receive audio or video signals.
  • the A/V input unit 104 may include a GPU (Graphics Processing Unit, graphics processor) 1041 and a microphone 1042, and the graphics processor 1041 is used for still pictures obtained by an image capture device (such as a camera) in video capture mode or image capture mode
  • the image data of the video is processed.
  • the processed image frames may be displayed on the display unit 106 .
  • the image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage media) or sent via the radio frequency unit 101 or the WiFi module 102 .
  • the microphone 1042 can receive sound (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, and the like operating modes, and can process such sound as audio data.
  • the processed audio (voice) data can be converted into a format transmittable to a mobile communication base station via the radio frequency unit 101 for output in case of a phone call mode.
  • the microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the process of receiving and transmitting audio signals.
  • the smart terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light, and the proximity sensor can turn off the display when the smart terminal 100 moves to the ear. panel 1061 and/or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when it is stationary, and can be used for applications that recognize the posture of mobile phones (such as horizontal and vertical screen switching, related Games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; as for mobile phones, fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, Other sensors such as thermometers and infrared sensors will not be described in detail here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of LCD (Liquid Crystal Display, liquid crystal display), OLED (Organic Light-Emitting Diode, organic light-emitting diode), and the like.
  • LCD Liquid Crystal Display, liquid crystal display
  • OLED Organic Light-Emitting Diode, organic light-emitting diode
  • the user input unit 107 can be used to receive input numbers or character information, and generate key signal input related to user settings and function control of the smart terminal.
  • the user input unit 107 may include a touch panel 1071 and other input devices 1072 .
  • the touch panel 1071 also referred to as a touch screen, can collect touch operations of the user on or near it (for example, the user uses any suitable object or accessory such as a finger or a stylus on the touch panel 1071 or near the touch panel 1071). operation), and drive the corresponding connection device according to the preset program.
  • the touch panel 1071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into contact coordinates , and then sent to the processor 110, and can receive the command sent by the processor 110 and execute it.
  • the touch panel 1071 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072 .
  • other input devices 1072 may include, but are not limited to, one or more of physical keyboards, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, etc., which are not specifically described here. limited.
  • the touch panel 1071 may cover the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it transmits to the processor 110 to determine the type of the touch event, and then the processor 110 determines the touch event according to the touch event.
  • the corresponding visual output is provided on the display panel 1061 .
  • the touch panel 1071 and the display panel 1061 are used as two independent components to realize the input and output functions of the smart terminal, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated.
  • the implementation of the input and output functions of the smart terminal is not specifically limited here.
  • the interface unit 108 is used as an interface through which at least one external device can be connected with the smart terminal 100 .
  • an external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) ports, video I/O ports, headphone ports, and more.
  • the interface unit 108 can be used to receive input from an external device (for example, data information, power, etc.) transfer data between devices.
  • the memory 109 can be used to store software programs as well as various data.
  • the memory 109 can mainly include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one function required application program (such as a sound playback function, an image playback function, etc.) etc.
  • the storage data area can be Store data (such as audio data, phone book, etc.) created according to the use of the mobile phone.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the smart terminal, and uses various interfaces and lines to connect various parts of the whole smart terminal, by running or executing software programs and/or modules stored in the memory 109, and calling data stored in the memory 109 , execute various functions of the smart terminal and process data, so as to monitor the smart terminal as a whole.
  • the processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor.
  • the application processor mainly processes operating systems, user interfaces, and application programs, etc.
  • the demodulation processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110 .
  • the smart terminal 100 can also include a power supply 111 (such as a battery) for supplying power to various components.
  • a power supply 111 (such as a battery) for supplying power to various components.
  • the power supply 111 can be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. and other functions.
  • the smart terminal 100 may also include a Bluetooth module, etc., which will not be repeated here.
  • Fig. 2 is a kind of communication network system architecture diagram that the present application provides, and this communication network system is the LTE system of general mobile communication technology, and this LTE system comprises the UE (User Equipment, user equipment) 201 that communication connects sequentially , E-UTRAN (Evolved UMTS Terrestrial Radio Access Network, Evolved UMTS Terrestrial Radio Access Network) 202, EPC (Evolved Packet Core, Evolved Packet Core Network) 203 and the operator's IP service 204.
  • UE User Equipment, user equipment
  • E-UTRAN Evolved UMTS Terrestrial Radio Access Network
  • EPC Evolved Packet Core, Evolved Packet Core Network
  • the UE 201 may be the above-mentioned terminal 100, which will not be repeated here.
  • E-UTRAN 202 includes eNodeB 2021 and other eNodeB 2022 and so on.
  • the eNodeB 2021 can be connected to other eNodeB 2022 through a backhaul (for example, X2 interface), the eNodeB 2021 is connected to the EPC 203 , and the eNodeB 2021 can provide access from the UE 201 to the EPC 203 .
  • a backhaul for example, X2 interface
  • EPC203 may include MME (Mobility Management Entity, Mobility Management Entity) 2031, HSS (Home Subscriber Server, Home Subscriber Server) 2032, other MME2033, SGW (Serving Gate Way, Serving Gateway) 2034, PGW (PDN Gate Way, packet data Network Gateway) 2035 and PCRF (Policy and Charging Rules Function, Policy and Charging Functional Entity) 2036, etc.
  • MME2031 is a control node that processes signaling between UE201 and EPC203, and provides bearer and connection management.
  • HSS2032 is used to provide some registers to manage functions such as home location register (not shown in the figure), and save some user-specific information about service features and data rates.
  • PCRF2036 is the policy and charging control policy decision point of service data flow and IP bearer resources, it is the policy and charging execution function A unit (not shown) selects and provides available policy and charging control decisions.
  • the IP service 204 may include Internet, Intranet, IMS (IP Multimedia Subsystem, IP Multimedia Subsystem) or other IP services.
  • IMS IP Multimedia Subsystem, IP Multimedia Subsystem
  • LTE system is used as an example above, those skilled in the art should know that this application is not only applicable to the LTE system, but also applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA and future new wireless communication systems.
  • the network system (such as 5G), etc., is not limited here.
  • FIG. 3 is a schematic diagram of a hardware structure of a controller 140 provided in the present application.
  • the controller 140 includes: a memory 1401 and a processor 1402, the memory 1401 is used to store program instructions, and the processor 1402 is used to call the program instructions in the memory 1401 to execute the steps performed by the controller in the first method embodiment above, and its implementation principle and beneficial effects are similar, and will not be repeated here.
  • the foregoing controller further includes a communication interface 1403 , and the communication interface 1403 may be connected to the processor 1402 through a bus 1404 .
  • the processor 1402 can control the communication interface 1403 to implement the receiving and sending functions of the controller 140 .
  • FIG. 4 is a schematic diagram of a hardware structure of a network node 150 provided in the present application.
  • the network node 150 includes: a memory 1501 and a processor 1502, the memory 1501 is used to store program instructions, and the processor 1502 is used to call the program instructions in the memory 1501 to execute the steps performed by the first node in the first method embodiment above, and its implementation principle and beneficial effects are similar, and will not be repeated here.
  • the foregoing controller further includes a communication interface 1503 , and the communication interface 1503 may be connected to the processor 1502 through a bus 1504 .
  • the processor 1502 can control the communication interface 1503 to realize the functions of receiving and sending of the network node 150 .
  • FIG. 5 is a schematic diagram of a hardware structure of a network node 160 provided in the present application.
  • the network node 160 includes: a memory 1601 and a processor 1602, the memory 1601 is used to store program instructions, and the processor 1602 is used to call the program instructions in the memory 1601 to execute the steps performed by the intermediate node and the tail node in the first method embodiment above, The implementation principles and beneficial effects are similar, and will not be repeated here.
  • the foregoing controller further includes a communication interface 1603 , and the communication interface 1603 may be connected to the processor 1602 through a bus 1604 .
  • the processor 1602 can control the communication interface 1603 to realize the functions of receiving and sending of the network node 160 .
  • FIG. 6 is a schematic diagram of a hardware structure of a controller 170 provided in the present application.
  • the controller 170 includes: a memory 1701 and a processor 1702, the memory 1701 is used to store program instructions, and the processor 1702 is used to call the program instructions in the memory 1701 to execute the steps performed by the controller in the second method embodiment above, and its implementation principle and beneficial effects are similar, and will not be repeated here.
  • FIG. 7 is a schematic diagram of a hardware structure of a network node 180 provided in the present application.
  • the network node 180 includes: a memory 1801 and a processor 1802, the memory 1801 is used to store program instructions, and the processor 1802 is used to invoke the program instructions in the memory 1801 to execute the steps performed by the head node in the second method embodiment above, and its implementation principle and beneficial effects are similar, and will not be repeated here.
  • the above-mentioned integrated modules implemented in the form of software function modules can be stored in a computer-readable storage medium.
  • the above-mentioned software function modules are stored in a storage medium, and include several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) or a processor (English: processor) to execute the methods of the various embodiments of the present application. partial steps.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • a computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the present application are generated in whole or in part.
  • a computer can be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g. Coaxial cable, optical fiber, digital subscriber line
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server, a data center, etc. integrated with one or more available media. Available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, solid state disk, SSD), etc.
  • the image processing method provided in the present application can be applied to the scene of image shooting, for example, the scene of computing and shooting.
  • the computing camera scene as an example, when the image of the shooting screen is obtained by computing the camera, in order to make the image color of the shooting screen closer to the real color of the shooting screen, during the image shooting process, it is necessary to perform preset processing on the shooting screen so that the shooting The image is as close as possible to the real image of the shooting screen.
  • the present application provides an image processing method, and proposes various embodiments of the present application based on the above-mentioned intelligent terminal hardware structure and communication network system.
  • the image processing method provided by the present application will be described in detail through specific embodiments. It can be understood that the following specific embodiments may be combined with each other, and the same or similar concepts or processes may not be repeated in some embodiments.
  • FIG. 8 is a schematic flowchart of an image processing method provided by the present application.
  • the image processing method may be executed by software and/or hardware devices.
  • the hardware device may be an image processing device, and the image processing device may be an intelligent terminal.
  • the image processing method may include:
  • S1 Determine or generate target parameter values corresponding to the shooting environment according to preset parameter values.
  • the preset parameter value may be a preset parameter value of the shooting environment under the collected grayscale pixels.
  • an equalizing sheet can be attached to the camera module. Due to the function of the uniformizing sheet, the preset parameter values collected are grayscale Pixel, which is the preset parameter value in a neutral color environment.
  • the camera may include a module on which a homogenizing sheet is attached, so that the preset parameter values of the shooting environment under the gray-scale pixels can be collected, and subsequent parameters based on the preset parameters of the shooting environment under the gray-scale pixels can be collected.
  • the camera module can also include two modules, one of which is attached with a uniform light sheet, and the other There is no uniformity sheet attached to a module, so that the preset parameter value of the shooting environment under the grayscale pixel is collected based on the module with the uniformity sheet attached, and the preset parameter value of the shooting environment under the grayscale pixel is subsequently collected based on the preset parameter value of the grayscale pixel.
  • Perform preset processing on the pictures in the shooting environment; another module that is not attached with a uniform film can also determine or generate the target image of the shooting picture based on the preset processing results, which can be set according to actual needs.
  • the target parameter value corresponding to the shooting environment can be accurately determined or generated according to the accurate preset parameter value.
  • the spectrum corresponding to the preset parameter value can be determined or generated based on the mapping relationship between the preset preset parameter value and the spectrum ; and determine the spectrum corresponding to the preset parameter value as the target spectrum.
  • the mapping relationship between the preset preset parameter value and the spectrum can be received from other electronic devices, or can be obtained from local storage
  • the mapping relationship between the preset preset parameter value and the spectrum can also be obtained in other ways, and can be specifically set according to actual needs.
  • the target parameter value can be determined or generated based on the target spectrum.
  • at least one parameter value corresponding to the target spectrum can be determined or generated based on the preset mapping relationship between the spectrum and the parameter value; from the at least one parameter value Determine or generate target parameter values.
  • the preset mapping relationship between the spectrum and the parameter value can be received from other electronic devices, or the preset spectrum and the parameter value can be retrieved from the local storage
  • the mapping relationship between the spectrum and the parameter value can also be obtained in other ways, which can be set according to actual needs.
  • mapping relationship between spectra and parameter values you can first collect the target parameter values corresponding to each spectrum in a full-spectrum scenario; in this way, after collecting the target parameter values corresponding to each spectrum, you can According to the target parameter value corresponding to each spectrum, determine or generate the mapping relationship between the preset spectrum and the parameter value.
  • the target parameter value can be obtained first.
  • the target shooting brightness can be understood as an expected shooting brightness when shooting an image, and a target parameter value is determined or generated from at least one parameter value according to the target shooting brightness.
  • the grayscale image is taken based on different shooting brightness, and different shooting brightness corresponds to different target parameter values
  • the target shooting brightness determine or generate the target parameter from at least one parameter value value
  • the parameter value corresponding to the target shooting brightness can be determined or generated from at least one parameter value based on the preset mapping relationship between the shooting brightness and the parameter value
  • the parameter value corresponding to the target shooting brightness can be determined as the target parameter value, so as to obtain the target parameter value corresponding to the shooting environment.
  • the color temperature value corresponding to the shooting environment may also be determined according to the target parameter value corresponding to the shooting environment.
  • preset processing can be performed on the shooting picture corresponding to the shooting environment based on the target parameter value, that is, the following S3 is executed:
  • the preset processing when the preset processing is performed on the shooting picture, since the preset parameter value of the collected shooting environment is an accurate preset parameter value, therefore, according to the accurate preset parameter value, it can be accurately
  • the target parameter value corresponding to the shooting environment is determined or generated; and then based on the target parameter value, preset processing is performed on the shooting picture corresponding to the shooting environment, which can effectively improve the processing effect.
  • Fig. 9 is a schematic flow chart of another image processing method provided by the present application.
  • the image processing method can also be executed by software and/or hardware devices.
  • the hardware device can be an image processing device, and the image processing device can be an intelligent terminal.
  • the image processing method may include:
  • the photographing picture may be a photographing picture including gray pixels
  • the preset data may be original data or compressed images, and the present application uses the preset data as examples for the original data and compressed images.
  • the original data may be raw data
  • the compressed image may be a jpeg image
  • the preset data of the shooting picture may be received from other electronic devices, or may be searched and obtained from the local storage, or the preset data of the shooting picture may be obtained in other ways
  • the preset data of the shooting picture can be set according to actual needs.
  • the photographed picture including gray pixels may be a picture in which a standard object representing gray pixels such as a color card or a gray card is placed.
  • the shooting picture can include enough neutral colors, so that when determining the target parameter value corresponding to the shooting environment corresponding to the shooting picture, the target parameter value corresponding to the shooting environment can be accurately determined , that is, execute the following S20:
  • S20 Based on the preset data, determine or generate a target parameter value corresponding to the shooting environment corresponding to the shooting frame.
  • the target spectrum of the shooting environment can be determined or generated based on the preset data; and then determined based on the target spectrum of the shooting environment Or generate target parameter values.
  • the gray pixels in the compressed image in the preset data can be firstly The gray pixels determine or generate preset parameter values corresponding to the target gray pixels in the raw data in the preset data; and then determine or generate the target spectrum of the shooting environment based on the preset parameter values corresponding to the target gray pixels.
  • the preset parameter value corresponding to the target gray pixel in the original data when determining or generating the preset parameter value corresponding to the target gray pixel in the original data based on the gray pixels in the compressed image, first determine or generate the value of the target gray pixel in the original data based on the gray pixels in the compressed image position, and then based on the position of the target gray pixel in the original data, determine the preset parameter value corresponding to the target gray pixel, so as to obtain the preset parameter value corresponding to the target gray pixel.
  • the size ratio between the original data and the compressed image can be determined first according to the size of the original data and the size of the compressed image; Then, according to the position and size ratio of the gray pixels in the compressed image, the position of the gray pixels in the original image is determined.
  • the gray pixels in the original data can be recorded here as the target gray pixels, so that Determine or generate the location of the gray pixel of interest in the raw data.
  • the target parameter value can be determined or generated based on the target spectrum of the shooting environment.
  • at least one parameter value corresponding to the target spectrum can be determined or generated based on the preset mapping relationship between the spectrum and the parameter value; from at least one Determine or generate target parameter values from parameter values.
  • the preset mapping relationship between the spectrum and the parameter value can be received from other electronic devices, or the preset spectrum and the parameter value can be retrieved from the local storage
  • the mapping relationship between the spectrum and the parameter value can also be obtained in other ways, which can be set according to actual needs.
  • mapping relationship between spectra and parameter values you can first collect the target parameter values corresponding to each spectrum in a full-spectrum scenario; in this way, after collecting the target parameter values corresponding to each spectrum, you can According to the target parameter value corresponding to each spectrum, determine or generate the mapping relationship between the preset spectrum and the parameter value.
  • the target parameter value can be obtained first.
  • the target shooting brightness can be understood as an expected shooting brightness when shooting an image, and a target parameter value is determined or generated from at least one parameter value according to the target shooting brightness.
  • the grayscale image is taken based on different shooting brightness, and different shooting brightness corresponds to different target parameter values
  • the target shooting brightness determine or generate the target parameter from at least one parameter value value
  • the parameter value corresponding to the target shooting brightness can be determined or generated from at least one parameter value based on the preset mapping relationship between the shooting brightness and the parameter value
  • the parameter value corresponding to the target shooting brightness can be determined as the target parameter value, so as to obtain the target parameter value corresponding to the shooting environment.
  • preset processing can be performed on the shooting picture corresponding to the shooting environment based on the target parameter value, that is, the following S30 is executed:
  • the preset data of the shooting picture is obtained; since the preset data contains enough neutral color information, based on the preset data of the shooting picture, The target parameter value corresponding to the shooting environment corresponding to the shooting frame can be accurately determined or generated; and then based on the target parameter value, preset processing is performed on the shooting frame corresponding to the shooting environment, which can effectively improve the processing effect.
  • FIG. 10 is a schematic structural diagram of an image processing device 1000 provided in the present application.
  • the image processing device 1000 may include:
  • the determining unit 1001 is configured to determine or generate target parameter values corresponding to the shooting environment according to preset parameter values.
  • the processing unit 1002 is configured to perform preset processing on the shooting picture corresponding to the shooting environment based on the target parameter value.
  • the determining unit 1001 includes a first determining module and a second determining module.
  • the first determination module is configured to determine or generate a target spectrum corresponding to the shooting environment according to preset parameter values.
  • the second determining module is configured to determine or generate target parameter values based on the target spectrum.
  • the second determination module is specifically configured to determine or generate at least one parameter value corresponding to the target spectrum based on the preset mapping relationship between the spectrum and the parameter value; determine or generate the target parameter value from at least one parameter value .
  • the second determining module is specifically configured to acquire target shooting brightness; determine or generate a target parameter value from at least one parameter value according to the target shooting brightness.
  • the second determination module is specifically configured to determine or generate a parameter value corresponding to the target shooting brightness from at least one parameter value based on a preset mapping relationship between the shooting brightness and the parameter value; The parameter value is determined as the target parameter value.
  • the first determining module is specifically configured to determine or generate a spectrum corresponding to a preset parameter value based on a preset mapping relationship between a preset parameter value and a spectrum; determine the spectrum corresponding to a preset parameter value as target spectrum.
  • the device further includes a collection unit and a generation unit.
  • the collection unit is used to collect target parameter values corresponding to each spectrum in a full-spectrum scenario.
  • the generating unit is configured to determine or generate a preset mapping relationship between spectra and parameter values according to target parameter values corresponding to each spectrum.
  • the image processing device 1000 shown in this application can execute the technical solution of the image processing method in the above-mentioned embodiments, and its realization principle and beneficial effect are similar to those of the image processing method. Please refer to the realization principle and beneficial effect of the image processing method. Beneficial effects are not repeated here.
  • FIG. 11 is a schematic structural diagram of another image processing device 1100 provided by the present application.
  • the image processing device 1100 may include:
  • the acquiring unit 1101 is configured to acquire preset data of a shooting picture.
  • the determining unit 1102 is configured to determine or generate a target parameter value corresponding to a shooting environment corresponding to a shooting frame based on preset data.
  • the processing unit 1103 is configured to perform preset processing on the shooting picture corresponding to the shooting environment based on the target parameter value.
  • the determining unit 1102 includes a first determining module and a second determining module;
  • the first determination module is configured to determine or generate the target spectrum of the shooting environment based on preset data.
  • the second determining module is configured to determine or generate target parameter values based on the target spectrum of the shooting environment.
  • the first determining module is specifically configured to determine or generate a preset parameter value corresponding to a target gray pixel in the original data in the preset data based on the gray pixels in the compressed image in the preset data;
  • the target spectrum of the shooting environment is determined or generated corresponding to the preset parameter value.
  • the first determination module is specifically configured to determine or generate the position of the target gray pixel in the original data based on the gray pixels in the compressed image; and determine the corresponding position of the target gray pixel based on the position of the target gray pixel in the original data Default parameter value.
  • the first determination module is specifically configured to determine the size ratio between the original data and the compressed image according to the size of the original data and the size of the compressed image; determine or generate The location of the target gray pixel in the raw data.
  • the second determination module is specifically configured to determine or generate at least one parameter value corresponding to the target spectrum based on the preset mapping relationship between the spectrum and the parameter value; determine or generate the target parameter value from at least one parameter value .
  • the second determination module is specifically configured to determine or generate a parameter value corresponding to the target shooting brightness from at least one parameter value based on a preset mapping relationship between the shooting brightness and the parameter value; The parameter value is determined as the target parameter value.
  • the image processing device 1100 shown in this application can execute the technical solution of the image processing method in the above-mentioned embodiment, and its realization principle and beneficial effect are similar to those of the image processing method. Please refer to the realization principle and beneficial effect of the image processing method. Beneficial effects are not repeated here.
  • the present application also provides an intelligent terminal.
  • the intelligent terminal includes a memory and a processor.
  • the image processing method program is stored in the memory.
  • the image processing method program is executed by the processor, the steps of the image processing method in any of the above embodiments are implemented.
  • the present application also provides a computer-readable storage medium, on which an image processing method program is stored, and when the image processing method program is executed by a processor, the steps of the image processing method in any of the foregoing embodiments are implemented.
  • the embodiments of the smart terminal and the computer-readable storage medium provided in this application may contain all the technical features of any of the above-mentioned image processing method embodiments. No more details.
  • the present application also provides a computer program product, the computer program product includes computer program code, and when the computer program code is run on the computer, the computer is made to execute the methods in the above various possible implementation manners.
  • the present application also provides a chip, including a memory and a processor, the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that the device installed with the chip executes the methods in the above various possible implementation modes .
  • Units in the device of the present application can be combined, divided and deleted according to actual needs.
  • the methods of the above embodiments can be implemented by means of software plus a necessary general-purpose hardware platform, and of course also by hardware, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or in other words, the part that contributes to the prior art, and the computer software product is stored in one of the above storage media (such as ROM/RAM, magnetic CD, CD), including several instructions to make a terminal device (which may be a mobile phone, computer, server, controlled terminal, or network device, etc.) execute the method of each embodiment of the present application.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • a computer program product comprises one or at least one computer instruction.
  • the computer program instructions When the computer program instructions are loaded and executed on the computer, the processes or functions according to the present application are generated in whole or in part.
  • the computer can be a general purpose computer, special purpose computer, a computer network, or other programmable apparatus.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, computer instructions may be transmitted from a website site, computer, server or data center by wire (such as Coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server, a data center, etc. integrated with one or at least one available medium.
  • Usable media may be magnetic media, (eg, floppy disk, memory disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)), among others.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente demande concerne un procédé de traitement d'image, un terminal intelligent et un support de stockage. Le procédé comprend les étapes suivantes : lorsqu'un traitement prédéfini est effectué sur une image capturée, étant donné qu'une valeur de paramètre prédéfinie collectée d'un environnement photographique est une valeur de paramètre prédéfinie précise, la possibilité de déterminer ou de générer avec précision, en fonction de la valeur de paramètre prédéfinie précise, une valeur de paramètre cible correspondant à l'environnement photographique ; puis la réalisation, sur la base de la valeur de paramètre cible, d'un traitement prédéfini sur l'image capturée correspondant à l'environnement photographique, de sorte que l'effet de traitement puisse être efficacement amélioré.
PCT/CN2021/138110 2021-12-14 2021-12-14 Procédé de traitement d'images, terminal intelligent et support de stockage WO2023108443A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/138110 WO2023108443A1 (fr) 2021-12-14 2021-12-14 Procédé de traitement d'images, terminal intelligent et support de stockage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/138110 WO2023108443A1 (fr) 2021-12-14 2021-12-14 Procédé de traitement d'images, terminal intelligent et support de stockage

Publications (1)

Publication Number Publication Date
WO2023108443A1 true WO2023108443A1 (fr) 2023-06-22

Family

ID=86774948

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/138110 WO2023108443A1 (fr) 2021-12-14 2021-12-14 Procédé de traitement d'images, terminal intelligent et support de stockage

Country Status (1)

Country Link
WO (1) WO2023108443A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090102943A1 (en) * 2007-10-22 2009-04-23 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and image sensing apparatus
JP2017212640A (ja) * 2016-05-26 2017-11-30 日本放送協会 色調整装置および色調整システム
CN107846554A (zh) * 2017-10-31 2018-03-27 努比亚技术有限公司 一种图像处理方法、终端和计算机可读存储介质
CN110121064A (zh) * 2019-05-15 2019-08-13 深圳市道通智能航空技术有限公司 一种图像色彩调节方法、装置及无人机
US20210281713A1 (en) * 2018-06-12 2021-09-09 Omron Corporation Image processing device, image processing method, image sensor, and non-transitory computer readable recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090102943A1 (en) * 2007-10-22 2009-04-23 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and image sensing apparatus
JP2017212640A (ja) * 2016-05-26 2017-11-30 日本放送協会 色調整装置および色調整システム
CN107846554A (zh) * 2017-10-31 2018-03-27 努比亚技术有限公司 一种图像处理方法、终端和计算机可读存储介质
US20210281713A1 (en) * 2018-06-12 2021-09-09 Omron Corporation Image processing device, image processing method, image sensor, and non-transitory computer readable recording medium
CN110121064A (zh) * 2019-05-15 2019-08-13 深圳市道通智能航空技术有限公司 一种图像色彩调节方法、装置及无人机

Similar Documents

Publication Publication Date Title
CN113179369B (zh) 拍摄画面的显示方法、移动终端及存储介质
WO2023005060A1 (fr) Procédé de capture d'image, terminal mobile et support de stockage
CN113268298A (zh) 应用显示方法、移动终端及可读存储介质
CN113325981B (zh) 处理方法、移动终端及存储介质
CN107239208B (zh) 处理屏幕截图的方法、设备及计算机可读存储介质
WO2024001853A1 (fr) Procédé de traitement, terminal intelligent et support de stockage
WO2023108444A1 (fr) Procédé de traitement d'image, terminal intelligent et support de stockage
WO2023284218A1 (fr) Procédé de commande de photographie, terminal mobile et support de stockage
WO2022262259A1 (fr) Procédé et appareil de traitement d'image, et dispositif, support et puce
CN112532786B (zh) 图像显示方法、终端设备和存储介质
WO2022252158A1 (fr) Procédé de photographie, terminal mobile et support de stockage lisible
CN115914719A (zh) 投屏显示方法、智能终端及存储介质
WO2023108443A1 (fr) Procédé de traitement d'images, terminal intelligent et support de stockage
CN108335301B (zh) 一种拍照方法及移动终端
CN107566745B (zh) 一种拍摄方法、终端和计算机可读存储介质
CN112230825A (zh) 分享方法、移动终端及存储介质
WO2023108442A1 (fr) Procédé de traitement d'image, terminal intelligent, et support de stockage
WO2023050413A1 (fr) Procédé de traitement d'image, terminal intelligent et support de stockage
WO2023102934A1 (fr) Procédé de traitement de données, terminal intelligent et support de stockage
CN114125151B (zh) 图像处理方法、移动终端及存储介质
WO2024045145A1 (fr) Procédé de commande d'affichage d'icône, terminal mobile et support de stockage
CN110110138B (zh) 保存方法、移动终端及计算机可读存储介质
WO2022261897A1 (fr) Procédé de traitement, terminal mobile et support de stockage
CN107479747B (zh) 一种触控显示方法、设备及计算机存储介质
WO2024055333A1 (fr) Procédé de traitement d'images, dispositif intelligent et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21967583

Country of ref document: EP

Kind code of ref document: A1