WO2023050413A1 - 图像处理方法、智能终端及存储介质 - Google Patents

图像处理方法、智能终端及存储介质 Download PDF

Info

Publication number
WO2023050413A1
WO2023050413A1 PCT/CN2021/122430 CN2021122430W WO2023050413A1 WO 2023050413 A1 WO2023050413 A1 WO 2023050413A1 CN 2021122430 W CN2021122430 W CN 2021122430W WO 2023050413 A1 WO2023050413 A1 WO 2023050413A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
information
image
data stream
target
Prior art date
Application number
PCT/CN2021/122430
Other languages
English (en)
French (fr)
Inventor
肖龙安
尚国强
蓝建梁
Original Assignee
深圳传音控股股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳传音控股股份有限公司 filed Critical 深圳传音控股股份有限公司
Priority to CN202180102485.3A priority Critical patent/CN118104247A/zh
Priority to PCT/CN2021/122430 priority patent/WO2023050413A1/zh
Publication of WO2023050413A1 publication Critical patent/WO2023050413A1/zh
Priority to US18/614,714 priority patent/US20240242411A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Definitions

  • the present application relates to the technical field of image processing, and in particular to an image processing method, an intelligent terminal and a storage medium.
  • Computational photography is a comprehensive technology that uses image processing algorithms combined with sensors, modern optics and other technologies to create new photographic equipment and applications. It can create novel image applications through software and hardware collaboration.
  • the present application provides an image processing method, a smart terminal and a storage medium, which can quickly acquire captured images and improve image acquisition efficiency.
  • the present application provides an image processing method, which can optionally be applied to a smart terminal, and the image processing method may include:
  • S2 Process the image data stream according to an image processing instruction, determine or generate corresponding image processing information, and obtain a target image.
  • the determining or generating the corresponding image processing information to obtain the target image includes at least one of the following:
  • the target image is obtained first, and then the corresponding image processing information is determined or generated.
  • the image data stream includes image data
  • the S2 includes:
  • the image data is processed according to the image processing instruction to obtain the target image data, and the corresponding image processing information is determined or generated.
  • the image data stream includes basic image information
  • the S2 includes:
  • the basic image information determines or generate the basic information of the target image, and update the image data stream based on the basic information of the target image, the target image data and the image processing information to obtain the target image a data stream, to obtain the target image according to the target image data stream; and/or,
  • the image processing information includes at least one of the following features:
  • the image processing information is located between the target image basic information and the target image data in the target image data stream.
  • the image processing information is located between the basic image information and the target image data in the target image data stream.
  • the image data stream further includes an identification of imaging information; wherein, updating the image data stream based on the target image basic information, the target image data, and the image processing information includes:
  • the imaging information determines or generate target imaging information, and process the image data stream based on the target image basic information, the target imaging information, the target image data, and the image processing information updates; and/or,
  • the image data stream is updated based on the target image basic information, the target image data, and the image processing information.
  • the target imaging information and the image processing information are located between the target image basic information and the target image data in the target image data stream, and the target imaging information is located in the image processing before the information.
  • the imaging information and the image processing information are both located between the target image basic information and the target image data in the target image data stream, and the imaging information is located before the image processing information.
  • the image data processing method also includes:
  • the image data stream includes original image processing information, adding the image processing information to the original image processing information in the image data stream.
  • the original image processing information includes a processing reversible identification
  • the image data processing method further includes:
  • processing reversible flag indicates reversible processing, then remove the image data from the target image data stream.
  • the basic image information includes at least one of the following:
  • the length of the basic image information the type identification of the image data, the length of the image data, the width of the image data, the color space of the image data, the bit width of the image data, or the image How the data is stored.
  • the image processing information includes at least one of the following:
  • the length of the image processing information, the processing identification, the processing reversible identification, the processing description type information, the pre-processing data storage identification, or the image data is not limited.
  • the present application also provides an image processing method.
  • it can be applied to a smart terminal.
  • the image processing method may include:
  • S20 Process the image data according to an image processing instruction, determine or generate basic information of a target image, so as to update the image data stream, and obtain a target image.
  • the S2 includes:
  • the image data is processed according to the image processing instruction to obtain target image data, and corresponding image processing information is determined or generated.
  • the image data stream is updated based on the target image basic information, the target image data and the image processing information to obtain a target image data stream, so as to obtain the target image according to the target image data stream.
  • the image data stream further includes an identification of imaging information; wherein, updating the image data stream based on the target image basic information, the target image data, and the image processing information includes :
  • the imaging information determines or generate target imaging information, and process the image data stream based on the target image basic information, the target imaging information, the target image data, and the image processing information updates; and/or,
  • the image data stream is updated based on the target image basic information, the target image data, and the image processing information.
  • the target imaging information and the image processing information are located between the target image basic information and the target image data in the target image data stream, and the target imaging information is located in the image processing before the information.
  • the imaging information and the image processing information are both located between the target image basic information and the target image data in the target image data stream, and the imaging information is located before the image processing information.
  • the imaging information includes at least one of the following:
  • the length of the imaging information The length of the imaging information, the shutter time of the imaging device, the sensitivity of the imaging device, the aperture of the imaging device, the focal length of the imaging device, the gyroscope information of the imaging device, the acceleration of the imaging device , geographic location information of the imaging device, or image rotation angle information of the imaging device.
  • the present application also provides an image processing device, which may include:
  • the obtaining unit is used to obtain the image data stream.
  • the processing unit is configured to process the image data stream according to an image processing instruction, determine or generate corresponding image processing information, and obtain a target image.
  • the determining or generating the corresponding image processing information to obtain the target image includes at least one of the following:
  • the target image is obtained first, and then the corresponding image processing information is determined or generated.
  • the image data stream includes image data.
  • the processing unit is specifically configured to process the image data according to an image processing instruction to obtain target image data, and determine or generate corresponding image processing information.
  • the image data stream includes basic image information.
  • the processing unit is specifically configured to judge whether the processing will cause the basic image information to change; if it is determined that the basic image information changes, then determine or generate target image basic information, based on the target image basic information, The target image data and the image processing information update the image data stream to obtain the target image data stream, so as to obtain the target image according to the target image data stream; and/or, if it is determined that the image is basically If the information does not change, the image data stream is updated based on the target image data and the image processing information to obtain a target image data stream, so as to obtain the target image according to the target image data stream.
  • the image processing information includes at least one of the following features:
  • the image processing information is located between the target image basic information and the target image data in the target image data stream.
  • the image processing information is located between the basic image information and the target image data in the target image data stream.
  • the image data stream further includes an identification of imaging information.
  • the processing unit is specifically configured to judge whether the processing will cause the imaging information to change; if it is determined that the imaging information changes, then determine or generate target imaging information, and based on the basic information of the target image, the updating the image data stream based on the target imaging information, the target image data and the image processing information; and/or, if it is determined that the imaging information has not changed, based on the target image basic information, the target Image data and the image processing information update the image data stream.
  • the target imaging information and the image processing information are located between the target image basic information and the target image data in the target image data stream, and the target imaging information is located in the image processing before the information.
  • the imaging information and the image processing information are both located between the target image basic information and the target image data in the target image data stream, and the imaging information is located before the image processing information.
  • the image processing device further includes an adding unit.
  • the adding unit is configured to add the image processing information to the original image processing information in the image data stream if the image data stream includes original image processing information.
  • the original image processing information includes a processing reversibility flag
  • the image processing apparatus further includes a culling unit.
  • the removing unit is configured to remove the image data from the target image data stream if the processing reversible flag indicates reversible processing.
  • the basic image information includes at least one of the following:
  • the length of the basic image information the type identification of the image data, the length of the image data, the width of the image data, the color space of the image data, the bit width of the image data, or the image How the data is stored.
  • the image processing information includes at least one of the following:
  • the length of the image processing information, the processing identification, the processing reversible identification, the processing description type information, the pre-processing data storage identification, or the image data is not limited.
  • the embodiment of the present application also provides an image processing device, which may include:
  • the acquiring unit is configured to acquire an image data stream, where the image data stream includes image data and basic image information.
  • the processing unit is configured to process the image data according to an image processing instruction, determine or generate basic information of a target image, so as to update the image data stream, and obtain a target image.
  • the processing unit is specifically configured to process the image data according to an image processing instruction to obtain target image data, and determine or generate corresponding image processing information; based on the basic information of the target image, the target image The data and the image processing information update the image data stream to obtain a target image data stream, so as to obtain the target image according to the target image data stream.
  • the image data stream further includes an identification of imaging information.
  • the processing unit is specifically configured to judge whether the processing will cause the imaging information to change; if it is determined that the imaging information changes, then determine or generate target imaging information, and based on the basic information of the target image, the updating the image data stream based on the target imaging information, the target image data and the image processing information; and/or, if it is determined that the imaging information has not changed, based on the target image basic information, the target Image data and the image processing information update the image data stream.
  • the target imaging information and the image processing information are located between the target image basic information and the target image data in the target image data stream, and the target imaging information is located in the image processing before the information.
  • the imaging information and the image processing information are both located between the target image basic information and the target image data in the target image data stream, and the imaging information is located before the image processing information.
  • the imaging information includes at least one of the following:
  • the length of the imaging information The length of the imaging information, the shutter time of the imaging device, the sensitivity of the imaging device, the aperture of the imaging device, the focal length of the imaging device, the gyroscope information of the imaging device, the acceleration of the imaging device , geographic location information of the imaging device, or image rotation angle information of the imaging device.
  • the present application also provides an intelligent terminal.
  • the intelligent terminal includes: a memory and a processor, wherein a processing program of image data is stored in the memory, and the processing program of image data is realized when executed by the processor. The steps of any one of the above image processing methods.
  • the present application also provides a readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the steps of any one of the above image processing methods are realized.
  • the embodiment of the present application also provides a computer program product, where the computer program product includes a computer program; when the computer program is executed, the steps of any one of the above image processing methods are realized.
  • the image data stream when acquiring the target image, can be acquired first, and the image data stream can be processed according to the image processing instruction, and the corresponding image processing information can be determined or generated to obtain the target image.
  • the target image can be acquired quickly, thereby effectively improving the acquisition efficiency of the target image.
  • FIG. 1 is a schematic diagram of a hardware structure of an intelligent terminal implementing various embodiments of the present application
  • FIG. 2 is a system architecture diagram of a communication network provided by an embodiment of the present application.
  • FIG. 3 is a schematic flow diagram of an image processing method provided in an embodiment of the present application.
  • FIG. 4 is a schematic frame diagram of a target image acquired by a photographing device provided in an embodiment of the present application
  • FIG. 5 is a schematic flow diagram of another image processing method provided in the embodiment of the present application.
  • FIG. 6 is a schematic flowchart of another image processing method provided in the embodiment of the present application.
  • FIG. 7 is a schematic flow diagram of an image processing method provided by an embodiment of the present application.
  • FIG. 8 is a schematic flow chart of another image processing method provided by the embodiment of the present application.
  • FIG. 9 is a schematic flowchart of another image processing method provided in the embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an image processing device provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of another image processing device provided by an embodiment of the present application.
  • first, second, third, etc. may be used herein to describe various information, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from one another. For example, without departing from the scope of this document, first information may also be called second information, and similarly, second information may also be called first information.
  • first information may also be called second information, and similarly, second information may also be called first information.
  • second information may also be called first information.
  • the word “if” as used herein may be interpreted as “at” or “when” or “in response to a determination”.
  • the singular forms "a”, “an” and “the” are intended to include the plural forms as well, unless the context indicates otherwise.
  • A, B, C means “any of the following: A; B; C; A and B; A and C; B and C; A and B and C
  • A, B or C or "A, B and/or C” means "any of the following: A; B; C; A and B; A and C; B and C; A and B and C”. Exceptions to this definition will only arise when combinations of elements, functions, steps or operations are inherently mutually exclusive in some way.
  • the words “if”, “if” as used herein may be interpreted as “at” or “when” or “in response to determining” or “in response to detecting”.
  • the phrases “if determined” or “if detected (the stated condition or event)” could be interpreted as “when determined” or “in response to the determination” or “when detected (the stated condition or event) )” or “in response to detection of (a stated condition or event)”.
  • step codes such as S301 and S302 are used, the purpose of which is to express the corresponding content more clearly and concisely, and does not constitute a substantive limitation on the order.
  • S302 will be executed first, followed by S301, etc., but these should be within the scope of protection of this application.
  • Smart terminals can be implemented in various forms.
  • the smart terminals described in this application may include mobile phones, tablet computers, notebook computers, palmtop computers, personal digital assistants (Personal Digital Assistant, PDA), portable media players (Portable Media Player, PMP), navigation devices, Smart terminals such as wearable devices, smart bracelets, and pedometers, as well as fixed terminals such as digital TVs and desktop computers.
  • PDA Personal Digital Assistant
  • PMP portable media players
  • navigation devices Smart terminals such as wearable devices, smart bracelets, and pedometers
  • Smart terminals such as wearable devices, smart bracelets, and pedometers
  • fixed terminals such as digital TVs and desktop computers.
  • a smart terminal will be taken as an example, and those skilled in the art will understand that, in addition to elements specially used for mobile purposes, the configurations according to the embodiments of the present application can also be applied to fixed-type terminals.
  • FIG. 1 is a schematic diagram of a hardware structure of an intelligent terminal implementing various embodiments of the present application.
  • the intelligent terminal 100 may include: an RF (Radio Frequency, radio frequency) unit 101, a WiFi module 102, an audio output unit 103, A/V (Audio/Video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111 and other components.
  • RF Radio Frequency, radio frequency
  • the radio frequency unit 101 can be used for sending and receiving information or receiving and sending signals during a call. Specifically, after receiving the downlink information of the base station, it is processed by the processor 110; in addition, the uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through wireless communication.
  • the above wireless communication can use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication, Global System for Mobile Communications), GPRS (General Packet Radio Service, General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 , Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access, Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, Time Division Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency division duplex long-term evolution), TDD-LTE (Time Division Duplexing-Long Term Evolution, time-division duplex long-term evolution) and 5G, etc.
  • GSM Global System of Mobile communication, Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • CDMA2000 Code Division Multiple Access 2000
  • WCDMA Wideband Code Division Multiple Access
  • TD-SCDMA Time Division-Synchronous Code Division Multiple Access, Time Division Synchro
  • WiFi is a short-distance wireless transmission technology.
  • the smart terminal can help users send and receive emails, browse web pages, and access streaming media, etc., and it provides users with wireless broadband Internet access.
  • Fig. 1 shows the WiFi module 102, it can be understood that it is not an essential component of the smart terminal, and can be completely omitted as required without changing the essence of the invention.
  • the audio output unit 103 can store the information received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 when the smart terminal 100 is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, or the like.
  • the audio data is converted into an audio signal and output as sound.
  • the audio output unit 103 may also provide audio output related to specific functions performed by the smart terminal 100 (for example, call signal receiving sound, message receiving sound, etc.).
  • the audio output unit 103 may include a speaker, a buzzer, and the like.
  • the A/V input unit 104 is used to receive audio or video signals.
  • the A/V input unit 104 may include a graphics processing unit (Graphics Processing Unit, GPU) 1041 and a microphone 1042, and the graphics processing unit 1041 is used for still pictures or The image data of the video is processed.
  • the processed image frames may be displayed on the display unit 106 .
  • the image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage media) or sent via the radio frequency unit 101 or the WiFi module 102 .
  • the microphone 1042 can receive sound (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, and the like operating modes, and can process such sound as audio data.
  • the processed audio (voice) data can be converted into a format transmittable to a mobile communication base station via the radio frequency unit 101 for output in case of a phone call mode.
  • the microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the process of receiving and transmitting audio signals.
  • the smart terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light, and the proximity sensor can turn off the display when the smart terminal 100 moves to the ear. panel 1061 and/or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the application of mobile phone posture (such as horizontal and vertical screen switching, related Games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; as for mobile phones, fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, Other sensors such as thermometers and infrared sensors will not be described in detail here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), or the like.
  • LCD Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • the user input unit 107 can be used to receive input numbers or character information, and generate key signal input related to user settings and function control of the smart terminal.
  • the user input unit 107 may include a touch panel 1071 and other input devices 1072 .
  • the touch panel 1071 also referred to as a touch screen, can collect touch operations of the user on or near it (for example, the user uses any suitable object or accessory such as a finger and a stylus on the touch panel 1071 or near the touch panel 1071). operation), and drive the corresponding connection device according to the preset program.
  • the touch panel 1071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into contact coordinates , and then sent to the processor 110, and can receive the command sent by the processor 110 and execute it.
  • the touch panel 1071 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072 .
  • other input devices 1072 may include, but are not limited to, one or more of physical keyboards, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, etc., which are not specifically described here. limited.
  • the touch panel 1071 may cover the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it transmits to the processor 110 to determine the type of the touch event, and then the processor 110 determines the touch event according to the touch event.
  • the corresponding visual output is provided on the display panel 1061 .
  • the touch panel 1071 and the display panel 1061 are used as two independent components to realize the input and output functions of the smart terminal, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated.
  • the implementation of the input and output functions of the smart terminal is not specifically limited here.
  • the interface unit 108 is used as an interface through which at least one external device can be connected with the smart terminal 100 .
  • an external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) ports, video I/O ports, headphone ports, and more.
  • the interface unit 108 can be used to receive input from an external device (for example, data information, power, etc.) transfer data between devices.
  • the memory 109 can be used to store software programs as well as various data.
  • the memory 109 can mainly include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one function required application program (such as a sound playback function, an image playback function, etc.) etc.
  • the storage data area can be Store data (such as audio data, phone book, etc.) created according to the use of the mobile phone.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the smart terminal, and uses various interfaces and lines to connect various parts of the whole smart terminal, by running or executing software programs and/or modules stored in the memory 109, and calling data stored in the memory 109 , execute various functions of the smart terminal and process data, so as to monitor the smart terminal as a whole.
  • the processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor.
  • the application processor mainly processes operating systems, user interfaces, and application programs, etc.
  • the demodulation processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110 .
  • the smart terminal 100 can also include a power supply 111 (such as a battery) for supplying power to various components.
  • a power supply 111 (such as a battery) for supplying power to various components.
  • the power supply 111 can be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. and other functions.
  • the smart terminal 100 may also include a Bluetooth module, etc., which will not be repeated here.
  • the following describes the communication network system on which the smart terminal of the present application is based.
  • Fig. 2 is a kind of communication network system architecture diagram that the embodiment of the present application provides, and this communication network system is the LTE system of general mobile communication technology, and this LTE system includes the UE (User Equipment, user equipment) that communication connects sequentially ) 201, E-UTRAN (Evolved UMTS Terrestrial Radio Access Network, Evolved UMTS Terrestrial Radio Access Network) 202, EPC (Evolved Packet Core, Evolved Packet Core Network) 203 and the operator's IP service 204.
  • UE User Equipment, user equipment
  • E-UTRAN Evolved UMTS Terrestrial Radio Access Network
  • EPC Evolved Packet Core, Evolved Packet Core Network
  • the UE 201 may be the above-mentioned terminal 100, which will not be repeated here.
  • E-UTRAN 202 includes eNodeB 2021 and other eNodeB 2022 and so on.
  • the eNodeB 2021 can be connected to other eNodeB 2022 through a backhaul (for example, X2 interface), the eNodeB 2021 is connected to the EPC 203 , and the eNodeB 2021 can provide access from the UE 201 to the EPC 203 .
  • a backhaul for example, X2 interface
  • EPC203 may include MME (Mobility Management Entity, Mobility Management Entity) 2031, HSS (Home Subscriber Server, Home Subscriber Server) 2032, other MME2033, SGW (Serving Gate Way, Serving Gateway) 2034, PGW (PDN Gate Way, packet data Network Gateway) 2035 and PCRF (Policy and Charging Rules Function, Policy and Charging Functional Entity) 2036, etc.
  • MME2031 is a control node that processes signaling between UE201 and EPC203, and provides bearer and connection management.
  • HSS2032 is used to provide some registers to manage functions such as home location register (not shown in the figure), and save some user-specific information about service features and data rates.
  • PCRF2036 is the policy and charging control policy decision point of business data flow and IP bearer resources, it is the policy and charging execution functional unit (not shown) Select and provide available policy and charging control decisions.
  • the IP service 204 may include Internet, Intranet, IMS (IP Multimedia Subsystem, IP Multimedia Subsystem) or other IP services.
  • IMS IP Multimedia Subsystem, IP Multimedia Subsystem
  • LTE system is used as an example above, those skilled in the art should know that this application is not only applicable to the LTE system, but also applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA and future new wireless communication systems.
  • the network system (such as 5G), etc., is not limited here.
  • the image processing method provided in the embodiment of the present application may be applied to a scene of image shooting, for example, a scene of computing and shooting.
  • a scene of image shooting for example, a scene of computing and shooting.
  • the inventors found at least the following problems when acquiring the images of the captured images through computational imaging: how to quickly acquire the images of the captured images to improve the efficiency of image acquisition is an urgent problem in this field.
  • an embodiment of the present application provides an image processing method, and various embodiments of the present application are proposed based on the above-mentioned hardware structure of the smart terminal and the communication network system.
  • the image processing method provided by the present application will be described in detail through specific embodiments. It can be understood that the following specific embodiments may be combined with each other, and the same or similar concepts or processes may not be repeated in some embodiments.
  • Fig. 3 is a schematic flow chart of an image processing method provided by an embodiment of the present application, the image processing method may be executed by software and/or hardware devices, for example, the hardware device may be an image processing device, and the image processing device may be an intelligent terminal .
  • the image processing method may include:
  • the image data stream sent by the optical imaging hardware module of the shooting device can be received; the image data stream sent by other image processing modules of the shooting device can also be received; specifically, it can be set according to actual needs,
  • the embodiment of the present application does not specifically limit the manner of acquiring the image data stream.
  • FIG. 4 is a schematic frame diagram of capturing a target image through a shooting device provided in an embodiment of the present application.
  • the shooting device may include an optical imaging hardware module and at least one The image processing module, the camera application module and the image control module are used to obtain the target image through the optical imaging hardware module and at least one image processing module, the camera application module and the image control module.
  • at least one image processing module can be respectively marked as image processing module 1, image processing module 2, . . . , image processing module N-1, and image processing module N.
  • the optical imaging hardware module mainly includes a camera module, such as a photosensitive chip, lens, focus motor, optical image stabilization (Optical Image Stabilization, OIS) device, etc.; in addition, it also has imaging auxiliary devices, such as a flash device, a color temperature sensor etc.; the image processing module is mainly used to process the code stream data output by the camera hardware. According to the different functions or processing code stream types, it is divided into various image processing modules, such as the image processing module for RAW domain noise reduction processing, An image processing module for dark light enhancement processing, an image processing module for high dynamic range image (High Dynamic Range, HDR) synthesis processing, etc.
  • a camera module such as a photosensitive chip, lens, focus motor, optical image stabilization (Optical Image Stabilization, OIS) device, etc.
  • imaging auxiliary devices such as a flash device, a color temperature sensor etc.
  • the image processing module is mainly used to process the code stream data output by the camera hardware. According to the different functions or processing code stream types, it is divided
  • the camera application module is mainly used to realize some image applications, such as photo shooting, video recording, video call, etc.; the image control module is mainly used to set and control the optical imaging hardware module according to the output requirements of the camera application module, so that the optical imaging hardware module The image data stream of the shooting screen can be collected on demand.
  • the image data stream can be processed according to the image processing instructions, and the corresponding image processing information can be determined or generated to obtain the target image, that is, the following S2 is executed:
  • S2 Process the image data stream according to the image processing instruction, determine or generate corresponding image processing information, and obtain the target image.
  • the image processing instruction may be an image processing instruction generated based on an image processing requirement, or may be a preset image processing instruction, which may be set according to actual needs, which is not specifically limited in this embodiment of the present application.
  • processing the image data stream according to the image processing instruction may include but not limited to: RAW domain noise reduction processing, dark light enhancement processing, and HDR synthesis processing, etc., which may be set according to actual needs.
  • the corresponding image processing information After the image data stream is processed according to the image processing instructions, in addition to obtaining the corresponding target image, the corresponding image processing information will also be determined or generated.
  • the image processing information can be understood as: generated based on this processing operation
  • the processing information is mainly used to describe the current processing operation, so that by determining or generating the image processing information, the subsequent operations can be known according to the image processing information, and provide a query basis for subsequent image data processing.
  • the image processing information may include at least one of the length of the image processing information, processing identification, processing reversible identification, processing description type information, pre-processing data storage identification, or image data, which can be set according to actual needs.
  • the length of the image processing information is used to identify the total length of the image processing information field; the processing identification is used to identify one image processing information; the processing reversible identification is used to identify whether this image processing is reversible processing; processing description type information It is used to identify the type description of this image processing; the pre-processing data storage flag is used to identify whether to keep the pre-processed image data; the image data is used to identify the pre-processed image data itself.
  • the image processing information may also include other information, such as the identification of the image processing information, etc., if the image processing information includes the identification of the image processing information, the length of the above image processing information includes the length of the identification of the image processing information, Specifically, it may be set according to actual needs.
  • the embodiment of the present application is only described by taking the image processing information may include at least one of the foregoing as an example, but it does not mean that the embodiment of the present application is limited thereto.
  • the identifier of the image processing information is used to identify the "image processing information" field.
  • the image processing information can be seen in Table 1 below:
  • the image data stream is processed according to the image processing instruction, and the corresponding image processing information is determined or generated, and when the target image is obtained, the corresponding image processing information can be determined or generated first, and then the target image can be obtained; The target image, and then determine or generate the corresponding image processing information; it is also possible to determine or generate the corresponding image processing at the same time, and obtain the target image, which can be set according to actual needs.
  • the execution order of the two operations of obtaining the target image is not specifically limited in this embodiment of the present application.
  • the image data stream may include image data, so that when the image data stream is processed according to the image processing instruction, the image data in the image data stream may be processed according to the image processing instruction to obtain
  • the target image data is used to determine or generate corresponding image processing information; to generate a final target image according to the target image data.
  • the image data in the image data stream may be the image data of the photographed picture collected by the optical imaging hardware module in the photographing device on demand.
  • the optical imaging hardware module collects the image data of the shooting picture, it will also determine or generate basic image information corresponding to the image data, and the relevant content of the basic image information will be described later.
  • the target image data can be obtained first, and then the corresponding image processing information can be determined or generated; Determine or generate the corresponding image processing information, and then obtain the target image data; you can also obtain the target image data at the same time, and determine or generate the corresponding image processing information, which can be set according to actual needs.
  • the execution sequence of the two operations of determining or generating the corresponding image processing information is not specifically limited in this embodiment of the present application.
  • the image data stream when acquiring the target image, can be acquired first, and the image data stream can be processed according to the image processing instruction, and the corresponding image processing information can be determined or generated to obtain the target image, which can The target image is acquired quickly, thereby effectively improving the acquisition efficiency of the target image.
  • the image data stream may also include the basic image information corresponding to the image data, so that through the image data stream Carry image data and basic image information, so that image data and its corresponding basic image information can be obtained together through an image data stream, without additional acquisition of basic image information by other means, thereby improving the acquisition efficiency of basic image information .
  • the basic image information may be the basic image information corresponding to the image data determined or generated by the optical imaging hardware module in the shooting device when collecting the image data of the captured picture.
  • the optical imaging hardware module can firstly collect the image data of the shooting picture through the optical imaging hardware module, and generate the basic image corresponding to the image data. information; and the image data and its corresponding image basic information are carried in the same image data stream, and transmitted to the image processing module 1 in at least one image processing module, so that the image processing module 1 can pass through the same image data stream.
  • the image data and the corresponding basic image information are acquired together, so that the image processing module 1 does not need to additionally acquire the basic image information through other means, thereby improving the acquisition efficiency of the basic image information.
  • the encapsulation format of carrying image data and its corresponding basic image information through the same image data stream is not limited to the optical imaging hardware module in the shooting device, and is also applicable to the shooting device
  • the image processing module that is, after the image processing module finishes processing the image data, it also needs to carry the processed image data and its corresponding image basic information in the same image data stream and send it to the next image processing module, so that The next image processing module obtains the processed image data and its corresponding basic image information based on the same image data stream.
  • image processing module 1 acquires image data and its corresponding basic image information from the optical imaging hardware module through the same image data stream, while image processing module 2, image processing module 3, ..., and image processing module N The processed image data and its corresponding basic image information are obtained from the previous image processing module through the same image data stream.
  • the image data stream when the same image data stream carries image data and its corresponding basic image information at the same time, that is, the image data stream includes both image data and its corresponding basic image information, optionally, the encapsulation format of the image data stream
  • the basic image information corresponding to the image data stream is located before the image data in the image data stream; of course, it can also be located after the image data, and can be set according to actual needs.
  • the embodiment of the present application only uses the basic image information In the image data stream, the image data is placed before the image data as an example for illustration, but this does not mean that the embodiment of the present application is limited thereto.
  • the basic image information may include the length of the basic image information, the type identifier of the image data, the length of the image data, the width of the image data, the color space of the image data, the bit width of the image data, or At least one of the image data storage methods can be specifically set according to actual needs.
  • the length of the image basic information is used to identify the total length of the image basic information field;
  • the type identification of the image data is used to identify whether the image data type is a single-frame image, multi-frame image or video stream;
  • the length of the image data is used to identify The length of the image data itself;
  • the width of the image data is used to identify the width of the image data itself;
  • the color space of the image data is used to identify the color space description of the image data, such as the color space description of the RGGB (Red-Green-Green-Blue) mode, The color space description of RGBW (Red-Green-Blue-White) mode, the color space description of RYYB (Red-Yellow-Yellow-Blue) mode, etc.
  • the bit width of image data is used to identify the number of Bits of each component of the image;
  • the storage method of the data is used to identify the arrangement method of each pixel of each component in memory in the image color space.
  • the basic image information may also include other information, such as the logo of the basic image information. If the basic image information includes the logo of the basic image information, the length of the basic image information includes the length of the logo of the basic image information. Specifically, it may be set according to actual needs. Here, the embodiment of the present application is only described by taking the image basic information including at least one of the above as an example, but it does not mean that the embodiment of the present application is limited thereto.
  • the identifier of the basic image information is used to identify the "basic description information" field of the image.
  • the basic information of the image can be seen in Table 2 below:
  • the image data stream includes image data and its corresponding basic image information
  • the image processing instruction determines or generate the corresponding image processing information
  • Fig. 5 is a schematic flow chart of another image processing method provided by the embodiment of the present application.
  • the image processing method can also be executed by software and/or hardware devices.
  • the image processing method can be include:
  • S501 Process the image data in the image data stream according to the image processing instruction, and determine or generate corresponding image processing information.
  • the image data in the image data stream is processed according to the image processing instruction, and the related operations of determining or generating the corresponding image processing information are the same as the processing of the image data stream according to the image processing instruction in S12 above. , the relevant operations of determining or generating the corresponding image processing information are similar.
  • the relevant descriptions of processing the image data stream according to the image processing instructions in the above-mentioned SS12 and determining or generating the corresponding image processing information please refer to the relevant descriptions of processing the image data stream according to the image processing instructions in the above-mentioned SS12 and determining or generating the corresponding image processing information.
  • the embodiment of the present application No further details will be given.
  • the target image basic information can be used to update the image basic information in the image data stream
  • the target image data can be used to update the image data stream.
  • the image data in the image data stream, and the image processing information is added to the image data stream to update the image data stream, so as to obtain the updated target image data stream.
  • the target image basic information when using the target image basic information to update the image basic information in the image data stream, the target image basic information can be directly used to replace the image basic information, or the target image basic information can be used to only update the image basic information that is consistent with the target image basic information Different parts and the same part may not be updated, and may be specifically set according to actual needs, and this embodiment of the present application does not make specific limitations here. It can be understood that when the target image data is used to update the image data, the corresponding update method is similar to the update method of the basic image information. Please refer to the related description of the update method of the basic image information. Here, the embodiment of the present application will not repeat repeat.
  • the obtained target image data stream may include target image basic information, target image data, and image processing information.
  • the target image data stream includes target image basic information, target image data, and image processing information
  • the image processing information is located in the target image data stream, located between the target image basic information and between the target image data; of course, the image processing information can also be located after the target image basic information and the target image data, or in other packaging formats.
  • the encapsulation format between data is described as an example, but it does not mean that the embodiment of the present application is limited thereto.
  • the image data stream is updated to obtain the target image data stream, so that the target image basic information, target image data and image processing information are carried together through the target image data stream, and there is no need to additionally obtain the target image basic information and image processing information through other methods.
  • the target image can be quickly acquired based on the target image data flow, thereby effectively improving the acquisition efficiency of the target image.
  • the image data stream is updated based on the target image data and image processing information, and when the target image data stream is obtained, optionally, the target image data can be used to update the image data in the image data stream, and the image processing information can be added to the image data stream to update the image data stream to obtain an updated target image data stream.
  • the corresponding updating method is similar to the updating method of the basic image information in S503 above, and can be referred to the relevant description of the updating method of the basic image information.
  • the embodiment of the present application No further details will be given.
  • the image data stream is updated based on the target image data and image processing information, and the obtained target image data stream may include image basic information, target image data and image processing information.
  • the target image data stream includes image basic information, target image data, and image processing information
  • the image processing information is located in the target image data stream, located in the image basic information and target image between the data; of course, the image processing information can also be located after the basic image information and the target image data, or in other packaging formats.
  • the embodiment of the present application only uses the image processing information to be located between the basic image information and the target image data. This encapsulation format is described as an example, but it does not mean that the embodiment of the present application is limited thereto.
  • the image data stream is updated to obtain the target image data stream, so that the basic image information, target image data and image processing information are carried together through the target image data stream, and there is no need to obtain additional basic image information and image processing information through other methods, so that based on The target image data flow can quickly acquire the target image, thus effectively improving the acquisition efficiency of the target image.
  • the image data stream may also include imaging information, so that through the image data stream Image data, image basic information and imaging information are carried in the image data stream, so that image data, image basic information and imaging information can be obtained together through one image data stream, and there is no need to additionally obtain image basic information and imaging information through other methods, thereby improving It improves the acquisition efficiency of basic image information and imaging information.
  • the imaging information and image processing information are located in the image data stream. between the basic information and the image data, and the imaging information is located before the image processing information; of course, other packaging formats can also be used, which can be set according to actual needs.
  • the embodiment of the application only takes this packaging format as an example description, but it does not mean that the embodiment of the present application is limited thereto.
  • the imaging information may include the length of the imaging information, the shutter time of the imaging device, the sensitivity of the imaging device, the aperture of the imaging device, the focal length of the imaging device, the gyroscope information of the imaging device, the At least one of the acceleration of the imaging device, the geographic location information of the imaging device, or the image rotation angle information of the imaging device, which can be specifically set according to actual needs.
  • the length of the imaging information is used to identify the total length of the imaging information field;
  • the shutter time of the imaging device is used to identify the time when the photo-taking light enters the photosensitive surface of the photosensitive material, that is, the shutter speed of the camera;
  • the sensitivity of the imaging device is used to identify the image
  • the value of sensitivity when shooting can refer to the definition of ISO 12232;
  • the aperture of the imaging device is used to identify the aperture value of the lens, and the value is the focal length of the lens/the effective aperture diameter of the lens;
  • the focal length of the imaging device is used to identify the physical focal length of the lens.
  • the distance between the focal point of the lens and the center point of the lens; the gyroscope information of the imaging device is used to identify the angular velocity of the smart terminal device, including the directions of X, Y and Z axes; the acceleration of the imaging device is used to identify the three axes of the smart terminal device The acceleration of the imaging device; the geographic location information of the imaging device is used to identify the longitude, latitude, and height; the image rotation angle information of the imaging device is used to identify the image rotation angle information on the smart terminal.
  • the imaging information may also include other information, such as the identification of the imaging information. If the imaging information includes the identification of the imaging information, the length of the above imaging information includes the length of the identification of the imaging information, which can be determined according to actual needs. Setting, here, the embodiment of the present application is only described by taking the imaging information may include at least one of the foregoing as an example, but it does not mean that the embodiment of the present application is limited thereto.
  • the identification of the imaging information is used to identify the "imaging information" field.
  • the basic information of the image can be found in Table 3 below:
  • the image data stream includes image data, its corresponding basic image information and imaging information, in order to facilitate understanding in the above S12, how to process the image data stream according to the image processing instructions to determine or generate the corresponding image
  • the information is processed to obtain the target image, which will be described in detail below through the embodiment shown in FIG. 6 below.
  • Fig. 6 is a schematic flow chart of another image processing method provided by the embodiment of the present application.
  • the image processing method can also be executed by software and/or hardware devices.
  • the image processing method can be include:
  • S601. Process image data in an image data stream according to an image processing instruction, and determine or generate corresponding image processing information.
  • the image data in the image data stream is processed according to the image processing instruction, and the related operations of determining or generating the corresponding image processing information are the same as those of processing the image data stream according to the image processing instruction in S12 above. , the relevant operations of determining or generating the corresponding image processing information are similar.
  • the relevant descriptions of processing the image data stream according to the image processing instructions in the above-mentioned SS12 and determining or generating the corresponding image processing information please refer to the relevant descriptions of processing the image data stream according to the image processing instructions in the above-mentioned SS12 and determining or generating the corresponding image processing information.
  • the embodiment of the present application No further details will be given.
  • the basic information of the image and the imaging information may change. Therefore, after the image data is processed according to the image processing instruction, it can be judged whether this processing will The basic image information and imaging information are changed to determine whether the image data stream needs to be updated.
  • the current processing when judging whether the current processing will change the basic image information and imaging information, it may first be judged whether the current processing will cause the basic image information to change, and then determine whether the current processing will cause the imaging information to change; It is also possible to first judge whether this processing will change the imaging information, and then judge whether this processing will cause the basic information of the image to change; it is also possible to judge whether this processing will cause the basic image information and imaging information to change at the same time. Set according to actual needs.
  • S602. Determine whether the processing will change the basic information of the image.
  • the imaging information when it is determined that the imaging information has changed, it is necessary to determine the updated target imaging information. Since both the basic information of the image and the imaging information have changed, it is necessary to determine based on the changed basic information of the target image and the imaging information of the target. , to update the image data stream. In addition, in view of the fact that the image data will inevitably change after the image data is processed according to the image processing instructions, it is also necessary to determine or generate the changed target image data, and based on the target image data to update the image data The stream is updated; and, since the corresponding image processing information will be determined or generated in this processing, it is also necessary to update the image data stream based on the image processing information.
  • the target image basic information can be used to update the image basic information in the image data stream
  • the target imaging information can be used to update the image data stream. Update the imaging information in the image data stream, update the image data in the image data stream with the target image data, and add the image processing information to the image data stream to update the image data stream, so as to obtain the updated target image data flow.
  • the target imaging information when using the target imaging information to update the imaging information in the image data stream, can be directly used to replace the imaging information, or the target imaging information can be used to update only the part of the imaging information that is different from the target imaging information, and the same part can be omitted.
  • the update can be specifically set according to actual needs, which is not specifically limited in this embodiment of the present application.
  • how to use the target image basic information to update the image basic information in the image data stream, and how to use the target image data to update the image data in the image data stream please refer to the relevant description of the above embodiment shown in FIG. 5 . Here, this The application examples will not be described in detail.
  • the image data stream is updated based on the target image basic information, target imaging information, target image data and image processing information
  • the obtained target image data stream may include target image basic information, target imaging information, target image data and image processing information .
  • the target image data stream includes target image basic information, target imaging information, target image data, and image processing information
  • the target imaging information and image processing information are included in the target image data stream , are located between the basic information of the target image and the target image data, and the target imaging information is located before the image processing information, namely: the basic target image information, the target imaging information, the image processing information and the target image data.
  • other encapsulation formats can also be used, which can be set according to actual needs.
  • the embodiment of the present application only uses this encapsulation format as an example for illustration, but it does not mean that the embodiment of the present application is limited thereto.
  • the target image data and image processing information update the image data stream to obtain the target image data stream.
  • the target image data stream carries the target image basic information, target imaging information, target image data and image processing information without any other
  • the method additionally obtains the basic information of the target image, target imaging information and image processing information, so that the target image can be quickly obtained based on the target image data stream, thereby effectively improving the efficiency of target image acquisition.
  • the target image basic information can be used to update the image basic information in the image data stream
  • the target image data can be used to update the image data stream.
  • the image data in the image data stream, and the image processing information is added to the image data stream to update the image data stream, so as to obtain the updated target image data stream.
  • the image data stream is updated based on the target image basic information, target image data and image processing information
  • the obtained target image data stream may include target image basic information, imaging information, target image data and image processing information.
  • the target image data stream includes target image basic information, imaging information, target image data and image processing information
  • the imaging information and image processing information are in the target image data stream, They are all located between the basic information of the target image and the target image data, and the imaging information is located before the image processing information; of course, other packaging formats can also be used, which can be set according to actual needs.
  • the embodiment of the present application only uses this One package format is used as an example for description, but it does not mean that the embodiment of the present application is limited thereto.
  • the target image After the image data is processed according to the image processing instructions, if it is determined that the basic information of the image in the image data stream has changed but the imaging information has not changed in this processing, then based on the basic information of the target image, the target image The data and image processing information update the image data stream to obtain the target image data stream, so that the target image data stream can carry the basic information of the target image, imaging information, target image data and image processing information together, without additional acquisition by other means
  • the basic information of the target image, the imaging information and the image processing information make it possible to quickly acquire the target image based on the target image data stream, thereby effectively improving the acquisition efficiency of the target image.
  • the image data stream needs to be updated based on the changed target imaging information.
  • the basic information of the image has not changed, There is no need to update the image data stream based on the basic information of the image; in addition, since the image data will inevitably change after the image data is processed according to the image processing instructions, it is also necessary to determine or generate the changed target image data, and based on the target
  • the image data updates the image data stream; and, since this processing will determine or generate corresponding image processing information, it is also necessary to update the image data stream based on the image processing information.
  • the target image data can be used to update the image data in the image data stream
  • the target imaging information can be used to update the image data stream in the image data stream. imaging information, and adding image processing information to the image data stream to update the image data stream to obtain an updated target image data stream.
  • the image data stream is updated based on the target image data, target imaging information, and image processing information
  • the obtained target image data stream may include basic image information, target imaging information, target image data, and image processing information.
  • the target image data stream includes image basic information, target imaging information, target image data, and image processing information
  • the target imaging information and image processing information are included in the target image data stream , are located between the basic image information and the target image data, and the target imaging information is located before the image processing information; of course, other packaging formats can also be used, which can be set according to actual needs.
  • the embodiment of the present application is only based on This encapsulation format is described as an example, but it does not mean that the embodiment of the present application is limited thereto.
  • the image processing instructions if it is determined that the basic information of the image in the image data stream has not changed and the imaging information has changed in this processing, then based on the target image data, the target imaging information , and image processing information to update the image data stream to obtain the target image data stream, so that the basic image information, target imaging information, target image data and image processing information are carried together through the target image data stream, without additional acquisition by other means
  • the basic image information, target imaging information and image processing information enable the target image to be quickly acquired based on the target image data stream, thereby effectively improving the acquisition efficiency of the target image.
  • the target image data can be used to update the image data in the image data stream, and the image processing information can be added to the image data stream to update the image
  • the data stream is updated to obtain an updated target image data stream.
  • the image data stream is updated based on the target image data and image processing information, and the obtained target image data stream may include image basic information, imaging information, target image data and image processing information.
  • the target image data stream includes image basic information, imaging information, target image data, and image processing information
  • the imaging information and image processing information in the target image data stream are both It is located between the basic image information and the target image data, and the imaging information is located before the image processing information, namely: basic image information, imaging information, image processing information and target image data.
  • other encapsulation formats can also be used, which can be set according to actual needs.
  • the embodiment of the present application only uses this encapsulation format as an example for illustration, but it does not mean that the embodiment of the present application is limited thereto.
  • the image data stream is updated to obtain the target image data stream, so that the basic image information, imaging information, target image data and image processing information are carried together through the target image data stream, and there is no need to obtain additional image basic information, imaging information and
  • the image processing information enables the target image to be quickly acquired based on the target image data stream, thereby effectively improving the acquisition efficiency of the target image.
  • the image data stream may also include original image processing information corresponding to the image data, so that by carrying the image data and original image processing information in the image data stream, The image data and the original image processing information can be obtained together through one image data stream, and there is no need to additionally obtain the original image processing information through other means, thereby improving the acquisition efficiency of the original image processing information.
  • the original image processing information may be understood as information related to image processing that the image data has undergone before.
  • the original image processing information is located in the image data stream before the image data
  • the embodiment of the present application only uses the original image processing information in the image data stream and the packaging format before the image data as an example for illustration. However, it does not mean that the embodiment of the present application is limited thereto.
  • the image data stream includes original image processing information
  • the image processing information corresponding to this processing operation can be added to In the original image processing information in the image data stream, the image data stream is updated to obtain the updated target image data stream.
  • the target image data stream always includes the latest image processing information, so that the follow-up can be based on the image Process information to know the relevant operations of this processing, and provide query basis for subsequent image data processing.
  • the original image processing information includes a reversible identifier that can also be processed.
  • the processing reversible flag indicates reversible processing, it means that the image data before processing can be restored based on the updated target image data and original image processing information.
  • the image data can be removed from the updated target image data stream, which can avoid occupying more memory due to storing the image data and reduce the memory usage.
  • the processing reversible flag indicates that the current processing is irreversible processing, it means that the image data before processing cannot be restored based on the updated target image data and original image processing information, then it needs to be in the updated target image data stream Carry the image data before processing to ensure that the image data can be accurately found later.
  • Fig. 7 is a schematic flow chart of an image processing method provided by an embodiment of the present application, the image processing method can be executed by software and/or hardware devices, for example, the hardware device can be an image processing device, and the image processing device can be an intelligent terminal .
  • the image processing method may include:
  • the basic image information may include the length of the basic image information, the type identifier of the image data, the length of the image data, the width of the image data, the color space of the image data, the bit width of the image data, or At least one of the image data storage methods can be set according to actual needs.
  • the embodiment of the present application does not Let me repeat.
  • the image data stream sent by the optical imaging hardware module of the shooting device can be received; the image data stream sent by other image processing modules of the shooting device can also be received; specifically, it can be set according to actual needs,
  • the embodiment of the present application does not specifically limit the manner of acquiring the image data stream.
  • the optical imaging hardware module can firstly collect the image data of the shooting picture through the optical imaging hardware module, and generate the basic image corresponding to the image data. Information; and the image data and its corresponding basic image information are carried in the same image data stream.
  • the image data stream when the same image data stream carries image data and its corresponding basic image information at the same time, that is, the image data stream includes both image data and its corresponding basic image information, optionally, the encapsulation format of the image data stream
  • the basic image information corresponding to the image data stream is located before the image data in the image data stream; of course, it can also be located after the image data, and can be set according to actual needs.
  • the embodiment of the present application only uses the basic image information In the image data stream, the image data is placed before the image data as an example for illustration, but this does not mean that the embodiment of the present application is limited thereto.
  • the image data After the image data is acquired, the image data can be processed according to the image processing instruction.
  • the basic information of the image will usually change. Therefore, according to the image processing instruction
  • the changed basic information of the target image can be determined or generated to update the image data flow to obtain the target image, that is, perform the following S20:
  • S20 Process the image data according to the image processing instruction, determine or generate the basic information of the target image, so as to update the image data flow, and obtain the target image.
  • the image processing instruction may be an image processing instruction generated based on an image processing requirement, or may be a preset image processing instruction, which may be set according to actual needs, which is not specifically limited in this embodiment of the present application.
  • the processing of the image data stream according to the image processing instructions may include but not limited to: RAW domain noise reduction processing, dark light enhancement processing, and HDR synthesis processing, etc., which can be set according to actual needs.
  • RAW domain noise reduction processing dark light enhancement processing
  • HDR synthesis processing etc.
  • the image data when the image data is processed according to the image processing instruction, the image data can be processed according to the image processing instruction to obtain the target image data, and the corresponding target image processing information is determined or generated; then based on the basic information of the target image, the target The image data and the image processing information update the image data stream to obtain the target image data stream, so as to obtain the target image according to the target image data stream.
  • the target image data when the image data is processed according to the image processing instruction to obtain the target image data, and the corresponding target image processing information is determined or generated, the target image data may be obtained first, and then the corresponding target image processing information is determined or generated; The corresponding target image processing information can be determined or generated first, and then the target image data can be obtained; the target image data can also be obtained at the same time, and the corresponding target image processing information can be determined or generated, which can be set according to actual needs.
  • the execution sequence of the two operations of target image data and determining or generating corresponding target image processing information is not specifically limited in this embodiment of the present application.
  • the target image basic information can be used to update the image basic information in the image data stream
  • the target image data can be used to update image data in the image data stream, and adding image processing information to the image data stream to update the image data stream, so as to obtain an updated target image data stream.
  • the target image basic information when using the target image basic information to update the image basic information in the image data stream, the target image basic information can be directly used to replace the image basic information, or the target image basic information can be used to only update the image basic information that is consistent with the target image basic information Different parts and the same part may not be updated, and may be specifically set according to actual needs, and this embodiment of the present application does not make specific limitations here. It can be understood that when the target image data is used to update the image data, the corresponding update method is similar to the update method of the basic image information. Please refer to the related description of the update method of the basic image information. Here, the embodiment of the present application will not repeat repeat.
  • the obtained target image data stream may include target image basic information, target image data, and image processing information.
  • the target image data stream includes target image basic information, target image data, and image processing information
  • the image processing information is located in the target image data stream, located between the target image basic information and between the target image data; of course, the image processing information can also be located after the target image basic information and the target image data, or in other packaging formats.
  • the encapsulation format between data is described as an example, but it does not mean that the embodiment of the present application is limited thereto.
  • the image data stream is updated to obtain the target image data stream, so that the target image basic information, target image data and image processing information are carried together through the target image data stream, and there is no need to additionally obtain the target image basic information and image processing information through other methods.
  • the target image can be quickly acquired based on the target image data flow, thereby effectively improving the acquisition efficiency of the target image.
  • the image data stream when acquiring the target image, can be acquired first, and the image data stream includes image data and basic image information; and the image data is processed according to the image processing instructions to determine or generate the target image Basic information to update the image data stream to obtain the target image, so that the image data and basic image information are carried together through the image data stream, and the image data stream is updated through the basic information of the target image without additional acquisition by other means
  • the basic information of the target image makes it possible to quickly acquire the target image based on the updated image data stream, thereby effectively improving the efficiency of acquiring the target image.
  • the image data stream may also include imaging information, so that through the image data stream Image data, image basic information and imaging information are carried in the image data stream, so that image data, image basic information and imaging information can be obtained together through one image data stream, and there is no need to additionally obtain image basic information and imaging information through other methods, thereby improving It improves the acquisition efficiency of basic image information and imaging information.
  • the imaging information and image processing information are located in the image data stream. between the basic information and the image data, and the imaging information is located before the image processing information; of course, other packaging formats can also be used, which can be set according to actual needs.
  • the embodiment of the application only takes this packaging format as an example description, but it does not mean that the embodiment of the present application is limited thereto.
  • the imaging information may include the length of the imaging information, the shutter time of the imaging device, the sensitivity of the imaging device, the aperture of the imaging device, the focal length of the imaging device, the gyroscope information of the imaging device, the At least one of the acceleration of the imaging device, the geographic location information of the imaging device, or the image rotation angle information of the imaging device, which can be set according to actual needs.
  • the imaging information please refer to the relevant description of the imaging information in the third embodiment above. Here , which will not be described in detail in this embodiment of the present application.
  • the image data stream includes image data, its corresponding basic image information and imaging information, in order to facilitate understanding in the above S20, how to The image data stream is updated by processing information, which will be described in detail below through the embodiment shown in FIG. 8 below.
  • Fig. 8 is a schematic flow chart of another image processing method provided in the embodiment of the present application.
  • the image processing method can also be executed by software and/or hardware devices.
  • the image processing method can be include:
  • the imaging information may change. Therefore, after the image data is processed according to the image processing instruction, it can be judged whether the processing will cause the imaging information to change. , to determine whether an update to the image data stream is required. If it is determined that the basic image information has changed, execute the following S802; and/or, if it is determined that the basic image information has not changed, execute the following S803:
  • the image data stream needs to be processed based on the changed basic information of the target image and the target imaging information.
  • Update in addition, since the image data will inevitably change after the image data is processed according to the image processing instructions, it is also necessary to determine or generate the changed target image data, and update the image data flow based on the target image data; and, Since this processing will determine or generate corresponding image processing information, it is also necessary to update the image data stream based on the image processing information.
  • the target image basic information can be used to update the image basic information in the image data stream
  • the target imaging information can be used to update the image data stream. Update the imaging information in the image data stream, update the image data in the image data stream with the target image data, and add the image processing information to the image data stream to update the image data stream, so as to obtain the updated target image data flow.
  • the target imaging information when using the target imaging information to update the imaging information in the image data stream, can be directly used to replace the imaging information, or the target imaging information can be used to update only the part of the imaging information that is different from the target imaging information, and the same part can be omitted.
  • the update can be specifically set according to actual needs, which is not specifically limited in this embodiment of the present application.
  • how to use the target image basic information to update the image basic information in the image data stream, and how to use the target image data to update the image data in the image data stream please refer to the relevant description of the above embodiment shown in FIG. 5 . Here, this The application examples will not be repeated here.
  • the image data stream is updated based on the target image basic information, target imaging information, target image data and image processing information
  • the obtained target image data stream may include target image basic information, target imaging information, target image data and image processing information .
  • the target image data stream includes target image basic information, target imaging information, target image data, and image processing information
  • the target imaging information and image processing information are included in the target image data stream , are located between the basic information of the target image and the target image data, and the target imaging information is located before the image processing information, namely: the basic target image information, the target imaging information, the image processing information and the target image data.
  • other encapsulation formats can also be used, which can be set according to actual needs.
  • the embodiment of the present application only uses this encapsulation format as an example for illustration, but it does not mean that the embodiment of the present application is limited thereto.
  • the target image data and image processing information update the image data stream to obtain the target image data stream.
  • the target image data stream carries the target image basic information, target imaging information, target image data and image processing information without any other
  • the method additionally obtains the basic information of the target image, target imaging information and image processing information, so that the target image can be quickly obtained based on the target image data stream, thereby effectively improving the efficiency of target image acquisition.
  • the image data stream needs to be updated based on the changed basic information of the target image.
  • the image data After the image data is processed by the processing instruction, the image data will inevitably change, and it is also necessary to determine or generate the changed target image data, and update the image data stream based on the target image data; and, given that this processing will determine or Generating the corresponding image processing information also requires updating the image data stream based on the image processing information.
  • the target image basic information can be used to update the image basic information in the image data stream
  • the target image data can be used to update the image data stream.
  • the image data in the image data stream, and the image processing information is added to the image data stream to update the image data stream, so as to obtain the updated target image data stream.
  • the image data stream is updated based on the target image basic information, target image data and image processing information
  • the obtained target image data stream may include target image basic information, imaging information, target image data and image processing information.
  • the target image data stream includes target image basic information, imaging information, target image data and image processing information
  • the imaging information and image processing information are in the target image data stream, They are all located between the basic information of the target image and the target image data, and the imaging information is located before the image processing information, namely: the basic information of the target image, the imaging information, the image processing information and the target image data.
  • other encapsulation formats can also be used, which can be set according to actual needs.
  • the embodiment of the present application only uses this encapsulation format as an example for illustration, but it does not mean that the embodiment of the present application is limited thereto.
  • the target image After the image data is processed according to the image processing instructions, if it is determined that the basic information of the image in the image data stream has changed but the imaging information has not changed in this processing, then based on the basic information of the target image, the target image The data and image processing information update the image data stream to obtain the target image data stream, so that the target image data stream can carry the basic information of the target image, imaging information, target image data and image processing information together, without additional acquisition by other means
  • the basic information of the target image, the imaging information and the image processing information make it possible to quickly acquire the target image based on the target image data stream, thereby effectively improving the acquisition efficiency of the target image.
  • the image data stream may also include the original Image processing information, in this way, by carrying image data, image basic information and original image processing information in the image data stream, image data, image basic information and original image processing information can be obtained together through one image data stream, without further
  • the basic image information and the original image processing information are additionally acquired through other means, thereby improving the acquisition efficiency of the basic image information and the original image processing information.
  • the original image processing information may be understood as information related to image processing that the image data has undergone before.
  • the original image processing information is in the image data stream, located in the image between the basic information and image data; of course, the original image processing information can also be located after the image basic information and image data, or in other packaging formats.
  • this embodiment of the application only places the original image processing information between the image basic information and The encapsulation format between data is described as an example, but it does not mean that the embodiment of the present application is limited thereto.
  • the image data stream includes original image processing information
  • the image processing information corresponding to this processing operation can be added to the image
  • the image data stream is updated to obtain the updated target image data stream, so that the target image data stream always includes the latest image processing information, so that the subsequent image processing can be based on the Information, to know the relevant operations of this processing, and provide a query basis for subsequent image data processing.
  • the original image processing information includes a reversible identifier that can also be processed.
  • the processing reversible flag indicates reversible processing, it means that the image data before processing can be restored based on the updated target image data and original image processing information.
  • the image data can be removed from the updated target image data stream, which can avoid occupying more memory due to storing the image data and reduce the memory usage.
  • the processing reversible flag indicates that the current processing is irreversible processing, it means that the image data before processing cannot be restored based on the updated target image data and original image processing information, then it needs to be in the updated target image data stream Carry the image data before processing to ensure that the image data can be accurately found later.
  • FIG. 9 is a schematic flowchart of another image processing method provided in the embodiment of the present application.
  • the basic image information in the image data stream can be parsed first, and then After analyzing the basic information of the image, judge whether the image data stream includes the identification of the imaging information; if it is determined that the identification of the imaging information is included, analyze the imaging information, and continue to determine whether the image data stream includes the identification of the original image processing information; and /or, if it is determined that the identification of the imaging information is not included, continue to judge whether the identification of the original image processing information is included in the image data stream; if it is determined that the identification of the original image processing information is included, then analyze the original image processing information, and process the image data Processing; if it is determined that the identification of the original image processing information is not included, the image data is processed.
  • the image data After processing the image data, it can be judged whether the basic information of the image has changed in this processing. If it is determined that the basic information of the image has changed, write the changed target image basic information into the image data stream, and continue to judge Whether the processing causes the imaging information to change; if it is determined that the basic information of the image has not changed, continue to judge whether this processing causes the imaging information to change; if it is determined that the imaging information has changed, write the changed target imaging information into the image data stream , and write the image processing information generated by this processing into the image data stream; if it is determined that the imaging information has not changed, then write the image processing information generated by this processing into the image data stream; The processed target image data is written into the image data stream, and after the updated target image data stream is obtained, the target image data stream is output.
  • FIG. 10 is a schematic structural diagram of an image processing device 100 provided in an embodiment of the present application.
  • the image processing device 100 may include:
  • the acquiring unit 1001 is configured to acquire an image data stream.
  • the processing unit 1002 is configured to process the image data stream according to the image processing instruction, determine or generate corresponding image processing information, and obtain the target image.
  • the image data stream includes image data.
  • the processing unit 1002 is specifically configured to process image data according to image processing instructions to obtain target image data, and determine or generate corresponding image processing information.
  • the image data stream includes image basic information.
  • the processing unit 1002 is specifically used to judge whether the basic information of the image will be changed by the processing; if it is determined that the basic information of the image changes, then determine or generate the basic information of the target image, and process the image based on the basic information of the target image, the target image data and the image processing information.
  • the data stream is updated to obtain the target image data stream, so as to obtain the target image according to the target image data stream; and/or, if it is determined that the basic information of the image has not changed, the image data stream is updated based on the target image data and image processing information,
  • the target image data stream is obtained, so as to obtain the target image according to the target image data stream.
  • the image processing information includes at least one of the following features:
  • the image processing information is located between the basic information of the target image and the target image data in the target image data stream.
  • Image processing information is located between basic image information and target image data in the target image data stream.
  • the image data stream further includes an identification of imaging information.
  • the processing unit 1002 is specifically configured to judge whether the processing will cause changes in the imaging information; if it is determined that the imaging information has changed, then determine or generate target imaging information, and based on the target image basic information, target imaging information, target image data and image processing information updating the image data stream; and/or, if it is determined that the imaging information has not changed, updating the image data stream based on the target image basic information, target image data and image processing information.
  • the target imaging information and image processing information are located between the target image basic information and the target image data in the target image data stream, and the target imaging information is located before the image processing information.
  • the imaging information and the image processing information are located between the target image basic information and the target image data in the target image data stream, and the imaging information is located before the image processing information.
  • the image processing apparatus 100 further includes an adding unit 1003 .
  • the adding unit 1003 is configured to add the image processing information to the original image processing information in the image data stream if the image data stream includes original image processing information.
  • the original image processing information includes a processing reversible flag
  • the device further includes a culling unit 1004 .
  • the removing unit 1004 is configured to remove image data from the target image data stream if the processing reversible flag indicates reversible processing.
  • the basic image information includes at least one of the following:
  • the length of the basic image information the type identification of the image data, the length of the image data, the width of the image data, the color space of the image data, the bit width of the image data, or the storage method of the image data.
  • the image processing information includes at least one of the following:
  • Length of image processing information, processing identifier, processing reversible identifier, processing description type information, pre-processing data storage identifier, or image data Length of image processing information, processing identifier, processing reversible identifier, processing description type information, pre-processing data storage identifier, or image data.
  • the image processing device 100 shown in the embodiment of the present application can implement the technical solution of the image processing method in the above embodiment, and its realization principle and beneficial effect are similar to those of the image processing method. Please refer to the realization of the image processing method The principles and beneficial effects will not be repeated here.
  • Fig. 11 is a schematic structural diagram of another image processing device 110 provided in the embodiment of the present application.
  • the image processing device 110 may include:
  • the acquiring unit 1101 is configured to acquire an image data stream, and the image data stream includes image data and basic image information.
  • the processing unit 1102 is configured to process the image data according to the image processing instruction, determine or generate basic information of the target image, so as to update the image data stream, and obtain the target image.
  • the processing unit 1102 is specifically configured to process image data according to image processing instructions to obtain target image data, and determine or generate corresponding image processing information; based on target image basic information, target image data, and image processing information, image processing The data stream is updated to obtain the target image data stream, so as to obtain the target image according to the target image data stream.
  • the image data stream further includes an identification of imaging information.
  • the processing unit 1102 is specifically used to judge whether the processing will change the imaging information; if it is determined that the imaging information changes, then determine or generate the target imaging information, and based on the target image basic information, target imaging information, target image data and image processing information updating the image data stream; and/or, if it is determined that the imaging information has not changed, updating the image data stream based on the target image basic information, target image data and image processing information.
  • the target imaging information and image processing information are located between the target image basic information and the target image data in the target image data stream, and the target imaging information is located before the image processing information.
  • the imaging information and the image processing information are located between the target image basic information and the target image data in the target image data stream, and the imaging information is located before the image processing information.
  • the imaging information includes at least one of the following:
  • the length of the imaging information The length of the imaging information, the shutter time of the imaging device, the sensitivity of the imaging device, the aperture of the imaging device, the focal length of the imaging device, the gyroscope information of the imaging device, the acceleration of the imaging device, the geographic location information of the imaging device, or the Image rotation angle information.
  • the image processing device 110 shown in the embodiment of the present application can implement the technical solution of the image processing method in the above embodiment, and its realization principle and beneficial effect are similar to those of the image processing method. Please refer to the realization of the image processing method The principles and beneficial effects will not be repeated here.
  • the embodiment of the present application also provides an intelligent terminal.
  • the intelligent terminal includes a memory and a processor.
  • the image data processing program is stored in the memory.
  • the image data processing program is executed by the processor, the image processing method in any of the above-mentioned embodiments is implemented. A step of.
  • the embodiment of the present application also provides a computer-readable storage medium, on which an image data processing program is stored.
  • an image data processing program is stored.
  • the image processing method in any of the above-mentioned embodiments is implemented. step.
  • An embodiment of the present application further provides a computer program product, the computer program product includes computer program code, and when the computer program code is run on the computer, the computer is made to execute the methods in the above various possible implementation manners.
  • the embodiment of the present application also provides a chip, including a memory and a processor.
  • the memory is used to store a computer program
  • the processor is used to call and run the computer program from the memory, so that the device installed with the chip executes the above various possible implementation modes. Methods.
  • Units in the device in the embodiment of the present application may be combined, divided and deleted according to actual needs.
  • the methods of the above embodiments can be implemented by means of software plus a necessary general-purpose hardware platform, and of course also by hardware, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or in other words, the part that contributes to the prior art, and the computer software product is stored in one of the above storage media (such as ROM/RAM, magnetic CD, CD), including several instructions to make a terminal device (which may be a mobile phone, computer, server, controlled terminal, or network device, etc.) execute the method of each embodiment of the present application.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • a computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, special purpose computer, a computer network, or other programmable apparatus.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g. Coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server, a data center, etc. integrated with one or more available media.
  • Usable media may be magnetic media, (eg, floppy disk, memory disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)), among others.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请提供了一种图像处理方法、智能终端及存储介质,处理方法应用于处理设备,包括以下步骤:在获取目标图像时,可以先获取图像数据流,并根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像,这样可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。

Description

图像处理方法、智能终端及存储介质 技术领域
本申请涉及图像处理技术领域,具体涉及一种图像处理方法、智能终端及存储介质。
背景技术
计算摄影是利用图像处理算法结合传感器、现代光学等技术创造出新型摄影设备及应用的综合技术,能够通过软硬件协同,创造新颖的图像应用。
以计算摄像场景为例,在通过计算摄像获取拍摄画面的图像时,在构思及实现本申请过程中,发明人发现至少存在如下问题:如何快速地获取拍摄画面的图像,以提高图像的获取效率是本领域亟待解决的问题。
前面的叙述在于提供一般的背景信息,并不一定构成现有技术。
发明内容
针对上述技术问题,本申请提供一种图像处理方法、智能终端及存储介质,可以快速获取拍摄的图像,提高了图像的获取效率。
为解决上述技术问题,本申请提供一种图像处理方法,可选地,可应用于智能终端,该图像处理方法可以包括:
S1:获取图像数据流。
S2:根据图像处理指令对所述图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像。
可选地,所述确定或生成对应的图像处理信息,得到目标图像,包括以下至少一种:
同时确定或生成对应的图像处理信息以及得到目标图像;
先确定或生成对应的图像处理信息,后得到目标图像;
先得到目标图像,后确定或生成对应的图像处理信息。
可选地,所述图像数据流中包括图像数据,所述S2包括:
根据图像处理指令对所述图像数据进行处理,得到目标图像数据,确定或生成对应的所述图像处理信息。
可选地,所述图像数据流中包括图像基本信息,所述S2包括:
判断所述处理是否会使所述图像基本信息发生变化。
若确定所述图像基本信息发生变化,则确定或生成目标图像基本信息,基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,得到目标图像数据流,以根据所述目标图像数据流得到所述目标图像;和/或,
若确定所述图像基本信息未发生变化,则基于所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,得到目标图像数据流,以根据所述目标图像数据流得到所述目标图像。
可选地,所述图像处理信息包括以下至少一种特征:
所述图像处理信息在所述目标图像数据流中,位于所述目标图像基本信息和所述目标图像数据之间。
所述图像处理信息在所述目标图像数据流中,位于所述图像基本信息和所述目标图像数据之间。
可选地,所述图像数据流还包括成像信息的标识;其中,所述基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,包括:
判断所述处理是否会使所述成像信息发生变化。
若确定所述成像信息发生变化,则确定或生成目标成像信息,并基于所述目标图像基本信息、所述目标成像信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新;和/或,
若确定所述成像信息未发生变化,则基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新。
可选地,所述目标成像信息和所述图像处理信息在所述目标图像数据流中,位于所述目标图像基本信息和所述目标图像数据之间,所述目标成像信息位于所述图像处理信息之前。
所述成像信息和所述图像处理信息在所述目标图像数据流中,均位于所述目标图像基本信息和所述目标图像数据之间,且所述成像信息位于所述图像处理信息之前。
可选地,该图像数据处理方法还包括:
若所述图像数据流包括原始图像处理信息,则将所述图像处理信息添加在所述图像数据流中的原始图像处理信息中。
可选地,所述原始图像处理信息包括处理可逆标识,该图像数据处理方法还包括:
若所述处理可逆标识指示为可逆处理,则从所述目标图像数据流中剔除所述图像数据。
可选地,所述图像基本信息包括下述至少一种:
所述图像基本信息的长度、所述图像数据的类型标识、所述图像数据的长度、所述图像数据的宽度、所述图像数据的色彩空间、所述图像数据的位宽、或者所述图像数据的存储方式。
可选地,所述图像处理信息包括下述至少一种:
所述图像处理信息的长度、处理标识、处理可逆标识、处理描述类型信息、处理前数据保存标识、或者所述图像数据。
本申请还提供一种图像处理方法,可选地,可应用于智能终端,该图像处理方法可以包括:
S10、获取图像数据流,所述图像数据流包括图像数据以及图像基本信息。
S20、根据图像处理指令对所述图像数据进行处理,确定或生成目标图像基本信息,以对所述图像数据流进行更新,得到目标图像。
可选地,所述S2包括:
根据图像处理指令对所述图像数据进行处理,得到目标图像数据,确定或生成对应的图像处理信息。
基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,得到目标图像数据流,以根据所述目标图像数据流得到所述目标图像。
可选地,若所述图像数据流还包括成像信息的标识;其中,所述基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,包括:
判断所述处理是否会使所述成像信息发生变化。
若确定所述成像信息发生变化,则确定或生成目标成像信息,并基于所述目标图像基本信息、所述目标成像信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新;和/或,
若确定所述成像信息未发生变化,则基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新。
可选地,所述目标成像信息和所述图像处理信息在所述目标图像数据流中,位于所述目标图像基本信息和所述目标图像数据之间,所述目标成像信息位于所述图像处理信息之前。
所述成像信息和所述图像处理信息在所述目标图像数据流中,均位于所述目标图像基本信息和所述目标图像数据之间,且所述成像信息位于所述图像处理信息之前。
可选地,所述成像信息包括下述至少一种:
所述成像信息的长度、成像设备的快门时间、所述成像设备的感光度、所述成像设备的光圈、所述成像设备的焦距、所述成像设备的陀螺仪信息、所述成像设备的加速度、所述成像设备的地理位置信息、或者所述成像设备的影像旋转角度信息。
本申请还提供了一种图像处理装置,该图像处理装置可以包括:
获取单元,用于获取图像数据流。
处理单元,用于根据图像处理指令对所述图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像。
可选地,所述确定或生成对应的图像处理信息,得到目标图像,包括以下至少一种:
同时确定或生成对应的图像处理信息以及得到目标图像;
先确定或生成对应的图像处理信息,后得到目标图像;
先得到目标图像,后确定或生成对应的图像处理信息。
可选地,所述图像数据流中包括图像数据。
所述处理单元,具体用于根据图像处理指令对所述图像数据进行处理,得到目标图像数据,确定或生成对应的所述图像处理信息。
可选地,所述图像数据流中包括图像基本信息。
所述处理单元,具体用于判断所述处理是否会使所述图像基本信息发生变化;若确定所述图像基本信息发生变化,则确定或生成目标图像基本信息,基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,得到目标图像数据流,以根据所述目标图像数据流得到所述目标图像;和/或,若确定所述图像基本信息未发生变化,则基于所述目标图像数据和所述图像处理信息对所述图 像数据流进行更新,得到目标图像数据流,以根据所述目标图像数据流得到所述目标图像。
可选地,所述图像处理信息包括以下至少一种特征:
所述图像处理信息在所述目标图像数据流中,位于所述目标图像基本信息和所述目标图像数据之间。
所述图像处理信息在所述目标图像数据流中,位于所述图像基本信息和所述目标图像数据之间。
可选地,所述图像数据流还包括成像信息的标识。
所述处理单元,具体用于判断所述处理是否会使所述成像信息发生变化;若确定所述成像信息发生变化,则确定或生成目标成像信息,并基于所述目标图像基本信息、所述目标成像信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新;和/或,若确定所述成像信息未发生变化,则基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新。
可选地,所述目标成像信息和所述图像处理信息在所述目标图像数据流中,位于所述目标图像基本信息和所述目标图像数据之间,所述目标成像信息位于所述图像处理信息之前。
所述成像信息和所述图像处理信息在所述目标图像数据流中,均位于所述目标图像基本信息和所述目标图像数据之间,且所述成像信息位于所述图像处理信息之前。
可选地,该图像处理装置还包括添加单元。
所述添加单元,用于若所述图像数据流包括原始图像处理信息,则将所述图像处理信息添加在所述图像数据流中的原始图像处理信息中。
可选地,所述原始图像处理信息包括处理可逆标识,所述图像处理装置还包括剔除单元。
所述剔除单元,用于若所述处理可逆标识指示为可逆处理,则从所述目标图像数据流中剔除所述图像数据。
可选地,所述图像基本信息包括下述至少一种:
所述图像基本信息的长度、所述图像数据的类型标识、所述图像数据的长度、所述图像数据的宽度、所述图像数据的色彩空间、所述图像数据的位宽、或者所述图像数据的存储方式。
可选地,所述图像处理信息包括下述至少一种:
所述图像处理信息的长度、处理标识、处理可逆标识、处理描述类型信息、处理前数据保存标识、或者所述图像数据。
本申请实施例还提供了一种图像处理装置,该图像处理装置可以包括:
获取单元,用于获取图像数据流,所述图像数据流包括图像数据以及图像基本信息。
处理单元,用于根据图像处理指令对所述图像数据进行处理,确定或生成目标图像基本信息,以对所述图像数据流进行更新,得到目标图像。
可选地,所述处理单元,具体用于根据图像处理指令对所述图像数据进行处理,得到目标图像数据,确定或生成对应的图像处理信息;基于所述目标图像基本信息、 所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,得到目标图像数据流,以根据所述目标图像数据流得到所述目标图像。
可选地,若所述图像数据流还包括成像信息的标识。
所述处理单元,具体用于判断所述处理是否会使所述成像信息发生变化;若确定所述成像信息发生变化,则确定或生成目标成像信息,并基于所述目标图像基本信息、所述目标成像信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新;和/或,若确定所述成像信息未发生变化,则基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新。
可选地,所述目标成像信息和所述图像处理信息在所述目标图像数据流中,位于所述目标图像基本信息和所述目标图像数据之间,所述目标成像信息位于所述图像处理信息之前。
所述成像信息和所述图像处理信息在所述目标图像数据流中,均位于所述目标图像基本信息和所述目标图像数据之间,且所述成像信息位于所述图像处理信息之前。
可选地,所述成像信息包括下述至少一种:
所述成像信息的长度、成像设备的快门时间、所述成像设备的感光度、所述成像设备的光圈、所述成像设备的焦距、所述成像设备的陀螺仪信息、所述成像设备的加速度、所述成像设备的地理位置信息、或者所述成像设备的影像旋转角度信息。
本申请还提供了一种智能终端,所述智能终端包括:存储器、处理器,其中,所述存储器上存储有图像数据的处理程序,所述图像数据的处理程序被所述处理器执行时实现如上任一所述的图像处理方法的步骤。
本申请还提供了一种可读存储介质,所述可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如上任一所述的图像处理方法的步骤。
本申请实施例还提供了一种计算机程序产品,所述计算机程序产品包括计算机程序;所述计算机程序被执行时,实现如上任一所述的图像处理方法的步骤。
如上所述,本申请的图像处理方法,在获取目标图像时,可以先获取图像数据流,并根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像,这样可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本申请的实施例,并与说明书一起用于解释本申请的原理。为了更清楚地说明本申请实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为实现本申请各个实施例的一种智能终端的硬件结构示意图;
图2为本申请实施例提供的一种通信网络系统架构图;
图3为本申请实施例提供的一种图像处理方法的流程示意图;
图4为本申请实施例提供的一种通过拍摄装置获取目标图像的框架示意图;
图5为本申请实施例提供的另一种图像处理方法的流程示意图;
图6为本申请实施例提供的又一种图像处理方法的流程示意图;
图7为本申请实施例提供的一种图像处理方法的流程示意图;
图8为本申请实施例提供的另一种图像处理方法的流程示意图;
图9为本申请实施例提供的又一种图像处理方法的流程示意图;
图10是本申请实施例提供的一种图像处理装置的结构示意图;
图11是本申请实施例提供的另一种图像处理装置的结构示意图。
本申请目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。通过上述附图,已示出本申请明确的实施例,后文中将有更详细的描述。这些附图和文字描述并不是为了通过任何方式限制本申请构思的范围,而是通过参考特定实施例为本领域技术人员说明本申请的概念。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例标识在附图中。下面的描述涉及附图时,除非另有标识,不同附图中的相同数字标识相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素,此外,本申请不同实施例中具有同样命名的部件、特征、要素可能具有相同含义,也可能具有不同含义,其具体含义需以其在该具体实施例中的解释或者进一步结合该具体实施例中上下文进行确定。
应当理解,尽管在本文可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本文范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语"如果"可以被解释成为"在……时"或"当……时"或"响应于确定"。再者,如同在本文中所使用的,单数形式“一”、“一个”和“该”旨在也包括复数形式,除非上下文中有相反的指示。应当进一步理解,术语“包含”、“包括”表明存在所述的特征、步骤、操作、元件、组件、项目、种类、和/或组,但不排除一个或多个其他特征、步骤、操作、元件、组件、项目、种类、和/或组的存在、出现或添加。本申请使用的术语“或”、“和/或”、“包括以下至少一个”等可被解释为包括性的,或意味着任一个或任何组合。例如,“包括以下至少一个:A、B、C”意味着“以下任一个:A;B;C;A和B;A和C;B和C;A和B和C”,再如,“A、B或C”或者“A、B和/或C”意味着“以下任一个:A;B;C;A和B;A和C;B和C;A和B和C”。仅当元件、功能、步骤或操作的组合在某些方式下内在地互相排斥时,才会出现该定义的例外。
应该理解的是,虽然本申请实施例中的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,其可以以其他的顺序执行。而且,图中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必 然是在同一时刻执行完成,而是可以在不同的时刻执行,其执行顺序也不必然是依次进行,而是可以与其他步骤或者其他步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
取决于语境,如在此所使用的词语“如果”、“若”可以被解释成为“在……时”或“当……时”或“响应于确定”或“响应于检测”。类似地,取决于语境,短语“如果确定”或“如果检测(陈述的条件或事件)”可以被解释成为“当确定时”或“响应于确定”或“当检测(陈述的条件或事件)时”或“响应于检测(陈述的条件或事件)”。
需要说明的是,在本文中,采用了诸如S301、S302等步骤代号,其目的是为了更清楚简要地表述相应内容,不构成顺序上的实质性限制,本领域技术人员在具体实施时,可能会先执行S302后执行S301等,但这些均应在本申请的保护范围之内。
应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或者“单元”的后缀仅为了有利于本申请的说明,其本身没有特定的意义。因此,“模块”、“部件”或者“单元”可以混合地使用。
智能终端可以以各种形式来实施。例如,本申请中描述的智能终端可以包括诸如手机、平板电脑、笔记本电脑、掌上电脑、个人数字助理(Personal Digital Assistant,PDA)、便捷式媒体播放器(Portable Media Player,PMP)、导航装置、可穿戴设备、智能手环、计步器等智能终端,以及诸如数字TV、台式计算机等固定终端。
后续描述中将以智能终端为例进行说明,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本申请的实施方式的构造也能够应用于固定类型的终端。
请参阅图1,图1为实现本申请各个实施例的一种智能终端的硬件结构示意图,该智能终端100可以包括:RF(Radio Frequency,射频)单元101、WiFi模块102、音频输出单元103、A/V(音频/视频)输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、处理器110、以及电源111等部件。本领域技术人员可以理解,图1中示出的智能终端结构并不构成对智能终端的限定,智能终端可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图1对智能终端的各个部件进行具体的介绍:
射频单元101可用于收发信息或通话过程中,信号的接收和发送,具体的,将基站的下行信息接收后,给处理器110处理;另外,将上行的数据发送给基站。通常,射频单元101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元101还可以通过无线通信与网络和其他设备通信。上述无线通信可以使用任一通信标准或协议,包括但不限于GSM(Global System of Mobile communication,全球移动通讯系统)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA2000(Code Division Multiple Access 2000,码分多址2000)、WCDMA(Wideband Code Division Multiple Access,宽带码分多址)、TD-SCDMA(Time Division-Synchronous Code Division Multiple Access,时分同步码分多址)、FDD-LTE(Frequency Division Duplexing-Long Term Evolution,频分双工长期演 进)、TDD-LTE(Time Division Duplexing-Long Term Evolution,分时双工长期演进)和5G等。
WiFi属于短距离无线传输技术,智能终端通过WiFi模块102可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图1示出了WiFi模块102,但是可以理解的是,其并不属于智能终端的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
音频输出单元103可以在智能终端100处于呼叫信号接收模式、通话模式、记录模式、语音识别模式、广播接收模式等等模式下时,将射频单元101或WiFi模块102接收的或者在存储器109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元103还可以提供与智能终端100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元103可以包括扬声器、蜂鸣器等等。
A/V输入单元104用于接收音频或视频信号。A/V输入单元104可以包括图形处理器(Graphics Processing Unit,GPU)1041和麦克风1042,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元106上。经图形处理器1041处理后的图像帧可以存储在存储器109(或其它存储介质)中或者经由射频单元101或WiFi模块102进行发送。麦克风1042可以在电话通话模式、记录模式、语音识别模式等等运行模式中经由麦克风1042接收声音(音频数据),并且能够将这样的声音处理为音频数据。处理后的音频(语音)数据可以在电话通话模式的情况下转换为可经由射频单元101发送到移动通信基站的格式输出。麦克风1042可以实施各种类型的噪声消除(或抑制)算法以消除(或抑制)在接收和发送音频信号的过程中产生的噪声或者干扰。
智能终端100还包括至少一种传感器105,比如光传感器、运动传感器以及其他传感器。可选地,光传感器包括环境光传感器及接近传感器,可选地,环境光传感器可根据环境光线的明暗来调节显示面板1061的亮度,接近传感器可在智能终端100移动到耳边时,关闭显示面板1061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机还可配置的指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
显示单元106用于显示由用户输入的信息或提供给用户的信息。显示单元106可包括显示面板1061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板1061。
用户输入单元107可用于接收输入的数字或字符信息,以及产生与智能终端的用户设置以及功能控制有关的键信号输入。可选地,用户输入单元107可包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1071 上或在触控面板1071附近的操作),并根据预先设定的程式驱动相应的连接装置。触控面板1071可包括触摸检测装置和触摸控制器两个部分。可选地,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,并能接收处理器110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1071。除了触控面板1071,用户输入单元107还可以包括其他输入设备1072。可选地,其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种,具体此处不做限定。
可选地,触控面板1071可覆盖显示面板1061,当触控面板1071检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板1061上提供相应的视觉输出。虽然在图1中,触控面板1071与显示面板1061是作为两个独立的部件来实现智能终端的输入和输出功能,但是在某些实施例中,可以将触控面板1071与显示面板1061集成而实现智能终端的输入和输出功能,具体此处不做限定。
接口单元108用作至少一个外部装置与智能终端100连接可以通过的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到智能终端100内的一个或多个元件或者可以用于在智能终端100和外部装置之间传输数据。
存储器109可用于存储软件程序以及各种数据。存储器109可主要包括存储程序区和存储数据区,可选地,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器110是智能终端的控制中心,利用各种接口和线路连接整个智能终端的各个部分,通过运行或执行存储在存储器109内的软件程序和/或模块,以及调用存储在存储器109内的数据,执行智能终端的各种功能和处理数据,从而对智能终端进行整体监控。处理器110可包括一个或多个处理单元;优选的,处理器110可集成应用处理器和调制解调处理器,可选地,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。
智能终端100还可以包括给各个部件供电的电源111(比如电池),优选的,电源111可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
尽管图1未示出,智能终端100还可以包括蓝牙模块等,在此不再赘述。
为了便于理解本申请实施例,下面对本申请的智能终端所基于的通信网络系统进 行描述。
请参阅图2,图2为本申请实施例提供的一种通信网络系统架构图,该通信网络系统为通用移动通信技术的LTE系统,该LTE系统包括依次通讯连接的UE(User Equipment,用户设备)201,E-UTRAN(Evolved UMTS Terrestrial Radio Access Network,演进式UMTS陆地无线接入网)202,EPC(Evolved Packet Core,演进式分组核心网)203和运营商的IP业务204。
可选地,UE201可以是上述终端100,此处不再赘述。
E-UTRAN202包括eNodeB2021和其它eNodeB2022等。可选地,eNodeB2021可以通过回程(backhaul)(例如X2接口)与其它eNodeB2022连接,eNodeB2021连接到EPC203,eNodeB2021可以提供UE201到EPC203的接入。
EPC203可以包括MME(Mobility Management Entity,移动性管理实体)2031,HSS(Home Subscriber Server,归属用户服务器)2032,其它MME2033,SGW(Serving Gate Way,服务网关)2034,PGW(PDN Gate Way,分组数据网络网关)2035和PCRF(Policy and Charging Rules Function,政策和资费功能实体)2036等。可选地,MME2031是处理UE201和EPC203之间信令的控制节点,提供承载和连接管理。HSS2032用于提供一些寄存器来管理诸如归属位置寄存器(图中未示)之类的功能,并且保存有一些有关服务特征、数据速率等用户专用的信息。所有用户数据都可以通过SGW2034进行发送,PGW2035可以提供UE201的IP地址分配以及其它功能,PCRF2036是业务数据流和IP承载资源的策略与计费控制策略决策点,它为策略与计费执行功能单元(图中未示)选择及提供可用的策略和计费控制决策。
IP业务204可以包括因特网、内联网、IMS(IP Multimedia Subsystem,IP多媒体子系统)或其它IP业务等。
虽然上述以LTE系统为例进行了介绍,但本领域技术人员应当知晓,本申请不仅仅适用于LTE系统,也可以适用于其他无线通信系统,例如GSM、CDMA2000、WCDMA、TD-SCDMA以及未来新的网络系统(如5G)等,此处不做限定。
本申请实施例提供的图像处理方法可以应用于图像拍摄的场景中,例如计算摄像场景。以计算摄像场景为例,在通过计算摄像获取拍摄画面的图像时,发明人发现至少存在如下问题:如何快速地获取拍摄画面的图像,以提高图像的获取效率是本领域亟待解决的问题。为了解决该问题,本申请实施例提供了一种图像处理方法,基于上述智能终端硬件结构以及通信网络系统,提出本申请各个实施例。下面,将通过具体的实施例对本申请提供的图像处理方法进行详细地说明。可以理解的是,下面这几个具体的实施例可以相互结合,对于相同或相似的概念或过程可能在某些实施例不再赘述。
实施例一
图3为本申请实施例提供的一种图像处理方法的流程示意图,该图像处理方法可以由软件和/或硬件装置执行,例如,该硬件装置可以为图像处理装置,该图像处理装置可以智能终端。可选地,请参见图3所示,该图像处理方法可以包括:
S1:获取图像数据流。
可选地,在获取图像数据流时,可以接收拍摄装置的光学成像硬件模块发送的图 像数据流;也可以接收拍摄装置的其它影像处理模块发送的图像数据流;具体可以根据实际需要进行设置,在此,对于图像数据流的获取方式,本申请实施例不做具体限制。
可选地,可参见图4所示,图4为本申请实施例提供的一种通过拍摄装置获取目标图像的框架示意图,结合图4可以看出,拍摄装置可以包括光学成像硬件模块和至少一个影像处理模块、相机应用模块以及影像控制模块,以通过光学成像硬件模块和至少一个影像处理模块、相机应用模块以及影像控制模块获取目标图像。可选地,至少一个影像处理模块可分别记为影像处理模块1、影像处理模块2、…、影像处理模块N-1,以及影像处理模块N。
可选地,光学成像硬件模块主要包括摄像头模组,例如感光芯片、镜头、对焦马达、光学防抖(Optical Image Stabilization,OIS)器件等;此外还带有成像辅助器件,例如闪光灯器件,色温传感器等;影像处理模块主要用于对摄像头硬件输出的码流数据进行处理,按照功能或者处理码流类型的不同,分成多种影像处理模块,例如用于进行RAW域降噪处理的影像处理模块,用于进行暗光增强处理的影像处理模块,用于进行高动态范围图像(High Dynamic Range,HDR)合成处理的影像处理模块等。相机应用模块主要用于实现图像的一些应用,如照片拍摄,视频录制,视频通话等;影像控制模块主要用于根据相机应用模块的输出需求设置和控制光学成像硬件模块,以使光学成像硬件模块可以按需采集拍摄画面的图像数据流。
在获取到图像数据流后,就可以根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像,即执行下述S2:
S2:根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像。
可选地,图像处理指令可以为基于图像处理需求生成的图像处理指令,也可以为预设的图像处理指令,具体可以根据实际需要进行设置,在此,本申请实施例不做具体限制。
可选地,根据图像处理指令对图像数据流进行处理可以包括但不限于:RAW域降噪处理、暗光增强处理、以及HDR合成处理等,具体可以根据实际需要进行设置。
需要说明的是,本申请实施例中,在根据图像处理指令对图像数据流进行RAW域降噪处理时,可参见现有技术中RAW域降噪处理的相关描述;在根据图像处理指令对图像数据流进行暗光增强处理时,也可以参见现有技术中暗光增强处理的相关描述、以及根据图像处理指令对图像数据流进行HDR合成处理时,可参见现有技术中HDR合成处理的相关描述,在此,本申请实施例不再进行赘述。
在根据图像处理指令对图像数据流进行处理后,除了会得到对应的目标图像之外,还会一并确定或生成对应的图像处理信息,该图像处理信息可以理解为:基于本次处理操作生成的处理信息,主要用于描述本次处理操作,这样通过确定或生成图像处理信息,使得后续可以根据该图像处理信息,获知本次处理的相关操作,为后续的图像数据处理提供查询依据。
可选地,图像处理信息可以包括图像处理信息的长度、处理标识、处理可逆标识、处理描述类型信息、处理前数据保存标识、或者图像数据中的至少一种,具体可以根 据实际需要进行设置。
可选地,图像处理信息的长度用于标识图像处理信息字段的总长度;处理标识用于标识其中一次图像处理信息;处理可逆标识用于标识本次图像处理是否为可逆处理;处理描述类型信息用于标识本次图像处理的类型描述;处理前数据保存标识用于标识是否保留处理前的图像数据;图像数据用于标识处理前的影像数据本身。
可以理解的是,图像处理信息还可以包括其它信息,例如图像处理信息的标识等,若图像处理信息包括图像处理信息的标识,则上述图像处理信息的长度包括了图像处理信息的标识的长度,具体可以根据实际需要进行设置,在此,本申请实施例只是以图像处理信息可以包括上述至少一种为例进行说明,但并不代表本申请实施例仅局限于此。可选地,图像处理信息的标识用于标识“图像处理信息”字段。可选地,图像处理信息可参见下述表1所示:
表1
Figure PCTCN2021122430-appb-000001
可选地,根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息,且得到目标图像时,可以先确定或生成对应的图像处理信息,再得到目标图像;也可以先得到目标图像,再确定或生成对应的图像处理信息;也可以同时确定或生成对应的图像处理,且得到目标图像,具体可以根据实际需要进行设置,在此,对于确定或生成对应的图像处理信息,和得到目标图像这两个操作的执行顺序,本申请实施例不做具体限制。
可选地,在本申请实施例中,图像数据流中可以包括图像数据,这样根据图像处理指令对图像数据流进行处理时,可以根据图像处理指令对图像数据流中的图像数据进行处理,得到目标图像数据,确定或生成对应的图像处理信息;以根据目标图像数据生成最终的目标图像。可选地,结合上述图4所示的拍摄装置,图像数据流中的图像数据可以为拍摄装置中的光学成像硬件模块按需采集拍摄画面的图像数据。此外, 光学成像硬件模块在采集拍摄画面的图像数据时,还会一并确定或生成该图像数据对应的图像基本信息,该图像基本信息的相关内容将在后续进行描述。
可选地,根据图像处理指令对图像数据进行处理,得到目标图像数据,且确定或生成对应的图像处理信息时,可以先得到目标图像数据,再确定或生成对应的图像处理信息;也可以先确定或生成对应的图像处理信息,再得到目标图像数据;也可以同时得到目标图像数据,且确定或生成对应的图像处理信息,具体可以根据实际需要进行设置,在此,对于得到目标图像数据,和确定或生成对应的图像处理信息这两个操作的执行顺序,本申请实施例不做具体限制。
可以看出,本申请实施例中,在获取目标图像时,可以先获取图像数据流,并根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像,这样可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
实施例二
基于上述图3所示的实施例,本申请实施例中,可选地,图像数据流中除了包括图像数据之外,还可以包括该图像数据对应的图像基本信息,这样通过在图像数据流中携带图像数据和图像基本信息,使得通过一个图像数据流,可以一并获取到图像数据和其对应的图像基本信息,无需再通过其它方式额外获取图像基本信息,从而提高了图像基本信息的获取效率。可选地,结合上述图4所示,图像基本信息可以为拍摄装置中的光学成像硬件模块在采集拍摄画面的图像数据时,一并确定或生成的该图像数据对应的图像基本信息。
可选地,可结合上述图4所示,在通过拍摄装置获取拍摄画面的目标图像时,可以先通过光学成像硬件模块在采集拍摄画面的图像数据时,会一并生成图像数据对应的图像基本信息;并将图像数据和其对应的图像基本信息携带在同一个图像数据流中,传输至至少一个影像处理模块中的影像处理模块1,以使影像处理模块1通过同一个图像数据流,可以一并获取图像数据和其对应的图像基本信息,这样影像处理模块1无需再通过其它方式额外获取图像基本信息,从而提高了图像基本信息的获取效率。
需要说明的是,本申请实施例中,通过同一个图像数据流携带图像数据和其对应的图像基本信息的封装格式,不局限于拍摄装置中的光学成像硬件模块,也同样适应于拍摄装置中的影像处理模块,即影像处理模块在对图像数据处理完成后,同样需要将处理后的图像数据和其对应的图像基本信息携带在同一个图像数据流中,发送给下一个影像处理模块,以便下一个影像处理模块基于该同一个图像数据流获取到处理后的图像数据和其对应的图像基本信息。不同的是,影像处理模块1是从光学成像硬件模块处通过同一个图像数据流获取图像数据和其对应的图像基本信息,而影像处理模块2、影像处理模块3、…、以及影像处理模块N是从其前一个影像处理模块处通过同一个图像数据流获取处理后的图像数据和其对应的图像基本信息。
结合上述描述,当同一个图像数据流同时携带图像数据和其对应的图像基本信息时,即图像数据流中同时包括图像数据和其对应的图像基本信息,可选地,图像数据流的封装格式中,图像数据流对应的图像基本信息在图像数据流中,位于图像数据之前;当然,也可以位于图像数据之后,具体可以根据实际需要进行设置,在此,本申请实施例只是以图像基本信息在图像数据流中,位于图像数据之前为例进行说明,但 并不代表本申请实施例仅局限于此。
可选地,本申请实施例中,图像基本信息可以包括图像基本信息的长度、图像数据的类型标识、图像数据的长度、图像数据的宽度、图像数据的色彩空间、图像数据的位宽、或者图像数据的存储方式中的至少一种,具体可以根据实际需要进行设置。
可选地,图像基本信息的长度用于标识图像基本信息字段的总长度;图像数据的类型标识用于标识影像数据类型是单帧图像、多帧图像或视频流;图像数据的长度用于标识图像数据本身的长度;图像数据的宽度用于标识图像数据本身的宽度;图像数据的色彩空间用于标识图像数据色彩空间描述,例如RGGB(Red-Green-Green-Blue)模式的色彩空间描述,RGBW(Red-Green-Blue-White)模式的色彩空间描述,RYYB(Red-Yellow-Yellow-Blue)模式的色彩空间描述等,图像数据的位宽用于标识图像每个分量的Bit数;图像数据的存储方式用于标识图像色彩空间中,每个分量的每个像素在内存中的排列方式。
可以理解的是,图像基本信息还可以包括其它信息,例如图像基本信息的标识等,若图像基本信息包括图像基本信息的标识,则上述图像基本信息的长度包括了图像基本信息的标识的长度,具体可以根据实际需要进行设置,在此,本申请实施例只是以图像基本信息可以包括上述至少一种为例进行说明,但并不代表本申请实施例仅局限于此。可选地,图像基本信息的标识用于标识图像“基本描述信息”字段。可选地,图像基本信息可参见下述表2所示:
表2
Figure PCTCN2021122430-appb-000002
结合上述描述,当图像数据流中包括图像数据和其对应的图像基本信息时,为了便于理解在上述S12中,如何根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像,下面,将通过下述图5所示的实施例进行详细描述。
图5为本申请实施例提供的另一种图像处理方法的流程示意图,该图像处理方法同样可以由软件和/或硬件装置执行,可选地,请参见图5所示,该图像处理方法可以包括:
S501、根据图像处理指令对图像数据流中的图像数据进行处理,确定或生成对应 的图像处理信息。
需要说明的是,在S501中,根据图像处理指令对图像数据流中的图像数据进行处理,确定或生成对应的图像处理信息的相关操作,与上述S12中根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息的相关操作类似,具体可参见上述SS12中根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息的相关描述,在此,本申请实施例不再进行赘述。
鉴于通常情况下,根据图像处理指令对图像数据进行处理之后,可能会引起图像基本信息发生变化,因此,在根据图像处理指令对图像数据进行处理后,可以判断本次处理是否会使图像基本信息发生变化,即执行下述S502:
S502、判断处理是否会使图像基本信息发生变化。
若确定图像基本信息发生变化,则执行下述S503;和/或,若确定图像基本信息未发生变化,则执行下述S504:
S503、若确定图像基本信息发生变化,则确定或生成目标图像基本信息,并基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,以根据目标图像数据流得到目标图像。
可以理解的是,当图像基本信息发生变化时,则需要确定或生成变化后的目标图像基本信息,并基于目标图像基本信息对图像数据流进行更新;此外,鉴于根据图像处理指令对图像数据进行处理之后,图像数据必然会发生变化,则同样需要确定或生成变化后的目标图像数据,并基于目标图像数据对图像数据流进行更新;并且,鉴于本次处理会确定或生成对应的图像处理信息,也同样需要基于图像处理信息对图像数据流进行更新。
上述基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新时,可选地,可以采用目标图像基本信息更新图像数据流中的图像基本信息、采用目标图像数据更新图像数据流中的图像数据,并将图像处理信息添加在图像数据流中,以对图像数据流进行更新,从而得到更新后的目标图像数据流。
可选地,采用目标图像基本信息更新图像数据流中的图像基本信息时,可以直接采用目标图像基本信息替换图像基本信息,也可以采用目标图像基本信息仅更新图像基本信息中与目标图像基本信息不同的部分,相同部分可不更新,具体可以根据实际需要进行设置,在此,本申请实施例不做具体限制。可以理解的是,采用目标图像数据更新图像数据时,其对应的更新方法与图像基本信息的更新方法类似,可参见图像基本信息的更新方法的相关描述,在此,本申请实施例不再进行赘述。
上述基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新之后,得到的目标图像数据流中可以包括目标图像基本信息、目标图像数据和图像处理信息。当目标图像数据流中包括目标图像基本信息、目标图像数据和图像处理信息时,可选地,目标图像数据流的封装格式中,图像处理信息在目标图像数据流中,位于目标图像基本信息和目标图像数据之间;当然,图像处理信息也可以位于目标图像基本信息和目标图像数据之后,或者其它的封装格式,在此,本申请实施例只是以图像处理信息位于目标图像基本信息和目标图像数据之间的这种封装格式为例进行描述,但并不代表本申请实施例仅局限于此。
在该种场景下,在根据图像处理指令对图像数据进行处理后,若确定本次处理使得图像数据流中的图像基本信息发生变化,则基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,这样通过目标图像数据流一并携带目标图像基本信息、目标图像数据和图像处理信息,无需再通过其它方式额外获取目标图像基本信息和图像处理信息,使得基于目标图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
S504、若确定图像基本信息未发生变化,则基于目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,以根据目标图像数据流得到目标图像。
可以理解的是,当图像基本信息未发生变化时,则无需更新图像数据流中的图像基本信息;但鉴于根据图像处理指令对图像数据进行处理之后,图像数据必然会发生变化,则需要确定或生成新的目标图像数据,并基于目标图像数据对图像数据流进行更新;并且,鉴于本次处理会确定或生成对应的图像处理信息,也同样需要基于图像处理信息对图像数据流进行更新。
上述基于目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流时,可选地,可以采用目标图像数据更新图像数据流中的图像数据,并将图像处理信息添加在图像数据流中,以对图像数据流进行更新,从而得到更新后的目标图像数据流。
可以理解的是,采用目标图像数据更新图像数据时,其对应的更新方法与上述S503中图像基本信息的更新方法类似,可参见图像基本信息的更新方法的相关描述,在此,本申请实施例不再进行赘述。
上述基于目标图像数据和图像处理信息对图像数据流进行更新,得到的目标图像数据流中可以包括图像基本信息、目标图像数据和图像处理信息。当目标图像数据流中包括图像基本信息、目标图像数据和图像处理信息时,可选地,目标图像数据流的封装格式中,图像处理信息在目标图像数据流中,位于图像基本信息和目标图像数据之间;当然,图像处理信息也可以位于图像基本信息和目标图像数据之后,或者其它的封装格式,在此,本申请实施例只是以图像处理信息位于图像基本信息和目标图像数据之间的这种封装格式为例进行描述,但并不代表本申请实施例仅局限于此。
在该种场景下,在根据图像处理指令对图像数据进行处理后,若确定本次处理使得图像数据流中的图像基本信息未发生变化,则基于图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,这样通过目标图像数据流一并携带图像基本信息、目标图像数据和图像处理信息,无需再通过其它方式额外获取图像基本信息和图像处理信息,使得基于目标图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
实施例三
基于上述图5所示的实施例,本申请实施例中,可选地,图像数据流中除了包括图像数据和其对应的图像基本信息之外,还可以包括成像信息,这样通过在图像数据流中携带图像数据、图像基本信息和成像信息,使得通过一个图像数据流,可以一并获取到图像数据、图像基本信息和成像信息,无需再通过其它方式额外获取图像基本信息和成像信息,从而提高了图像基本信息和成像信息的获取效率。
结合上述描述,当同一个图像数据流同时携带图像数据、图像基本信息和成像信息时,可选地,图像数据流的封装格式中,成像信息和图像处理信息在图像数据流中,均位于图像基本信息和图像数据之间,且成像信息位于图像处理信息之前;当然,也可以为其它的封装格式,具体可以根据实际需要进行设置,在此,本申请实施例只是以这种封装格式为例进行说明,但并不代表本申请实施例仅局限于此。
可选地,本申请实施例中,成像信息可以包括成像信息的长度、成像设备的快门时间、成像设备的感光度、成像设备的光圈、成像设备的焦距、成像设备的陀螺仪信息、成像设备的加速度、成像设备的地理位置信息、或者成像设备的影像旋转角度信息中的至少一种,具体可以根据实际需要进行设置。
可选地,成像信息的长度用于标识成像信息字段的总长度;成像设备的快门时间用于标识拍照时光进入感光材料的感光面时间,即相机快门速度;成像设备的感光度用于标识图像拍摄时的感光度的值,可参考ISO 12232所定义;成像设备的光圈用于标识镜头的光圈数值,取值为镜头焦距/镜头有效口径直径;成像设备的焦距用于标识镜头的物理焦距,即透镜焦点到透镜中心点之间的距离;成像设备的陀螺仪信息用于标识智能终端设备的角速度,包含X,Y和Z轴方向;成像设备的加速度用于标识智能终端设备的三个轴的加速度大小;成像设备的地理位置信息用于标识经度、维度、高度;成像设备的影像旋转角度信息用于标识智能终端上影像旋转角度的信息。
可以理解的是,成像信息还可以包括其它信息,例如成像信息的标识等,若成像信息包括成像信息的标识,则上述成像信息的长度包括了成像信息的标识的长度,具体可以根据实际需要进行设置,在此,本申请实施例只是以成像信息可以包括上述至少一种为例进行说明,但并不代表本申请实施例仅局限于此。可选地,成像信息的标识用于标识“成像信息”字段。可选地,图像基本信息可参见下述表3所示:
表3
Figure PCTCN2021122430-appb-000003
结合上述描述,当图像数据流中包括图像数据、其对应的图像基本信息和成像信息时,为了便于理解在上述S12中,如何根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像,下面,将通过下述图6所示的实施例进行详细描述。
图6为本申请实施例提供的又一种图像处理方法的流程示意图,该图像处理方法同样可以由软件和/或硬件装置执行,可选地,请参见图6所示,该图像处理方法可以包括:
S601、根据图像处理指令对图像数据流中的图像数据进行处理,确定或生成对应的图像处理信息。
需要说明的是,在S601中,根据图像处理指令对图像数据流中的图像数据进行处理,确定或生成对应的图像处理信息的相关操作,与上述S12中根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息的相关操作类似,具体可参见上述SS12中根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息的相关描述,在此,本申请实施例不再进行赘述。
鉴于通常情况下,根据图像处理指令对图像数据进行处理之后,可能会引起图像基本信息和成像信息发生变化,因此,在根据图像处理指令对图像数据进行处理后,可以判断本次处理是否会使图像基本信息和成像信息发生变化,以确定是否需要对图像数据流进行更新。
可选地,在判断本次处理是否会使图像基本信息和成像信息发生变化时,可以先判断本次处理是否会使图像基本信息发生变化,再判断本次处理是否会使成像信息发生变化;也可以先判断本次处理是否会使成像信息发生变化,再判断本次处理是否会使图像基本信息发生变化;也可以同时判断本次处理是否会使图像基本信息和成像信息发生变化,具体可以根据实际需要进行设置。
本申请实施例中,以先判断本次处理是否会使图像基本信息发生变化,再判断本次处理是否会使成像信息发生变化为例,在根据图像处理指令对图像数据进行处理后,可以先判断本次处理是否会使图像基本信息发生变化,即执行下述S602:
S602、判断处理是否会使图像基本信息发生变化。
若确定图像基本信息发生变化,则执行下述S603;和/或,若确定图像基本信息未发生变化,则执行下述S604:
S603、若确定图像基本信息发生变化,则确定或生成目标图像基本信息,并判断本次处理操作是否会使得成像信息发生变化,并根据判断结果对图像数据流进行更新,得到目标图像数据流,以根据目标图像数据流得到目标图像。
在判断本次处理操作是否会使得成像信息发生变化时,可以包括下述两种情况,分别为成像信息发生变化和成像信息未发生变化。
在一种情况下,当确定成像信息发生变化,则需要确定更新后的目标成像信息,鉴于图像基本信息和成像信息均发生了变化,因此,需要基于变化后的目标图像基本信息和目标成像信息,对图像数据流进行更新,此外,鉴于根据图像处理指令对图像数据进行处理之后,图像数据必然会发生变化,则同样需要确定或生成变化后的目标图像数据,并基于目标图像数据对图像数据流进行更新;并且,鉴于本次处理会确定 或生成对应的图像处理信息,也同样需要基于图像处理信息对图像数据流进行更新。
上述基于目标图像基本信息、目标成像信息、目标图像数据和图像处理信息对图像数据流进行更新时,可选地,可以采用目标图像基本信息更新图像数据流中的图像基本信息、采用目标成像信息更新图像数据流中的成像信息、采用目标图像数据更新图像数据流中的图像数据,并将图像处理信息添加在图像数据流中,以对图像数据流进行更新,从而得到更新后的目标图像数据流。
可选地,采用目标成像信息更新图像数据流中的成像信息时,可以直接采用目标成像信息替换成像信息,也可以采用目标成像信息仅更新成像信息中与目标成像信息不同的部分,相同部分可不更新,具体可以根据实际需要进行设置,在此,本申请实施例不做具体限制。至于如何采用目标图像基本信息更新图像数据流中的图像基本信息、以及如何采用目标图像数据更新图像数据流中的图像数据,可参见上述图5所示的实施例的相关描述,在此,本申请实施例不再进行赘述。
上述基于目标图像基本信息、目标成像信息、目标图像数据和图像处理信息对图像数据流进行更新,得到的目标图像数据流中可以包括目标图像基本信息、目标成像信息、目标图像数据和图像处理信息。当目标图像数据流中包括目标图像基本信息、目标成像信息、目标图像数据和图像处理信息时,可选地,目标图像数据流的封装格式中,目标成像信息和图像处理信息在目标图像数据流中,均位于目标图像基本信息和目标图像数据之间,且目标成像信息位于图像处理信息之前,即依次为:目标图像基本信息、目标成像信息、图像处理信息以及目标图像数据。当然,也可以为其它的封装格式,具体可以根据实际需要进行设置,在此,本申请实施例只是以这种封装格式为例进行说明,但并不代表本申请实施例仅局限于此。
在该种情况下,在根据图像处理指令对图像数据进行处理后,若确定本次处理使得图像数据流中的图像基本信息和成像信息均发生变化,则基于目标图像基本信息、目标成像信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,这样通过目标图像数据流一并携带目标图像基本信息、目标成像信息、目标图像数据和图像处理信息,无需再通过其它方式额外获取目标图像基本信息、目标成像信息和图像处理信息,使得基于目标图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
在另一种情况下,当确定成像信息未发生变化,则无需更新图像数据流中的成像信息,但鉴于图像基本信息发生了变化,则需要基于变化后的目标图像基本信息对图像数据流进行更新,此外,鉴于根据图像处理指令对图像数据进行处理之后,图像数据必然会发生变化,则同样需要确定或生成变化后的目标图像数据,并基于目标图像数据对图像数据流进行更新;并且,鉴于本次处理会确定或生成对应的图像处理信息,也同样需要基于图像处理信息对图像数据流进行更新。
上述基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新时,可选地,可以采用目标图像基本信息更新图像数据流中的图像基本信息、采用目标图像数据更新图像数据流中的图像数据,并将图像处理信息添加在图像数据流中,以对图像数据流进行更新,从而得到更新后的目标图像数据流。
需要说明的是,在S603中,关于如何采用目标图像基本信息更新图像数据流中的 图像基本信息、以及如何采用目标图像数据更新图像数据流中的图像数据,可参见上述图5所示的实施例的相关描述,在此,本申请实施例不再进行赘述。
上述基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到的目标图像数据流中可以包括目标图像基本信息、成像信息、目标图像数据和图像处理信息。当目标图像数据流中包括目标图像基本信息、成像信息、目标图像数据和图像处理信息时,可选地,目标图像数据流的封装格式中,成像信息和图像处理信息在目标图像数据流中,均位于目标图像基本信息和目标图像数据之间,且成像信息位于图像处理信息之前;当然,也可以为其它的封装格式,具体可以根据实际需要进行设置,在此,本申请实施例只是以这种封装格式为例进行说明,但并不代表本申请实施例仅局限于此。
在该种情况下,在根据图像处理指令对图像数据进行处理后,若确定本次处理使得图像数据流中的图像基本信息发生变化,成像信息未发生变化,则基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,这样通过目标图像数据流一并携带目标图像基本信息、成像信息、目标图像数据和图像处理信息,无需再通过其它方式额外获取目标图像基本信息、成像信息和图像处理信息,使得基于目标图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
S604、若确定图像基本信息未发生变化,则判断本次处理操作是否会使得成像信息发生变化,并根据判断结果对图像数据流进行更新,得到目标图像数据流,以根据目标图像数据流得到目标图像。
在判断本次处理操作是否会使得成像信息发生变化时,可以包括下述两种情况,分别为成像信息发生变化和成像信息未发生变化。
在一种情况下,当确定成像信息发生变化,则需要确定更新后的目标成像信息,因此,需要基于变化后的目标成像信息,对图像数据流进行更新,但鉴于图像基本信息未发生变化,则无需基于图像基本信息对图像数据流进行更新;此外,鉴于根据图像处理指令对图像数据进行处理之后,图像数据必然会发生变化,则同样需要确定或生成变化后的目标图像数据,并基于目标图像数据对图像数据流进行更新;并且,鉴于本次处理会确定或生成对应的图像处理信息,也同样需要基于图像处理信息对图像数据流进行更新。
上述基于目标图像数据、目标成像信息、和图像处理信息对图像数据流进行更新时,可选地,可以采用目标图像数据更新图像数据流中的图像数据,采用目标成像信息更新图像数据流中的成像信息,并将图像处理信息添加在图像数据流中,以对图像数据流进行更新,从而得到更新后的目标图像数据流。
需要说明的是,在S604中,关于如何采用目标图像数据更新图像数据流中的图像数据,以及如何采用目标成像信息更新图像数据流中的成像信息,可参见上述S603中的相关描述,在此,本申请实施例不再进行赘述。
上述基于目标图像数据、目标成像信息、和图像处理信息对图像数据流进行更新,得到的目标图像数据流中可以包括图像基本信息、目标成像信息、目标图像数据和图像处理信息。当目标图像数据流中包括图像基本信息、目标成像信息、目标图像数据 和图像处理信息时,可选地,目标图像数据流的封装格式中,目标成像信息和图像处理信息在目标图像数据流中,均位于图像基本信息和目标图像数据之间,且目标成像信息位于图像处理信息之前;当然,也可以为其它的封装格式,具体可以根据实际需要进行设置,在此,本申请实施例只是以这种封装格式为例进行说明,但并不代表本申请实施例仅局限于此。
在该种情况下,在根据图像处理指令对图像数据进行处理后,若确定本次处理使得图像数据流中的图像基本信息未发生变化,成像信息发生变化,则基于目标图像数据、目标成像信息、和图像处理信息对图像数据流进行更新,得到目标图像数据流,这样通过目标图像数据流一并携带图像基本信息、目标成像信息、目标图像数据和图像处理信息,无需再通过其它方式额外获取图像基本信息、目标成像信息和图像处理信息,使得基于目标图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
在另一种情况下,当确定成像信息未发生变化时,鉴于图像基本信息和成像信息均未发生了变化,因此,无需基于图像基本信息和成像信息,对图像数据流进行更新,但鉴于根据图像处理指令对图像数据进行处理之后,图像数据必然会发生变化,则同样需要确定或生成变化后的目标图像数据,并基于目标图像数据对图像数据流进行更新;并且,鉴于本次处理会确定或生成对应的图像处理信息,也同样需要基于图像处理信息对图像数据流进行更新。
上述基于目标图像数据和图像处理信息对图像数据流进行更新时,可选地,可以采用目标图像数据更新图像数据流中的图像数据,并将图像处理信息添加在图像数据流中,以对图像数据流进行更新,从而得到更新后的目标图像数据流。
需要说明的是,关于如何采用目标图像数据更新图像数据流中的图像数据,可参见上述图5所示的实施例的相关描述,在此,本申请实施例不再进行赘述。
上述基于目标图像数据和图像处理信息对图像数据流进行更新,得到的目标图像数据流中可以包括图像基本信息、成像信息、目标图像数据和图像处理信息。当目标图像数据流中包括图像基本信息、成像信息、目标图像数据和图像处理信息时,可选地,目标图像数据流的封装格式中,成像信息和图像处理信息在目标图像数据流中,均位于图像基本信息和目标图像数据之间,且成像信息位于图像处理信息之前,即依次为:图像基本信息、成像信息、图像处理信息以及目标图像数据。当然,也可以为其它的封装格式,具体可以根据实际需要进行设置,在此,本申请实施例只是以这种封装格式为例进行说明,但并不代表本申请实施例仅局限于此。
在该种情况下,在根据图像处理指令对图像数据进行处理后,若确定本次处理使得图像数据流中的图像基本信息和成像信息均未发生变化,则基于目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,这样通过目标图像数据流一并携带图像基本信息、成像信息、目标图像数据和图像处理信息,无需再通过其它方式额外获取图像基本信息、成像信息和图像处理信息,使得基于目标图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
实施例四
基于上述任一实施例,可选地,在本申请实施例中,图像数据流中还可以包括图 像数据对应的原始图像处理信息,这样通过在图像数据流中携带图像数据和原始图像处理信息,使得通过一个图像数据流,可以一并获取到图像数据和原始图像处理信息,无需再通过其它方式额外获取原始图像处理信息,从而提高了原始图像处理信息的获取效率。可选地,原始图像处理信息可以理解为图像数据之前经过的图像处理的相关信息。
结合上述描述,当同一个图像数据流同时携带图像数据和其对应的原始图像处理信息时,可选地,图像数据流的封装格式中,原始图像处理信息在图像数据流中,位于图像数据之前;当然,也可以位于图像数据之后,具体可以根据实际需要进行设置,在此,本申请实施例只是以原始图像处理信息在图像数据流中,位于图像数据之前这种封装格式为例进行说明,但并不代表本申请实施例仅局限于此。
当图像数据流包括原始图像处理信息时,在根据图像处理指令对图像数据流进行处理,确定或生成本次处理操作对应的图像处理信息后,可以将本次处理操作对应的图像处理信息添加在图像数据流中的原始图像处理信息中,以对图像数据流进行更新,从而得到更新后的目标图像数据流,这样目标图像数据流中总会包括最新的图像处理信息,使得后续可以根据该图像处理信息,获知本次处理的相关操作,为后续的图像数据处理提供查询依据。
可选地,原始图像处理信息中包括还可以处理可逆标识。当处理可逆标识指示为可逆处理时,说明后续可以基于更新后的目标图像数据和原始图像处理信息恢复出处理之前的图像数据,在该种情况下,为了避免因存储图像数据占用较多的内存,因此,在将目标图像数据添加在图像数据流中后,可以从更新后的目标图像数据流中剔除图像数据,这样可以避免因存储图像数据占用较多的内存,降低了内存的占用。和/或,当处理可逆标识指示当前处理为不可逆处理时,说明后续无法基于更新后的目标图像数据和原始图像处理信息恢复出处理之前的图像数据,则需要在更新后的目标图像数据流中携带处理之前的图像数据,以保证后续可以准确地查找到该图像数据。
实施例五
图7为本申请实施例提供的一种图像处理方法的流程示意图,该图像处理方法可以由软件和/或硬件装置执行,例如,该硬件装置可以为图像处理装置,该图像处理装置可以智能终端。可选地,请参见图7所示,该图像处理方法可以包括:
S10、获取图像数据流,图像数据流包括图像数据以及图像基本信息。
可选地,本申请实施例中,图像基本信息可以包括图像基本信息的长度、图像数据的类型标识、图像数据的长度、图像数据的宽度、图像数据的色彩空间、图像数据的位宽、或者图像数据的存储方式中的至少一种,具体可以根据实际需要进行设置,其相关描述可参见上述图3所示的实施例二中关于图像基本信息的相关描述,在此,本申请实施例不再进行赘述。
可选地,在获取图像数据流时,可以接收拍摄装置的光学成像硬件模块发送的图像数据流;也可以接收拍摄装置的其它影像处理模块发送的图像数据流;具体可以根据实际需要进行设置,在此,对于图像数据流的获取方式,本申请实施例不做具体限制。可选地,可结合上述图4所示,在通过拍摄装置获取拍摄画面的目标图像时,可以先通过光学成像硬件模块在采集拍摄画面的图像数据时,会一并生成图像数据对应 的图像基本信息;并将图像数据和其对应的图像基本信息携带在同一个图像数据流中。
结合上述描述,当同一个图像数据流同时携带图像数据和其对应的图像基本信息时,即图像数据流中同时包括图像数据和其对应的图像基本信息,可选地,图像数据流的封装格式中,图像数据流对应的图像基本信息在图像数据流中,位于图像数据之前;当然,也可以位于图像数据之后,具体可以根据实际需要进行设置,在此,本申请实施例只是以图像基本信息在图像数据流中,位于图像数据之前为例进行说明,但并不代表本申请实施例仅局限于此。
在获取到图像数据后,就可以根据图像处理指令对图像数据进行处理,此外,鉴于根据图像处理指令对图像数据进行处理后,通常会引起图像基本信息发生变化,因此,在根据图像处理指令对图像数据进行处理后,可以确定或生成变化后的目标图像基本信息,以对图像数据流进行更新,得到目标图像,即执行下述S20:
S20、根据图像处理指令对图像数据进行处理,确定或生成目标图像基本信息,以对图像数据流进行更新,得到目标图像。
可选地,图像处理指令可以为基于图像处理需求生成的图像处理指令,也可以为预设的图像处理指令,具体可以根据实际需要进行设置,在此,本申请实施例不做具体限制。
可选地,根据图像处理指令对图像数据流进行处理可以包括但不限于:RAW域降噪处理、暗光增强处理、以及HDR合成处理等,具体可以根据实际需要进行设置,其相关描述可参见上述S2中图像处理指令的相关描述,在此,本申请实施例不再进行赘述。
可选地,根据图像处理指令对图像数据进行处理时,可以先根据图像处理指令对图像数据进行处理,得到目标图像数据,确定或生成对应的目标图像处理信息;再基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,以根据目标图像数据流得到目标图像。
可选地,根据图像处理指令对图像数据进行处理,得到目标图像数据,且确定或生成对应的目标图像处理信息时,可以先得到目标图像数据,再确定或生成对应的目标图像处理信息;也可以先确定或生成对应的目标图像处理信息,再得到目标图像数据;也可以同时得到目标图像数据,且确定或生成对应的目标图像处理信息,具体可以根据实际需要进行设置,在此,对于得到目标图像数据,和确定或生成对应的目标图像处理信息这两个操作的执行顺序,本申请实施例不做具体限制。
可选地,基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新时,可选地,可以采用目标图像基本信息更新图像数据流中的图像基本信息、采用目标图像数据更新图像数据流中的图像数据,并将图像处理信息添加在图像数据流中,以对图像数据流进行更新,从而得到更新后的目标图像数据流。
可选地,采用目标图像基本信息更新图像数据流中的图像基本信息时,可以直接采用目标图像基本信息替换图像基本信息,也可以采用目标图像基本信息仅更新图像基本信息中与目标图像基本信息不同的部分,相同部分可不更新,具体可以根据实际需要进行设置,在此,本申请实施例不做具体限制。可以理解的是,采用目标图像数据更新图像数据时,其对应的更新方法与图像基本信息的更新方法类似,可参见图像 基本信息的更新方法的相关描述,在此,本申请实施例不再进行赘述。
上述基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新之后,得到的目标图像数据流中可以包括目标图像基本信息、目标图像数据和图像处理信息。当目标图像数据流中包括目标图像基本信息、目标图像数据和图像处理信息时,可选地,目标图像数据流的封装格式中,图像处理信息在目标图像数据流中,位于目标图像基本信息和目标图像数据之间;当然,图像处理信息也可以位于目标图像基本信息和目标图像数据之后,或者其它的封装格式,在此,本申请实施例只是以图像处理信息位于目标图像基本信息和目标图像数据之间的这种封装格式为例进行描述,但并不代表本申请实施例仅局限于此。
在该种场景下,在根据图像处理指令对图像数据进行处理后,若确定本次处理使得图像数据流中的图像基本信息发生变化,则基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,这样通过目标图像数据流一并携带目标图像基本信息、目标图像数据和图像处理信息,无需再通过其它方式额外获取目标图像基本信息和图像处理信息,使得基于目标图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
可以看出,本申请实施例中,在获取目标图像时,可以先获取图像数据流,图像数据流包括图像数据以及图像基本信息;并根据图像处理指令对图像数据进行处理,确定或生成目标图像基本信息,以对图像数据流进行更新,得到目标图像,这样通过图像数据流一并携带图像数据和图像基本信息,并通过目标图像基本信息对图像数据流进行更新,无需再通过其它方式额外获取目标图像基本信息,使得基于更新后的图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
实施例六
基于上述图7所示的实施例,本申请实施例中,可选地,图像数据流中除了包括图像数据和其对应的图像基本信息之外,还可以包括成像信息,这样通过在图像数据流中携带图像数据、图像基本信息和成像信息,使得通过一个图像数据流,可以一并获取到图像数据、图像基本信息和成像信息,无需再通过其它方式额外获取图像基本信息和成像信息,从而提高了图像基本信息和成像信息的获取效率。
结合上述描述,当同一个图像数据流同时携带图像数据、图像基本信息和成像信息时,可选地,图像数据流的封装格式中,成像信息和图像处理信息在图像数据流中,均位于图像基本信息和图像数据之间,且成像信息位于图像处理信息之前;当然,也可以为其它的封装格式,具体可以根据实际需要进行设置,在此,本申请实施例只是以这种封装格式为例进行说明,但并不代表本申请实施例仅局限于此。
可选地,本申请实施例中,成像信息可以包括成像信息的长度、成像设备的快门时间、成像设备的感光度、成像设备的光圈、成像设备的焦距、成像设备的陀螺仪信息、成像设备的加速度、成像设备的地理位置信息、或者成像设备的影像旋转角度信息中的至少一种,具体可以根据实际需要进行设置,其相关描述可参见上述实施例三中成像信息的相关描述,在此,本申请实施例不再进行赘述。
结合上述描述,当图像数据流中包括图像数据、其对应的图像基本信息和成像信息时,为了便于理解在上述S20中,如何基于所述目标图像基本信息、所述目标图像 数据和所述图像处理信息对所述图像数据流进行更新,下面,将通过下述图8所示的实施例进行详细描述。
图8为本申请实施例提供的另一种图像处理方法的流程示意图,该图像处理方法同样可以由软件和/或硬件装置执行,可选地,请参见图7所示,该图像处理方法可以包括:
S801、判断处理是否会使图像基本信息发生变化。
鉴于通常情况下,根据图像处理指令对图像数据进行处理之后,可能会引起成像信息发生变化,因此,在根据图像处理指令对图像数据进行处理后,可以判断本次处理是否会使成像信息发生变化,以确定是否需要对图像数据流进行更新。若确定图像基本信息发生变化,则执行下述S802;和/或,若确定图像基本信息未发生变化,则执行下述S803:
S802、若确定成像信息发生变化,则确定或生成目标成像信息,并基于目标图像基本信息、目标成像信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流。
当确定成像信息发生变化,则需要确定更新后的目标成像信息,鉴于图像基本信息和成像信息均发生了变化,因此,需要基于变化后的目标图像基本信息和目标成像信息,对图像数据流进行更新,此外,鉴于根据图像处理指令对图像数据进行处理之后,图像数据必然会发生变化,则同样需要确定或生成变化后的目标图像数据,并基于目标图像数据对图像数据流进行更新;并且,鉴于本次处理会确定或生成对应的图像处理信息,也同样需要基于图像处理信息对图像数据流进行更新。
上述基于目标图像基本信息、目标成像信息、目标图像数据和图像处理信息对图像数据流进行更新时,可选地,可以采用目标图像基本信息更新图像数据流中的图像基本信息、采用目标成像信息更新图像数据流中的成像信息、采用目标图像数据更新图像数据流中的图像数据,并将图像处理信息添加在图像数据流中,以对图像数据流进行更新,从而得到更新后的目标图像数据流。
可选地,采用目标成像信息更新图像数据流中的成像信息时,可以直接采用目标成像信息替换成像信息,也可以采用目标成像信息仅更新成像信息中与目标成像信息不同的部分,相同部分可不更新,具体可以根据实际需要进行设置,在此,本申请实施例不做具体限制。至于如何采用目标图像基本信息更新图像数据流中的图像基本信息、以及如何采用目标图像数据更新图像数据流中的图像数据,可参见上述图5所示的实施例的相关描述,在此,本申请实施例不再进行赘述。
上述基于目标图像基本信息、目标成像信息、目标图像数据和图像处理信息对图像数据流进行更新,得到的目标图像数据流中可以包括目标图像基本信息、目标成像信息、目标图像数据和图像处理信息。当目标图像数据流中包括目标图像基本信息、目标成像信息、目标图像数据和图像处理信息时,可选地,目标图像数据流的封装格式中,目标成像信息和图像处理信息在目标图像数据流中,均位于目标图像基本信息和目标图像数据之间,且目标成像信息位于图像处理信息之前,即依次为:目标图像基本信息、目标成像信息、图像处理信息以及目标图像数据。当然,也可以为其它的封装格式,具体可以根据实际需要进行设置,在此,本申请实施例只是以这种封装格 式为例进行说明,但并不代表本申请实施例仅局限于此。
在该种情况下,在根据图像处理指令对图像数据进行处理后,若确定本次处理使得图像数据流中的图像基本信息和成像信息均发生变化,则基于目标图像基本信息、目标成像信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,这样通过目标图像数据流一并携带目标图像基本信息、目标成像信息、目标图像数据和图像处理信息,无需再通过其它方式额外获取目标图像基本信息、目标成像信息和图像处理信息,使得基于目标图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
S803、若确定成像信息未发生变化,则基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流。
当确定成像信息未发生变化,则无需更新图像数据流中的成像信息,但鉴于图像基本信息发生了变化,则需要基于变化后的目标图像基本信息对图像数据流进行更新,此外,鉴于根据图像处理指令对图像数据进行处理之后,图像数据必然会发生变化,则同样需要确定或生成变化后的目标图像数据,并基于目标图像数据对图像数据流进行更新;并且,鉴于本次处理会确定或生成对应的图像处理信息,也同样需要基于图像处理信息对图像数据流进行更新。
上述基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新时,可选地,可以采用目标图像基本信息更新图像数据流中的图像基本信息、采用目标图像数据更新图像数据流中的图像数据,并将图像处理信息添加在图像数据流中,以对图像数据流进行更新,从而得到更新后的目标图像数据流。
需要说明的是,在S803中,关于如何采用目标图像基本信息更新图像数据流中的图像基本信息、以及如何采用目标图像数据更新图像数据流中的图像数据,可参见上述图5所示的实施例的相关描述,在此,本申请实施例不再进行赘述。
上述基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到的目标图像数据流中可以包括目标图像基本信息、成像信息、目标图像数据和图像处理信息。当目标图像数据流中包括目标图像基本信息、成像信息、目标图像数据和图像处理信息时,可选地,目标图像数据流的封装格式中,成像信息和图像处理信息在目标图像数据流中,均位于目标图像基本信息和目标图像数据之间,且成像信息位于图像处理信息之前,即依次为:目标图像基本信息、成像信息、图像处理信息以及目标图像数据。当然,也可以为其它的封装格式,具体可以根据实际需要进行设置,在此,本申请实施例只是以这种封装格式为例进行说明,但并不代表本申请实施例仅局限于此。
在该种情况下,在根据图像处理指令对图像数据进行处理后,若确定本次处理使得图像数据流中的图像基本信息发生变化,成像信息未发生变化,则基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,这样通过目标图像数据流一并携带目标图像基本信息、成像信息、目标图像数据和图像处理信息,无需再通过其它方式额外获取目标图像基本信息、成像信息和图像处理信息,使得基于目标图像数据流可以快速地获取目标图像,从而有效地提高了目标图像的获取效率。
实施例七
基于上述图7或图8所示的实施例,可选地,在本申请实施例中,图像数据流中除了包括图像数据和其对应的图像基本信息之外,还可以包括图像数据对应的原始图像处理信息,这样通过在图像数据流中携带图像数据、图像基本信息和原始图像处理信息,使得通过一个图像数据流,可以一并获取到图像数据、图像基本信息和原始图像处理信息,无需再通过其它方式额外获取图像基本信息和原始图像处理信息,从而提高了图像基本信息和原始图像处理信息的获取效率。可选地,原始图像处理信息可以理解为图像数据之前经过的图像处理的相关信息。
结合上述描述,当同一个图像数据流同时携带图像数据、图像基本信息、以及原始图像处理信息时,可选地,图像数据流的封装格式中,原始图像处理信息在图像数据流中,位于图像基本信息和图像数据之间;当然,原始图像处理信息也可以位于图像基本信息和图像数据之后,或者其它的封装格式,在此,本申请实施例只是以原始图像处理信息位于图像基本信息和图像数据之间的这种封装格式为例进行描述,但并不代表本申请实施例仅局限于此。
当图像数据流包括原始图像处理信息时,在根据图像处理指令对图像数据进行处理,确定或生成本次处理操作对应的图像处理信息后,可以将本次处理操作对应的图像处理信息添加在图像数据流中的原始图像处理信息中,以对图像数据流进行更新,从而得到更新后的目标图像数据流,这样目标图像数据流中总会包括最新的图像处理信息,使得后续可以根据该图像处理信息,获知本次处理的相关操作,为后续的图像数据处理提供查询依据。
可选地,原始图像处理信息中包括还可以处理可逆标识。当处理可逆标识指示为可逆处理时,说明后续可以基于更新后的目标图像数据和原始图像处理信息恢复出处理之前的图像数据,在该种情况下,为了避免因存储图像数据占用较多的内存,因此,在将目标图像数据添加在图像数据流中后,可以从更新后的目标图像数据流中剔除图像数据,这样可以避免因存储图像数据占用较多的内存,降低了内存的占用。和/或,当处理可逆标识指示当前处理为不可逆处理时,说明后续无法基于更新后的目标图像数据和原始图像处理信息恢复出处理之前的图像数据,则需要在更新后的目标图像数据流中携带处理之前的图像数据,以保证后续可以准确地查找到该图像数据。
实施例八
基于上述任一实施例,在实际计算摄像场景中,假设图像数据流中包括图像数据和图像基本信息,但并不确定是否包括成像信息和原始图像处理信息,则对图像数据进行处理时,可选地,可参见9所示,图9为本申请实施例提供的又一种图像处理方法的流程示意图,在获取到图像数据流后,可以先解析图像数据流中的图像基本信息,并在解析得到图像基本信息后,再判断图像数据流中是否包括成像信息的标识;若确定包括成像信息的标识,则解析成像信息,并继续判断图像数据流中是否包括原始图像处理信息的标识;和/或,若确定不包括成像信息的标识,则继续判断图像数据流中是否包括原始图像处理信息的标识;若确定包括原始图像处理信息的标识,则解析原始图像处理信息,并对图像数据进行处理;若确定不包括原始图像处理信息的标识,则对图像数据进行处理。
在对图像数据进行处理后,可以判断本次处理是否使得图像基本信息发生变化,若确定图像基本信息发生变化,则将变化后的目标图像基本信息写入图像数据流中,并继续判断本次处理是否使得成像信息发生变化;若确定图像基本信息未发生变化,则继续判断本次处理是否使得成像信息发生变化;若确定成像信息发生变化,则将变化后的目标成像信息写入图像数据流中,并将本次处理对应生成的图像处理信息写入图像数据流中;若确定成像信息未发生变化,则将本次处理对应生成的图像处理信息写入图像数据流中;再将本次处理得到的目标图像数据写入到图像数据流中,得到更新后的目标图像数据流后,并输出目标图像数据流。
实施例九
图10是本申请实施例提供的一种图像处理装置100的结构示意图,可选地,请参见图10所示,该图像处理装置100可以包括:
获取单元1001,用于获取图像数据流。
处理单元1002,用于根据图像处理指令对图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像。
可选地,图像数据流中包括图像数据。
处理单元1002,具体用于根据图像处理指令对图像数据进行处理,得到目标图像数据,确定或生成对应的图像处理信息。
可选地,图像数据流中包括图像基本信息。
处理单元1002,具体用于判断处理是否会使图像基本信息发生变化;若确定图像基本信息发生变化,则确定或生成目标图像基本信息,基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,以根据目标图像数据流得到目标图像;和/或,若确定图像基本信息未发生变化,则基于目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,以根据目标图像数据流得到目标图像。
可选地,图像处理信息包括以下至少一种特征:
图像处理信息在目标图像数据流中,位于目标图像基本信息和目标图像数据之间。
图像处理信息在目标图像数据流中,位于图像基本信息和目标图像数据之间。
可选地,图像数据流还包括成像信息的标识。
处理单元1002,具体用于判断处理是否会使成像信息发生变化;若确定成像信息发生变化,则确定或生成目标成像信息,并基于目标图像基本信息、目标成像信息、目标图像数据和图像处理信息对图像数据流进行更新;和/或,若确定成像信息未发生变化,则基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新。
可选地,目标成像信息和图像处理信息在目标图像数据流中,位于目标图像基本信息和目标图像数据之间,目标成像信息位于图像处理信息之前。
成像信息和图像处理信息在目标图像数据流中,均位于目标图像基本信息和目标图像数据之间,且成像信息位于图像处理信息之前。
可选地,图像处理装置100还包括添加单元1003。
添加单元1003,用于若图像数据流包括原始图像处理信息,则将图像处理信息添 加在图像数据流中的原始图像处理信息中。
可选地,原始图像处理信息包括处理可逆标识,装置还包括剔除单元1004。
剔除单元1004,用于若处理可逆标识指示为可逆处理,则从目标图像数据流中剔除图像数据。
可选地,图像基本信息包括下述至少一种:
图像基本信息的长度、图像数据的类型标识、图像数据的长度、图像数据的宽度、图像数据的色彩空间、图像数据的位宽、或者图像数据的存储方式。
可选地,图像处理信息包括下述至少一种:
图像处理信息的长度、处理标识、处理可逆标识、处理描述类型信息、处理前数据保存标识、或者图像数据。
本申请实施例所示的图像处理装置100,可以执行上述实施例中图像处理方法的技术方案,其实现原理以及有益效果与图像处理方法的实现原理及有益效果类似,可参见图像处理方法的实现原理及有益效果,此处不再进行赘述。
实施例十
图11是本申请实施例提供的另一种图像处理装置110的结构示意图,可选地,请参见图11所示,该图像处理装置110可以包括:
获取单元1101,用于获取图像数据流,图像数据流包括图像数据以及图像基本信息。
处理单元1102,用于根据图像处理指令对图像数据进行处理,确定或生成目标图像基本信息,以对图像数据流进行更新,得到目标图像。
可选地,处理单元1102,具体用于根据图像处理指令对图像数据进行处理,得到目标图像数据,确定或生成对应的图像处理信息;基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新,得到目标图像数据流,以根据目标图像数据流得到目标图像。
可选地,若图像数据流还包括成像信息的标识。
处理单元1102,具体用于判断处理是否会使成像信息发生变化;若确定成像信息发生变化,则确定或生成目标成像信息,并基于目标图像基本信息、目标成像信息、目标图像数据和图像处理信息对图像数据流进行更新;和/或,若确定成像信息未发生变化,则基于目标图像基本信息、目标图像数据和图像处理信息对图像数据流进行更新。
可选地,目标成像信息和图像处理信息在目标图像数据流中,位于目标图像基本信息和目标图像数据之间,目标成像信息位于图像处理信息之前。
成像信息和图像处理信息在目标图像数据流中,均位于目标图像基本信息和目标图像数据之间,且成像信息位于图像处理信息之前。
可选地,成像信息包括下述至少一种:
成像信息的长度、成像设备的快门时间、成像设备的感光度、成像设备的光圈、成像设备的焦距、成像设备的陀螺仪信息、成像设备的加速度、成像设备的地理位置信息、或者成像设备的影像旋转角度信息。
本申请实施例所示的图像处理装置110,可以执行上述实施例中图像处理方法的 技术方案,其实现原理以及有益效果与图像处理方法的实现原理及有益效果类似,可参见图像处理方法的实现原理及有益效果,此处不再进行赘述。
本申请实施例还提供一种智能终端,智能终端包括存储器、处理器,存储器上存储有图像数据的处理程序,图像数据的处理程序被处理器执行时实现上述任一实施例中的图像处理方法的步骤。
本申请实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有图像数据的处理程序,图像数据的处理程序被处理器执行时实现上述任一实施例中的图像处理方法的步骤。
在本申请实施例提供的智能终端和计算机可读存储介质的实施例中,包含了上述图像处理方法各实施例的全部技术特征,说明书拓展和解释内容与上述方法的各实施例基本相同,在此不做再赘述。
本申请实施例还提供一种计算机程序产品,计算机程序产品包括计算机程序代码,当计算机程序代码在计算机上运行时,使得计算机执行如上各种可能的实施方式中的方法。
本申请实施例还提供一种芯片,包括存储器和处理器,存储器用于存储计算机程序,处理器用于从存储器中调用并运行计算机程序,使得安装有芯片的设备执行如上各种可能的实施方式中的方法。
可以理解,上述场景仅是作为示例,并不构成对于本申请实施例提供的技术方案的应用场景的限定,本申请的技术方案还可应用于其他场景。例如,本领域普通技术人员可知,随着系统架构的演变和新业务场景的出现,本申请实施例提供的技术方案对于类似的技术问题,同样适用。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本申请实施例方法中的步骤可以根据实际需要进行顺序调整、合并和删减。
本申请实施例设备中的单元可以根据实际需要进行合并、划分和删减。
在本申请中,对于相同或相似的术语概念、技术方案和/或应用场景描述,一般只在第一次出现时进行详细描述,后面再重复出现时,为了简洁,一般未再重复阐述,在理解本申请技术方案等内容时,对于在后未详细描述的相同或相似的术语概念、技术方案和/或应用场景描述等,可以参考其之前的相关详细描述。
在本申请中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本申请技术方案的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本申请记载的范围。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上的一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设 备(可以是手机,计算机,服务器,被控终端,或者网络设备等)执行本申请每个实施例的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络,或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。可用介质可以是磁性介质,(例如,软盘、存储盘、磁带)、光介质(例如,DVD),或者半导体介质(例如固态存储盘Solid State Disk(SSD))等。
以上仅为本申请的优选实施例,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (17)

  1. 一种图像处理方法,其特征在于,包括:
    S1:获取图像数据流;
    S2:根据图像处理指令对所述图像数据流进行处理,确定或生成对应的图像处理信息,得到目标图像。
  2. 根据权利要求1所述的方法,其特征在于,所述图像数据流中包括图像数据,所述S2包括:
    根据图像处理指令对所述图像数据进行处理,得到目标图像数据,确定或生成对应的所述图像处理信息。
  3. 根据权利要求2所述的方法,其特征在于,所述图像数据流中包括图像基本信息,所述S2包括:
    判断所述处理是否会使所述图像基本信息发生变化;
    若确定所述图像基本信息发生变化,则确定或生成目标图像基本信息,基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,得到目标图像数据流,以根据所述目标图像数据流得到所述目标图像;和/或,
    若确定所述图像基本信息未发生变化,则基于所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,得到目标图像数据流,以根据所述目标图像数据流得到所述目标图像。
  4. 根据权利要求3所述的方法,其特征在于,所述图像处理信息包括以下至少一种特征:
    所述图像处理信息在所述目标图像数据流中,位于所述目标图像基本信息和所述目标图像数据之间;
    所述图像处理信息在所述目标图像数据流中,位于所述图像基本信息和所述目标图像数据之间。
  5. 根据权利要求3或4所述的方法,其特征在于,所述图像数据流还包括成像信息的标识;其中,所述基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,包括:
    判断所述处理是否会使所述成像信息发生变化;
    若确定所述成像信息发生变化,则确定或生成目标成像信息,并基于所述目标图像基本信息、所述目标成像信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新;和/或,
    若确定所述成像信息未发生变化,则基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新。
  6. 根据权利要求5所述的方法,其特征在于,
    所述目标成像信息和所述图像处理信息在所述目标图像数据流中,位于所述目标图像基本信息和所述目标图像数据之间,所述目标成像信息位于所述图像处理信息之前;
    所述成像信息和所述图像处理信息在所述目标图像数据流中,均位于所述目标图像基本信息和所述目标图像数据之间,且所述成像信息位于所述图像处理信息之前。
  7. 根据权利要求1至4中任一项所述的方法,其特征在于,所述方法还包括:
    若所述图像数据流包括原始图像处理信息,则将所述图像处理信息添加在所述图像数据流中的原始图像处理信息中。
  8. 根据权利要求7所述的方法,其特征在于,所述原始图像处理信息包括处理可逆标识,所述方法还包括:
    若所述处理可逆标识指示为可逆处理,则从所述目标图像数据流中剔除所述图像数据。
  9. 根据权利要求2至4中任一项所述的方法,其特征在于,所述图像基本信息包括下述至少一种:
    所述图像基本信息的长度、所述图像数据的类型标识、所述图像数据的长度、所述图像数据的宽度、所述图像数据的色彩空间、所述图像数据的位宽、或者所述图像数据的存储方式。
  10. 根据权利要求1至4中任一项所述的方法,其特征在于,所述图像处理信息包括下述至少一种:
    所述图像处理信息的长度、处理标识、处理可逆标识、处理描述类型信息、处理前数据保存标识、或者所述图像数据。
  11. 一种图像处理方法,其特征在于,包括:
    S10、获取图像数据流,所述图像数据流包括图像数据以及图像基本信息;
    S20、根据图像处理指令对所述图像数据进行处理,确定或生成目标图像基本信息,以对所述图像数据流进行更新,得到目标图像。
  12. 根据权利要求11所述的方法,其特征在于,所述S2包括:
    根据图像处理指令对所述图像数据进行处理,得到目标图像数据,确定或生成对应的图像处理信息;
    基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,得到目标图像数据流,以根据所述目标图像数据流得到所述目标图像。
  13. 根据权利要求12所述的方法,其特征在于,若所述图像数据流还包括成像信 息的标识;其中,所述基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新,包括:
    判断所述处理是否会使所述成像信息发生变化;
    若确定所述成像信息发生变化,则确定或生成目标成像信息,并基于所述目标图像基本信息、所述目标成像信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新;和/或,
    若确定所述成像信息未发生变化,则基于所述目标图像基本信息、所述目标图像数据和所述图像处理信息对所述图像数据流进行更新。
  14. 根据权利要求13所述的方法,其特征在于,
    所述目标成像信息和所述图像处理信息在所述目标图像数据流中,位于所述目标图像基本信息和所述目标图像数据之间,所述目标成像信息位于所述图像处理信息之前;
    所述成像信息和所述图像处理信息在所述目标图像数据流中,均位于所述目标图像基本信息和所述目标图像数据之间,且所述成像信息位于所述图像处理信息之前。
  15. 根据权利要求13所述的方法,其特征在于,所述成像信息包括下述至少一种:
    所述成像信息的长度、成像设备的快门时间、所述成像设备的感光度、所述成像设备的光圈、所述成像设备的焦距、所述成像设备的陀螺仪信息、所述成像设备的加速度、所述成像设备的地理位置信息、或者所述成像设备的影像旋转角度信息。
  16. 一种智能终端,其特征在于,所述智能终端包括:存储器、处理器,其中,所述存储器上存储有图像处理程序,所述图像处理程序被所述处理器执行时实现如权利要求1至15中任一项所述的图像处理方法的步骤。
  17. 一种可读存储介质,其特征在于,所述可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1至15中任一项所述的图像处理方法的步骤。
PCT/CN2021/122430 2021-09-30 2021-09-30 图像处理方法、智能终端及存储介质 WO2023050413A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202180102485.3A CN118104247A (zh) 2021-09-30 2021-09-30 图像处理方法、智能终端及存储介质
PCT/CN2021/122430 WO2023050413A1 (zh) 2021-09-30 2021-09-30 图像处理方法、智能终端及存储介质
US18/614,714 US20240242411A1 (en) 2021-09-30 2024-03-24 Image processing method, intelligent terminal, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/122430 WO2023050413A1 (zh) 2021-09-30 2021-09-30 图像处理方法、智能终端及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/614,714 Continuation US20240242411A1 (en) 2021-09-30 2024-03-24 Image processing method, intelligent terminal, and storage medium

Publications (1)

Publication Number Publication Date
WO2023050413A1 true WO2023050413A1 (zh) 2023-04-06

Family

ID=85781207

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/122430 WO2023050413A1 (zh) 2021-09-30 2021-09-30 图像处理方法、智能终端及存储介质

Country Status (3)

Country Link
US (1) US20240242411A1 (zh)
CN (1) CN118104247A (zh)
WO (1) WO2023050413A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1728801A (zh) * 2004-07-28 2006-02-01 奥林巴斯株式会社 数字照相机及图像数据记录方法
CN1929535A (zh) * 2005-09-07 2007-03-14 索尼株式会社 成像装置、图像处理装置、图像处理方法及计算机程序
US20130201349A1 (en) * 2012-02-02 2013-08-08 Apple Inc. Digital camera raw image support
CN105979235A (zh) * 2016-05-30 2016-09-28 努比亚技术有限公司 一种图像处理方法及终端
CN107277351A (zh) * 2017-06-30 2017-10-20 维沃移动通信有限公司 一种图像数据的处理方法和移动终端
CN110062159A (zh) * 2019-04-09 2019-07-26 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法、装置、电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1728801A (zh) * 2004-07-28 2006-02-01 奥林巴斯株式会社 数字照相机及图像数据记录方法
CN1929535A (zh) * 2005-09-07 2007-03-14 索尼株式会社 成像装置、图像处理装置、图像处理方法及计算机程序
US20130201349A1 (en) * 2012-02-02 2013-08-08 Apple Inc. Digital camera raw image support
CN105979235A (zh) * 2016-05-30 2016-09-28 努比亚技术有限公司 一种图像处理方法及终端
CN107277351A (zh) * 2017-06-30 2017-10-20 维沃移动通信有限公司 一种图像数据的处理方法和移动终端
CN110062159A (zh) * 2019-04-09 2019-07-26 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法、装置、电子设备

Also Published As

Publication number Publication date
CN118104247A (zh) 2024-05-28
US20240242411A1 (en) 2024-07-18

Similar Documents

Publication Publication Date Title
WO2022166765A1 (zh) 图像处理方法、移动终端及存储介质
CN113179370B (zh) 拍摄方法、移动终端及可读存储介质
CN113179369B (zh) 拍摄画面的显示方法、移动终端及存储介质
CN109739602A (zh) 一种移动终端壁纸设置方法及装置、移动终端及存储介质
CN107230065B (zh) 一种二维码显示方法、设备及计算机可读存储介质
WO2023005060A1 (zh) 拍摄方法、移动终端及存储介质
WO2022266907A1 (zh) 处理方法、终端设备及存储介质
CN107896304B (zh) 一种图像拍摄方法、装置及计算机可读存储介质
CN113301253B (zh) 一种天文图像的辅助拍摄方法、移动终端及存储介质
WO2024001853A1 (zh) 处理方法、智能终端及存储介质
CN113347372A (zh) 拍摄补光方法、移动终端及可读存储介质
CN112423211A (zh) 一种多音频传输控制方法、设备及计算机可读存储介质
CN109684020B (zh) 一种主题切换方法、设备及计算机可读存储介质
WO2022262259A1 (zh) 一种图像处理方法、装置、设备、介质和芯片
WO2023108444A1 (zh) 图像处理方法、智能终端及存储介质
WO2023050413A1 (zh) 图像处理方法、智能终端及存储介质
WO2023284218A1 (zh) 摄像控制方法、移动终端及存储介质
CN112822548B (zh) 一种投屏显示方法及装置、移动终端、存储介质
CN114143467A (zh) 一种基于自动对焦变焦的拍摄方法、移动终端及存储介质
CN112532786B (zh) 图像显示方法、终端设备和存储介质
CN107566745B (zh) 一种拍摄方法、终端和计算机可读存储介质
CN108335301B (zh) 一种拍照方法及移动终端
WO2023108443A1 (zh) 图像处理方法、智能终端及存储介质
WO2022133967A1 (zh) 拍摄的方法、终端及计算机存储介质
CN220673847U (zh) 摄像装置及智能终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21958981

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180102485.3

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE