CN114630153B - Parameter transmission method and device for application processor and storage medium - Google Patents

Parameter transmission method and device for application processor and storage medium Download PDF

Info

Publication number
CN114630153B
CN114630153B CN202011472314.3A CN202011472314A CN114630153B CN 114630153 B CN114630153 B CN 114630153B CN 202011472314 A CN202011472314 A CN 202011472314A CN 114630153 B CN114630153 B CN 114630153B
Authority
CN
China
Prior art keywords
image
parameter
processor
packet
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011472314.3A
Other languages
Chinese (zh)
Other versions
CN114630153A (en
Inventor
刘君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011472314.3A priority Critical patent/CN114630153B/en
Publication of CN114630153A publication Critical patent/CN114630153A/en
Application granted granted Critical
Publication of CN114630153B publication Critical patent/CN114630153B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a parameter transmission method, a device and a storage medium for an application processor, wherein the method comprises the following steps: receiving an image statistics parameter packet sent by an image processor, wherein the image statistics parameter packet is generated based on the image statistics parameters of at least one image block in the image obtained by the image processor; if the number of the parameter packets of the appointed type image statistical parameters in the received image statistical parameter packets reaches a preset threshold, receiving index information corresponding to the appointed type image statistical parameters sent by the image processor; display parameters of the image are acquired based on the index information and the received parameter packet specifying the type of image statistical parameters. By adopting the embodiment of the application, the reliability of parameter transmission of the image processor can be ensured, and real-time transmission of image statistical information is realized.

Description

Parameter transmission method and device for application processor and storage medium
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a method and apparatus for transmitting parameters of an application processor, and a storage medium.
Background
Along with the wide popularization and application of electronic devices (such as mobile phones, tablet computers, smart watches and the like), the electronic devices can support more and more applications, have more and more functions, and develop towards diversification and individuation, so that the electronic devices become indispensable electronic articles in the life of users.
In the prior art, it is required that 3A image statistics parameters (auto exposure (AE) image statistics parameters, auto Focus (AF) image statistics parameters, auto white balance (auto white balance, AWB) image statistics parameters) are transmitted to an application processor (application processor, AP) after the current frame processing is completed, and the image statistics information cannot be received in real time and processed in time, which affects the 3A convergence performance.
Disclosure of Invention
The embodiment of the application provides a parameter transmission method, a device and a storage medium for an application processor, which can transmit image statistical parameters in real time.
In a first aspect, an embodiment of the present application provides a parameter transmission method for an image processor, where the method includes:
receiving an image statistics parameter packet sent by an image processor, wherein the image statistics parameter packet is used for acquiring image statistics parameters of at least one image block in an image by the image processor, and generating an image statistics parameter packet based on the image statistics parameters of the at least one image block;
If the number of the parameter packets of the specified type of image statistical parameters in the received image statistical parameter packets reaches a preset threshold, receiving index information corresponding to the specified type of image statistical parameters sent by the image processor, wherein the index information is used for representing the corresponding relation between the parameter packets of the specified type of image statistical parameters and the image blocks;
and acquiring display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters.
In a second aspect, embodiments of the present application provide an application processor, where the application processor is configured to perform the following operations:
receiving an image statistics parameter packet sent by an image processor, wherein the image statistics parameter packet is used for acquiring image statistics parameters of at least one image block in an image by the image processor, and generating an image statistics parameter packet based on the image statistics parameters of the at least one image block;
if the number of the parameter packets of the specified type of image statistical parameters in the received image statistical parameter packets reaches a preset threshold, receiving index information corresponding to the specified type of image statistical parameters sent by the image processor, wherein the index information is used for representing the corresponding relation between the parameter packets of the specified type of image statistical parameters and the image blocks;
And acquiring display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters.
In a third aspect, embodiments of the present application provide an electronic device comprising an application processor, an image processor, a memory for storing one or more programs and configured to be executed by the application processor, the programs comprising instructions for performing part or all of the steps as described in the first aspect, or the electronic device comprising an application processor as described in the second aspect.
In a fourth aspect, embodiments of the present application provide another electronic device, including an image processor and an application processor, wherein,
the image processor is used for acquiring image statistical parameters of at least one image block in the image;
the image processor is used for generating an image statistical parameter packet based on the image statistical parameter of the at least one image block and sending the image statistical parameter packet to the application processor;
the application processor is used for receiving the image statistical parameter packet sent by the application processor;
the image processor is configured to send index information corresponding to a specified type of image statistics parameter to the application processor if the number of parameter packets of the specified type of image statistics parameter in the sent image statistics parameter packets reaches a predetermined threshold, where the index information is used to characterize a correspondence between the parameter packets of the specified type of image statistics parameter and the image block;
The application processor is used for acquiring the display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program causes a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application.
In a sixth aspect, embodiments of the present application provide a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
By implementing the embodiment of the application, the following beneficial effects are achieved:
it can be seen that, in the parameter transmission method, the device and the storage medium for an application processor described in the embodiments of the present application, the image statistics parameters of at least one image block in an image are obtained, based on the image statistics parameters of at least one image block, an image statistics parameter packet is generated and sent to the application processor, if the number of parameter packets of the specified type of image statistics parameters in the sent image statistics parameter packet reaches a predetermined threshold, index information corresponding to the specified type of image statistics parameters is sent to the application processor, so that the application processor obtains the display parameters of the image based on the index information and the received parameter packets of the specified type of image statistics parameters, wherein the index information is used for characterizing the correspondence between the parameter packets of the specified type of image statistics parameters and the image blocks.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic software structure of an electronic device according to an embodiment of the present application;
fig. 3A is a flowchart of a parameter transmission method for an image processor according to an embodiment of the present application;
fig. 3B is a schematic structural diagram of a data packet according to an embodiment of the present application;
FIG. 3C is a schematic illustration of a parameter transmission for an image processor provided in an embodiment of the present application;
FIG. 3D is a schematic illustration of another parameter transmission for an image processor provided in an embodiment of the present application;
FIG. 3E is a schematic illustration of a unpacking operation provided by an embodiment of the present application;
FIG. 3F is a schematic flow chart of an unpacking operation provided in an embodiment of the present application;
FIG. 4 is a flowchart of another method for parameter transmission of an application processor according to an embodiment of the present application;
fig. 5 is a flowchart of another parameter transmission method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of another electronic device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of another electronic device according to an embodiment of the present application;
fig. 8 is a functional unit block diagram of a parameter transmission device according to an embodiment of the present application;
fig. 9 is a block diagram of functional units of another parameter transmission apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
For a better understanding of aspects of embodiments of the present application, related terms and concepts that may be related to embodiments of the present application are described below.
In particular implementations, the electronic device may include various devices with computer functions, such as a handheld device (smart phone, tablet, etc.), a vehicle-mounted device (navigator, auxiliary back-up system, automobile data recorder, automobile refrigerator, etc.), a wearable device (smart bracelet, wireless headset, smart watch, smart glasses, etc.), a computing device or other processing device connected to a wireless modem, and various forms of User Equipment (UE), a Mobile Station (MS), a virtual reality/augmented reality device, a terminal device (terminal device), etc., and the electronic device may also be a base Station or a server.
The electronic device may further include an intelligent home device, where the intelligent home device may be at least one of: the intelligent sound box, the intelligent camera, the intelligent electric cooker, the intelligent wheelchair, the intelligent massage chair, the intelligent furniture, the intelligent dish washer, the intelligent television, the intelligent refrigerator, the intelligent electric fan, the intelligent warmer, the intelligent clothes hanger, the intelligent lamp, the intelligent router, the intelligent switch board, the intelligent humidifier, the intelligent air conditioner, the intelligent door, the intelligent window, the intelligent cooking bench, the intelligent disinfection cabinet, the intelligent toilet, the sweeping robot and the like are not limited herein.
In the first part, the software and hardware operation environment of the technical scheme disclosed in the application is introduced as follows.
As shown, fig. 1 shows a schematic structural diagram of an electronic device 100. Electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a compass 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor AP, a modem processor, a graphics processor GPU, an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor NPU, etc. Wherein the different processing units may be separate components or may be integrated in one or more processors. In some embodiments, the electronic device 101 may also include one or more processors 110. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. In other embodiments, memory may also be provided in the processor 110 for storing instructions and data. Illustratively, the memory in the processor 110 may be a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. This avoids repeated accesses and reduces the latency of the processor 110, thereby improving the efficiency of the electronic device 101 in processing data or executing instructions. The processor may also include an image processor, which may be an image preprocessor (preprocess image signal processor, pre-ISP), which may be understood as a simplified ISP, which may also perform some image processing operations, e.g. may obtain image statistics.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include inter-integrated circuit (inter-integrated circuit, I2C) interfaces, inter-integrated circuit audio (inter-integrated circuit sound, I2S) interfaces, pulse code modulation (pulse code modulation, PCM) interfaces, universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interfaces, mobile industry processor interfaces (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interfaces, SIM card interfaces, and/or USB interfaces, among others. The USB interface 130 is an interface conforming to the USB standard, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 101, or may be used to transfer data between the electronic device 101 and a peripheral device. The USB interface 130 may also be used to connect headphones through which audio is played.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle times, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G/6G, etc. applied on the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (FLED), a mini light-emitting diode (mini light-emitting diode), microLed, micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or more display screens 194.
The electronic device 100 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature, etc. of the photographed scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or more cameras 193.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the electronic device 101 to execute the methods of displaying page elements, and various applications, data processing, and the like provided in some embodiments of the present application by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area can store an operating system; the storage program area may also store one or more applications (such as gallery, contacts, etc.), etc. The storage data area may store data created during use of the electronic device 101 (e.g., photos, contacts, etc.), and so on. In addition, the internal memory 121 may include high-speed random access memory, and may also include nonvolatile memory, such as one or more disk storage units, flash memory units, universal flash memory (universal flash storage, UFS), and the like. In some embodiments, processor 110 may cause electronic device 101 to perform the methods of displaying page elements provided in embodiments of the present application, as well as other applications and data processing, by executing instructions stored in internal memory 121, and/or instructions stored in a memory provided in processor 110. The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., X, Y and Z axis) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
By way of example, fig. 2 shows a block diagram of the software architecture of the electronic device 100. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 2, the application layer may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In the second part, the parameter transmission method, the parameter transmission device and the storage medium for the image processor disclosed in the embodiment of the application are described as follows.
Further, based on the structure of fig. 1 or fig. 2, the present application provides reference to fig. 3A, and fig. 3A is a schematic flow chart of a parameter transmission method for an image processor according to an embodiment of the present application, where the parameter transmission method for an image processor is applied to an electronic device including the one shown in fig. 1, and as shown in the figure, the parameter transmission method for an image processor includes:
301. an image statistics parameter of at least one image block in the image is obtained.
The image may be original image data, and for a frame of image, the image processor may acquire the original image data pixel by pixel, that is, the image processor scans line by line to acquire the original image data, which may be understood that all pixels of the original image data need to be scanned, in this embodiment of the present application, an area of an image block may be divided according to a preset rule, for example, each 5 rows of pixels may be used as an image block, and for example, an area (location) of the image block may be planned in advance, and when all pixels in the area corresponding to the image block are scanned, an image block may be obtained. Ultimately, the original image data may be understood as a plurality of image blocks, each of which may contain a portion of the original image data.
Of course, in a specific implementation, the image processor may acquire, during the image acquisition, an image statistical parameter of at least one image block in the image, where the image statistical parameter may be at least one of the following: the auto exposure AE image statistics, auto focus AF image statistics, AWB image statistics, auto lens shading correction (lens shading correction, LSC) image statistics, auto moire (FLK) image statistics, and the like are not limited herein. Thus, the type of image statistics parameters may be at least one of: AE. AF, AWB, LSC, FLK, etc., without limitation herein.
In a specific implementation, the original image data may be raw data of one or more frames of images. The image statistics package may be part or all of the image statistics of the original image data. The image processor may start acquiring the target image statistics parameters of any image block after successfully acquiring any image block, for example, when the original image data is loaded on the j-th line, it is considered that an image block is acquired, and then the image statistics parameters of any image block may start to be acquired, for example, when the original image data is loaded on a preset number of pixels, it is considered that an image block is acquired, and then the corresponding image statistics information may be acquired, where the preset number may be set by a user or default by the system. That is, in the embodiment of the present application, the image statistics parameters may be obtained and packaged and transmitted during the loading process of the original image data.
Wherein each image block may correspond to at least one image statistics packet, or a plurality of image blocks may correspond to at least one image statistics packet. The image statistics package may also include at least one data package, and each data package may include one or more types of image statistics parameters, for example, one data package may include only AE type image statistics parameters, for example, one data package may include AE type and AF type image statistics parameters, and for example, one data package may include AE type, AWB type and AF type image statistics parameters.
302. And generating an image statistical parameter packet based on the image statistical parameter of the at least one image block, and sending the image statistical parameter packet to an application processor.
In particular implementations, the image processor may generate a corresponding image statistics package based on the image statistics of one or more image blocks and send the image statistics package to the application processor. An image statistics package may be understood as one or more data packages.
Further, the image processor may send the image statistics packet to the application processor using at least one data transmission channel. For example, the image processor may send the image statistics packets to the application processor in turn using a data transmission channel. For another example, the image processor may divide the image statistics packet into a plurality of data sets, each data set corresponding to one or a portion of the image statistics packet, and send the plurality of data sets to the application processor using at least one data transmission channel, where the data transmission channels may correspond to the data sets one to one, and each data transmission channel may correspond to a process or thread.
In a specific implementation, the image processor not only can send the image statistics parameter packet to the application processor, but also can send the original image data to the application processor, for example, the image statistics parameter packet can be loaded in a transmission channel corresponding to the original image data in real time and is transmitted to the application processor by using MIPI. In a specific implementation, the original image data and the image statistics parameter packet may be separately sent to the application processor, for example, the image statistics parameter packet may be sent to the application processor first, after the image statistics parameter packet is sent, the original image data may be sent to the application processor, so after the image statistics parameter packet is sent to the application processor, the application processor may perform a unpacking operation on the image statistics parameter packet, and call a corresponding algorithm based on the image statistics parameter obtained after unpacking, that is, before the original image data is completely sent to the application processor, the corresponding algorithm is prepared in advance, which is helpful for improving the image processing efficiency.
In one possible example, any one of the image statistics parameter packets includes: packet Header (PH), effective image statistics, and packet trailer (PF), each packet effective image statistics may correspond to at least one type of image statistics.
As shown in fig. 3B, the image statistics parameter packet may include a packet header PH, an effective image statistics parameter, and a packet tail PF, where the packet header may be used to mark a start position of a data packet, the effective image statistics parameter is a partial image statistics parameter of one type of image statistics parameter, and all effective image statistics parameters in the data packet corresponding to all image blocks of the original image data form a complete image statistics parameter of the original image data, and the packet tail may be used to represent an end position of the data packet. The length of the valid image statistics in the data packets for each type of image statistics may be the same or different.
In one possible example, the packet header includes a packet header flag, an index packet flag, and a packet data length.
Wherein, the packet head can include: the packet header flag is used for indicating the statistics type of the current data packet (image statistics parameter packet), the index packet flag is used for indicating whether the current data packet is statistics data or independent index, and the packet data length is used for indicating the data length of the current data packet, and the specific structure is as shown in the following table:
header structure Byte length
Header marking Byte3
Index package marking Byte2
Packet data length Byte1+Byte0
In one possible example, the package tail includes: end of packet flag, packet count, and frame count.
Wherein, the package tail can include: the method comprises the steps of a packet tail mark, a parameter packet count and a frame count, wherein the packet tail mark is used for representing the packet tail position, the parameter packet count is used for representing the count (the number) of the parameter packet of the current statistical type, and the frame count represents the original image data of which frame image the parameter packet comes from, and the specific table is as follows:
tail wrapping structure Byte length
Tail wrapping mark Byte3
Packet counting Byte2
Frame counting Byte1+Byte0
Optionally, the step 302 of sending the image statistics packet to the application processor may be implemented as follows:
and sending the image statistical parameter packet to the application processor through a preset virtual channel.
The preset virtual channel (virtual channel) may be set by the user or default by the system, and the image processor may send the image statistics parameter packet to the application processor through the preset virtual channel. In a specific implementation, one or more virtual channels may be used to send the image statistics parameter packet to the application processor, where each virtual channel may correspond to a thread or process.
Optionally, the step 302 of sending the image statistics parameter packet to the application processor may include the following steps:
21. acquiring target attribute information corresponding to the image statistics parameter packet;
22. determining a target channel corresponding to the target attribute information according to a mapping relation between preset attribute information and the channel;
23. and transmitting the image statistical parameter packet through the target channel.
In this embodiment of the present application, the attribute information may be at least one of the following: the data type of the data in the parameter packet, the number of data bits (data length) of the data in the parameter packet, the type of the parameter packet (image statistics parameter type), and the like are not limited herein, wherein the data type may be at least one of the following: floating point type (single precision type, double precision type), integer type, etc., without limitation, the type of the parameter packet may be at least one of the following: AE. AF, AWB, LSC, FLK, etc., without limitation herein. As shown in fig. 3C, different types of image statistics packages may correspond to different channels, or different data lengths may correspond to different channels, for example, a mobile industry processor interface (mobile industry processor interface, MIPI) channel may include 3 image data interfaces (image data interface, IDI), which may correspond to: the image statistics of IDI0, IDI1, and IDI2, k1 types may correspond to IDI0, and the image statistics of k2 types may correspond to DII1.
In a specific implementation, a memory of the electronic device may store a mapping relationship between preset attribute information and a channel in advance. Taking the parameter packet i as an example, the parameter packet i is any parameter packet in the image statistics parameter packet, and the image processor can acquire target attribute information corresponding to the parameter packet i, so that a target channel corresponding to the target attribute information can be determined according to a mapping relation between preset attribute information and channels, and the parameter packet i can be transmitted through the target channel, so that a corresponding channel can be allocated according to the attribute of the parameter packet, and the data transmission efficiency can be improved.
Further, the image processor may send a part of the parameter packets of the image statistics parameter packet to the application processor through a preset virtual channel, and send the rest of the parameter packets of the image statistics parameter packet to the application processor through a mode of selecting a corresponding channel by attribute information, where the two different modes may be performed synchronously or asynchronously, for example, one process or a procedure may be adopted to send a part of the parameter packets of the image statistics parameter packet to the application processor through the preset virtual channel, and another thread or a procedure may be adopted to send the rest of the parameter packets of the image statistics parameter packet to the application processor through a mode of selecting a corresponding channel by attribute information.
In one possible example, the generating of the image statistics package based on the image statistics of the at least one image block in step 302 may be implemented as follows:
and packaging the image statistical parameters of each image block of the at least one image block to obtain the image statistical parameter package.
In a specific implementation, the image processor packages the image statistics parameters of each image block of at least one image block to obtain an image statistics parameter package, namely, one image block is coming, and the image statistics parameters of the image block can be packaged to obtain the image statistics parameter package.
Further, optionally, the packaging the image statistics parameters of each image block of the at least one image block to obtain the image statistics parameter packet includes:
after the complete image statistical parameters of one image block are obtained, the image statistical parameter packet is generated based on the image statistical parameters of the one image block.
In a specific implementation, after the complete image statistical parameter of any image block is obtained, the image statistical parameter of any image block can be packaged to obtain an image statistical parameter package. The one image block is any one of the at least one image block.
303. And if the number of the parameter packets of the specified type of image statistical parameters in the sent image statistical parameter packets reaches a preset threshold, sending index information corresponding to the specified type of image statistical parameters to the application processor, so that the application processor obtains the display parameters of the image based on the index information and the received parameter packets of the specified type of image statistical parameters, wherein the index information is used for representing the corresponding relation between the parameter packets of the specified type of image statistical parameters and the image blocks.
Wherein the predetermined threshold may be set by the user himself or by default. The specified type of image statistics parameters may be set by the user himself or by default in the system. The display parameter of the image may be at least one of: display brightness, pixel values, display color, resolution, contrast, sharpness, etc., are not limited herein. The specified type of image statistics may be at least one type of image statistics in the image.
In a specific implementation, if the number of parameter packets of the specified type of image statistics parameters in the sent image statistics parameter packets reaches a predetermined threshold, transmission of the remaining image statistics parameter packets corresponding to the specified type of image statistics parameters to the application processor can be stopped, so that power consumption of the device can be reduced to a certain extent.
In addition, when the number of the parameter packets corresponding to the image statistical parameter of the specified type sent to the application processor reaches a predetermined threshold value, index information corresponding to the image statistical information of the specified type may be sent to the application processor, where the index information may be placed in an index data packet, the index data packet may include a target index table, and the target index table records related information of the data packet corresponding to the image statistical information of the specified type, where the related information may include at least one of the following: the index packet flag of the data packet, the storage location of the image statistics parameters in the data packet, and the like are not limited herein. In addition, the offset of the target index table of the specified type image statistics parameters in the total data packets (all data packets of the original image data) can be pre-stored in the image processor, when the sending of the specified type image statistics information is completed, the offset corresponding to the target index table can be sent to the application processor, the application processor can acquire the target index table of the specified type image statistics information, further, the image statistics parameter packets corresponding to the target index table can be acquired from a plurality of data packets already received by the application processor, the unpacking operation is performed on the image statistics parameter packets according to the index sequence of the target index table, the specified type image statistics parameters can be obtained, and the algorithm for calling the type image statistics information can be realized, so that the corresponding image processing operation can be realized, the unpacking operation can be performed on the data packets corresponding to any type image statistics parameters, the unpacking operation is not required to be completed, and the image processing efficiency is improved.
Further, in one possible example, after the sending, to the application processor, index information corresponding to the specified type of image statistics parameter, the method may further include the following steps:
and sending a notification message to the application processor in a preset interrupt mode, wherein the notification message is used for indicating that the number of parameter packets of the specified type of image statistical parameters reaches the preset threshold value.
The preset interrupt mode can be set by a user or default by the system. The predetermined interrupt mode may be a general purpose input output (general purpose input output, GPIO) interrupt mode or a mobile industry processor interface (mobile industry processor interface, MIPI) channel. For example, an additional data packet is sent to the AP through the MIPI channel, and the AP is notified that the transmission task is completed by the additional data packet specifying the type of image statistics parameters.
In a specific implementation, when the number of parameter packets transmitted to an application processor by a parameter packet corresponding to a statistical parameter of an image of a specified type reaches a predetermined threshold, the image processor may send a notification message to the application processor in a preset interrupt mode, where the notification message is used to indicate that the number of parameter packets of the statistical parameter of the image of the specified type has reached the predetermined threshold.
For example, as shown in fig. 3C, taking the 3A image statistics as an example, the 3A image statistics may include: the AE image statistical parameter, the AF image statistical parameter and the AWB image statistical parameter may form a 3A image statistical parameter data packet from the 3 types of image statistical parameters, that is, at least one data packet is obtained, each type of image statistical parameter of each data packet may correspond to an image statistical parameter index table, or each type of image statistical parameter of each frame image may correspond to an image statistical parameter index table, where the image statistical parameter index table may be an AE image statistical parameter index table, an AWB image statistical parameter index table, or an AF image statistical parameter index table, and the data packets may be sent according to an index sequence of the image statistical parameter index table in a data packet transmission process. After the sending of each type of image statistical parameter is completed, the image statistical parameter index table corresponding to the type of image statistical parameter can be sent to the application processor, and the application processor can perform unpacking operation according to the image statistical parameter index table, specifically, can perform unpacking operation according to the index sequence corresponding to the image statistical parameter index table. When at least one data packet includes multiple types of image statistics parameters, the at least one data packet may be sent out of order or orderly, the out of order may be understood as sending one data packet of one type of image statistics parameters at the moment, sending one data packet of another type of image statistics parameters at the moment, the orderly may be understood as sending one data packet of one type of image statistics parameters in a time period, and when the sending of the data packet of the type of image statistics parameters is completed, sending the data packet of another type of image statistics parameters.
For example, as shown in fig. 3D, the image processor may send one of the image statistics parameter packets to the application processor through MIPI, in the process of sending the parameter packets, record the index of each parameter packet, when the sending of the last parameter packet reaching the predetermined threshold number corresponding to the image statistics parameter of a specified type is completed, the application processor may set the GPIO interrupt, and the application processor may search the index corresponding to the image statistics parameter of the specified type through the index table, parse the image statistics parameter corresponding to the image statistics parameter of the specified type from the image statistics parameter packet through the index table, and arrange the image statistics parameter of the specified type according to the index order of the index table to obtain the image statistics parameter of the specified type, and then further call the algorithm corresponding to the image statistics parameter of the specified type, that is, when the image statistics parameter of any type is finished in advance, the application processor may immediately start the corresponding algorithm, without waiting for the receiving of all the image statistics parameters to be completed, thereby improving the image processing efficiency.
Further, as shown in fig. 3E, when the parameter transmission of each type of image processor is completed, the offset of the index table corresponding to the type of image statistics parameter may be sent to the application processor, the application processor may locate the index table according to the offset to unpack the corresponding parameter packet, and arrange the unpacked image effective image statistics parameters according to the order of the index table to obtain the final type of image statistics parameter, for example, the AF type of image statistics parameter, where the index table corresponding to the AF type of image statistics parameter includes f, j, n, q, t data packets, and unpack the corresponding data packets according to the index table. Specifically, as shown in fig. 3F, when the application processor receives that a certain type of image statistics parameter is completed, the application processor may read an index packet corresponding to the parameter packet that has completed statistics, for example, may sequentially read the content of the index packet to an index position index_n in units of 32bits, read 32bits of packed data starting from the index_n as a header tag (PH), parse the PH content, read the length of a data segment corresponding to the index, copy a piece of statistical data into a target cache, detect whether the traversing of the index packet is completed, if yes, unpack the current image statistics parameter, otherwise, execute the step of reading 32bits of packed data starting from the index_n again as the header tag (PH) and its subsequent steps until the traversing of the index packet is completed.
Further, after receiving the GPIO interrupt, the AP queries, through the secure digital input output card (secure digital input and output, SDIO), which image statistics parameter is currently completed, and obtains the starting position of the index packet of the image statistics parameter, and then the AP side can quickly locate each type of image statistics parameter by means of the index packet, so as to start the corresponding algorithm immediately. White balance algorithms, image enhancement algorithms, deblurring algorithms, image segmentation algorithms, image enhancement algorithms, interpolation algorithms, and the like, are not limited herein.
In one possible example, the step 301 of obtaining the image statistics of at least one image block in the image may include the following steps:
a11, acquiring target shooting parameters;
a12, determining a first target image statistic parameter type corresponding to the target shooting parameter according to a mapping relation between a preset shooting parameter and an image statistic parameter type;
a13, acquiring a target image statistical parameter corresponding to the first target image statistical parameter type from at least one image block of the image.
In this embodiment of the present application, the shooting parameter may be at least one of the following: exposure time, photographing mode, sensitivity ISO, white balance parameter, focal length, focus, region of interest, and the like, are not limited herein.
In a specific implementation, a mapping relationship between a preset shooting parameter and an image statistics parameter type may be stored in a memory of the electronic device in advance, where the mapping relationship is shown in the following table:
shooting parameters Image statistics parameter type
Shooting parameter a1 Image statistics parameter type A1
Shooting parameter a2 Image statistics parameter type A2
... ...
Shooting parameter an Image statistics parameter type An
I.e. different shooting parameters correspond to different image statistics parameter types.
Furthermore, the image processor may acquire the target shooting parameter, determine a first target image statistical parameter type corresponding to the target shooting parameter according to a mapping relationship between a preset shooting parameter and an image statistical parameter type, and acquire a target image statistical parameter corresponding to the first target image statistical parameter type from at least one image block of the image, so that the corresponding image statistical parameter may be selected according to the shooting requirement.
In one possible example, the step 301 of obtaining the image statistics of at least one image block in the image may include the following steps:
B11, acquiring target environment parameters;
b12, determining a second target image statistical parameter type corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and an image statistical parameter type;
b13, acquiring a target image statistical parameter corresponding to the second target image statistical parameter type from at least one image block of the image.
In this embodiment of the present application, the environmental parameters may include external environmental parameters, and/or internal environmental parameters, where the external environmental parameters may be understood as an objectively existing physical environment, that is, a natural environment, and the external environmental parameters may be at least one of the following: ambient temperature, ambient humidity, ambient light, barometric pressure, geographic location, magnetic field disturbance strength, shaking parameters, and the like, are not limited herein. Wherein the environmental parameter may be collected by an environmental sensor, which may be at least one of: temperature sensor, humidity sensor, ambient light sensor, meteorological sensor, location sensor, magnetic field detection sensor. The internal environment parameter may be understood as an environment parameter generated by the operation of the respective modules of the electronic device, and may be at least one of the following: CPU temperature, GPU temperature, dithering parameters, CPU core number, etc., are not limited herein.
In a specific implementation, the electronic device may include a memory, where a mapping relationship between a preset environmental parameter and an image statistical parameter type may be stored in the memory in advance, where the mapping relationship is shown in the following table:
environmental parameters Image statistics parameter type
Environmental parameter b1 Image statistics parameter type B1
Shooting parameter b2 Image statistics parameter type B2
... ...
Shooting parameter bn Image statistical parameter type Bn
I.e. different environmental parameters correspond to different image statistics types.
Furthermore, the image processor may acquire the target environmental parameter, determine a second target image statistical parameter type corresponding to the target environmental parameter according to the mapping relationship, and acquire the target image statistical parameter corresponding to the second target image statistical parameter type from at least one image block of the image, so that the corresponding image statistical parameter may be acquired according to the shooting environment.
In one possible example, before the step 301, the following steps may be further included:
c1, the image processor acquires first original image data, wherein the first original image data is part of original image data of a current processing image frame;
c2, the image processor determines a target image quality evaluation value of the first original image data;
And C3, when the target image quality evaluation value is larger than a preset image quality evaluation value, the image processor executes step 301.
Wherein the first raw image data may be part of the raw image data of the currently processed image frame before the raw image data is not loaded. The preset image quality evaluation value may be set by the user himself or by default of the system. In a specific implementation, the image processor may acquire the first original image data, and the image processor may perform image quality evaluation on the first original image data by using at least one image quality evaluation index to obtain a target image quality evaluation value, where the image quality evaluation index may be at least one of the following: information entropy, average gradient, average gray, contrast, etc., are not limited herein. Step 301 may be performed when the target image quality evaluation value is greater than the preset image quality evaluation value, otherwise, the camera may be invoked to re-photograph.
Further, in the step C2, the determining, by the image processor, the target image quality evaluation value of the first original image data may include the steps of:
c21, determining the distribution density of target feature points and the target signal-to-noise ratio of the first original image data;
C22, determining a first image quality evaluation value corresponding to the target feature point distribution density according to a mapping relation between the preset feature point distribution density and the image quality evaluation value;
c23, determining a target image quality deviation value corresponding to the target signal-to-noise ratio according to a mapping relation between a preset signal-to-noise ratio and the image quality deviation value;
c24, acquiring first shooting parameters of the first original image data;
c25, determining a target optimization coefficient corresponding to the first shooting parameter according to a mapping relation between a preset shooting parameter and the optimization coefficient;
and C26, adjusting the first image quality evaluation value according to the target optimization coefficient and the target image quality deviation value to obtain the target image quality evaluation value.
In a specific implementation, a memory in the electronic device may store a mapping relationship between a preset feature point distribution density and an image quality evaluation value, a mapping relationship between a preset signal-to-noise ratio and an image quality deviation value, and a mapping relationship between a preset shooting parameter and an optimization coefficient in advance, where a value range of the image quality evaluation value may be 0-1, or may be 0-100. The image quality deviation value may be a positive real number, for example, 0 to 1, or may be greater than 1. The value range of the optimization coefficient can be between-1 and 1, for example, the optimization coefficient can be between-0.1 and 0.1. In this embodiment of the present application, the shooting parameter may be at least one of the following: exposure time, photographing mode, sensitivity ISO, white balance parameter, focal length, focus, region of interest, and the like, are not limited herein.
In a specific implementation, the electronic device may determine a target feature point distribution density and a target signal-to-noise ratio of the first original image data, and determine, according to a mapping relationship between a preset feature point distribution density and an image quality evaluation value, a first image quality evaluation value corresponding to the target feature point distribution density, where the feature point distribution density reflects image quality to a certain extent, and the feature point distribution density may be understood as a ratio between a total number of feature points of the first original image data and an image area of the first original image data. Furthermore, the electronic device may determine the target image quality deviation value corresponding to the target signal-to-noise ratio according to the mapping relationship between the preset signal-to-noise ratio and the image quality deviation value, and when generating the image, due to some noise generated by external (weather, light, angle, jitter, etc.) or internal (system, GPU) reasons, the noise may have some influence on the image quality, so that the image quality may be adjusted to some extent to ensure objective evaluation of the image quality.
Further, the electronic device may further obtain a first shooting parameter of the first original image data, determine a target optimization coefficient corresponding to the first shooting parameter according to a mapping relationship between the preset shooting parameter and the optimization coefficient, and the setting of the shooting parameter may also have a certain influence on the image quality evaluation, so that an influence component of the shooting parameter on the image quality needs to be determined, and finally, adjust the first image quality evaluation value according to the target optimization coefficient and the target image quality deviation value to obtain a target image quality evaluation value, where the target image quality evaluation value may be obtained according to the following formula:
In the case where the image quality evaluation value is a percentile, the specific calculation formula is as follows:
target image quality evaluation value= (first image quality evaluation value+target image quality deviation value) × (1+target optimization coefficient)
In the case where the image quality evaluation value is a percentage, a specific calculation formula is as follows:
target image quality evaluation value=first image quality evaluation value (1+target image quality deviation value) ×1+target optimization coefficient
Therefore, the image quality can be objectively evaluated by combining the influences of internal and external environment factors, shooting setting factors and the like, and the image quality evaluation accuracy is improved.
It can be seen that, in the parameter transmission method for an image processor described in the embodiments of the present application, the image statistics parameters of at least one image block in an image are obtained, based on the image statistics parameters of at least one image block, an image statistics parameter packet is generated and sent to an application processor, if the number of parameter packets of the specified type of image statistics parameters in the sent image statistics parameter packet reaches a predetermined threshold, index information corresponding to the specified type of image statistics parameters is sent to the application processor, so that the application processor obtains the display parameters of the image based on the index information and the received parameter packets of the specified type of image statistics parameters, wherein the index information is used for characterizing the correspondence between the parameter packets of the specified type of image statistics parameters and the image blocks.
Fig. 4 is a flow chart of a parameter transmission method for an application processor, which is provided in an embodiment of the present application, and is applied to an electronic device, where the electronic device includes an image processor and an application processor, and as shown in the drawing, the parameter transmission method for the image processor includes:
401. and receiving an image statistical parameter packet sent by an image processor, wherein the image statistical parameter packet is obtained by the image processor and is generated based on the image statistical parameter of at least one image block in an image.
402. And if the number of the parameter packets of the specified type of image statistical parameters in the received image statistical parameter packets reaches a preset threshold, receiving index information corresponding to the specified type of image statistical parameters sent by the image processor, wherein the index information is used for representing the corresponding relation between the parameter packets of the specified type of image statistical parameters and the image blocks.
403. And acquiring display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters.
In a possible example, the step 403, which obtains the display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters, may include the steps of:
31. unpacking the received image statistical parameter packet based on the index information to obtain a target image statistical parameter of the specified type of image statistical parameter;
32. invoking a target image processing algorithm corresponding to the target image statistical parameter;
33. and processing the image according to the target image processing algorithm to obtain the display parameters of the image.
Wherein, different image statistical parameters correspond to different image processing algorithms, and the image processing algorithms can be at least one of the following: image enhancement algorithms, white balance algorithms, image noise reduction algorithms, image defogging algorithms, etc., are not limited herein. In a specific implementation, a mapping relation between the image statistics parameter type and the image processing algorithm may be stored in advance, and then, an initial image processing algorithm corresponding to the specified type image statistics parameter and an initial algorithm control parameter corresponding to the image processing algorithm may be determined according to the mapping relation, where the initial algorithm control parameter is used to adjust the processing degree of the algorithm, and different algorithms may correspond to different initial algorithm control parameters, and the initial algorithm control parameter may be preset or default. Furthermore, the statistical parameters of the target image may be processed to obtain corresponding target feature parameters, where the target feature parameters may be at least one of the following: mean, mean square error, entropy, etc., are not limited herein. The application processor can also acquire a mapping relation between the pre-stored characteristic parameters and the optimization parameters, further, a target optimization parameter corresponding to the target characteristic parameters can be determined according to the mapping relation, the initial algorithm control parameter is adjusted according to the target optimization parameter, a corresponding reference algorithm control parameter is obtained, the reference algorithm control parameter and the initial image processing algorithm are used as a target image processing algorithm, and the image is processed based on the target image processing algorithm, so that the display parameter of the image is obtained.
Further, in one possible example, before receiving the index information corresponding to the specified type of image statistics parameter sent by the image processor, step 402 may further include the following steps:
and receiving a notification message sent by the image processor, wherein the notification message is sent to the application processor by the image processor in a preset interrupt mode, and the notification message is used for indicating that the number of parameter packets of the specified type of image statistical parameters reaches the preset threshold value.
In one possible example, the image statistics parameter packet includes: the system comprises a packet head, effective image statistical parameters and a packet tail, wherein the effective image statistical parameters correspond to at least one type of image statistical parameters.
In one possible example, the packet header includes a packet header flag, an index packet flag, and a packet data length.
In one possible example, the package tail includes: end of packet flag, parameter packet count, and frame count.
In one possible example, the step 401 of receiving the image statistics packet sent by the image processor may be implemented as follows:
and receiving the image statistical parameter packet sent by the image processor through a preset virtual channel.
In one possible example, the step 401 of receiving the image statistics packet sent by the image processor may be implemented as follows:
and transmitting and receiving the image statistical parameter packet through a target channel, wherein the target channel acquires target attribute information corresponding to the image statistical parameter packet by the image processor, and determines the target channel corresponding to the target attribute information according to a mapping relation between preset attribute information and channels.
The detailed description of all the steps in the embodiment corresponding to fig. 4 may refer to the related description of the parameter transmission method for the image processor described in fig. 3A, which is not repeated herein.
It can be seen that, in the parameter transmission method for an application processor described in the embodiments of the present application, since in the process of acquiring original image data, image statistics parameters are acquired and packaged and sent to the application processor, and all parameter packets are not required to be sent to the image processor, when the number of parameter packets of the specified type of image statistics parameters reaches a certain number, the parameter packets can be unpacked to invoke an algorithm corresponding to the specified type of image statistics parameters, thereby ensuring the reliability of parameter transmission for the image processor, realizing real-time transmission of image statistics information, and improving the image processing efficiency.
Referring to fig. 5, fig. 5 is a flowchart of a parameter transmission method provided in an embodiment of the present application, where the electronic device includes an image processor and an application processor, and as shown in the drawing, the parameter transmission method for the image processor includes:
501. the image processor obtains image statistics parameters of at least one image block in the image.
502. The image processor generates an image statistics parameter packet based on the image statistics parameters of the at least one image block and sends the image statistics parameter packet to an application processor.
503. The application processor receives an image statistics packet sent by the image processor.
504. And if the number of the parameter packets of the specified type of image statistical parameters in the received image statistical parameter packets reaches a preset threshold, the image processor receives index information corresponding to the specified type of image statistical parameters sent by the image processor, wherein the index information is used for representing the corresponding relation between the parameter packets of the specified type of image statistical parameters and the image blocks.
505. The image processor acquires display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters.
The specific description of the above steps 501-505 may refer to the related description of the parameter transmission method for the image processor described in fig. 3A, which is not repeated herein.
It can be seen that, in the method for parameter transmission described in the embodiment of the present application, since the image statistics parameters are acquired and packaged and sent to the application processor in the original image data acquisition process, and all the parameter packets are not required to be sent to the image processor, when the number of the parameter packets of the image statistics parameters of the specified type reaches a certain number, the parameter packets can be unpacked to invoke the algorithm corresponding to the image statistics parameters of the specified type, thereby ensuring the reliability of parameter transmission for the image processor, realizing real-time transmission of the image statistics information, and improving the image processing efficiency.
In accordance with the above-described embodiments, referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device provided in an embodiment of the present application, as shown in the fig. 6, the electronic device includes an application processor, an image processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the image processor, and in the embodiment of the present application, the programs include instructions for executing the following steps:
Acquiring image statistical parameters of at least one image block in an image;
generating an image statistics parameter packet based on the image statistics parameters of the at least one image block, and sending the image statistics parameter packet to an application processor;
and if the number of the parameter packets of the specified type of image statistical parameters in the sent image statistical parameter packets reaches a preset threshold, sending index information corresponding to the specified type of image statistical parameters to the application processor, so that the application processor obtains the display parameters of the image based on the index information and the received parameter packets of the specified type of image statistical parameters, wherein the index information is used for representing the corresponding relation between the parameter packets of the specified type of image statistical parameters and the image blocks.
In one possible example, in terms of generating an image statistics package based on the image statistics of the at least one image block, the program includes instructions for:
and packaging the image statistical parameters of each image block of the at least one image block to obtain the image statistical parameter package.
In one possible example, in said packaging the image statistics of each image block of said at least one image block to obtain said image statistics package, the program comprises instructions for:
After the complete image statistical parameters of one image block are obtained, the image statistical parameter packet is generated based on the image statistical parameters of the one image block.
In one possible example, after the sending, to the application processor, index information corresponding to the specified type of image statistics parameters, the program further includes instructions for:
and sending a notification message to the application processor in a preset interrupt mode, wherein the notification message is used for indicating that the number of parameter packets of the specified type of image statistical parameters reaches the preset threshold value.
In one possible example, the image statistics parameter packet includes: the system comprises a packet head, effective image statistical parameters and a packet tail, wherein the effective image statistical parameters correspond to at least one type of image statistical parameters.
In one possible example, the packet header includes a packet header flag, an index packet flag, and a packet data length.
In one possible example, the package tail includes: end of packet flag, parameter packet count, and frame count.
In one possible example, in said sending said image statistics parameter packet to an application processor, the above program comprises instructions for:
And sending the image statistical parameter packet to the application processor through a preset virtual channel.
In one possible example, in said sending said image statistics parameter packet to an application processor, the above program comprises instructions for:
acquiring target attribute information corresponding to the image statistics parameter packet;
determining a target channel corresponding to the target attribute information according to a mapping relation between preset attribute information and the channel;
and transmitting the image statistical parameter packet through the target channel.
In one possible example, in terms of image statistics of at least one image block in the acquired image, the above-described program comprises instructions for performing the steps of:
acquiring target shooting parameters;
determining a first target image statistical parameter type corresponding to the target shooting parameter according to a mapping relation between a preset shooting parameter and an image statistical parameter type;
and acquiring an image statistical parameter corresponding to the first target image statistical parameter type from at least one image block in the image.
In one possible example, in terms of image statistics of at least one image block in the acquired image, the above-described program comprises instructions for performing the steps of:
Acquiring a target environment parameter;
determining a second target image statistical parameter type corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and an image statistical parameter type;
and acquiring an image statistical parameter corresponding to the second target image statistical parameter type from at least one image block in the image.
Further, the one or more programs may be further configured to be executed by the application processor, where in an embodiment of the application, the program includes instructions for performing the following steps:
receiving an image statistics parameter packet sent by an image processor, wherein the image statistics parameter packet is used for acquiring image statistics parameters of at least one image block in an image by the image processor, and generating an image statistics parameter packet based on the image statistics parameters of the at least one image block;
if the number of the parameter packets of the specified type of image statistical parameters in the received image statistical parameter packets reaches a preset threshold, receiving index information corresponding to the specified type of image statistical parameters sent by the image processor;
and acquiring display parameters of the image based on the index information and the received parameter packet of the specified type image statistical parameters, wherein the index information is used for representing the corresponding relation between the parameter packet of the specified type image statistical parameters and the image block.
In one possible example, in terms of the acquiring of the display parameters of the image based on the index information and the received parameter packet of the specified type of image statistics parameters, the program includes instructions for:
unpacking the received image statistical parameter packet based on the index information to obtain a target image statistical parameter of the specified type of image statistical parameter;
invoking a target image processing algorithm corresponding to the target image statistical parameter;
and processing the image according to the target image processing algorithm to obtain the display parameters of the image.
In one possible example, before the receiving index information corresponding to the specified type of image statistics parameter transmitted by the image processor, the method further includes:
and receiving a notification message sent by the image processor, wherein the notification message is sent to the application processor by the image processor in a preset interrupt mode, and the notification message is used for indicating that the number of parameter packets of the specified type of image statistical parameters reaches the preset threshold value.
In one possible example, the image statistics parameter packet includes: the system comprises a packet head, effective image statistical parameters and a packet tail, wherein the effective image statistical parameters correspond to at least one type of image statistical parameters.
In one possible example, the packet header includes a packet header flag, an index packet flag, and a packet data length.
In one possible example, the package tail includes: end of packet flag, parameter packet count, and frame count.
In one possible example, in the receiving of the image statistics parameter packet sent by the image processor, the program comprises instructions for:
and receiving the image statistical parameter packet sent by the image processor through a preset virtual channel.
In one possible example, in the receiving of the image statistics parameter packet sent by the image processor, the program comprises instructions for:
and transmitting and receiving the image statistical parameter packet through a target channel, wherein the target channel acquires target attribute information corresponding to the image statistical parameter packet by the image processor, and determines the target channel corresponding to the target attribute information according to a mapping relation between preset attribute information and channels.
In one possible example, the image processor and the application processor are integrated on the same chip, or the image processor and the application processor are two independent modules, respectively.
The foregoing description of the embodiments of the present application has been presented primarily in terms of a method-side implementation. It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application may divide the functional units of the electronic device according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device 700 according to an embodiment of the present application, where the electronic device 700 includes an image processor 701 and an application processor 702, as shown in the figure, and in this way,
the image processor 701 is configured to obtain an image statistics parameter of at least one image block in an image;
the image processor 701 is configured to generate an image statistics parameter packet based on the image statistics parameters of the at least one image block, and send the image statistics parameter packet to an application processor;
the application processor 702 is configured to receive the image statistics parameter packet sent by the application processor;
the image processor 701 is configured to send index information corresponding to a specified type of image statistics parameter to the application processor if the number of parameter packets of the specified type of image statistics parameter in the sent image statistics parameter packets reaches a predetermined threshold, where the index information is used to characterize a correspondence between the parameter packets of the specified type of image statistics parameter and the image block;
the application processor 702 is configured to obtain a display parameter of the image based on the index information and the received parameter packet of the specified type of image statistical parameter.
It can be seen that, in the electronic device described in the embodiments of the present application, the electronic device includes an image processor and an application processor, because in the process of acquiring original image data, image statistics information is acquired, and is packaged and sent to the application processor, reliability of parameter transmission for the image processor can be ensured, and real-time transmission of the image statistics information is realized.
Wherein the image processor 701 and the application processor 702 are capable of implementing the functions or steps of any of the methods described above.
Fig. 8 is a functional unit block diagram of a parameter transmission apparatus 800 for an image processor, which is referred to in the embodiment of the present application. The parameter transmission apparatus 800 for an image processor is applied to an electronic device including an image processor and an application processor, and the apparatus 800 includes: an acquisition unit 801, a generation unit 802, and a transmission unit 803, wherein,
the acquiring unit 801 is configured to acquire an image statistics parameter of at least one image block in an image;
the generating unit 802 is configured to generate an image statistics parameter packet based on the image statistics parameters of the at least one image block, and send the image statistics parameter packet to an application processor;
The sending unit 803 is configured to send index information corresponding to a specified type of image statistics parameter to the application processor if the number of parameter packets of the specified type of image statistics parameter in the sent image statistics parameter packets reaches a predetermined threshold, so that the application processor obtains a display parameter of the image based on the index information and the received parameter packets of the specified type of image statistics parameter, where the index information is used to characterize a correspondence between the parameter packets of the specified type of image statistics parameter and the image block.
In one possible example, in terms of generating an image statistics parameter packet based on the image statistics parameters of the at least one image block, the generating unit 802 is specifically configured to include:
and packaging the image statistical parameters of each image block of the at least one image block to obtain the image statistical parameter package.
In one possible example, in the aspect of packaging the image statistics parameters of each image block of the at least one image block to obtain the image statistics parameter packet, the generating unit 802 is specifically configured to include:
after the complete image statistical parameters of one image block are obtained, the image statistical parameter packet is generated based on the image statistical parameters of the one image block.
In one possible example, after the sending, to the application processor, index information corresponding to the specified type of image statistics parameter, the sending unit 803 is further specifically configured to:
and sending a notification message to the application processor in a preset interrupt mode, wherein the notification message is used for indicating that the number of parameter packets of the specified type of image statistical parameters reaches the preset threshold value.
In one possible example, the image statistics parameter packet includes: the system comprises a packet head, effective image statistical parameters and a packet tail, wherein the effective image statistical parameters correspond to at least one type of image statistical parameters.
In one possible example, the packet header includes a packet header flag, an index packet flag, and a packet data length.
In one possible example, the package tail includes: end of packet flag, parameter packet count, and frame count.
In one possible example, in terms of said sending the image statistics parameter packet to an application processor, the sending unit 803 is specifically configured to:
and sending the image statistical parameter packet to the application processor through a preset virtual channel.
In one possible example, in terms of said sending the image statistics parameter packet to an application processor, the sending unit 803 is specifically configured to:
Acquiring target attribute information corresponding to the image statistics parameter packet;
determining a target channel corresponding to the target attribute information according to a mapping relation between preset attribute information and the channel;
and transmitting the image statistical parameter packet through the target channel.
In one possible example, in terms of image statistics parameters of at least one image block in the acquired image, the acquiring unit 801 is specifically configured to:
acquiring target shooting parameters;
determining a first target image statistical parameter type corresponding to the target shooting parameter according to a mapping relation between a preset shooting parameter and an image statistical parameter type;
and acquiring an image statistical parameter corresponding to the first target image statistical parameter type from at least one image block in the image.
In one possible example, in terms of image statistics parameters of at least one image block in the acquired image, the acquiring unit 801 is specifically configured to:
acquiring a target environment parameter;
determining a second target image statistical parameter type corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and an image statistical parameter type;
and acquiring an image statistical parameter corresponding to the second target image statistical parameter type from at least one image block in the image.
It should be noted that the electronic device described in the embodiments of the present application is presented in the form of functional units. The term "unit" as used herein should be understood in the broadest possible sense, and the objects used to implement the functions described by the various "units" may be, for example, an integrated circuit ASIC, a single circuit, a processor (shared, dedicated, or group) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
The acquiring unit 801, the generating unit 802, and the transmitting unit 803 may be an image processor circuit or an application processor, and the functions or steps of any of the methods described above can be implemented based on the above unit modules.
Fig. 9 is a block diagram of functional units of a parameter transmission apparatus 900 for an application processor according to an embodiment of the present application. The parameter transmission apparatus 900 for an application processor is applied to an electronic device, the electronic device including an image processor and an application processor, the apparatus 900 includes: a receiving unit 901 and an acquiring unit 902, wherein,
the receiving unit 901 is configured to receive an image statistics parameter packet sent by an image processor, where the image statistics parameter packet is obtained by the image processor and is an image statistics parameter packet generated based on an image statistics parameter of at least one image block in an image;
The receiving unit 901 is further configured to receive index information corresponding to a specified type of image statistics parameter sent by the image processor, where the index information is used to characterize a correspondence between a parameter packet of the specified type of image statistics parameter and the image block, if the number of parameter packets of the specified type of image statistics parameter in the received image statistics parameter packets reaches a predetermined threshold;
the acquiring unit 902 is configured to acquire a display parameter of the image based on the index information and the received parameter packet of the specified type of image statistical parameter.
In one possible example, in terms of the acquiring the display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters, the acquiring unit 902 is specifically configured to:
unpacking the received image statistical parameter packet based on the index information to obtain a target image statistical parameter of the specified type of image statistical parameter;
invoking a target image processing algorithm corresponding to the target image statistical parameter;
and processing the image according to the target image processing algorithm to obtain the display parameters of the image.
Further, in one possible example, before the receiving index information corresponding to the specified type of image statistics parameter sent by the image processor, the receiving unit 901 is further specifically configured to:
and receiving a notification message sent by the image processor, wherein the notification message is sent to the application processor by the image processor in a preset interrupt mode, and the notification message is used for indicating that the number of parameter packets of the specified type of image statistical parameters reaches the preset threshold value.
In one possible example, the image statistics parameter packet includes: the system comprises a packet head, effective image statistical parameters and a packet tail, wherein the effective image statistical parameters correspond to at least one type of image statistical parameters.
In one possible example, the packet header includes a packet header flag, an index packet flag, and a packet data length.
In one possible example, the package tail includes: end of packet flag, parameter packet count, and frame count.
In one possible example, in the aspect of receiving the image statistics parameter packet sent by the image processor, the receiving unit 901 is specifically configured to:
and receiving the image statistical parameter packet sent by the image processor through a preset virtual channel.
In one possible example, in the aspect of receiving the image statistics parameter packet sent by the image processor, the receiving unit 901 is specifically configured to:
and transmitting and receiving the image statistical parameter packet through a target channel, wherein the target channel acquires target attribute information corresponding to the image statistical parameter packet by the image processor, and determines the target channel corresponding to the target attribute information according to a mapping relation between preset attribute information and channels.
The receiving unit 901 and the acquiring unit 902 may be an image processor circuit or an application processor, and the functions or steps of any of the methods described above can be implemented based on the above unit modules.
It can be seen that, in the parameter transmission device described in the embodiment of the present application, or the electronic device, because in the process of acquiring original image data, the image statistics parameters are acquired, packaged and sent to the application processor, and without sending all parameter packets to the image processor, when the number of parameter packets of the specified type of image statistics parameters reaches a certain number, the parameter packets can be unpacked to invoke an algorithm corresponding to the specified type of image statistics parameters, thereby ensuring the reliability of parameter transmission for the image processor, realizing real-time transmission of image statistics information, and improving the image processing efficiency.
In addition, the embodiment of the application also provides an image processor, which is used for executing the following operations:
acquiring image statistical parameters of at least one image block in an image;
generating an image statistics parameter packet based on the image statistics parameters of the at least one image block, and sending the image statistics parameter packet to an application processor;
and if the number of the parameter packets of the specified type of image statistical parameters in the sent image statistical parameter packets reaches a preset threshold, sending index information corresponding to the specified type of image statistical parameters to the application processor, so that the application processor obtains the display parameters of the image based on the index information and the received parameter packets of the specified type of image statistical parameters, wherein the index information is used for representing the corresponding relation between the parameter packets of the specified type of image statistical parameters and the image blocks.
And, the embodiment of the application also provides an application processor, which is used for:
receiving an image statistics parameter packet sent by an image processor, wherein the image statistics parameter packet is used for acquiring image statistics parameters of at least one image block in an image by the image processor, and generating an image statistics parameter packet based on the image statistics parameters of the at least one image block;
If the number of the parameter packets of the specified type of image statistical parameters in the received image statistical parameter packets reaches a preset threshold, receiving index information corresponding to the specified type of image statistical parameters sent by the image processor;
and acquiring display parameters of the image based on the index information and the received parameter packet of the specified type image statistical parameters, wherein the index information is used for representing the corresponding relation between the parameter packet of the specified type image statistical parameters and the image block.
The present embodiment also provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the above computer program causes a computer to execute the embodiments as the present application for implementing any of the methods of the embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described relevant steps to implement any of the methods of the above-described embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is configured to store computer-executable instructions that, when the device is operated, are executable by the processor to cause the chip to perform any one of the method embodiments described above.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for parameter transmission for an application processor, the method comprising:
receiving an image statistics parameter packet sent by an image processor, wherein the image statistics parameter packet is an image statistics parameter packet generated by the image processor by acquiring the image statistics parameters of at least one image block in an image and based on the image statistics parameters of the at least one image block;
if the number of the parameter packets of the specified type of image statistical parameters in the received image statistical parameter packets reaches a preset threshold, receiving index information corresponding to the specified type of image statistical parameters sent by the image processor, wherein the index information is used for representing the corresponding relation between the parameter packets of the specified type of image statistical parameters and the image blocks;
And acquiring display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters.
2. The method according to claim 1, wherein the acquiring the display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters includes:
unpacking the received image statistical parameter packet based on the index information to obtain a target image statistical parameter of the specified type of image statistical parameter;
invoking a target image processing algorithm corresponding to the target image statistical parameter;
and processing the image according to the target image processing algorithm to obtain the display parameters of the image.
3. The method of claim 2, wherein prior to said receiving index information corresponding to said specified type of image statistics parameters transmitted by said image processor, said method further comprises:
and receiving a notification message sent by the image processor, wherein the notification message is sent to the application processor by the image processor in a preset interrupt mode, and the notification message is used for indicating that the number of parameter packets of the specified type of image statistical parameters reaches the preset threshold value.
4. A method according to any one of claims 1-3, wherein the image statistics package comprises: the system comprises a packet head, effective image statistical parameters and a packet tail, wherein the effective image statistical parameters correspond to at least one type of image statistical parameters.
5. The method of claim 4, wherein the packet header comprises a packet header flag, an index packet flag, and a packet data length.
6. The method of claim 4, wherein the tail-wrapping comprises: end of packet flag, parameter packet count, and frame count.
7. A method according to any one of claims 1-3, wherein said receiving an image statistics package transmitted by an image processor comprises:
and receiving the image statistical parameter packet sent by the image processor through a preset virtual channel.
8. A method according to any one of claims 1-3, wherein prior to said receiving an image statistics parameter packet sent by an image processor, the method further comprises:
and transmitting and receiving the image statistical parameter packet through a target channel, wherein the target channel acquires target attribute information corresponding to the image statistical parameter packet by the image processor, and determines the target channel corresponding to the target attribute information according to a mapping relation between preset attribute information and channels.
9. An application processor, wherein the application processor is configured to perform the following operations:
receiving an image statistics parameter packet sent by an image processor, wherein the image statistics parameter packet is an image statistics parameter packet generated by the image processor by acquiring the image statistics parameters of at least one image block in an image and based on the image statistics parameters of the at least one image block;
if the number of the parameter packets of the specified type of image statistical parameters in the received image statistical parameter packets reaches a preset threshold, receiving index information corresponding to the specified type of image statistical parameters sent by the image processor, wherein the index information is used for representing the corresponding relation between the parameter packets of the specified type of image statistical parameters and the image blocks;
and acquiring display parameters of the image based on the index information and the received parameter packet of the specified type of image statistical parameters.
10. An electronic device comprising an application processor, an image processor, a memory for storing one or more programs and configured for execution by the application processor, the programs comprising instructions for performing the steps in the method of any of claims 1-8.
CN202011472314.3A 2020-12-11 2020-12-11 Parameter transmission method and device for application processor and storage medium Active CN114630153B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011472314.3A CN114630153B (en) 2020-12-11 2020-12-11 Parameter transmission method and device for application processor and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011472314.3A CN114630153B (en) 2020-12-11 2020-12-11 Parameter transmission method and device for application processor and storage medium

Publications (2)

Publication Number Publication Date
CN114630153A CN114630153A (en) 2022-06-14
CN114630153B true CN114630153B (en) 2023-12-19

Family

ID=81896568

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011472314.3A Active CN114630153B (en) 2020-12-11 2020-12-11 Parameter transmission method and device for application processor and storage medium

Country Status (1)

Country Link
CN (1) CN114630153B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014125289A1 (en) * 2013-02-15 2014-08-21 University Of Warwick Multisensory data compression
CN107578372A (en) * 2017-10-31 2018-01-12 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and electronic equipment
CN111768352A (en) * 2020-06-30 2020-10-13 Oppo广东移动通信有限公司 Image processing method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014125289A1 (en) * 2013-02-15 2014-08-21 University Of Warwick Multisensory data compression
CN107578372A (en) * 2017-10-31 2018-01-12 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and electronic equipment
CN111768352A (en) * 2020-06-30 2020-10-13 Oppo广东移动通信有限公司 Image processing method and device

Also Published As

Publication number Publication date
CN114630153A (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN115473957B (en) Image processing method and electronic equipment
CN114650363B (en) Image display method and electronic equipment
CN111553846B (en) Super-resolution processing method and device
CN111768352B (en) Image processing method and device
CN111882642B (en) Texture filling method and device for three-dimensional model
CN115550556B (en) Exposure intensity adjusting method and related device
US20230335081A1 (en) Display Synchronization Method, Electronic Device, and Readable Storage Medium
CN114945019B (en) Data transmission method, device and storage medium
CN115119048B (en) Video stream processing method and electronic equipment
CN116048831B (en) Target signal processing method and electronic equipment
WO2023000745A1 (en) Display control method and related device
CN114172596B (en) Channel noise detection method and related device
CN114630152A (en) Parameter transmission method and device for image processor and storage medium
CN114630153B (en) Parameter transmission method and device for application processor and storage medium
CN111836226B (en) Data transmission control method, device and storage medium
CN116561085A (en) Picture sharing method and electronic equipment
CN114336998A (en) Charging control method, charging control device and storage medium
CN116095512B (en) Photographing method of terminal equipment and related device
CN114596819B (en) Brightness adjusting method and related device
CN113311380B (en) Calibration method, device and storage medium
CN114339607B (en) Ranging method, ranging device and storage medium
CN116703741B (en) Image contrast generation method and device and electronic equipment
CN116266159B (en) Page fault exception handling method and electronic equipment
CN115175164B (en) Communication control method and related device
CN116193275B (en) Video processing method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant