WO2022170866A1 - 数据传输方法、装置及存储介质 - Google Patents

数据传输方法、装置及存储介质 Download PDF

Info

Publication number
WO2022170866A1
WO2022170866A1 PCT/CN2021/141257 CN2021141257W WO2022170866A1 WO 2022170866 A1 WO2022170866 A1 WO 2022170866A1 CN 2021141257 W CN2021141257 W CN 2021141257W WO 2022170866 A1 WO2022170866 A1 WO 2022170866A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
data
statistical data
statistical
image data
Prior art date
Application number
PCT/CN2021/141257
Other languages
English (en)
French (fr)
Inventor
刘君
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2022170866A1 publication Critical patent/WO2022170866A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L49/00Packet switching elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region

Definitions

  • the present application relates to the field of electronic technology, and in particular, to a data transmission method, device, and storage medium.
  • the video noise reduction algorithm is mostly implemented in the application processor (AP) of the mobile phone, and the statistical decision of the image signal processor (ISP) is all within the AP.
  • all APs use a general-purpose central processing unit (CPU), neural-network processing unit (NPU), and digital signal processor (DSP) architectures, and the algorithm achieves a very high energy efficiency ratio.
  • CPU central processing unit
  • NPU neural-network processing unit
  • DSP digital signal processor
  • the statistical information accompanying the chip ISP is generally packaged and transmitted to the AP at the end of the frame. Due to the large delay of the ISP and the NPU itself, the statistical information is also packaged at the end of the frame according to a common format, which will seriously affect the real-time performance of AP reception. , which affects the decision-making timing of the AP. Therefore, the problem of how to realize the real-time transmission of image statistical information needs to be solved urgently.
  • Embodiments of the present application provide a data transmission method, device, and storage medium, which can transmit image statistical data in real time.
  • an embodiment of the present application provides a data transmission method, the method includes:
  • the packet group is communicated to the application processor.
  • an embodiment of the present application provides a data transmission method, the method includes:
  • the group of data packets includes one or more statistical data packets generated based on image block statistics of one or more image data blocks contained in the image data stream;
  • an unpacking operation is performed on the statistics packet to obtain image block statistics for the one or more image data blocks.
  • an image processing apparatus including:
  • a statistical unit for generating image block statistical data based on the image data blocks contained in the image data stream
  • a packing unit for generating statistical data packets based on the image block statistical data of one or more image data blocks, and packing one or more of the statistical data packets into a data packet group;
  • a transmission unit configured to transmit the data packet group to the application processor.
  • an application processor including:
  • a receiving unit configured to receive a data packet group, wherein the data packet group includes one or more statistical data packets, and the statistical data packets are based on image block statistical data of one or more image data blocks included in the image data stream to generate;
  • An unpacking unit in response to the interrupt information, performs an unpacking operation on the statistical data packet to obtain image block statistical data of the one or more image data blocks.
  • an electronic device including:
  • the image processing device is configured to generate image block statistical data based on the image data blocks contained in the image data stream, generate statistical data packets based on the image block statistical data of one or more image data blocks, and convert one or more of the statistical data Packages are packaged into packet groups;
  • an application processor for receiving a data packet group from the image processing device
  • the image processing device transmits interrupt information to the application processor, so that the application processor unpacks the statistical data packets, Image block statistics for the one or more image data blocks are obtained.
  • an embodiment of the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the computer program enables a computer to execute the computer program as described in the first embodiment of the present application.
  • an embodiment of the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to execute as implemented in the present application. Examples include some or all of the steps described in the first aspect.
  • the computer program product may be a software installation package.
  • FIG. 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a software structure of an electronic device provided by an embodiment of the present application.
  • 3A is a schematic flowchart of a data transmission method provided by an embodiment of the present application.
  • 3B is a schematic structural diagram of a data packet provided by an embodiment of the present application.
  • 3C is a schematic diagram illustrating data transmission for an image processor provided by an embodiment of the present application.
  • FIG. 3D is a schematic diagram illustrating another data transmission provided by an embodiment of the present application.
  • 3E is a schematic diagram illustrating an unpacking operation provided by an embodiment of the present application.
  • 3F is a schematic flowchart of an unpacking operation provided by an embodiment of the present application.
  • 3G is a schematic flowchart of another data transmission method provided by an embodiment of the present application.
  • 3H is a schematic diagram illustrating another data transmission provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of another data transmission method provided by an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of another electronic device provided by an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of another electronic device provided by an embodiment of the present application.
  • FIG. 8 is a block diagram of functional units of an image processing apparatus provided by an embodiment of the present application.
  • FIG. 9 is a block diagram of functional units of an application processor provided by an embodiment of the present application.
  • the electronic devices may include various devices with computer functions, such as handheld devices (smartphones, tablet computers, etc.), vehicle-mounted devices (navigators, auxiliary reversing systems, driving recorders, vehicle-mounted refrigerators, etc.), Wearable devices (smart bracelets, wireless headsets, smart watches, smart glasses, etc.), computing devices or other processing devices connected to wireless modems, and various forms of user equipment (User Equipment, UE), mobile stations (Mobile Stations) , MS), virtual reality/augmented reality device, terminal device (terminal device), etc.
  • the electronic device may also be a base station or a server.
  • the electronic devices may also include smart home devices, and the smart home devices may be at least one of the following: smart speakers, smart cameras, smart rice cookers, smart wheelchairs, smart massage chairs, smart furniture, smart dishwashers, smart TVs, smart refrigerators, Smart fans, smart heaters, smart drying racks, smart lights, smart routers, smart switches, smart switch panels, smart humidifiers, smart air conditioners, smart doors, smart windows, smart cooktops, smart disinfection cabinets, smart toilets, floor sweeping Robots, etc., are not limited here.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, compass 190, motor 191, indicator 192, camera 193, display screen 194 and user Identity module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM subscriber identification module
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor AP, a modem processor, a graphics processor GPU, an image signal processor (image signal processor, ISP), a controller, Video codec, digital signal processor (DSP), baseband processor, and/or neural network processor NPU, etc. Wherein, different processing units may be independent components, or may be integrated in one or more processors.
  • electronic device 100 may also include one or more processors 110 .
  • the controller can generate an operation control signal according to the instruction operation code and the timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in the processor 110 may be a cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from memory. In this way, repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the electronic device 100 in processing data or executing instructions.
  • the processor may also include an image processor, and the image processor may be an image preprocessor (preprocess image signal processor, Pre-ISP), which can be understood as a simplified ISP, which can also perform some image processing operations. Get image statistics.
  • the processor 110 may include one or more interfaces.
  • the interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal) asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interface, SIM card interface and/or USB interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices.
  • the USB interface 130 can also be used to connect an earphone, and play audio through the earphone.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, the wireless communication module 160, and the like.
  • the power management module 141 can also be used to monitor data such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G/6G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation Satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation Satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode, or an active matrix organic light emitting diode (active-matrix organic light).
  • emitting diode, AMOLED flexible light-emitting diode (flex light-emitting diode, FLED), mini light-emitting diode (mini light-emitting diode, miniled), MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), etc.
  • electronic device 100 may include one or more display screens 194 .
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, converting it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other data of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include one or more cameras 193 .
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store one or more computer programs including instructions.
  • the processor 110 may execute the above-mentioned instructions stored in the internal memory 121, thereby causing the electronic device 100 to execute the method for displaying page elements, various applications and data processing provided in some embodiments of the present application.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the stored program area may store the operating system; the stored program area may also store one or more applications (such as gallery, contacts, etc.) and the like.
  • the storage data area may store data (such as photos, contacts, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage components, flash memory components, universal flash storage (UFS), and the like.
  • the processor 110 may cause the electronic device 100 to execute the instructions provided in the embodiments of the present application by executing the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor 110 . Methods for displaying page elements, as well as other applications and data processing.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of the electronic device 100 about three axes ie, the X, Y, and Z axes
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • FIG. 2 shows a software structural block diagram of the electronic device 100 .
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application layer can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short messages.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • Data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (media library), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library media library
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • FIG. 3A is a schematic flowchart of a data transmission method provided by an embodiment of the present application.
  • the data transmission method is applied to an electronic device as shown in FIG. 1 or FIG. 2 or an image of the electronic device.
  • the processor, as shown in the figure, the data transmission method includes:
  • the image data may be at least one of the following: original image data, pixel data (pixel data, PD) data, processed image data, etc., which are not limited herein.
  • the above image can be any image in the video stream.
  • the processed image data may be the original image data processed by a preset image processing algorithm, and the preset image processing algorithm may be one or more of various image processing algorithms, for example, it may be at least one of the following: White balance algorithm, wavelet transform algorithm, histogram equalization algorithm, neural network algorithm, etc., are not limited here.
  • the image processor may be an image preprocessor or a companion chip.
  • the image processor may acquire image block statistics of at least one image data block in the image, that is, image statistics, during the image acquisition process, and the image block statistics may be at least one of the following: automatic exposure AE image statistics, auto focus AF image statistics, AWB image statistics, automatic lens shading correction (LSC) image statistics, automatic water ripple (flicker, FLK) image statistics, etc. limited. Therefore, the type of image statistical data may be at least one of the following: AE, AF, AWB, LSC, FLK, etc., which are not limited herein.
  • the image may be original image data
  • the image processor may acquire the original image data pixel by pixel, that is, the image processor acquires the original image data by scanning line by line , it can be understood that to obtain all the original image data, it is necessary to scan all the pixel points.
  • the area of the image data block can be divided according to preset rules. For example, every 5 rows of pixel points can be As an image data block, for example, the area (position) of the image data block can be pre-planned, and when all the pixels in the area corresponding to the image data block are scanned, an image data block can be obtained.
  • the original image data can be understood as a plurality of image data blocks, and each image data block may contain part of the original image data.
  • the original image data may be raw data of one frame or multiple frames of images.
  • the image processor can start to acquire the image statistical data of any image data block after successfully acquiring any image data block. For example, when the original image data is loaded into the jth row, it is considered that an image data block is obtained. , you can start to obtain the image statistical data of any image data block. For example, every time the original image data is loaded with a preset number of pixels, it can be considered that an image data block is obtained, and the corresponding image statistical data can be obtained. The number of settings can be set by the user or the system defaults. That is, in this embodiment of the present application, the image statistical data may be acquired during the loading process of the original image data.
  • the image processor may package the image data at time T, and package the image statistical data at time T+1, and further, multiple data packets may be obtained. Based on the arbitration and packaging mechanism, the statistical decision flow and the video image processing flow can be completely separated to ensure real-time decision-making. Furthermore, various statistical information packets are transmitted to the AP in real time in an out-of-order manner, avoiding the accompanying chip storage overhead and ensuring the real-time reception of the AP to the greatest extent. sex.
  • step 302 generating a statistical data package based on the image block statistical data of one or more image data blocks, can be implemented as follows:
  • the data to be packaged is selected from the image data and the statistical data of the image block in a random manner, and the data to be packaged is packaged to obtain the statistical data package.
  • the image processor may randomly select the statistical data of the image blocks or the data in the image data for packaging, and further, the statistical data packet may be obtained.
  • the image processor may also generate a corresponding statistical data packet based on the image block statistical data of one or more image data blocks, and send the statistical data packet to the application processor.
  • Statistics packets can be understood as one or more packets.
  • the image processor may use at least one data transmission channel to send the image statistics data packet to the application processor. For example, the image processor may sequentially send the image statistics packets to the application processor using a data transmission channel. For another example, the image processor may divide the image statistical data packet into multiple data sets, each data set corresponds to one or part of the image statistical data packet, and use at least one data transmission channel to send the multiple data sets to the application processor. Transmission channels may correspond to data sets one-to-one, and each data transmission channel may correspond to a process or thread.
  • the image processor can not only send the image statistics data package to the application processor, but also send the original image data to the application processor.
  • the image statistics data package can be loaded in the transmission channel corresponding to the original image data in real time. , transmitted to the application processor using MIPI.
  • the original image data and the image statistics data packets can be sent to the application processor separately.
  • the image statistics data packets can be sent to the application processor first, and after the image statistics data packets are sent, the original image statistics data packets can be sent to the application processor. The image data is sent to the application processor.
  • the application processor can perform an unpacking operation on the image statistics data packet, and call the corresponding image statistics data based on the unpacked image statistics data. It can realize that the corresponding algorithm is prepared in advance before the original image data is completely sent to the application processor, which helps to improve the efficiency of image processing.
  • any data packet in the image statistics data packet includes: a packet header (packet header, PH), effective image statistical data and a packet footer (packet footer, PF), and the effective image statistical data of each data packet can correspond to At least one type of image statistics.
  • the image statistical data packet may include a packet header PH, valid image statistical data and a packet tail PF
  • the packet header may be used to mark the start position of a data packet
  • the effective image statistical data is part of a class of image statistical data
  • all valid image statistical data in the data packets corresponding to all image data blocks of the original image data constitute complete image statistical data of the original image data
  • the end of the packet can be used to indicate the end position of the data packet.
  • the lengths of valid image statistics in the data packets of each type of image statistics may be the same or different.
  • the packet header includes a packet header mark, an index packet mark and a packet data length.
  • the packet header may include: packet header mark, index packet mark and packet data length
  • the packet header mark is used to indicate the statistical type of the current data packet (image statistics data packet)
  • the index packet mark is used to indicate whether the current data packet is statistical data or independent Index
  • packet data length is used to indicate the data length of the current data packet
  • the packet tail includes: a packet tail marker, a data packet count and a frame count.
  • the packet tail may include: packet tail mark, data packet count and frame count, the packet tail mark is used to indicate the position of the packet tail, and the packet count is used to indicate that the data packet is the count of the data packets of the current statistical type (the number of the ), the frame count indicates the original image data of which frame of image the data packet comes from, as shown in the following table:
  • the above step 302 generating a statistical data package based on the image block statistical data of one or more image data blocks, may include the following steps:
  • the original image data, the processed image data and the image block statistical data are arbitrated and packaged to obtain the statistical data package.
  • the image data when the image data includes the original image data and the processed image data, the original image data, the processed image data and the image block statistical data are arbitrated and packaged to obtain a statistical data package, that is, the original image data is randomly selected from the original image data. , processed image data, and image block statistics for packaging.
  • the raw image can be bypassed from the image processing path, and the MIPI bandwidth can be used to transmit the AP in real time, without the need for a large bandwidth interface, and at the same time, the AP can take pictures with zero delay.
  • step 302 generating a statistical data package based on image block statistical data of one or more image data blocks, may further include the following steps:
  • the system data, the image data and the one or more image data blocks are arbitrated and packaged to obtain the statistical data package.
  • the system data may be at least one of the following: Log data, MD (Mate data) data, etc., which are not limited here.
  • the image processor can arbitrarily package system data, image data and one or more image data blocks to obtain statistical data packets, that is, randomly select from system data, processed image data and image block statistical data Select data for packaging.
  • the preset number may be set by the user or the system defaults.
  • the preset number may be the number of a part of the image data blocks, or may also be the number of all the image data blocks.
  • the image processor may transmit a data packet group to the application processor, the data packet group has a predetermined size and includes one or more statistics data packets, and the application processor receives the block statistics data packets of the predetermined number of image data blocks. Then, based on the interrupt information from the image processor, the interrupt information is used to notify the application processor that the block statistics data has been transmitted, that is, a reconstruction operation can be performed on a predetermined number of block statistics data packets to obtain image statistics data.
  • the data packet group further includes system data packets and/or image data packets of the one or more image data blocks.
  • system data packet may be a data packet obtained by packaging system data
  • system data may be at least one of the following: Log data, MD (Mate data) data, etc., which are not limited herein.
  • the image data package includes raw image data and/or processed image data.
  • the image processor can also package at least one type of data in the original image data of the image and the processed image data corresponding to the original image data to obtain an image data package, and the image data after specified processing can be the original image data Image data processed by a preset image processing algorithm.
  • a statistical data packet is generated based on image block statistical data of one image data block, and the statistical data packet is packaged into the data packet group.
  • the image processor when receiving the statistical data of an image data block, can generate a statistical data packet based on the received statistical data stream, add the statistical data packet to the data packet group, and send the data packet to the application processor, Further, data transmission can be guaranteed.
  • a statistical data packet is generated based on image block statistical data of Q image data blocks, and the statistical data packet is packaged into the data packet group, where Q is a number greater than 1.
  • the image processor may generate a statistical data packet based on the received image block statistical data of the Q image data blocks when accumulatively receiving the statistical data of the Q image data blocks, and add the statistical data packet to the data packet group, Q It is an integer greater than 1, and the data packet is sent to the application processor, which can accumulate a certain amount of data and repack it, which can reduce the power consumption of the device.
  • the image processor can also send at least one data packet to the application processor in the form of one data packet by one data packet, or can send the data packets in a centralized manner when the number of data packets accumulates to a set number.
  • the set number can be set by the user or the system defaults.
  • the image processor can send data packets through virtual channels or data channels.
  • the image processor can use MIPI long packets and DataType to transmit debug log information to the AP for debugging, without the need for a debug port, and can also use MIPI protocol to reserve short packets to transmit interrupt sources and interrupt information in real time without additional port communication.
  • the above step 303, transmitting the data packet group to the application processor may be implemented as follows:
  • the data packet group is sent to the application processor through a preset virtual channel.
  • the preset virtual channel can be set by the user or the system defaults, and the image processor can send the data packet group to the application processor through the preset virtual channel.
  • one or more virtual channels may be used to send the data packet group to the application processor, and each virtual channel may correspond to a thread or process.
  • interrupt information is provided to the application processor to notify the application processor to perform unpacking of the statistics packets.
  • the interrupt method of the interrupt information can be a general purpose input output (GPIO) interrupt method, and the interrupt information can be used to notify the application processor that the statistical data of the block has been transmitted. After the statistical data packets of the number of image data blocks are received, interrupt information is provided to the application processor to notify the application processor that the statistical data of the block has been transmitted, and further, the application processor knows that the image processor has completed the data transmission.
  • GPIO general purpose input output
  • interruption information and the statistical data packet are located in different data packet groups.
  • the interrupt information and the statistical data packet may be located in different data packet groups, and further, the interrupt information may be set after the statistical data packet, so that after the statistical data packet is transmitted, the application processor can be immediately notified. The application processor can quickly know that the block statistics have been transferred.
  • step 303 transmitting the data packet group to the application processor, may include the following steps:
  • the attribute information may be at least one of the following: the data type of the data in the data packet, the number of data bits (data length) of the data in the data packet, the type of the data packet (image statistics data type) etc., which are not limited here, where the data type can be at least one of the following: floating point (single-precision, double-precision), integer, etc., which are not limited here, and the data packet type can be At least one of the following: AE, AF, AWB, LSC, FLK, etc., which are not limited here.
  • different types of image statistics data packets may correspond to different channels, or different data lengths may correspond to different channels.
  • a mobile industry processor interface (MIPI) channel may include 3 image data interfaces (image data interface, IDI), the corresponding channels can be: IDI0, IDI1 and IDI2, k1 type image statistics can correspond to IDI0, k2 type image statistics can correspond to DII1.
  • IDI image data interface
  • a preset mapping relationship between attribute information and channels may be pre-stored in the memory of the electronic device.
  • the data packet i is any data packet in the image statistics data packet
  • the image processor can obtain the target attribute information corresponding to the data packet i, then the mapping between the preset attribute information and the channel can be performed.
  • the target channel corresponding to the target attribute information is determined, and the data packet i can be transmitted through the target channel. In this way, the corresponding channel can be allocated according to the attribute of the data packet, which helps to improve the data transmission efficiency.
  • the image processor can send some data packets of the image statistics data packet to the application processor through a preset virtual channel, and send the remaining part of the data packets in the image statistics data packet to the corresponding channel by selecting the attribute information to the application processor.
  • Application processor the two different methods can be performed synchronously or asynchronously.
  • a process or process execution can be used to send some data packets of the image statistics data packet to the application processor through a preset virtual channel, and another process can be used.
  • a thread or process executes and sends the remaining part of the data packets in the image statistics data packet to the application processor by selecting the corresponding channel through the attribute information.
  • the application processor If the number of data packets of the designated type of image statistical data in the sent statistical data packets reaches a predetermined threshold, then send the index information corresponding to the designated type of image statistical data to the application processor, so that the application processor is based on The index information and the received data packet of the specified type of image statistical data are used to obtain the display data of the image, wherein the index information is used to represent the specified type of image statistical data packet and the image Correspondence between data blocks.
  • the predetermined threshold may be set by the user or the system defaults.
  • the specified type of image statistics can be set by the user or the system defaults.
  • the display data of the image may be at least one of the following: display brightness, pixel value, display color, resolution, contrast, sharpness, etc., which are not limited herein.
  • the specified type of image statistical data may be at least one type of image statistical data in the image.
  • the transmission of the remaining statistical data packets corresponding to the designated type of image statistical data to the application processor can be stopped. , the power consumption of the device can be reduced.
  • the index information corresponding to the designated type of image statistical information may also be sent to the application processor.
  • the information can be placed in an index data packet, the index data packet can contain a target index table, and the target index table records the relevant information of the data packet corresponding to the specified type of image statistical information, and the relevant information can include at least one of the following: : the index packet mark of the data packet, the storage location of the image statistical data in the data packet, etc., which are not limited here.
  • the offset of the target index table of the specified type of image statistical data in the total data packets can be stored in the image processor in advance.
  • the offset corresponding to the target index table can be given to the application processor, and the application processor can obtain the target index table of the statistical information of the specified type of image, and further, can be obtained from the multiple data packets that the application processor has received.
  • the image statistics data packet corresponding to the target index table is unpacked according to the index order of the target index table, and the specified type of image statistics data is obtained, that is, the algorithm corresponding to the image statistics information of this type can be retrieved. Implement the corresponding image processing operations, so that the data packets can be unpacked, and the data packets corresponding to any type of image statistical data can be unpacked without waiting for all the data packets to be transmitted, which is helpful for improving Image processing efficiency.
  • the following steps may be further included:
  • the preset interruption mode may be set by the user or the system defaults.
  • the preset interrupt mode may be a general-purpose input/output interrupt mode or a mobile industry processor interface (MIPI) channel.
  • MIPI mobile industry processor interface
  • an additional data packet is sent to the AP through the MIPI channel, and the additional data packet is used to notify the AP that the specified type of image statistical data has completed the transmission task.
  • the image processor can send a notification message to the application processor through a preset interrupt method.
  • the notification message is used for The number of packets indicating that the specified type of image statistics has reached a predetermined threshold.
  • the 3A image statistical data may include: AE image statistical data, AF image statistical data, and AWB image statistical data, and the three types of image statistical data can be formed into 3A image statistical data packet, that is, at least one data packet is obtained, and each type of image statistical data in each data packet can correspond to an image statistical data index table, or, each type of image statistical data in each frame of image can be Corresponding to an image statistical data index table, the image statistical data index table can be AE image statistical data index table, AWB image statistical data index table or AF image statistical data index table, etc.
  • the index of the index table sends packets sequentially. After each type of image statistical data is sent, the image statistical data index table corresponding to the image statistical data can be sent to the application processor, and the application processor can perform an unpacking operation according to the image statistical data index table. The unpacking operation is performed according to the index order corresponding to the image statistical data index table.
  • the at least one data packet may be sent out of order or in order. Out of order can be understood as sending one data packet of one type of image statistical data at one moment, and sending another data packet at the next moment.
  • a data packet of one type of image statistical data the order can be understood as sending a data packet of one type of image statistical data in a time period, when the data packet of this type of image statistical data is sent, another type of image can be sent. Packet of statistics.
  • the image processor can send the statistical data packets one by one to the application processor through MIPI.
  • the index of each data packet is recorded.
  • the GPIO interrupt can be set, and the application processor can search the index corresponding to the specified type of image statistics data through the index table.
  • the offset of the index table corresponding to the image statistical data of this type can be sent to the application processor, and the application process
  • the device can unpack the corresponding data packets according to the offset positioning index table, and arrange the effective image statistics data of the unpacked images according to the order of the index table to obtain the final image statistics data of this type, for example,
  • the corresponding index table includes f, j, n, q, t data packets, and the corresponding data packets can be unpacked according to the index table.
  • FIG. 3E when each type of data transmission for the image processor is completed, the offset of the index table corresponding to the image statistical data of this type can be sent to the application processor, and the application process The device can unpack the corresponding data packets according to the offset positioning index table, and arrange the effective image statistics data of the unpacked images according to the order of the index table to obtain the final image statistics data of this type, for example,
  • the corresponding index table includes f, j, n, q, t data packets, and the
  • the application processor when it receives a certain type of image statistical data and completes, it can read the index packet corresponding to the data packet whose statistics have been completed.
  • the index can be sequentially read in units of 32 bits.
  • the package content goes to the index position index_n, reads the 32 bits of the packaged data starting from index_n as the header mark (PH), parses the PH content, reads the length of the data segment corresponding to the index, copies the statistical data segment to the target cache, and detects the index package Whether the traversal is over, if so, the current image statistical data is unpacked; otherwise, read the 32 bits of the packed data starting from index_n as the packet header mark (PH) and its subsequent steps until the index packet traversal is completed.
  • PH packet header mark
  • the AP uses the secure digital input and output card (SDIO) to inquire which image statistical data is currently completed, and at the same time obtains the starting position of the index packet of the image statistical data. , then the AP side can quickly locate the statistical data of each type of image by virtue of this index package, so that the corresponding algorithm can be started immediately.
  • the algorithm can be at least one of the following: a white balance algorithm, an image enhancement algorithm, and a deblurring algorithm. , image segmentation algorithm, image enhancement algorithm, interpolation algorithm, etc., which are not limited here.
  • the image processor may also be a companion chip.
  • the companion chip includes an ISP and an NPU.
  • the companion chip can receive the Raw image data transmitted by the camera, and then transmit it to the ISP, and then send it to the ISP. It can then be transmitted to the NPU by the ISP for processing.
  • the raw image data, the data processed by the NPU, and the statistical data of the image can be arbitrated and packaged to obtain out-of-order small packets, which are then sent to the application processor through the MIPI transmission channel.
  • the application processor can unpack the received data.
  • Raw image data is stored in Buf-Queue
  • PD and image statistics can be stored in DDR
  • NPU processed data can be stored in ISP
  • DDR can also be stored.
  • the data in the recovery process is performed.
  • the statistical decision stream + Raw image stream and the video processing stream are separated and transmitted to the AP using a high-bandwidth time-division multiplexing of MIPITX, so that the AP side does not feel the influence of the companion chip.
  • the mechanism of out-of-order small packets is adopted.
  • the statistical data is generated in the video stream, it is not stored on the side of the accompanying chip, and the statistical data is loaded in the raw image channel in real time and transmitted to the raw image channel using MIPI. AP.
  • the transmission of a normal video stream is to transmit regular images line by line to the AP side, and in the embodiment of the present application, as shown in FIG. 3H , the statistical values (AE, AWB, AF, LSC, FLK) generated by the video stream are Different positions of the image are generated concurrently at different times.
  • the various statistical data generated can be sent out in the form of long packets of the virtual channel + data channel (VC + DT) of MIPITX in a small packet out of order, and at the same time the camera point PD, chip of the image captured by the camera Log and Matedata generated by the system can be sent in this way.
  • the image statistical data of each frame is sent to the AP side in the above-mentioned way, and the AP side can use the short packet reserved by the MIPI protocol to transmit interruption and interruption information without asking back, and it is high-speed and real-time, and the short packet ( Int) can appear at any position in Figure 3H, and further, these out-of-order packets are shunted to the ISP at the MIPIRX unpacking on the AP side, and Buf_Queue, DDR, and PDAF tend to be ordered. Among them, all kinds of statistical out-of-order packets to DDR will be recovered in the CPU recovery process.
  • the AP side can not feel the existence of the companion chip, and the statistical decision flow is completely separated from the video image processing flow.
  • the AP can obtain the automatic white balance statistical value of the raw image in real time, the automatic exposure Focus statistics, automatic lens correction statistics, automatic water ripple statistics and other information. Guarantee the AP's decision-making timing, reduce the probability of shock and non-convergence caused by decision-making errors, and do not need to increase the hardware overhead of additional high-bandwidth interfaces, such as PCIE, USB and other interfaces, realize raw transmission, and ensure that the AP side can achieve zero-latency photography. Moreover, no additional It can greatly improve the interrupt response time of the AP side, and provide the basis for the AP side to start working without getting all the information. Finally, there is no need for additional interfaces such as Trace and UART to the AP side. The shipping log is easy to debug.
  • step 301 generating image data block statistical data based on image data blocks included in the image data stream, may include the following steps:
  • A12. Determine a first target image statistical data type corresponding to the target shooting data according to a preset mapping relationship between the shooting data and the image statistical data type;
  • the shooting data may be at least one of the following: exposure duration, shooting mode, ISO sensitivity, white balance data, focal length, focus, area of interest, etc., which are not limited here.
  • the memory of the electronic device can pre-store the mapping relationship between the preset shooting data and the image statistical data type, and the mapping relationship is shown in the following table:
  • the image processor may acquire the target shooting data, and determine the first target image statistical data type corresponding to the target shooting data according to the preset mapping relationship between the shooting data and the image statistical data type, and obtain the first target image statistical data type corresponding to the target shooting data, from at least one image of the image.
  • the image block statistical data corresponding to the first target image statistical data type is obtained from the data block, so that the corresponding image statistical data can be selected according to the shooting requirements.
  • step 301 generating image block statistical data based on image data blocks included in the image data stream, may include the following steps:
  • the environmental data may include external environmental data, and/or, internal environmental data
  • the external environmental data may be understood as an objectively existing physical environment, that is, the natural environment, and the external environmental data may be at least one of the following: Ambient temperature, ambient humidity, ambient light brightness, atmospheric pressure, geographic location, magnetic field interference strength, jitter data, etc., are not limited here.
  • the environmental data can be collected by an environmental sensor, and the environmental sensor can be at least one of the following: a temperature sensor, a humidity sensor, an ambient light sensor, a weather sensor, a positioning sensor, and a magnetic field detection sensor.
  • the internal environment data can be understood as the environment data generated by the operation of each module of the electronic device, and the internal environment data can be at least one of the following: CPU temperature, GPU temperature, jitter data, number of CPU cores, etc., which are not limited here.
  • the electronic device may include a memory, and the memory may pre-store a preset mapping relationship between environmental data and image statistical data types, and the mapping relationship is shown in the following table:
  • the image processor may obtain the target environment data, and determine the second target image statistical data type corresponding to the target environment data according to the mapping relationship, and obtain the second target image statistical data type corresponding to the second target image statistical data type from at least one image data block of the image.
  • Statistical data of image blocks, thus, corresponding image statistic data can be obtained according to the shooting environment.
  • step 301 before generating the image block statistical data based on the image data blocks included in the image data stream, the following steps may be further included:
  • the image processor obtains first original image data, and the first original image data is part of the original image data of the currently processed image frame;
  • the image processor determines the target image quality evaluation value of the first original image data
  • the image processor executes step 301 when the target image quality evaluation value is greater than the preset image quality evaluation value.
  • the first original image data may be part of the original image data of the currently processed image frame before the original image data is loaded.
  • the preset image quality evaluation value can be set by the user or the system defaults.
  • the image processor may acquire the first original image data, and the image processor may use at least one image quality evaluation index to perform image quality evaluation on the first original image data to obtain a target image quality evaluation value, and the image quality evaluation index may be: At least one of the following: information entropy, average gradient, average grayscale, contrast, etc., which are not limited here.
  • step 301 may be performed; otherwise, the camera may be called to shoot again.
  • the image processor determines the target image quality evaluation value of the first original image data, which may include the following steps:
  • the memory in the electronic device may pre-store the preset mapping relationship between the distribution density of feature points and the image quality evaluation value, the preset mapping relationship between the signal-to-noise ratio and the image quality deviation value, and the preset mapping relationship between the signal-to-noise ratio and the image quality deviation value.
  • the mapping relationship between the shooting data and the optimization coefficient wherein the value range of the image quality evaluation value may be 0-1, or may also be 0-100.
  • the image quality deviation value may be a positive real number, for example, 0 to 1, or may be greater than 1.
  • the value range of the optimization coefficient may be between -1 and 1, for example, the optimization coefficient may be -0.1 to 0.1.
  • the shooting data may be at least one of the following: exposure duration, shooting mode, ISO sensitivity, white balance data, focal length, focus, area of interest, etc., which are not limited here.
  • the electronic device may determine the target feature point distribution density and the target signal-to-noise ratio of the first original image data, and determine the target feature point distribution according to the preset mapping relationship between the feature point distribution density and the image quality evaluation value.
  • the first image quality evaluation value corresponding to the density, the feature point distribution density reflects the image quality to a certain extent, and the feature point distribution density can be understood as the sum of the total number of feature points of the first original image data and the image area of the first original image data. ratio between.
  • the electronic device can determine the target image quality deviation value corresponding to the target signal-to-noise ratio according to the preset mapping relationship between the signal-to-noise ratio and the image quality deviation value. angle, jitter, etc.) or internal (system, GPU) reasons, some noise will be generated, and these noises will have some impact on the image quality. Therefore, the image quality can be adjusted to a certain extent to ensure an objective evaluation of the image quality.
  • the electronic device can also obtain the first shooting data of the first original image data, and determine the target optimization coefficient corresponding to the first shooting data according to the preset mapping relationship between the shooting data and the optimization coefficient, and the shooting data setting is also It may have a certain impact on the image quality evaluation. Therefore, it is necessary to determine the impact components of the shooting data on the image quality.
  • the first image quality evaluation value is adjusted according to the target optimization coefficient and the target image quality deviation value to obtain the target image quality. evaluation value, where the target image quality evaluation value can be obtained according to the following formula:
  • Target image quality evaluation value (first image quality evaluation value+target image quality deviation value)*(1+target optimization coefficient)
  • Target image quality evaluation value first image quality evaluation value*(1+target image quality deviation value)*(1+target optimization coefficient)
  • the image quality can be objectively evaluated by combining the influence of internal and external environmental factors and shooting setting factors, which helps to improve the accuracy of image quality evaluation.
  • image block statistical data is generated based on the image data blocks included in the image data stream; statistical data packets are generated based on the image block statistical data of one or more image data blocks; Packing one or more of the statistical data packets into a data packet group; and transmitting the data packet group to the application processor, on the one hand, the image statistical data can be transmitted in real time to ensure the reliability of data transmission, and on the other hand, The user does not feel the existence of the image processor, which improves the user experience.
  • FIG. 4 is a schematic flowchart of a data transmission method provided by an embodiment of the present application, which is applied to the electronic device shown in FIG. 1 or 2 or an application processor in the electronic device, as shown in the figure.
  • this data transmission method includes:
  • the data transmission method described in the embodiments of the present application can transmit image statistical data in real time to ensure the reliability of data transmission; user experience.
  • FIG. 5 is a schematic flowchart of a data transmission method provided by an embodiment of the present application, which is applied to an electronic device, where the electronic device includes an image processor and an application processor.
  • the processor's data transfer methods include:
  • the image processor generates image block statistical data based on the image data blocks included in the image data stream; generates a statistical data packet based on the image block statistical data of one or more image data blocks; packaging into a group of data packets; and transmitting the group of data packets to the application processor.
  • the application processor receives a data packet group; and in response to the interrupt information, performs an unpacking operation on the statistical data packet to obtain image block statistical data of the one or more image data blocks.
  • the data transmission method described in the embodiments of the present application can transmit image statistical data in real time to ensure the reliability of data transmission; user experience.
  • FIG. 6 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device includes an application processor, an image processor, a memory, and a communication interface. and one or more programs, wherein the one or more programs are stored in the memory and are configured to be executed by the image processor.
  • the program includes instructions for executing the following steps:
  • the packet group is communicated to the application processor.
  • the data packet group further includes system data packets and/or image data packets of the one or more image data blocks.
  • the image data package includes original image data and/or processed image data.
  • the above program includes instructions for performing the following steps:
  • a statistical data packet is generated based on image block statistical data of one image data block, and the statistical data packet is packaged into the data packet group.
  • the above program also includes instructions for performing the following steps:
  • a statistical data packet is generated based on image block statistical data of Q image data blocks, and the statistical data packet is packaged into the data packet group, where Q is a number greater than 1.
  • the above program also includes instructions for performing the following steps:
  • interrupt information is provided to the application processor to notify the application processor to perform unpacking of the statistics packets.
  • the interruption information and the statistical data packets are located in different data packet groups.
  • the above-mentioned one or more programs can also be configured to be executed by the above-mentioned application processor, and in the embodiment of the present application, the above-mentioned program includes an instruction for executing the following steps:
  • the group of data packets includes one or more statistical data packets generated based on image block statistics of one or more image data blocks contained in the image data stream;
  • an unpacking operation is performed on the statistics packet to obtain image block statistics for the one or more image data blocks.
  • the image processor and the application processor are integrated into the same chip, or the image processor and the application processor are two independent modules respectively.
  • the electronic device includes corresponding hardware structures and/or software modules for executing each function.
  • the present application can be implemented in hardware or in the form of a combination of hardware and computer software, in combination with the units and algorithm steps of each example described in the embodiments provided herein. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each particular application, but such implementations should not be considered beyond the scope of this application.
  • the electronic device may be divided into functional units according to the foregoing method examples.
  • each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units. It should be noted that the division of units in the embodiments of the present application is illustrative, and is only a logical function division, and other division methods may be used in actual implementation.
  • FIG. 7 is a schematic structural diagram of an electronic device 700 provided by an embodiment of the present application.
  • the electronic device 700 includes an image processor 701 and an application processor 702. As shown in the figure, wherein,
  • the image processor 701 is configured to generate image block statistical data based on image data blocks included in an image data stream; generate statistical data packets based on image block statistical data of one or more image data blocks; Packing data packets into data packet groups; and transmitting the data packet groups to application processor 702;
  • an application processor 702 configured to receive a data packet group; in response to the interrupt information, perform an unpacking operation on the statistical data packet to obtain image block statistical data of the one or more image data blocks,
  • the image processor 701 transmits interrupt information to the application processor 702 after transmitting the image block statistics data of a predetermined number of image data blocks to the application processor 702, so that the application processor 702 can The statistical data packets of the predetermined number of image data blocks are unpacked to obtain the statistical data of the image.
  • the electronic device includes an image processor and an application processor.
  • it can transmit image statistical data in real time to ensure the reliability of data transmission.
  • users can The presence of the image processor is not felt, which improves the user experience.
  • the image processor 701 and the application processor 702 can implement the functions or steps of any of the above methods.
  • FIG. 8 is a block diagram of functional units of the image processing apparatus 800 involved in the embodiment of the present application.
  • This can be used in electronic equipment, the electronic equipment further includes an application processor, and the image processing apparatus 800 includes: a statistics unit 801, a packing unit 802 and a transmission unit 803, wherein,
  • the statistical unit 801 is configured to generate image block statistical data based on the image data blocks contained in the image data stream;
  • the packing unit 802 is configured to generate a statistical data packet based on the image block statistical data of one or more image data blocks, and pack one or more of the statistical data packets into a data packet group;
  • the transmitting unit 803 is configured to transmit the data packet group to the application processor.
  • the data packet group further includes system data packets and/or image data packets of the one or more image data blocks.
  • the image data package includes original image data and/or processed image data.
  • packaging unit 802 is also used for:
  • a statistical data packet is generated based on image block statistical data of one image data block, and the statistical data packet is packaged into the data packet group.
  • packaging unit 802 is also used for:
  • the transmitting unit 803 is further configured to:
  • interrupt information is provided to the application processor to notify the application processor to perform unpacking of the statistics packets.
  • the interruption information and the statistical data packets are located in different data packet groups.
  • each unit may be, for example, an integrated circuit ASIC, a single circuit for executing one or more software or firmware Program processors (shared, dedicated, or chipset) and memory, combinational logic circuits, and/or other suitable components that provide the functions described above.
  • the statistics unit 801 , the packing unit 802 and the transmission unit 803 may be image processor circuits, and the functions or steps of any of the above methods can be implemented based on the above unit modules.
  • FIG. 9 is a block diagram of functional units of the application processor 900 involved in the embodiment of the present application.
  • the application processor 900 is applied to an electronic device, the electronic device may further include an image processor, and the application processor 900 includes: a receiving unit 901 and an unpacking unit 902, wherein,
  • the receiving unit 901 is configured to receive a data packet group, wherein the data packet group includes one or more statistical data packets, and the statistical data packets are based on images of one or more image data blocks included in the image data stream generated from block statistics;
  • the unpacking unit 90 in response to the interrupt information, performs an unpacking operation on the statistical data packet to obtain image block statistical data of the one or more image data blocks.
  • the receiving unit 901 and the unpacking unit 902 may be application processors, and the functions or steps of any of the above methods can be implemented based on the above unit modules.
  • the device for data transmission or the electronic device described in the embodiments of the present application, on the one hand, can transmit image statistical data in real time to ensure the reliability of data transmission, on the other hand, the user cannot feel the image
  • the existence of the processor improves the user experience.
  • an embodiment of the present application also provides an image processor, where the image processor is configured to perform the following operations:
  • the packet group is communicated to the application processor.
  • the embodiment of the present application also provides an application processor, where the application processor is used for:
  • the group of data packets includes one or more statistical data packets generated based on image block statistics of one or more image data blocks contained in the image data stream;
  • an unpacking operation is performed on the statistics packet to obtain image block statistics for the one or more image data blocks.
  • This embodiment also provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the computer program causes a computer to execute the embodiments of the present application, so as to realize any of the methods described above.
  • This embodiment also provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the above-mentioned relevant steps, so as to implement any of the methods in the above-mentioned embodiments.
  • the embodiments of the present application also provide an apparatus, which may specifically be a chip, a component or a module, and the apparatus may include a connected processor and a memory; wherein, the memory is used for storing computer execution instructions, and when the apparatus is running, The processor can execute the computer-executed instructions stored in the memory, so that the chip executes any one of the foregoing method embodiments.
  • the electronic device, computer storage medium, computer program product or chip provided in this embodiment are all used to execute the corresponding method provided above. Therefore, for the beneficial effects that can be achieved, reference can be made to the corresponding provided above. The beneficial effects in the method will not be repeated here.
  • the disclosed apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or May be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • Units described as separate components may or may not be physically separated, and components shown as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed in multiple different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium.
  • a readable storage medium including several instructions to make a device (which may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

本申请公开了一种数据传输方法、装置及存储介质,该方法包括:基于图像数据流包含的图像数据块生成图像块统计数据;基于一个或多个图像数据块的图像块统计数据生成统计数据包;将一个或多个所述统计数据包打包到数据包组中;以及向应用处理器传送所述数据包组。采用本申请实施例,能够保证数据传输的可靠性,并实现图像信息实时传输。

Description

数据传输方法、装置及存储介质 技术领域
本申请涉及电子技术领域,尤其涉及一种数据传输方法、装置及存储介质。
背景技术
随着电子设备(如手机、平板电脑、智能手表等等)的大量普及应用,电子设备能够支持的应用越来越多,功能越来越强大,电子设备向着多样化、个性化的方向发展,成为用户生活中不可缺少的电子用品。
目前,视频降噪算法多在手机的应用处理器(application processor,AP)测实现,图像信号处理器(image signal processor,ISP)的统计决策等均在AP内部。然而,AP内部均采用通用中央处理器(central processing unit,CPU)、神经网络处理器(neural-network processing unit,NPU)、数字信号处理器(digital signal processor,DSP)架构,算法实现能效比非常低,一般会把伴随芯片ISP的统计信息在帧尾统一打包传给AP,由于ISP和NPU本身的延迟很大,统计信息还按照通用的格式在帧尾打包,会严重影响AP接收的实时性,影响到AP的决策时机,因此,如何实现图像统计信息传输实时性的问题亟待解决。
发明内容
本申请实施例提供一种数据传输方法、装置及存储介质,能够实时传输图像统计数据。
第一方面,本申请实施例提供一种数据传输方法,所述方法包括:
基于图像数据流包含的图像数据块生成图像块统计数据;
基于一个或多个图像数据块的图像块统计数据生成统计数据包;
将一个或多个所述统计数据包打包到数据包组中;以及
向应用处理器传送所述数据包组。
第二方面,本申请实施例提供一种数据传输方法,所述方法包括:
接收数据包组,其中,所述数据包组包括一个或多个统计数据包,所述统计数据包基于图像数据流中包含的一个或多个图像数据块的图像块统计数据而生成;
响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据。
第三方面,本申请实施例提供一种图像处理装置,包括:
统计单元,其用于基于图像数据流包含的图像数据块生成图像块统计数据;
打包单元,用于基于一个或多个图像数据块的图像块统计数据生成统计数据包,并将一个或多个所述统计数据包打包到数据包组中;以及
传送单元,用于向应用处理器传送所述数据包组。
第四方面,本申请实施例提供一种应用处理器,包括:
接收单元,用于接收数据包组,其中,所述数据包组包括一个或多个统计数据包,所述统计数据包基于图像数据流中包含的一个或多个图像数据块的图像块统计数据而生成;
解包单元,响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据。
第五方面,本申请实施例提供一种电子设备,包括:
图像处理装置,其用于基于图像数据流包含的图像数据块生成图像块统计数据,基于一个或多个图像数据块的图像块统计数据生成统计数据包,并将一个或多个所述统计数据包打包到数据包组中;
应用处理器,其用于接收来自所述图像处理装置的数据包组;
其中,所述图像处理装置在向所述应用处理器传送预定数目的统计数据包后,向所述应用处理器传送中断信息,以使得所述应用处理器对所述统计数据包进行解包,得到所述一个或多个图像数据块的图像块统计数据。
第六方面,本申请实施例提供了一种计算机可读存储介质,其中,上述计算机可读存储介质存储用于电子数据交换的计算机程序,其中,上述计算机程序使得计算机执行如本申请实施例第一方面中所描述的部分或全部步骤。
第七方面,本申请实施例提供了一种计算机程序产品,其中,上述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,上述计算机程序可操作来使计算机执行如本申请实施例第一方面中所描述的部分或全部步骤。该计算机程序产品可以为一个软件安装包。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种电子设备的结构示意图;
图2是本申请实施例提供的一种电子设备的软件结构示意图;
图3A是本申请实施例提供的一种数据传输方法的流程示意图;
图3B是本申请实施例提供的数据包的结构示意图;
图3C是本申请实施例提供的用于图像处理器的数据传输的演示示意图;
图3D是本申请实施例提供的另一种数据传输的演示示意图;
图3E是本申请实施例提供的解包操作的演示示意图;
图3F是本申请实施例提供的解包操作的流程示意图;
图3G是本申请实施例提供的另一种数据传输方法的流程示意图;
图3H是本申请实施例提供的另一种数据传输的演示示意图;
图4是本申请实施例提供的另一种数据传输方法的流程示意图;
图5是本申请实施例提供的另一种数据传输方法的流程示意图;
图6是本申请实施例提供的另一种电子设备的结构示意图;
图7是本申请实施例提供的另一种电子设备的结构示意图;
图8是本申请实施例提供的一种图像处理装置的功能单元组成框图;
图9是本申请实施例提供的一种应用处理器的功能单元组成框图。
具体实施方式
下面将结合附图,对本申请实施例中的技术方案进行描述。
为了更好地理解本申请实施例的方案,下面先对本申请实施例可能涉及的相关术语和概念进行介绍。
具体实现中,电子设备可以包括各种具有计算机功能的设备,例如,手持设备(智能手机、平板电脑等)、车载设备(导航仪、辅助倒车系统、行车记录仪、车载冰箱等等)、可穿戴设备(智能手环、无线耳机、智能手表、智能眼镜等等)、计算设备或连接到无线调制解调器的其他处理设备,以及各种形式的用户设备(User Equipment,UE),移动台(Mobile Station,MS),虚拟现实/增强现实设备,终端设备(terminal device)等等,电子设备还可以为基站或者服务器。
电子设备还可以包括智能家居设备,智能家居设备可以为以下至少一种:智能音箱、智能摄像头、智能电饭煲、智能轮椅、智能按摩椅、智能家具、智能洗碗机、智能电视机、智能冰箱、智能电风扇、智能取暖器、智能晾衣架、智能灯、智能路由器、智能交换机、智能开关面板、智能加湿器、智能空调、智能门、智能窗、智能灶台、智能消毒柜、智能马桶、扫地机器人等等,在此不做限定。
第一部分,本申请所公开的技术方案的软硬件运行环境介绍如下。
如图所示,图1示出了电子设备100的结构示意图。电子设备100可以包括处理器110、外部存储器接口120、内部存储器121、通用串行总线(universal serial bus,USB)接口130、充电管理模块140、电源管理模块141、电池142、天线1、天线2、移动通信模块150、无线通信模块160、音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、传感器模块180、指南针190、马达191、指示器192、摄像头193、显示屏194以及用户标识模块(subscriber identification module,SIM)卡接口195等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器AP,调制解调处理器,图形处理器GPU,图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器NPU等。其中,不同的处理单元可以是独立的部件,也可以集成在一个或多个处理器中。在一些实施例中,电子设备100也可以包括一个或多个处理器110。其中,控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。在其他一些实施例中,处理器110中还可以设置存储器,用于存储指令和数据。示例性地,处理器110中的存储器可以为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从存储器中直接调用。这样就避免了重复存取,减少了处理器110的等待时间,因而提高了电子设备100处理数据或执行指令的效率。处理器还可以包括图像处理器,图像处理器可以为图像预处理器(preprocess image signal processor,Pre-ISP),其可以理解为一个简化的ISP,其也可以进行一些图像处理操作,例如,可以获取图像统计信息。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路间(inter-integrated circuit,I2C)接口、集成电路间音频(inter-integrated circuit sound,I2S)接口、脉冲编码调制(pulse code modulation,PCM)接口、通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口、移动产业处理器接口(mobile industry processor interface,MIPI)、用输入输出(general-purpose input/output,GPIO)接口、SIM卡接口和/或USB接口等。其中,USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口、Micro USB接口、USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。该USB接口130也可以用于连接耳机,通过耳机播放音频。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110、内部存储器121、外部存储器、显示屏194、摄像头193和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量、电池循环次数、电池健康状态(漏电,阻抗)等数据。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1、天线2、移动通信模块150、无线通信模块160、调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G/6G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络)、蓝牙(blue tooth,BT),全球导航卫星系统(global navigation satellite system,GNSS)、调频(frequency modulation,FM)、近距离无线通信技术(near field communication,NFC)、红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像、视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)、有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED)、柔性发光二极管(flex light-emitting diode,FLED)、迷你发光二极管(mini light-emitting diode,miniled)、MicroLed、Micro-oLed、量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或多个显示屏194。
电子设备100可以通过ISP、摄像头193、视频编解码器、GPU、显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点、亮度、肤色进行算法优化。ISP还可以对拍摄场景的曝光、色温等数据优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或多个摄像头193。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1、MPEG2、MPEG3、MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100 的智能认知等应用,例如:图像识别、人脸识别、语音识别、文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储一个或多个计算机程序,该一个或多个计算机程序包括指令。处理器110可以通过运行存储在内部存储器121的上述指令,从而使得电子设备100执行本申请一些实施例中所提供的显示页面元素的方法,以及各种应用以及数据处理等。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统;该存储程序区还可以存储一个或多个应用(比如图库、联系人等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如照片,联系人等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如一个或多个磁盘存储部件,闪存部件,通用闪存存储器(universal flash storage,UFS)等。在一些实施例中,处理器110可以通过运行存储在内部存储器121的指令,和/或存储在设置于处理器110中的存储器的指令,来使得电子设备100执行本申请实施例中所提供的显示页面元素的方法,以及其他应用及数据处理。电子设备100可以通过音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、以及应用处理器等实现音频功能。例如音乐播放、录音等。
传感器模块180可以包括压力传感器180A、陀螺仪传感器180B、气压传感器180C、磁传感器180D、加速度传感器180E、距离传感器180F、接近光传感器180G、指纹传感器180H、温度传感器180J、触摸传感器180K、环境光传感器180L、骨传导传感器180M等。
其中,压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即X、Y和Z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中, 当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
示例性的,图2示出了电子设备100的软件结构框图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。应用程序层可以包括一系列应用程序包。
如图2所示,应用程序层可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
第二部分,本申请实施例所公开的用于图像处理器的数据传输方法、装置及存储介质介绍如下。
进一步地,请参阅图3A,图3A是本申请实施例提供的一种数据传输方法的流程示意图,该数据传输方法应用于包括如图1或图2所示的电子设备或者该电子设备的图像处理器,如图所示,本数据传输方法包括:
301、基于图像数据流包含的图像数据块生成图像块统计数据。
其中,本申请实施例中,图像数据可以为以下至少一种:原始图像数据、像素数据(pixel data,PD)数据、处理后的图像数据等等,在此不作限定。上述图像可以为视频流中的任一图像。处理后的图像数据可以为原始图像数据经过预设图像处理算法处理后的图像数据,预设图像处理算法为各种图像处理算法中的一种或者多种,例如,可以为以下至少一种:白平衡算法、小波变换算法、直方图均衡化算法、神经网络算法等等,在此不作限定。本申请实施例中图像处理器可以为图像预处理器或者伴随芯片。
当然,在具体实现中,图像处理器可以在图像获取的过程中,获取图像中至少一个图像数据块的图像块统计数据,即图像统计数据,图像块统计数据可以为以下至少一种:自动曝光AE图像统计数据、自动聚焦AF图像统计数据、AWB图像统计数据、自动镜头阴影校正(lens shading correction,LSC)图像统计数据、自动水波纹(flicker,FLK)图像统计数据等等,在此不做限定。故而,图像统计数据的类型可以为以下至少一种:AE、AF、AWB、LSC、FLK等等,在此不做限定。
其中,本申请实施例中,图像可以为原始图像数据,对于一帧图像而言,图像处理器可以是一个像素一个像素地获取原始图像数据,即图像处理器是逐行扫描以获取原始图像数据,可以理解为,获取全部原始图像数据则需要扫描完其全部的像素点,本申请实施例中,则可以是按照预设规则划分好图像数据块的区域,例如,可以将每5行像素点作为一个图像数据块,又例如,可以预先规划好图像数据块的区域(位置),当对应图像数据块的区域内的像素点全部扫描完,则可以得到一个图像数据块。最终的话,可以将原始图像数据理解为多个图像数据块,每一图像数据块均可以包含部分原始图像数据。
具体实现中,原始图像数据可以为一帧或者多帧图像的raw数据。图像处理器可以在成功获取到任一图像数据块之后,就开始获取该任一图像数据块的图像统计数据,例如,当原始图像数据加载到第j行的时候,便认为得到一个图像数据块,便可以开始获取该任一图像数据块的图像统计数据,又例如,原始图像数据每加载预设数量像素点,则可以认为得到一个图像数据块,便可以获取其对应的图像统计数据,预设数量可以由用户自行设置或者系统默认。即本申请实施例,可以在原始图像数据加载过程中,便开始获取图像统计数据。
302、基于一个或多个图像数据块的图像块统计数据生成统计数据包。
具体实现中,图像处理器可以在T时刻对图像数据进行打包,在T+1时刻对图像统计数据进行打包,进而,可以得到多个数据包。基于仲裁打包机制,能够把统计决策流跟视频图像处理流完全分开,保证决策实时,进而,把各种统计信息小包乱序实时传输给AP,避免伴随芯片存储开销同时最大限度保证AP接收的实时性。
可选地,上述步骤302,基于一个或多个图像数据块的图像块统计数据生成统计数据包,可以按照如下方式实施:
采用随机方式从图像数据和所述图像块统计数据中选取待打包数据,对所述待打包数据进行打包,得到所述统计数据包。
具体实现中,图像处理器可以随机方式选取图像块统计数据或者图像数据中的数据,用以进行打包,进而,可以得到统计数据包。
以图像统计数据为例,图像处理器还可以基于一个或者多个图像数据块的图像块统计数据,生成相应的统计数据包,并向应用处理器发送该统计数据包。统计数据包可以理解为一个或者多个数据包。
进一步地,图像处理器可以采用至少一个数据传输通道将图像统计数据包发送应用处理器。例如,图像处理器可以采用一个数据传输通道依次将该图像统计数据包发送给应用处理器。又例如,图像处理器可以将图像统计数据包划分为多个数据集,每一数据集对应一个或者部分图像统计数据包,采用至少一个数据传输通道将多个数据集发送给应用处理器,数据传输通道可以与数据集一一对应,每一数据传 输通道可以对应一个进程或者线程。
具体实现中,图像处理器不仅可以将图像统计数据包发送给应用处理器,还可以将原始图像数据发送给应用处理器,例如,可以将图像统计数据包实时加载在原始图像数据对应的传输通道,利用MIPI传送给应用处理器。具体实现中,可以将原始图像数据和图像统计数据包分开发送给应用处理器,例如,可以将图像统计数据包先发送给应用处理器,在该图像统计数据包发送完成之后,则可以将原始图像数据发送给应用处理器,如此,在图像统计数据包发送给应用处理器之后,应用处理器可以对该图像统计数据包进行解包操作,并基于解包后得到的图像统计数据调取相应的算法,即可以实现原始图像数据完全发送给应用处理器之前,提前准备好相应的算法,有助于提升图像处理效率。
可选地,图像统计数据包中的任一数据包均包括:包头(packet header,PH)、有效图像统计数据和包尾(packet footer,PF),每一数据包的有效图像统计数据可以对应至少一种类型的图像统计数据。
其中,如图3B所示,图像统计数据包可以包括包头PH、有效图像统计数据以及包尾PF,包头可以用于标记一个数据包的开始位置,有效图像统计数据为一类图像统计数据的部分图像统计数据,该原始图像数据的所有图像数据块对应的数据包中的所有有效图像统计数据构成原始图像数据的完整的图像统计数据,包尾可以用于表示数据包的结束位置。每一类型的图像统计数据的数据包中的有效图像统计数据的长度可以相同或者不同。
可选地,所述包头包括包头标记、索引包标记和包数据长度。
其中,包头可以包括:包头标记、索引包标记和包数据长度,包头标记用于表示当前数据包(图像统计数据包)的统计类型,索引包标记用于表示当前数据包是统计数据还是独立的索引,包数据长度用于表示当前数据包的数据长度,具体结构如下表所示:
包头结构 字节长度
包头标记 Byte3
索引包标记 Byte2
包数据长度 Byte1+Byte0
可选地,所述包尾包括:包尾标记、数据包计数和帧计数。
其中,包尾可以包括:包尾标记、数据包计数和帧计数,包尾标记用于表示包尾位置,数据包计数用于表示该数据包为当前统计类型的数据包的计数(第几个),帧计数表示该数据包来自哪一帧图像的原始图像数据,具体如下表所示:
包尾结构 字节长度
包尾标记 Byte3
数据包计数 Byte2
帧计数 Byte1+Byte0
可选地,在所述图像数据包括原始图像数据和处理后的图像数据时,上述步骤302,基于一个或多个图像数据块的图像块统计数据生成统计数据包,可以包括如下步骤:
将所述原始图像数据、所述处理后的图像数据和所述图像块统计数据进行仲裁打包,得到所述统计数据包。
具体实现中,在图像数据包括原始图像数据和处理后的图像数据时,将原始图像数据、处理后的图像数据和图像块统计数据进行仲裁打包,得到统计数据包,即随机方式从原始图像数据、处理后的图像数据和图像块统计数据中选取数据,以用于打包。另外,还可以把Raw图像绕开图像处理路径,借用MIPI带宽实时传送AP,无需大带宽接口,同时保证了AP的零延迟拍照。
可选地,步骤302,基于一个或多个图像数据块的图像块统计数据生成统计数据包,还可以包括如下步骤:
获取所述图像处理器的系统数据;
将所述系统数据、所述图像数据和所述一个或多个图像数据块进行仲裁打包,得到所述统计数据包。
其中,系统数据可以为以下至少一种:Log数据、MD(Matedata)数据等等,在此不作限定。
具体实现中,图像处理器可以将系统数据、图像数据和一个或多个图像数据块进行仲裁打包,得到至统计数据包,即随机方式从系统数据、处理后的图像数据和图像块统计数据中选取数据,以用于打包。
303、将一个或多个所述统计数据包打包到数据包组中;以及向应用处理器传送所述数据包组。
其中,具体实现中,预设数目可以由用户自行设置或者系统默认,例如,预设数目可以为一部分的图像数据块的数量,或者,也可以为全部图像数据块的数量。具体实现中,图像处理器可以向应用处理器传送数据包组,该数据包组具有预定大小并且包括一个或多个统计数据包,应用处理器在接收到预定数目图像数据块的块统计数据包后,可以基于来自图像处理器的中断信息,中断信息用于以通知应用处理器块统计资料已经传输完成,即可以对预定数目的块统计数据包执行重构操作,以得到图像的统计数据。
可选地,所述数据包组还包括系统数据包和/或所述一个或多个图像数据块的图像数据包。
具体实现中,系统数据包可以为对系统数据进行打包后得到的数据包,系统数据可以为以下至少一种:Log数据、MD(Matedata)数据等等,在此不作限定。
可选地,所述图像数据包包括原始图像数据和/或经处理后的图像数据。
其中,图像处理器还可以将图像的原始图像数据以及原始图像数据对应的处理后的图像数据中的至少一类数据进行打包,得到图像数据包,经指定处理后的图像数据可以为原始图像数据经过预设图像处理算法处理后的图像数据。
可选地,还可以包括如下步骤:
基于一个图像数据块的图像块统计数据生成统计数据包,并将所述统计数据包打包到所述数据包组中。
其中,图像处理器可以当接收到一个图像数据块的统计数据时,基于所接收的统计数据流生成统计数据包,将统计数据包添加入数据包组,将该数据包发送给应用处理器,进而,可以保证数据实施传输。
可选地,还可以包括如下步骤:
基于Q个图像数据块的图像块统计数据生成统计数据包,将所述统计数据包打包到所述数据包组中,所述Q为大于1的数。
其中,图像处理器可以当累计接收到Q个图像数据块的统计数据时,基于所接收的Q个图像数据块的图像块统计数据生成统计数据包,将统计数据包添加入数据包组,Q为大于1的整数,将该数据包发送给应用处理器,能够积累一定的数据再打包,可以降低设备功耗。
可选地,图像处理器还可以以一个数据包一个数据包地形式,将至少一个数据包发送给应用处理器,也可以在数据包数量积累到设定数量的时候,再将数据包集中发送给应用处理器,设定数量可以由用户自行设置或者系统默认。图像处理器可以通过虚拟通道或者数据通道实现数据包发送。另外,图像处理器可以利用MIPI长包及DataType传送调试日志信息给AP用于调试,无需调试口,还可以用MIPI协议预留短包实时传送中断源及中断信息,无需额外端口通信。
可选地,在所述至少一个数据包包括图像统计数据包时,上述步骤303,向应用处理器传送所述数据包组,可以按照如下方式实施:
通过预设虚拟通道将所述数据包组发送给所述应用处理器。
其中,预设虚拟通道(virtual channel)可以由用户自行设置或者系统默认,则图像处理器可以通过预设虚拟通道将数据包组发送给应用处理器。具体实现中,可以采用一个或者多个虚拟通道将数据包组发送给应用处理器,每一个虚拟通道可以对应一个线程或者进程。
可选地,还可以包括如下步骤:
当向所述应用处理器传送预定数目的统计数据包后,向所述应用处理器提供中断信息,以通知所述应用处理器执行对所述统计数据包的解包。
其中,中断信息的中断方式可以为通用输入输出(general purpose input output,GPIO)中断方式,中断信息可以用于通知应用处理器块统计资料已经传输完成,具体实现中,当向应用处理器传送预定数 目图像数据块的统计数据包后,向应用处理器提供中断信息,以通知应用处理器块统计资料已经传输完成,进而,应用处理器知晓图像处理器已经完成数据传输。
进一步地,可选地,所述中断信息与所述统计数据包位于不同的数据包组中。
具体实现中,中断信息与统计数据包可以包位于不同的数据包组中,进而,可以将中断信息设置于统计数据包之后,这样,统计数据包传输完成之后,则可以立马通知应用处理器,应用处理器可以快速知晓块统计资料已经传输完成。
可选地,上述步骤303,向应用处理器传送数据包组,可以包括如下步骤:
31、在发送所述数据包组中的图像块统计数据时,获取相应的统计数据包对应的目标属性信息;
32、按照预设的属性信息与通道之间的映射关系,确定所述目标属性信息对应的目标通道;
33、通过所述目标通道传输所述块统计数据包。
其中,本申请实施例中,属性信息可以为以下至少一种:数据包中的数据的数据类型、数据包中的数据的数据位数(数据长度)、数据包的类型(图像统计数据类型)等等,在此不做限定,其中,数据类型可以为以下至少一种:浮点型(单精度型、双精度型)、整型等等,在此不做限定,数据包的类型可以为以下至少一种:AE、AF、AWB、LSC、FLK等等,在此不做限定。如图3C所示,不同的图像统计数据包的类型可以对应不同的通道,或者,不同的数据长度可以对应不同的通道,例如,移动产业处理器接口(mobile industry processor interface,MIPI)通道可以包括3个图像数据接口(image data interface,IDI),其对应的通道可以为:IDI0、IDI1和IDI2,k1类型的图像统计数据可以对应IDI0,k2类型的图像统计数据可以对应DII1。
具体实现中,电子设备的存储器中可以预先存储预设的属性信息与通道之间的映射关系。以数据包i为例,数据包i为图像统计数据包中的任一数据包,图像处理器可以获取数据包i对应的目标属性信息,则可以按照预设的属性信息与通道之间的映射关系,确定目标属性信息对应的目标通道,可以通过目标通道传输数据包i,如此,可以依据数据包的属性分配相应的通道,有助于提升数据传输效率。
进一步地,图像处理器可以将图像统计数据包的部分数据包通过预设虚拟通道发送给应用处理器,以及将图像统计数据包中的剩余部分数据包通过属性信息选择相应的通道的方式发送给应用处理器,该两种不同方式可以同步进行,也可以异步进行,例如,可以采用一个进程或者进程执行将图像统计数据包的部分数据包通过预设虚拟通道发送给应用处理器,而采用另一个线程或者进程执行将图像统计数据包中的剩余部分数据包通过属性信息选择相应的通道的方式发送给应用处理器。
可选地,在上述步骤303之后,还可以包括如下步骤:
若所发送的统计数据包中的指定类型图像统计数据的数据包数量达到预定阈值,则向所述应用处理器发送所述指定类型图像统计数据对应的索引信息,以使得所述应用处理器基于所述索引信息和所接收的所述指定类型图像统计数据的数据包来获取所述图像的显示数据,其中,所述索引信息用于表征所述指定类型图像统计数据的数据包与所述图像数据块之间的对应关系。
其中,预定阈值可以由用户自行设置或者系统默认。指定类型图像统计数据可以由用户自行设置或者系统默认。图像的显示数据可以为以下至少一种:显示亮度、像素值、显示颜色、分辨率、对比度、清晰度等等,在此不作限定。指定类型图像统计数据可以为图像中的至少一种类型图像统计数据。
具体实现中,若所发送的统计数据包中的指定类型图像统计数据的数据包数量达到预定阈值,则可以停止向应用处理器传输指定类型图像统计数据对应的剩余的统计数据包,在一定程度上,可以降低设备功耗。
另外,在原始图像数据中指定类型图像统计数据对应的数据包发送给应用处理器的数据包数量达到预定阈值时,还可以向应用处理器发送与指定类型图像统计信息对应的索引信息,该索引信息可以放在一个索引数据包里,该索引数据包中可以包含目标索引表,目标索引表中则记录着该指定类型图像统计信息对应的数据包的相关信息,相关信息可以包括以下至少一种:数据包的索引包标记、数据包中的图像统计数据的存储位置等等,在此不做限定。另外,指定类型图像统计数据的目标索引表在总的数据包(原始图像数据的所有数据包)中的偏移量可以预先保存在图像处理器中,在该指定类型图像统计信息 发送完成时,便可以将该目标索引表对应的偏移量给应用处理器,应用处理器则可以获取指定类型图像统计信息的目标索引表,进而,可以从应用处理器已经接收到的多个数据包中获取与目标索引表对应的图像统计数据包,依据目标索引表的索引顺序对图像统计数据包进行解包操作,得到指定类型图像统计数据,即可以实现调取该类图像统计信息对应的算法,以实现相应的图像处理操作,如此,可以实现对数据包进行解包操作,并且可以针对任一类型图像统计数据对应的数据包进行解包,不必等所有的数据包均完成传输,有助于提升图像处理效率。
进一步地,可选地,在所述向所述应用处理器发送所述指定类型图像统计数据对应的索引信息之后,还可以包括如下步骤:
以预设中断方式向所述应用处理器发送通知消息,所述通知消息用于指示所述指定类型图像统计数据的数据包数量已经达到所述预定阈值。
其中,预设中断方式可以由用户自行设置或者系统默认。预设中断方式可以为通用输入输出中断方式或者移动产业处理器接口(mobile industry processor interface,MIPI)通道。例如,通过MIPI通道发送额外数据包给AP,通过该额外数据包以通知AP指定类型图像统计数据已完成传输任务。
具体实现中,在指定类型图像统计数据对应的数据包向应用处理器传输的数据包数量达到预定阈值时时,图像处理器可以通过预设中断方式向应用处理器发送通知消息,该通知消息用于指示指定类型图像统计数据的数据包数量已经达到预定阈值。
举例说明下,如图3C所示,以3A图像统计数据为例,该3A图像统计数据可以包括:AE图像统计数据、AF图像统计数据和AWB图像统计数据,可以将该3类图像统计数据构成3A图像统计数据数据包,即得到至少一个数据包,每一数据包的每一类型的图像统计数据可以对应一个图像统计数据索引表,或者,每一帧图像的每一类型的图像统计数据可以对应一个图像统计数据索引表,图像统计数据索引表可以为AE图像统计数据索引表、AWB图像统计数据索引表或者AF图像统计数据索引表等等,在数据包传输过程中,可以依据图像统计数据索引表的索引顺序发送数据包。在每类图像统计数据发送完成之后,可以将该类图像统计数据对应的图像统计数据索引表发送给应用处理器,应用处理器可以依据该图像统计数据索引表进行解包操作,具体地,可以依据该图像统计数据索引表对应的索引顺序进行解包操作。在至少一个数据包包括多种类型图像统计数据时,可以乱序或者有序发送该至少一个数据包,乱序可以理解为这一刻发送一种类型图像统计数据的一个数据包,下一刻发送另一类型图像统计数据的一个数据包,有序可以理解为一个时间段发送一种类型图像统计数据的数据包,在该类型图像统计数据的数据包发送完成时,则可以发送另一种类型图像统计数据的数据包。
举例说明下,如图3D所示,图像处理器可以将统计数据包中一个一个数据包通过MIPI发送给应用处理器,在发送数据包的过程中,记录每一个数据包的索引,当指定类型的图像统计数据对应的达到预定阈值数量的最后一个数据包发送完成时,则可以设置GPIO中断,应用处理器则可以通过索引表,查找该指定类型的图像统计数据对应的索引,通过索引表可以从统计数据包中解析出指定类型的图像统计数据对应的统计数据,将其按照索引表的索引顺序进行排列,得到该指定类型的统计数据,则可以进一步调用该指定类型的统计数据相应的算法,即任一类型的统计数据提前结束时,应用处理器便可立即启动相应算法,而不必等所有图像统计数据接收完成,提升了图像处理效率。
进一步地,如图3E所示,在每一类型的用于图像处理器的数据传输完成时,则可以将该类型的图像统计数据对应的索引表的偏移量发送给应用处理器,应用处理器则可以依据偏移量定位索引表对相应的数据包进行解包,并将解包后的图像有效图像统计数据依据索引表的顺序进行排列,得到最终的该类型的图像统计数据,例如,AF类型的图像统计数据,其对应索引表包括f、j、n、q、t数据包,则可以依据该索引表解包相应的数据包。具体地,如图3F所示,当应用处理器收到某类型的图像统计数据完成,则可以读取该已经完成统计的数据包对应的索引包,例如,可以以32bits为单位依次读取索引包内容到索引位置index_n,读取打包数据从index_n开始的32bits作为包头标记(PH),解析PH内容,读取该索引对应的数据段的长度,拷贝统计数据片段到目标缓存中,检测索引包遍历是否结束,若是,则当前图像统计数据解包完成,否则,再次执行读取打包数据从index_n开始的32bits作为包头标记(PH) 及其后续步骤,直到索引包遍历完成。
进一步地,AP在收到GPIO中断后,通过安全数字输入输出卡(secure digital input and output,SDIO)来查询当前是哪个图像统计数据已经完成,同时获取这个图像统计数据的索引包的起始位置,随后AP侧凭借这个索引包可以快速定位每个类型图像统计数据,便于立即启动相应的算法,本申请实施例中,算法可以为以下至少一种:白平衡算法、图像增强算法、去模糊算法、图像分割算法、图像增强算法、插值算法等等,在此不做限定。
进一步地,本申请实施例中,图像处理器还可以为伴随芯片,如图3G所示,伴随芯片包括ISP、NPU,伴随芯片可以接收由摄像头传输的Raw图像数据,再其传给ISP,还可以再由ISP传输给NPU进行处理,进而,可以对Raw图像数据、NPU处理后的数据以及图像的统计数据进行仲裁打包,得到乱序小包,再将其通过MIPI传输通道发送给应用处理器,应用处理器可以对接收到的数据进行解包,例如,Raw图像数据在Buf-Queue中进行存储,PD以及图像统计数据可以存放在DDR,NPU处理后的数据可以存储在ISP,还可以对DDR中的数据进行复原处理。进而,在伴随芯片,把统计决策流+Raw图像流和视频处理流分开,利用一路MIPITX的高带宽分时复用传送给AP,使得AP侧感觉不到伴随芯片的影响。
具体实现中,以5A统计数据为例,由于5A统计数据在视频流的前端早起生成,且生成的时机不定、数据量较大。为保证能够实时传送给AP侧,本申请实施例中,采用小包乱序的机制,在视频流生成统计数据时,不在伴随芯片侧存储,将统计数据实时加载在raw图像通道,利用MIPI传送给AP。
进一步地,正常视频流的传输是规律的一行行图像传送给AP侧,而本申请实施例,如图3H所示,视频流生成的统计值(AE,AWB,AF,LSC,FLK),在图像的不同位置不同时机陆续并发产生。本申请实施例中,可以将生成的各种统计数据以小包乱序的方式借用MIPITX的虚拟通道+数据通道(VC+DT)长包的形式送出,同时摄像头拍摄的图像的摄像点PD、芯片系统产生的Log以及Matedata,都可以用此方式送出。
具体实现中,每帧的图像统计数据用上述方式送给AP侧,AP侧则可以利用MIPI协议预留的短包,传送中断及中断信息,无需回问,且高速实时,且该短包(Int)可以在图3H的任意位置出现,进而,这些乱序小包在AP侧MIPIRX解包处分流去向ISP,Buf_Queue、DDR、PDAF趋于有序。其中去往DDR的各类统计乱序小包将在CPU复原中实现整包复原。
基于上述本申请实施例,在AP侧可以感觉不到伴随芯片的存在,把统计决策流跟视频图像处理流完全分开,AP能实时得到raw图像的自动白平衡统计值,自动曝光统计值,自动聚焦统计值,自动镜头矫正统计,自动水波纹统计等信息。保证AP的决策时机,减少决策错误引起震荡不收敛概率,另外,无需增加额外大带宽接口硬件开销,比如PCIE,USB等接口,实现raw传输,保证AP侧实现零延迟拍照,再者,无需额外的中断线,无需邮箱询问伴随芯片中断信息,能够大幅提升AP侧中断响应时间,提供AP侧在未拿到全部资料就可以开始工作的基础,最后,无需额外接口如Trace,UART给AP侧传送日志便于调试。
可选地,上述步骤301,基于图像数据流包含的图像数据块生成图像数据块统计数据,可以包括如下步骤:
A11、获取目标拍摄数据;
A12、按照预设的拍摄数据与图像统计数据类型之间的映射关系,确定所述目标拍摄数据对应的第一目标图像统计数据类型;
A13、从所述图像的至少一个图像数据块中获取与所述第一目标图像统计数据类型对应的图像块统计数据。
其中,本申请实施例中,拍摄数据可以为以下至少一种:曝光时长、拍摄模式、感光度ISO、白平衡数据、焦距、焦点、感兴趣区域等等,在此不做限定。
具体实现中,电子设备的存储器中可以预先存储预设的拍摄数据与图像统计数据类型之间的映射关系,该映射关系如下表所示:
拍摄数据 图像统计数据类型
拍摄数据a1 图像统计数据类型A1
拍摄数据a2 图像统计数据类型A2
... ...
拍摄数据an 图像统计数据类型An
即不同的拍摄数据对应不同的图像统计数据类型。
进而,图像处理器可以获取目标拍摄数据,并且按照预设的拍摄数据与图像统计数据类型之间的映射关系,确定目标拍摄数据对应的第一目标图像统计数据类型,从该图像的至少一个图像数据块中获取与第一目标图像统计数据类型对应的图像块统计数据,如此,可以依据拍摄要求选取相应的图像统计数据。
可选地,上述步骤301,基于图像数据流包含的图像数据块生成图像块统计数据,可以包括如下步骤:
B11、获取目标环境数据;
B12、按照预设的环境数据与图像统计数据类型之间的映射关系,确定所述目标环境数据对应的第二目标图像统计数据类型;
B13、从所述图像的至少一个图像数据块中获取与所述第二目标图像统计数据类型对应的图像块统计数据。
其中,本申请实施例中,环境数据可以包括外部环境数据,和/或,内部环境数据,外部环境数据可以理解为客观存在的物理环境,即自然环境,外部环境数据可以为以下至少一种:环境温度、环境湿度、环境光亮度、大气压、地理位置、磁场干扰强度、抖动数据等等,在此不做限定。其中环境数据可以通过环境传感器进行采集,环境传感器可以为以下至少一种:温度传感器、湿度传感器、环境光传感器、气象传感器、定位传感器、磁场检测传感器。内部环境数据可以理解为电子设备的各个模块工作产生的环境数据,内部环境数据可以为以下至少一种:CPU温度、GPU温度、抖动数据、CPU核数等等,在此不做限定。
具体实现中,电子设备中可以包括存储器,该存储器中可以预先存储预设的环境数据与图像统计数据类型之间的映射关系,该映射关系如下表所示:
环境数据 图像统计数据类型
环境数据b1 图像统计数据类型B1
拍摄数据b2 图像统计数据类型B2
... ...
拍摄数据bn 图像统计数据类型Bn
即不同的环境数据对应不同的图像统计数据类型。
进而,图像处理器可以获取目标环境数据,并且按照该映射关系确定目标环境数据对应的第二目标图像统计数据类型,从该图像的至少一个图像数据块中获取与第二目标图像统计数据类型对应的图像块统计数据,如此,可以依据拍摄环境,获取相应的图像统计数据。
可选地,在上述步骤301,基于图像数据流包含的图像数据块生成图像块统计数据之前,还可以包括如下步骤:
C1、所述图像处理器获取第一原始图像数据,所述第一原始图像数据为当前处理图像帧的部分原始图像数据;
C2、所述图像处理器确定所述第一原始图像数据的目标图像质量评价值;
C3、所述图像处理器在所述目标图像质量评价值大于预设图像质量评价值时,执行步骤301。
其中,第一原始图像数据可以在原始图像数据未加载完成之前的,当前处理图像帧的部分原始图像数据。预设图像质量评价值可以由用户自行设置或者系统默认。具体实现中,图像处理器可以获取第一原始图像数据,图像处理器可以采用至少一个图像质量评价指标对第一原始图像数据进行图像质量评价, 得到目标图像质量评价值,图像质量评价指标可以为以下至少一种:信息熵、平均梯度、平均灰度、对比度等等,在此不做限定。在目标图像质量评价值大于预设图像质量评价值时,可以执行步骤301,否则,则可以调用摄像头重新拍摄。
进一步地,上述步骤C2,所述图像处理器确定所述第一原始图像数据的目标图像质量评价值,可以包括如下步骤:
C21、确定所述第一原始图像数据的目标特征点分布密度和目标信噪比;
C22、按照预设的特征点分布密度与图像质量评价值之间的映射关系,确定所述目标特征点分布密度对应的第一图像质量评价值;
C23、按照预设的信噪比与图像质量偏差值之间的映射关系,确定所述目标信噪比对应的目标图像质量偏差值;
C24、获取所述第一原始图像数据的第一拍摄数据;
C25、按照预设的拍摄数据与优化系数之间的映射关系,确定所述第一拍摄数据对应的目标优化系数;
C26、依据所述目标优化系数、所述目标图像质量偏差值对所述第一图像质量评价值进行调整,得到所述目标图像质量评价值。
具体实现中,电子设备中的存储器可以预先存储预设的特征点分布密度与图像质量评价值之间的映射关系、预设的信噪比与图像质量偏差值之间的映射关系、以及预设的拍摄数据与优化系数之间的映射关系,其中,图像质量评价值的取值范围可以为0~1,或者,也可以为0~100。图像质量偏差值可以为正实数,例如,0~1,或者,也可以大于1。优化系数的取值范围可以为-1~1之间,例如,优化系数可以为-0.1~0.1。本申请实施例中,拍摄数据可以为以下至少一种:曝光时长、拍摄模式、感光度ISO、白平衡数据、焦距、焦点、感兴趣区域等等,在此不做限定。
具体实现中,电子设备可以确定第一原始图像数据的目标特征点分布密度和目标信噪比,且按照预设的特征点分布密度与图像质量评价值之间的映射关系,确定目标特征点分布密度对应的第一图像质量评价值,特征点分布密度在一定程度上反映了图像质量,特征点分布密度可以理解为第一原始图像数据的特征点总数与该第一原始图像数据的图像面积之间的比值。进而,电子设备可以按照预设的信噪比与图像质量偏差值之间的映射关系,确定目标信噪比对应的目标图像质量偏差值,由于在生成图像的时候,由于外部(天气、光线、角度、抖动等)或者内部(系统、GPU)原因,产生一些噪声,这些噪声对图像质量会带来一些影响,因此,可以对图像质量进行一定程度调节,以保证对图像质量进行客观评价。
进一步地,电子设备还可以获取第一原始图像数据的第一拍摄数据,按照预设的拍摄数据与优化系数之间的映射关系,确定第一拍摄数据对应的目标优化系数,拍摄的数据设置也可能对图像质量评价带来一定的影响,因此,需要确定拍摄数据对图像质量的影响成分,最后,依据目标优化系数、目标图像质量偏差值对第一图像质量评价值进行调整,得到目标图像质量评价值,其中,目标图像质量评价值可以按照如下公式得到:
在图像质量评价值为百分制的情况下,具体计算公式如下:
目标图像质量评价值=(第一图像质量评价值+目标图像质量偏差值)*(1+目标优化系数)
在图像质量评价值为百分比的情况下,具体计算公式如下:
目标图像质量评价值=第一图像质量评价值*(1+目标图像质量偏差值)*(1+目标优化系数)
如此,可以结合内部、外部环境因素以及拍摄设置因素等影响,对图像质量进行客观评价,有助于提升图像质量评价精准度。
可以看出,在本申请实施例中所描述的数据传输方法,基于图像数据流包含的图像数据块生成图像块统计数据;基于一个或多个图像数据块的图像块统计数据生成统计数据包;将一个或多个所述统计数据包打包到数据包组中;以及向应用处理器传送所述数据包组,一方面,能够实时传输图像统计数据,保证数据传输的可靠性,另一方面,用户感觉不到图像处理器的存在,提升了用户体验。
请参阅图4,图4是本申请实施例提供的一种数据传输方法的流程示意图,应用于如图1或者图2所示的电子设备或者该电子设备中的应用处理器,如图所示,本数据传输方法包括:
401、接收数据包组,其中,所述数据包组包括一个或多个统计数据包,所述统计数据包基于图像数据流中包含的一个或多个图像数据块的图像块统计数据而生成。
402、响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据。
其中,上述图4对应的实施例中的所有步骤的具体描述可以参见图3A所描述的数据传输方法的相关描述,在此不再赘述。
可以看出,在本申请实施例中所描述的数据传输方法,一方面,能够实时传输图像统计数据,保证数据传输的可靠性,另一方面,用户感觉不到图像处理器的存在,提升了用户体验。
请参阅图5,图5是本申请实施例提供的一种数据传输方法的流程示意图,应用于电子设备,所述电子设备包括图像处理器和应用处理器,如图所示,本用于图像处理器的数据传输方法包括:
501、所述图像处理器基于图像数据流包含的图像数据块生成图像块统计数据;基于一个或多个图像数据块的图像块统计数据生成统计数据包;将一个或多个所述统计数据包打包到数据包组中;以及向所述应用处理器传送所述数据包组。
502、所述应用处理器接收数据包组;响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据。
其中,上述步骤501-502的具体描述可以参见图3A所描述的数据传输方法的相关描述,在此不再赘述。
可以看出,在本申请实施例中所描述的数据传输方法,一方面,能够实时传输图像统计数据,保证数据传输的可靠性,另一方面,用户感觉不到图像处理器的存在,提升了用户体验。
与上述实施例一致地,请参阅图6,图6是本申请实施例提供的一种电子设备的结构示意图,如图所示,该电子设备包括应用处理器、图像处理器、存储器、通信接口以及一个或多个程序,其中,上述一个或多个程序被存储在上述存储器中,并且被配置由上述图像处理器执行,本申请实施例中,上述程序包括用于执行以下步骤的指令:
基于图像数据流包含的图像数据块生成图像块统计数据;
基于一个或多个图像数据块的图像块统计数据生成统计数据包;
将一个或多个所述统计数据包打包到数据包组中;以及
向应用处理器传送所述数据包组。
可选的,所述数据包组还包括系统数据包和/或所述一个或多个图像数据块的图像数据包。
可选的,所述图像数据包包括原始图像数据和/或经处理后的图像数据。
可选的,上述程序包括用于执行以下步骤的指令:
基于一个图像数据块的图像块统计数据生成统计数据包,并将所述统计数据包打包到所述数据包组中。
可选的,上述程序还包括用于执行以下步骤的指令:
基于Q个图像数据块的图像块统计数据生成统计数据包,将所述统计数据包打包到所述数据包组中,所述Q为大于1的数。
可选的,上述程序还包括用于执行以下步骤的指令:
当向所述应用处理器传送预定数目的统计数据包后,向所述应用处理器提供中断信息,以通知所述应用处理器执行对所述统计数据包的解包。
可选的,所述中断信息与所述统计数据包位于不同的数据包组中。
进一步地,上述一个或多个程序还可以被配置由上述应用处理器执行,本申请实施例中,上述程序 包括用于执行以下步骤的指令:
接收数据包组,其中,所述数据包组包括一个或多个统计数据包,所述统计数据包基于图像数据流中包含的一个或多个图像数据块的图像块统计数据而生成;
响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据。
可选地,所述图像处理器和所述应用处理器集成于同一芯片,或者,所述图像处理器和所述应用处理器分别为两个独立模块。
上述主要从方法侧执行过程的角度对本申请实施例的方案进行了介绍。可以理解的是,电子设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所提供的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对电子设备进行功能单元的划分,例如,可以对应各个功能划分各个功能单元,也可以将两个或两个以上的功能集成在一个处理单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。需要说明的是,本申请实施例中对单元的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
本申请提供了请参阅图7,图7是本申请实施例提供的一种电子设备700的结构示意图,所述电子设备700包括图像处理器701和应用处理器702,如图所示,其中,
所述图像处理器701,用于基于图像数据流包含的图像数据块生成图像块统计数据;基于一个或多个图像数据块的图像块统计数据生成统计数据包;将一个或多个所述统计数据包打包到数据包组中;以及向应用处理器702传送所述数据包组;
应用处理器702,其用于接收数据包组;响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据,
其中,所述图像处理器701在向所述应用处理器702传送预定数目图像数据块的图像块统计数据后,向所述应用处理器702传送中断信息,以使得所述应用处理器702对所述预定数目图像数据块的统计数据包进行解包,得到所述图像的统计数据。
可以看出,在本申请实施例中所描述的电子设备,该电子设备包括图像处理器和应用处理器,一方面,能够实时传输图像统计数据,保证数据传输的可靠性,另一方面,用户感觉不到图像处理器的存在,提升了用户体验。
其中,图像处理器701和应用处理器702能够实现上述任一方法的功能或者步骤。
图8是本申请实施例中所涉及的图像处理装置800的功能单元组成框图。该其可以用于电子设备,所述电子设备还包括应用处理器,所述图像处理装置800包括:统计单元801、打包单元802和传送单元803,其中,
所述统计单元801,其用于基于图像数据流包含的图像数据块生成图像块统计数据;
所述打包单元802,用于基于一个或多个图像数据块的图像块统计数据生成统计数据包,并将一个或多个所述统计数据包打包到数据包组中;以及
所述传送单元803,用于向应用处理器传送所述数据包组。
可选的,所述数据包组还包括系统数据包和/或所述一个或多个图像数据块的图像数据包。
可选的,所述图像数据包包括原始图像数据和/或经处理后的图像数据。
可选的,所述打包单元802还用于:
基于一个图像数据块的图像块统计数据生成统计数据包,并将所述统计数据包打包到所述数据包组中。
可选的,所述打包单元802还用于:
基于Q个图像数据块的图像块统计数据生成统计数据包,并将所述统计数据包打包到所述数据包组中,所述Q为大于1的数。
可选的,所述传送单元803还用于:
当向所述应用处理器传送预定数目的统计数据包后,向所述应用处理器提供中断信息,以通知所述应用处理器执行对所述统计数据包的解包。
可选的,所述中断信息与所述统计数据包位于不同的数据包组中。
需要注意的是,本申请实施例所描述的装置是以功能单元的形式呈现。这里所使用的术语“单元”应当理解为尽可能最宽的含义,用于实现各个“单元”所描述功能的对象例如可以是集成电路ASIC,单个电路,用于执行一个或多个软件或固件程序的处理器(共享的、专用的或芯片组)和存储器,组合逻辑电路,和/或提供实现上述功能的其他合适的组件。
其中,统计单元801、打包单元802和传送单元803可以是图像处理器电路,基于上述单元模块能够实现上述任一方法的功能或者步骤。
图9是本申请实施例中所涉及的应用处理器900的功能单元组成框图。该应用处理器900应用于电子设备,所述电子设备还可以包括图像处理器,所述应用处理器900包括:接收单元901和解包单元902,其中,
所述接收单元901,用于接收数据包组,其中,所述数据包组包括一个或多个统计数据包,所述统计数据包基于图像数据流中包含的一个或多个图像数据块的图像块统计数据而生成;
所述解包单元902,响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据。
其中,接收单元901和解包单元902可以是应用处理器,基于上述单元模块能够实现上述任一方法的功能或者步骤。
可以看出,在本申请实施例中所描述的用于数据传输装置,或者,电子设备,一方面,能够实时传输图像统计数据,保证数据传输的可靠性,另一方面,用户感觉不到图像处理器的存在,提升了用户体验。
另外,本申请实施例还提供了一种图像处理器,所述图像处理器用于执行如下操作:
基于图像数据流包含的图像数据块生成图像块统计数据;
基于一个或多个图像数据块的图像块统计数据生成统计数据包;
将一个或多个所述统计数据包打包到数据包组中;以及
向应用处理器传送所述数据包组。
以及,本申请实施例还提供了一种应用处理器,所述应用处理器用于:
接收数据包组,其中,所述数据包组包括一个或多个统计数据包,所述统计数据包基于图像数据流中包含的一个或多个图像数据块的图像块统计数据而生成;
响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据。
本实施例还提供了一种计算机可读存储介质,其中,该计算机可读存储介质存储用于电子数据交换的计算机程序,其中,上述计算机程序使得计算机执行如本申请实施例,以用于实现上述实施例中的任一方法。
本实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的任一方法。
另外,本申请的实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述各方法实施例中的任一方法。
其中,本实施例提供的电子设备、计算机存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上实施方式的描述,所属领域的技术人员可以了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (19)

  1. 一种数据传输方法,其特征在于,包括:
    基于图像数据流包含的图像数据块生成图像块统计数据;
    基于一个或多个图像数据块的图像块统计数据生成统计数据包;
    将一个或多个所述统计数据包打包到数据包组中;以及
    向应用处理器传送所述数据包组。
  2. 根据权利要求1所述的方法,其特征在于,所述数据包组还包括系统数据包和/或所述一个或多个图像数据块的图像数据包。
  3. 根据权利要求2所述的方法,其特征在于,所述图像数据包包括原始图像数据和/或经处理后的图像数据。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    基于一个图像数据块的图像块统计数据生成统计数据包,并将所述统计数据包打包到所述数据包组中。
  5. 根据权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    基于Q个图像数据块的图像块统计数据生成统计数据包,将所述统计数据包打包到所述数据包组中,所述Q为大于1的数。
  6. 根据权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    当向所述应用处理器传送预定数目的统计数据包后,向所述应用处理器提供中断信息,以通知所述应用处理器执行对所述统计数据包的解包。
  7. 根据权利要求6所述的方法,其特征在于,所述中断信息与所述统计数据包位于不同的数据包组中。
  8. 一种数据传输方法,其特征在于,包括:
    接收数据包组,其中,所述数据包组包括一个或多个统计数据包,所述统计数据包基于图像数据流中包含的一个或多个图像数据块的图像块统计数据而生成;
    响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据。
  9. 一种图像处理装置,其特征在于,包括:
    统计单元,其用于基于图像数据流包含的图像数据块生成图像块统计数据;
    打包单元,用于基于一个或多个图像数据块的图像块统计数据生成统计数据包,并将一个或多个所述统计数据包打包到数据包组中;以及
    传送单元,用于向应用处理器传送所述数据包组。
  10. 根据权利要求9所述的图像处理装置,其特征在于,所述数据包组还包括系统数据包和/或所述一个或多个图像数据块的图像数据包。
  11. 根据权利要求10所述的图像处理装置,其特征在于,所述图像数据包包括原始图像数据和/或经处理后的图像数据。
  12. 根据权利要求9-11任一项所述的图像处理装置,其特征在于,所述打包单元还用于:
    基于一个图像数据块的图像块统计数据生成统计数据包,并将所述统计数据包打包到所述数据包组中。
  13. 根据权利要求9-11任一项所述的图像处理装置,其特征在于,所述打包单元还用于:
    基于Q个图像数据块的图像块统计数据生成统计数据包,并将所述统计数据包打包到所述数据包组中,所述Q为大于1的数。
  14. 根据权利要求9-11任一项所述的图像处理装置,其特征在于,所述传送单元还用于:
    当向所述应用处理器传送预定数目的统计数据包后,向所述应用处理器提供中断信息,以通知所述应用处理器执行对所述统计数据包的解包。
  15. 根据权利要求14所述的图像处理装置,其特征在于,所述中断信息与所述统计数据包位于不同的数据包组中。
  16. 一种应用处理器,其特征在于,包括:
    接收单元,用于接收数据包组,其中,所述数据包组包括一个或多个统计数据包,所述统计数据包基于图像数据流中包含的一个或多个图像数据块的图像块统计数据而生成;
    解包单元,响应于中断信息,对所述统计数据包执行解包操作,以得到所述一个或多个图像数据块的图像块统计数据。
  17. 一种电子设备,其特征在于,包括:
    图像处理装置,其用于基于图像数据流包含的图像数据块生成图像块统计数据,基于一个或多个图像数据块的图像块统计数据生成统计数据包,并将一个或多个所述统计数据包打包到数据包组中;
    应用处理器,其用于接收来自所述图像处理装置的数据包组;
    其中,所述图像处理装置在向所述应用处理器传送预定数目的统计数据包后,向所述应用处理器传送中断信息,以使得所述应用处理器对所述统计数据包进行解包,得到所述一个或多个图像数据块的图像块统计数据。
  18. 一种计算机可读存储介质,其特征在于,存储有计算机程序,其中,所述计算机程序在被执行时使得计算机执行如权利要求1-8任一项所述的方法。
  19. 一种计算机程序产品,其特征在于,所述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,所述计算机程序在被执行时使计算机执行如权利要求1-8任一项所述的方法。
PCT/CN2021/141257 2021-02-10 2021-12-24 数据传输方法、装置及存储介质 WO2022170866A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110187084.4 2021-02-10
CN202110187084.4A CN114945019B (zh) 2021-02-10 2021-02-10 数据传输方法、装置及存储介质

Publications (1)

Publication Number Publication Date
WO2022170866A1 true WO2022170866A1 (zh) 2022-08-18

Family

ID=82838245

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/141257 WO2022170866A1 (zh) 2021-02-10 2021-12-24 数据传输方法、装置及存储介质

Country Status (2)

Country Link
CN (1) CN114945019B (zh)
WO (1) WO2022170866A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116321289B (zh) * 2023-02-22 2023-10-17 北纬实捌(海口)科技有限公司 无线传输数据包长转换系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106610987A (zh) * 2015-10-22 2017-05-03 杭州海康威视数字技术股份有限公司 视频图像检索方法、装置及系统
US20190035048A1 (en) * 2017-07-26 2019-01-31 Altek Semiconductor Corp. Image processing chip and image processing system
US20190272625A1 (en) * 2018-03-05 2019-09-05 Samsung Electronics Co., Ltd. Electronic device and method for correcting images based on image feature information and image correction scheme
CN110276767A (zh) * 2019-06-28 2019-09-24 Oppo广东移动通信有限公司 图像处理方法和装置、电子设备、计算机可读存储介质
CN110300989A (zh) * 2017-05-15 2019-10-01 谷歌有限责任公司 可配置并且可编程的图像处理器单元

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106610987A (zh) * 2015-10-22 2017-05-03 杭州海康威视数字技术股份有限公司 视频图像检索方法、装置及系统
CN110300989A (zh) * 2017-05-15 2019-10-01 谷歌有限责任公司 可配置并且可编程的图像处理器单元
US20190035048A1 (en) * 2017-07-26 2019-01-31 Altek Semiconductor Corp. Image processing chip and image processing system
US20190272625A1 (en) * 2018-03-05 2019-09-05 Samsung Electronics Co., Ltd. Electronic device and method for correcting images based on image feature information and image correction scheme
CN110276767A (zh) * 2019-06-28 2019-09-24 Oppo广东移动通信有限公司 图像处理方法和装置、电子设备、计算机可读存储介质

Also Published As

Publication number Publication date
CN114945019A (zh) 2022-08-26
CN114945019B (zh) 2023-11-21

Similar Documents

Publication Publication Date Title
WO2021135730A1 (zh) 显示界面适配方法、显示界面适配设计方法和电子设备
CN115473957B (zh) 一种图像处理方法和电子设备
KR102170781B1 (ko) 이미지를 처리하는 전자장치 및 방법
WO2020093988A1 (zh) 一种图像处理方法及电子设备
CN112532892B (zh) 图像处理方法及电子装置
WO2022127787A1 (zh) 一种图像显示的方法及电子设备
CN113810601B (zh) 终端的图像处理方法、装置和终端设备
WO2021013132A1 (zh) 输入方法及电子设备
WO2022100685A1 (zh) 一种绘制命令处理方法及其相关设备
CN112087649B (zh) 一种设备搜寻方法以及电子设备
WO2022001258A1 (zh) 多屏显示方法、装置、终端设备及存储介质
WO2022170866A1 (zh) 数据传输方法、装置及存储介质
US20230335081A1 (en) Display Synchronization Method, Electronic Device, and Readable Storage Medium
CN115119048B (zh) 一种视频流处理方法及电子设备
EP4395290A1 (en) Bluetooth audio playback method, electronic device, and storage medium
WO2023125518A1 (zh) 一种图像编码方法以及装置
WO2023016059A1 (zh) 数据传输控制方法及相关装置
WO2023000745A1 (zh) 显示控制方法及相关装置
WO2022033344A1 (zh) 视频防抖方法、终端设备和计算机可读存储介质
CN114630152A (zh) 用于图像处理器的参数传输方法、装置及存储介质
CN116939559A (zh) 蓝牙音频编码数据分发方法、电子设备及存储介质
CN114630153B (zh) 用于应用处理器的参数传输方法、装置及存储介质
CN116095512B (zh) 终端设备的拍照方法及相关装置
WO2022222780A1 (zh) 音频输出方法、媒体文件的录制方法以及电子设备
WO2022228196A1 (zh) 一种视频处理方法及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21925511

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21925511

Country of ref document: EP

Kind code of ref document: A1