WO2023016059A1 - Procédé de commande de transmission de données et appareil associé - Google Patents

Procédé de commande de transmission de données et appareil associé Download PDF

Info

Publication number
WO2023016059A1
WO2023016059A1 PCT/CN2022/095793 CN2022095793W WO2023016059A1 WO 2023016059 A1 WO2023016059 A1 WO 2023016059A1 CN 2022095793 W CN2022095793 W CN 2022095793W WO 2023016059 A1 WO2023016059 A1 WO 2023016059A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
data
data packet
frame rate
virtual video
Prior art date
Application number
PCT/CN2022/095793
Other languages
English (en)
Chinese (zh)
Inventor
朱文波
Original Assignee
哲库科技(上海)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 哲库科技(上海)有限公司 filed Critical 哲库科技(上海)有限公司
Publication of WO2023016059A1 publication Critical patent/WO2023016059A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present application relates to the field of electronic technology, and in particular to a data transmission control method and a related device.
  • the debugging methods of chips are all in the development stage, which is not suitable for the environment after productization.
  • the detection of the front-end working status is still based on direct connection to the debugging line, which requires a special Hardware interface, of course, due to changes in the hardware environment after productization, it is impossible to capture relevant front-end module work information for developers to analyze. Therefore, how to efficiently capture relevant front-end module work information for developers to analyze problem needs to be resolved urgently.
  • the embodiments of the present application provide a data transmission control method and a related device, which can efficiently capture the work information of the front-end module for analysis by developers.
  • the embodiment of the present application provides a data transmission control method for testing electronic equipment, the method comprising:
  • the virtual video device is configured to encapsulate the operating data of the front-end module of the electronic device at a first frame rate
  • Determining whether the front-end module is abnormal is based at least in part on adjusting values of the first frame rate and/or the second frame rate.
  • the embodiment of the present application provides an electronic device, including a front-end module and a processor, wherein,
  • the front-end module is configured to:
  • the processor is configured to:
  • Determining whether the front-end module is abnormal is based at least in part on adjusting values of the first frame rate and/or the second frame rate.
  • an embodiment of the present application provides a data transmission control method, which is applied to an electronic device, where the electronic device includes a front-end module and a processor, and the method includes:
  • the front-end module sets a virtual video device to encapsulate the operation data of the front-end module at a first frame rate
  • the processor is configured to:
  • Determining whether the front-end module is abnormal is based at least in part on adjusting values of the first frame rate and/or the second frame rate.
  • the embodiment of the present application provides a data transmission control device, which is applied to electronic equipment, where the electronic equipment includes a front-end module and a processor, and the device includes: a setting unit, a packaging unit, a sending unit, and an abnormality detection unit, in,
  • the setting unit is configured to set a virtual video device, and the virtual video device is configured to encapsulate the operation data of the front-end module of the electronic device at a first frame rate;
  • the packaging unit is configured to use the virtual video device to package the running data at a second frame rate to obtain a target data package;
  • the sending unit is configured to send the target data packet to a processor
  • the abnormality detection unit is configured to determine whether there is an abnormality in the front-end module based at least in part on adjusting the value of the first frame rate and/or the second frame rate.
  • the embodiment of the present application provides an electronic device, the electronic device includes a front-end module, a processor, and a memory, the memory is used to store one or more programs, and is configured by the processor or the The front-end module executes, and the program includes instructions for executing the steps in the method according to any one of the first aspect or the third aspect.
  • the embodiment of the present application provides a computer-readable storage medium, wherein the above-mentioned computer-readable storage medium stores a computer program for electronic data exchange, wherein the above-mentioned computer program enables the computer to execute Part or all of the steps described in the first aspect or the third aspect.
  • the embodiment of the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to enable the computer to execute the program as implemented in the present application.
  • the computer program product may be a software installation package.
  • FIG. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a software structure of an electronic device provided by an embodiment of the present application.
  • FIG. 3A is a schematic flowchart of a data transmission control method provided in an embodiment of the present application.
  • Fig. 3B is a schematic diagram illustrating the data transmission between the front-end module and the application processor provided by the embodiment of the present application;
  • FIG. 3C is a schematic structural diagram of a data packet provided by an embodiment of the present application.
  • FIG. 3D is a schematic diagram illustrating another data transmission between the front-end module and the application processor provided by the embodiment of the present application;
  • FIG. 4 is a schematic flowchart of another data transmission control method provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • Fig. 6 is a block diagram of functional units of a device for controlling data transmission provided by an embodiment of the present application.
  • Electronic devices can include a variety of devices with communication capabilities, such as smartphones, in-vehicle devices, wearable devices, charging devices (such as power banks), smart watches, smart glasses, wireless Bluetooth headsets, computing devices, or connected to a wireless modem.
  • Other processing equipment as well as various forms of user equipment (User Equipment, UE), mobile station (Mobile Station, MS), virtual reality/augmented reality equipment, terminal equipment (terminal device), etc., electronic equipment can also be a base station or The server, and the electronic device may also be a test module composed of a front-end module and a processor (for example, an application processor).
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, compass 190, motor 191, indicator 192, camera 193, display screen 194 and user An identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor AP, a modem processor, a graphics processor GPU, an image signal processor (image signal processor, ISP), a controller, Video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor NPU, etc. Wherein, different processing units may be independent components, or may be integrated in one or more processors.
  • the electronic device 100 may also include one or more processors 110 .
  • the controller can generate an operation control signal according to the instruction operation code and the timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be set in the processor 110 for storing instructions and data.
  • the memory in the processor 110 may be a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. In this way, repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the electronic device 100 in processing data or executing instructions.
  • the processor can also include an image processor, and the image processor can be an image preprocessor (preprocess image signal processor, Pre-ISP), which can be understood as a simplified ISP, and can also perform some image processing operations.
  • Pre-ISP image signal processor
  • processor 110 may include one or more interfaces.
  • the interface may include inter-integrated circuit (I2C) interface, inter-integrated circuit sound (I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous transceiver (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input/output (general-purpose input/output, GPIO) interface, SIM card interface and/or USB interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices.
  • the USB interface 130 can also be used to connect earphones to play audio through the earphones.
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input of the battery 142 and/or the charging management module 140, and supplies power for the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193 and the wireless communication module 160, etc.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G/6G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth (blue tooth, BT), global navigation Satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (liquid crystal display, LCD), organic light-emitting diode (organic light-emitting diode, OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), mini light-emitting diode (mini light-emitting diode, miniled), MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include one or more display screens 194 .
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 , and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP for conversion into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include one or more cameras 193 .
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store one or more computer programs including instructions.
  • the processor 110 may execute the above-mentioned instructions stored in the internal memory 121, so that the electronic device 100 executes the method for displaying page elements provided in some embodiments of the present application, as well as various applications and data processing.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system; the stored program area can also store one or more applications (such as a gallery, contacts, etc.) and the like.
  • the data storage area can store data (such as photos, contacts, etc.) created during the use of the electronic device 100 .
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more disk storage components, flash memory components, universal flash storage (universal flash storage, UFS) and the like.
  • the processor 110 may execute the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor 110, so that the electronic device 100 executes the instructions provided in the embodiments of the present application. Methods for displaying page elements, and other applications and data processing.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the pressure sensor 180A is used for sensing pressure signals, and can convert the pressure signals into electrical signals.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device 100 determines the intensity of pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes ie, X, Y and Z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the electronic device 100 may reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to prevent the electronic device 100 from being shut down abnormally due to the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • FIG. 2 shows a software structural block diagram of the electronic device 100 .
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application layer may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window managers, content providers, view systems, phone managers, resource managers, notification managers, and so on.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Data can include videos, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 . For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • Android Runtime includes core library and virtual machine. Android Runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (media libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • the electronic device includes a front-end module and a processor, which can be used to implement the following functions:
  • the front-end module is configured to:
  • the processor is configured to:
  • Determining whether the front-end module is abnormal is based at least in part on adjusting values of the first frame rate and/or the second frame rate.
  • the electronic device described in the embodiment of the present application is used to test the electronic device and set up a virtual video device, and the virtual video device is configured to encapsulate the operating data of the front-end module of the electronic device at the first frame rate , use the virtual video device to pack the running data according to the second frame rate to obtain the target data packet, and send the target data packet to the processor, at least partly based on adjusting the value of the first frame rate and/or the second frame rate Determine whether there is an abnormality in the front-end module. Since the virtual video device can perform virtual video recording processing on the running data, the running data can be output to the external device according to a certain frame rate and size, and then the working information of the front-end module can be efficiently captured for development. personnel to analyze.
  • the front-end module is further configured to: acquire target pipeline configuration parameters;
  • the front-end module is specifically configured as:
  • the target packaging processing parameters include at least the second frame rate
  • the virtual video device packs the running data by using the target packing processing parameters to obtain the target data package.
  • the target data packet includes at least one data packet, and the data size of each data packet is less than or equal to the size of each cache register in a plurality of cache registers allocated by the USB service, and the cache register is used for A data packet of the at least one data packet is accommodated.
  • the front-end module is specifically configured as:
  • the target pipeline configuration parameters corresponding to the attribute information of the target debugging content are determined according to the preset mapping relationship between the attribute information of the debugging content and the pipeline configuration parameters.
  • the processor is also specifically configured to:
  • the processor receives the target data packet, unpacks the target data packet to obtain the operation data, and the operation data is used for data analysis.
  • the processor is further specifically configured to:
  • the processor is further specifically configured to:
  • the processor is also specifically configured to:
  • a rotation operation is performed on the target cache register, so that the target cache register after rotation continues to be used for accommodating data packets received by the processor.
  • the processor is further specifically configured to:
  • Enabling the camera of the electronic device to call a hardware abstraction module wherein the hardware abstraction module is configured to: perform the function of unpacking the target data packet to obtain the running data.
  • FIG. 3A is a schematic flowchart of a data transmission control method provided by an embodiment of this application, which is applied to an electronic device, and the electronic device includes a front-end module and a processor.
  • the processor can be Application processor AP
  • the data transmission control method can be used to test electronic equipment, as shown in the figure
  • the data transmission control method includes:
  • the running data may be at least one of the following: working status data, image data, voice data, debugging log data, video data, etc. of each chip in the front-end module, which is not limited herein.
  • the working status data of each chip in the front-end module can be at least one of the following: working level, waveform, working voltage, working current, working power, working clock, etc., which are not limited here, and the above-mentioned running data can be raw data or
  • the preprocessing can be at least one of the following: sampling, compression, screening, etc., which is not limited here.
  • the front-end module may include: an image signal processor (ISP), a neural network processor (NPU), a memory (DDR), a master control module (TOP), a selector (MUX), MIPI, etc., are not limited here.
  • the selector is used to realize the selection operation of the running data. For example, only the running data of the ISP is selected. MIPI samples and packs the running data, and sends the packed data to the AP through the camera serial interface (CSI). The AP transmits the running data to the external device through the USB service.
  • the front-end module can be an image preprocessor.
  • step 301 setting up a virtual video device, can be implemented in the following manner:
  • the virtual video device is set through the MIPI of the front-end module.
  • a virtual video device can be set through MIPI, specifically, a virtual video device can be set through a camera serial interface (CSI) of MIPI, and the above-mentioned virtual video device can be called a virtualized video recording (video) equipment.
  • MIPI camera serial interface
  • video virtualized video recording
  • step 301 before setting the virtual video device, the following steps may also be included:
  • step 301 using the virtual video device to pack the running data according to the second frame rate to obtain the target data package may include the following steps:
  • the target packaging processing parameters include at least the second frame rate
  • the pipeline configuration parameters may include at least one of the following: pipeline type, pipeline serial number, frame rate, resolution, etc., which are not limited here, and the pipeline type can be understood as used to process data of a certain data type
  • the image data corresponds to the image data pipeline type
  • the voice data corresponds to the voice data pipeline type, etc.
  • the pipeline serial number can be understood as the serial number of the pipeline, and the pipeline serial number can be pre-defined, and the functions of the pipelines with different serial numbers are different.
  • the packaging processing parameters include at least the frame rate, and the packaging processing parameters may also include at least one of the following: data packet size, packet header, packet trailer, etc., which are not limited here.
  • the electronic device may pre-store a mapping relationship between preset pipeline configuration parameters and packaging processing parameters, that is, different pipeline configuration parameters may correspond to different packaging processing parameters.
  • the target pipeline configuration parameters can be pre-configured, and the electronic device can obtain the target pipeline configuration parameters from the hardware abstraction module (camera hal), and then determine the target pipeline according to the mapping relationship between the preset pipeline configuration parameters and the packaging processing parameters Configure the target packaging processing parameters corresponding to the parameters, and use the target packaging processing parameters to pack the running data through the virtual video device to obtain a target data packet.
  • the target data packet can be at least one data packet, and the target packaging processing parameters include at least the second frame rate.
  • the amount of information generated by the front-end module may be small within a certain period of time, resulting in insufficient filling of the cache registers, resulting in waste of cache registers and transmission bandwidth.
  • Data can be packaged based on the principle of sending when it is full, and the packaged data and the next frame of data can be sent to the AP for processing, or the packaged data can be stamped with a time stamp and sent to the AP for processing separately. This saves transmission bandwidth.
  • the target data packet includes at least one data packet, and the data size of each data packet is less than or equal to the size of each cache register in a plurality of cache registers allocated by the USB service, and the cache register is used for A data packet of the at least one data packet is accommodated.
  • the target data packet may include at least one data packet, and the data size of each data packet is less than or equal to the size of each cache register in a plurality of buffer registers allocated by the USB service, and the cache register is used to accommodate at least one data packet The data packet in, thus, facilitates the cache register to be able to hold the data packet.
  • step A1 obtaining target pipeline configuration parameters, may include the following steps:
  • A13 Determine the target pipeline configuration parameter corresponding to the attribute information of the target debugging content according to the preset mapping relationship between the attribute information of the debugging content and the pipeline configuration parameter.
  • the attribute information of the debugging content may be at least one of the following: data type, debugging purpose, parameter type of debugging parameters, foreground application, etc., which are not limited here, and pre-stored The mapping relationship between the attribute information of the debugging content and the pipeline configuration parameters.
  • the electronic device can determine the target debugging content corresponding to the engineering mode in the engineering mode, and the debugging content can be set by the user, or the debugging content can be determined based on the user's operation, and then the attributes corresponding to the target debugging content can be obtained information, furthermore, according to the preset mapping relationship between the attribute information of the debugging content and the pipeline configuration parameters, the target pipeline configuration parameters corresponding to the attribute information of the target debugging content can be determined, so that the corresponding Pipeline configuration parameters, in turn, can make the depth of pipeline configuration parameters meet the debugging requirements.
  • step 301 before setting the virtual video device, the following steps may also be included:
  • the electronic device can detect whether it is in the engineering mode.
  • step 301 can be executed; otherwise, step 301 may not be executed.
  • the engineering mode can also be called the debugging mode (debug mode).
  • the engineering mode can be set by the user, or automatically enter the engineering mode when an abnormality occurs in the electronic device.
  • step 301 before setting the virtual video device, the following steps may also be included:
  • Enabling the camera of the electronic device to call a hardware abstraction module wherein the hardware abstraction module is configured to: perform the function of unpacking the target data packet to obtain the running data.
  • the evacuation module can be used to implement unpacking operations to obtain operating data.
  • the electronic device can obtain the operating data of at least one module in the front-end module, and pack the operating data at the second frame rate through the virtual video device to obtain the target data packet, that is, the target data packet can be obtained according to a certain frequency. and size output the run data.
  • the first frame rate and the second frame rate may be the same or different.
  • the electronic device can virtualize the operating data through the virtual video device, so that it can transmit the operating data at a certain interval and size.
  • the front-end module packs and processes the obtained operating data according to certain rules, that is It can be placed in the cache register allocated by the backend (for the driver to store data in the front-end module), and further transmit the packaged data to the application processor AP (backend) through MIPI, and can also allocate multiple The buffer register (buffer register), and then, realizes the buffer register rotation operation.
  • the number of cache registers can be set by the user, or determined by the bandwidth and/or the speed at which the running data is generated.
  • the operating data can be sent to an external device for data analysis, or the electronic device can also perform data analysis on the operating data to realize anomaly detection or detection of the working status of each chip in the front-end module.
  • the above-mentioned external device can be Other electronic equipment, such as a host computer (PC).
  • the data analysis may be at least one of the following functions: abnormality detection, status detection of each chip in the front-end module, debugging data analysis, etc., which are not limited herein.
  • Electronic equipment can also display operating data through the display screen.
  • the first frame rate and/or the second frame rate may be dynamically adjusted, thereby ensuring effective packaging and packet transmission of the running data.
  • it may be at least partly based on adjusting the value of the first frame rate to determine whether there is an abnormality in the front-end module, or at least partly based on adjusting the value of the second frame rate to determine whether there is an abnormality in the front-end module, or, Determine whether there is an abnormality in the front-end module based at least in part on adjusting the values of the first frame rate and the second frame rate.
  • Adjustment is convenient for the front-end to transmit transportation data in a rhythmic and efficient manner, and to facilitate the back-end to quickly extract operating data in a rhythmic manner.
  • the target data packet can be analyzed to obtain the operating data of the front-end module. Through the operating data to Determine if the front-end module is abnormal.
  • the electronic device processes the virtual video device of the abnormal information collection module in the front-end module so that the abnormal information (including the detection information at runtime) of different modules can be processed in a certain rhythm and size at the front end. Collect and package for real-time output to the PC side. When developers need to debug, they can get the cause of the problem by analyzing the received monitoring data.
  • the running data is image data
  • the processing effect of each frame for example, the effect of certain areas
  • the pre-stored standard rendering it can be obtained The judgment result of the processing effect of the current frame, and then adaptively adjust the processing parameters of the relevant algorithm according to the judgment result so that the image processing of the camera system can perform image processing stably.
  • step 304 the following steps may also be included:
  • the preset neural network model may be at least one of the following: a convolutional neural network model, a fully connected neural network model, a recurrent neural network model, etc., which are not limited herein.
  • steps C1-C3 may be implemented by an electronic device, or may also be implemented by an external device, for example, a host computer PC.
  • the target feature set may include at least one feature, or the target feature set may include at least one type of feature.
  • the features may be at least one of the following: feature points, feature vectors, colors, pixels, feature lines, etc., which are not limited here.
  • the feature may be at least one of the following: frequency, amplitude, waveform, timbre, pitch, wavelength, etc., which is not limited here.
  • the running data is debugging data
  • the feature may be at least one of the following: level, waveform, voltage, current, power, log data, etc., which is not limited here.
  • the neural network model Before executing step 301 in the embodiment of the present application, the neural network model can be trained through a large amount of sample data and labels corresponding to the samples, and after the neural network model converges, a preset neural network model can be obtained.
  • the electronic device can perform feature extraction on the operating data to obtain the target feature set, and then input the target feature set into the preset neural network model to obtain the target calculation result, which can be at least one label and the corresponding probability value , and then generate an abnormality analysis report according to the target calculation result, and use the abnormality analysis report as the target abnormality information, or the electronic device can also pre-store the mapping relationship between the calculation result and the abnormality information, and then determine the target calculation according to the mapping relationship
  • the target abnormality information corresponding to the result, the abnormality information may be at least one of the following: abnormality cause, abnormality location, abnormality duration, etc., which are not limited here.
  • the front-end module can actively initiate anomaly detection on the system and images.
  • an anomaly occurs, it can detect the anomaly in advance and obtain relevant anomaly information, and then can without affecting the upper layer, that is,
  • the adaptive processing corresponding to the abnormality can be quickly performed, and the processed result and abnormal information can be reported, so that the upper layer can timely respond to the adaptive processing result and abnormality
  • the information is further judged and processed, so that the embodiment of the present application can better improve the robustness of the system, and at the same time reduce the impact of abnormalities on the user experience and improve the user experience.
  • any data packet in the target data packet can include packet header PH, running data and packet tail PF, and packet header can be used to mark the start position of a data packet, and packet tail can be used to represent the data packet end position.
  • the packet header may include: packet header mark, index packet mark and packet data length
  • the packet header mark is used to represent the data type of the current data packet
  • the index packet mark is used to represent the independent index of the current data packet
  • the packet data length is used to represent The data length of the current data packet
  • Baotou structure byte length Baotou mark Byte3 index pack tag Byte2 packet data length Byte1+Byte0
  • the packet tail can include: packet tail mark, data packet count and frame count, the packet tail mark is used to represent the position of the packet tail, the data packet count is used to represent the count (number) of data packets, and the frame count represents the Which frame the packet comes from, as shown in the following table:
  • the target data packet is received by the AP, and the target data packet is unpacked to obtain the operation data, and the operation data is used for data analysis.
  • the AP can receive the target data packet, and unpack the target data packet according to the first frame rate and/or the second frame rate to obtain the operation data, and can also implement data analysis through the operation data to achieve the purpose of abnormal detection .
  • step 302 unpacking the target data packet to obtain the operation data, may be implemented in the following manner:
  • the target data packet is unpacked by a hardware abstraction module to obtain the running data.
  • unpacking is the reverse operation of packing action.
  • the hardware abstraction module can analyze the target data packet according to the flag bits or pre-agreed rules, and then process the parsed data. For example, the order of storage can be changed. adjustment or pack the abnormal information into the image data of the sub-camera when there are multiple channels of image data (the size of the image data of the sub-camera is small, and there is space and bandwidth to transmit debugging information), while the cache register of the main camera is Only image data is transferred.
  • the internal signal of the front-end module can be forwarded from the MIPI of the front-end module to the DDR on the AP side, and the AP side triggers an interrupt after receiving the data.
  • the front-end module packs the debugging data as a virtual camera device (virtual video device), it is necessary to start a special pipeline (pipeline) in the hardware abstraction module to serve the virtual video device.
  • the USB service needs to allocate relevant cache registers and give the cache registers to the hardware abstraction module.
  • the hardware abstraction module needs to copy the obtained debugging data to the cache registers allocated by the USB service in a frame-by-frame manner.
  • the data in the cache register does not need to be processed by the relevant algorithm of the ISP module on the AP side. Therefore, the ISP-related algorithm module needs to be closed, so as to ensure that the data transmitted to the external device (PC) is original data.
  • the USB service side can implement the rotation of the buffer registers by means of a callback function (callback).
  • the processing of the buffer register by the pipeline is different from the traditional camera processing module.
  • the monitoring data and image data are packaged and transmitted, the data needs to be split and processed. Therefore, the buffer register of the original data will be parsed in the pipeline. Reorganization, data analysis and reorganization can be analyzed according to flag bits or agreed rules, and then the analyzed data can be processed separately.
  • step, 302, unpacking the target data packet to obtain the operation data may include the following steps:
  • the electronic device can copy the target data packet frame by frame to the target cache register allocated by the USB service through the hardware abstraction module, unpack the data in the target cache register through the hardware abstraction module, and obtain the running data. After the target cache register is occupied, the cache register rotation can be performed so as to provide the next cache register for storing the data copied by the hardware abstraction module.
  • the host computer can also Implement data analysis.
  • the above-mentioned step 302 after unpacking the target data packet and obtaining the operating data, may also include the following steps:
  • a rotation operation is performed on the target cache register, so that the target cache register after rotation continues to be used for accommodating data packets received by the AP.
  • the front-end module when the front-end module performs debugging data and processing, information can be divided and packaged based on the interval between frames. For example, each 33ms or 16.6ms is used to package and process the forward debugging data corresponding to the length of time. It can be guaranteed that the packaged debugging data can correspond to the image data (acquisition timing) according to the index or time stamp of the data packet, and the AP can distinguish based on this when processing. It is also possible to pack the debug data and the image data of the camera into the same buffer register. When the monitoring data and image data are packaged and transmitted, the data needs to be split and processed, so the cache registers of the original data will be parsed and reorganized in the pipeline.
  • infringement detection it is possible to identify whether the front-end module is virtualized into a video recording device by checking the working status of the camera during use. Furthermore, by judging the size and output speed (frame rate) of the data uploaded by the camera module, if the size is consistent within a certain period of time, and the output speed is output at a fixed speed, it can be basically confirmed that the data provided by the embodiment of the present application is used. The implementation method can also be further confirmed whether it constitutes an infringement.
  • the file can be scanned on the platform side (mobile phone terminal) to check whether there is any large-sized data generated in real time. After the larger data is real-time, the data can be pulled through adb, and then the data content is analyzed to confirm whether it is the debugging data packet transmitted by the front-end module.
  • the data transmission control method described in the embodiment of the present application is used for testing electronic equipment and setting a virtual video equipment, and the virtual video equipment is configured to encapsulate the operation of the front-end module of the electronic equipment at the first frame rate.
  • data using the virtual video device to pack the running data according to the second frame rate to obtain the target data packet, and sending the target data packet to the processor, at least partially based on the value of the first frame rate and/or the second frame rate.
  • the virtual video device can perform virtual video recording processing on the running data, so that the running data can be output to the external device according to a certain frame rate and size, and then, it can efficiently capture the working information of the front-end module for Developers analyze.
  • Figure 4 is a schematic flowchart of a data transmission control method provided by an embodiment of this application, which is applied to an electronic device, and the electronic device includes a front-end module and an application processor AP, as shown in the figure , the data transmission control method includes:
  • step 401-step 409 reference may be made to the relevant description of the data transmission control method described in FIG. 3A , which will not be repeated here.
  • the front-end module also configures the packaging method according to the pipeline configuration parameters, that is, to ensure that the data size after packaging is consistent with the cache register allocated by the AP side;
  • each sub-module in the front-end module performs data acquisition and output of the module operation, the MIPI of the front-end module performs data packaging processing, and each module performs cache register rotation according to the set parameters;
  • the AP receives the data through the hardware abstraction module, and performs data analysis and processing according to the data packaging method, and performs a copy operation on the analyzed data, and copies it to the USB service in real time;
  • the USB service transmits the received data to the external device
  • the cache registers are rotated to ensure that the front-end monitoring data is packaged and transmitted to the back-end in real time for storage or transmission to the PC side.
  • MIPI can process the operating data with a virtual video device
  • the operating data can be output to external devices at a certain frame rate and size, and then, the working information of the front-end module can be efficiently captured for developers to analyze.
  • FIG. 5 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device includes a front-end module, a processor, a memory, a communication interface, and One or more programs, wherein the above one or more programs are stored in the above memory and configured to be executed by the above processor or front-end module.
  • the above program includes instructions for performing the following steps:
  • the virtual video device is configured to encapsulate the operating data of the front-end module of the electronic device at a first frame rate
  • Determining whether the front-end module is abnormal is based at least in part on adjusting values of the first frame rate and/or the second frame rate.
  • the above program includes instructions for performing the following steps:
  • the virtual video device is set through the MIPI of the front-end module.
  • the above program also includes an instruction for performing the following steps:
  • the above program also includes instructions for performing the following steps:
  • the target packaging processing parameters include at least the second frame rate
  • the virtual video device packs the running data by using the target packing processing parameters to obtain the target data package.
  • the target data packet includes at least one data packet, and the data size of each data packet is less than or equal to the size of each cache register in a plurality of cache registers allocated by the USB service, and the cache register is used for A data packet of the at least one data packet is accommodated.
  • the above program includes instructions for performing the following steps:
  • the target pipeline configuration parameters corresponding to the attribute information of the target debugging content are determined according to the preset mapping relationship between the attribute information of the debugging content and the pipeline configuration parameters.
  • the above program also includes instructions for performing the following steps:
  • the processor receives the target data packet, unpacks the target data packet to obtain the operation data, and the operation data is used for data analysis.
  • the above program includes instructions for performing the following steps:
  • the above program further includes instructions for performing the following steps:
  • the above program also includes instructions for performing the following steps:
  • a rotation operation is performed on the target cache register, so that the target cache register after rotation continues to be used for accommodating data packets received by the processor.
  • the above program also includes an instruction for performing the following steps:
  • the step of setting a virtual video device is performed.
  • the above program also includes an instruction for performing the following steps:
  • Enabling the camera of the electronic device to call a hardware abstraction module wherein the hardware abstraction module is configured to: perform the function of unpacking the target data packet to obtain the operation data.
  • the electronic device includes hardware structures and/or software modules corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the embodiment of the present application may divide the electronic device into functional units according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units. It should be noted that the division of units in the embodiment of the present application is schematic, and is only a logical function division, and there may be another division manner in actual implementation.
  • FIG. 6 is a block diagram of functional units of a data transmission control device 600 involved in the embodiment of the present application.
  • the data transmission control device 600 is applied to an electronic device, the electronic device includes a front-end module and a processor, and the device includes: a setting unit 601, a packaging unit 602, a sending unit 603 and an abnormality detection unit 604, wherein,
  • the setting unit 601 is configured to set a virtual video device, the virtual video device is configured to encapsulate the operation data of the front-end module of the electronic device at a first frame rate;
  • the packaging unit 602 is configured to use the virtual video device to package the running data at a second frame rate to obtain a target data package;
  • the sending unit 603 is configured to send the target data packet to a processor
  • the abnormality detection unit 604 is configured to determine whether there is an abnormality in the front-end module based at least in part on adjusting the value of the first frame rate and/or the second frame rate.
  • the setting unit 601 is specifically configured to:
  • the virtual video device is set through the MIPI of the front-end module.
  • the apparatus 600 is further specifically configured to:
  • the packing unit 602 is specifically used for:
  • the target packaging processing parameters include at least the second frame rate
  • the virtual video device packs the running data by using the target packing processing parameters to obtain the target data package.
  • the target data packet includes at least one data packet, and the data size of each data packet is less than or equal to the size of each cache register in a plurality of cache registers allocated by the USB service, and the cache register is used for A data packet of the at least one data packet is accommodated.
  • the device 600 is specifically used to:
  • the target pipeline configuration parameters corresponding to the attribute information of the target debugging content are determined according to the preset mapping relationship between the attribute information of the debugging content and the pipeline configuration parameters.
  • the device 600 is also specifically used for:
  • the processor receives the target data packet, unpacks the target data packet to obtain the operation data, and the operation data is used for data analysis.
  • the device 600 is specifically used for:
  • the device 600 is further specifically configured to:
  • the device 600 is also specifically used for:
  • a rotation operation is performed on the target cache register, so that the target cache register after rotation continues to be used for accommodating data packets received by the processor.
  • the apparatus 600 is further specifically configured to:
  • Enabling the camera of the electronic device to call a hardware abstraction module wherein the hardware abstraction module is configured to: perform the function of unpacking the target data packet to obtain the operation data.
  • each "unit” can be, for example, an integrated circuit ASIC, a single circuit, used to execute one or more software or firmware Processors (shared, dedicated, or chipsets) and memories of programs, combinational logic circuits, and/or other suitable components that provide the functions described above.
  • the setting unit 601, the packaging unit 602, and the sending unit 603 may be front-end modules, and the abnormality detection unit 604 may be a processor, based on the above-mentioned unit modules, the functions or steps of any of the above-mentioned methods can be realized.
  • This embodiment also provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the above-mentioned computer program enables the computer to execute the embodiment of the present application to implement Any method in the above embodiments.
  • This embodiment also provides a computer program product, which, when running on a computer, causes the computer to execute the above-mentioned related steps, so as to implement any method in the above-mentioned embodiments.
  • an embodiment of the present application also provides a device, which may specifically be a chip, a component or a module, and the device may include a connected processor and a memory; wherein the memory is used to store computer-executable instructions, and when the device is running, The processor can execute the computer-executable instructions stored in the memory, so that the chip executes any method in the above method embodiments.
  • the electronic device, computer storage medium, computer program product or chip provided in this embodiment is all used to execute the corresponding method provided above, therefore, the beneficial effects it can achieve can refer to the corresponding method provided above The beneficial effects in the method will not be repeated here.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or It may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • a unit described as a separate component may or may not be physically separated, and a component shown as a unit may be one physical unit or multiple physical units, which may be located in one place or distributed to multiple different places. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • an integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the software product is stored in a storage medium Among them, several instructions are included to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods in various embodiments of the present application.
  • the aforementioned storage medium includes: various media that can store program codes such as U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente demande divulgue un procédé de commande de transmission de données et un appareil associé qui sont utilisés pour tester un dispositif électronique. Le procédé comprend : le réglage d'un dispositif vidéo virtuel, le dispositif vidéo virtuel étant configuré pour encapsuler des données de fonctionnement d'un module frontal d'un dispositif électronique à un premier débit de trames ; la mise en paquets des données de fonctionnement à un second débit de trames à l'aide du dispositif vidéo virtuel de façon à obtenir un paquet de données cible ; l'envoi du paquet de données cible à un processeur ; et au moins en partie sur la base de l'ajustement de la valeur du premier débit de trames et/ou du second débit de trames, le fait de déterminer si le module frontal présente une anomalie. Au moyen des modes de réalisation de la présente demande, des informations de fonctionnement d'un module frontal peuvent être efficacement capturées aux fins d'analyse par un développeur.
PCT/CN2022/095793 2021-08-09 2022-05-28 Procédé de commande de transmission de données et appareil associé WO2023016059A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110911104.8 2021-08-09
CN202110911104.8A CN115706769A (zh) 2021-08-09 2021-08-09 数据传输控制方法及相关装置

Publications (1)

Publication Number Publication Date
WO2023016059A1 true WO2023016059A1 (fr) 2023-02-16

Family

ID=85180000

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/095793 WO2023016059A1 (fr) 2021-08-09 2022-05-28 Procédé de commande de transmission de données et appareil associé

Country Status (2)

Country Link
CN (1) CN115706769A (fr)
WO (1) WO2023016059A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117156088A (zh) * 2023-04-21 2023-12-01 荣耀终端有限公司 图像处理方法及相关装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106713571A (zh) * 2015-08-20 2017-05-24 广州爱九游信息技术有限公司 一种测试游戏引擎应用的性能的移动终端及方法
CN110162451A (zh) * 2019-04-22 2019-08-23 腾讯科技(深圳)有限公司 一种性能分析方法、装置、服务器及存储介质
WO2020000489A1 (fr) * 2018-06-30 2020-01-02 华为技术有限公司 Procédé, appareil, dispositif et système d'envoi et de réception pcie
CN112230758A (zh) * 2020-11-09 2021-01-15 腾讯科技(深圳)有限公司 帧率调整方法、装置、设备及计算机可读存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106713571A (zh) * 2015-08-20 2017-05-24 广州爱九游信息技术有限公司 一种测试游戏引擎应用的性能的移动终端及方法
WO2020000489A1 (fr) * 2018-06-30 2020-01-02 华为技术有限公司 Procédé, appareil, dispositif et système d'envoi et de réception pcie
CN110162451A (zh) * 2019-04-22 2019-08-23 腾讯科技(深圳)有限公司 一种性能分析方法、装置、服务器及存储介质
CN112230758A (zh) * 2020-11-09 2021-01-15 腾讯科技(深圳)有限公司 帧率调整方法、装置、设备及计算机可读存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117156088A (zh) * 2023-04-21 2023-12-01 荣耀终端有限公司 图像处理方法及相关装置
CN117156088B (zh) * 2023-04-21 2024-06-11 荣耀终端有限公司 图像处理方法及相关装置

Also Published As

Publication number Publication date
CN115706769A (zh) 2023-02-17

Similar Documents

Publication Publication Date Title
CN113726950B (zh) 一种图像处理方法和电子设备
WO2020093988A1 (fr) Procédé de traitement d'image et dispositif électronique
WO2022127787A1 (fr) Procédé d'affichage d'image et dispositif électronique
WO2022042285A1 (fr) Procédé d'affichage d'interface de programme d'application et dispositif électronique
US11995317B2 (en) Method and apparatus for adjusting memory configuration parameter
WO2022100141A1 (fr) Procédé, système et appareil de gestion de module d'extension
CN110636554B (zh) 数据传输方法及装置
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
CN113641271A (zh) 应用窗口的管理方法、终端设备及计算机可读存储介质
CN113986070A (zh) 一种应用卡片的快速查看方法及电子设备
CN115756268A (zh) 跨设备交互的方法、装置、投屏系统及终端
WO2022121988A1 (fr) Procédé de synchronisation d'affichage, dispositif électronique et support de stockage lisible
WO2022033355A1 (fr) Procédé de traitement de courrier et dispositif électronique
WO2023016059A1 (fr) Procédé de commande de transmission de données et appareil associé
CN112783418B (zh) 一种存储应用程序数据的方法及移动终端
WO2023125518A1 (fr) Procédé de codage d'image et dispositif
US12093307B2 (en) Image data invoking method and system for application, electronic device, and storage medium
WO2022170866A1 (fr) Procédé et appareil de transmission de données, et support de stockage
CN112437341A (zh) 一种视频流处理方法及电子设备
WO2021238376A1 (fr) Procédé et appareil de chargement d'ensemble de fonctions, et serveur et dispositif électronique
CN114828098B (zh) 数据传输方法和电子设备
CN114630152A (zh) 用于图像处理器的参数传输方法、装置及存储介质
CN114003241A (zh) 应用程序的界面适配显示方法、系统、电子设备和介质
CN114816973A (zh) 调试代码的方法、装置、电子设备和可读存储介质
CN116266159B (zh) 一种缺页异常处理方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22855045

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22855045

Country of ref document: EP

Kind code of ref document: A1