CN115706769A - Data transmission control method and related device - Google Patents

Data transmission control method and related device Download PDF

Info

Publication number
CN115706769A
CN115706769A CN202110911104.8A CN202110911104A CN115706769A CN 115706769 A CN115706769 A CN 115706769A CN 202110911104 A CN202110911104 A CN 202110911104A CN 115706769 A CN115706769 A CN 115706769A
Authority
CN
China
Prior art keywords
data
target
data packet
processor
frame rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110911104.8A
Other languages
Chinese (zh)
Inventor
朱文波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zeku Technology Shanghai Corp Ltd
Original Assignee
Zeku Technology Shanghai Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zeku Technology Shanghai Corp Ltd filed Critical Zeku Technology Shanghai Corp Ltd
Priority to CN202110911104.8A priority Critical patent/CN115706769A/en
Priority to PCT/CN2022/095793 priority patent/WO2023016059A1/en
Publication of CN115706769A publication Critical patent/CN115706769A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a data transmission control method and a related device, which are used for testing electronic equipment, and the method comprises the following steps: setting a virtual video device configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate; packaging the operating data by using the virtual video equipment according to a second frame rate to obtain a target data packet; sending the target data packet to a processor; determining whether an anomaly exists in the front-end module based at least in part on adjusting the values of the first frame rate and/or the second frame rate. By adopting the embodiment of the application, the working information of the front-end module can be efficiently grasped to be analyzed by developers.

Description

Data transmission control method and related device
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a data transmission control method and a related apparatus.
Background
With the widespread use of electronic devices (such as mobile phones, tablet computers, smartwatches, and the like), electronic devices have increasingly supported applications and increasingly powerful functions, and the electronic devices are developed towards diversification and personalization, and become indispensable electronic products in user life.
At present, take the cell-phone as an example, all be some modes at the development stage to the debugging mode of chip, be not suitable for the environment after the commercialization, still go on based on the lug connection debugging line to the detection of front end operating condition, this just needs to have special hardware interface, certainly after the productization because the change of hardware environment, especially can't snatch relevant front end module work information and carry out the analysis for the developer, consequently, how to snatch relevant front end module work information with high efficiency and carry out the problem of analysis for the developer and wait for solution urgently.
Disclosure of Invention
The embodiment of the application provides a data transmission control method and a related device, which can efficiently grasp the working information of a front-end module for a developer to analyze.
In a first aspect, an embodiment of the present application provides a data transmission control method, configured to test an electronic device, where the method includes:
setting a virtual video device configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate;
packaging the operating data according to a second frame rate by using the virtual video equipment to obtain a target data packet;
sending the target data packet to a processor;
determining whether an anomaly exists in the front-end module based at least in part on adjusting the values of the first frame rate and/or the second frame rate.
In a second aspect, an embodiment of the present application provides an electronic device, including a front-end module and a processor, wherein,
the front end module is configured to:
setting a virtual video device to package the operating data of the front-end module according to a first frame rate;
packaging the operating data by using the virtual video equipment according to a second frame rate to obtain a target data packet; and
sending the target data packet to the processor;
the processor is configured to:
determining whether an anomaly exists for the front-end module based at least in part on adjusting values of the first frame rate and/or the second frame rate.
In a third aspect, an embodiment of the present application provides a data transmission control method, which is applied to an electronic device, where the electronic device includes a front-end module and a processor, and the method includes:
the front-end module is provided with a virtual video device to package the running data of the front-end module according to a first frame rate;
packaging the operating data by using the virtual video equipment according to a second frame rate to obtain a target data packet; and
sending the target data packet to the processor;
the processor is configured to:
determining whether an anomaly exists in the front-end module based at least in part on adjusting the values of the first frame rate and/or the second frame rate.
In a fourth aspect, an embodiment of the present application provides a data transmission control apparatus, which is applied to an electronic device, where the electronic device includes a front-end module and a processor, and the apparatus includes: a setting unit, a packing unit, a sending unit and an abnormality detecting unit, wherein,
the setting unit is used for setting a virtual video device, and the virtual video device is configured to encapsulate the operation data of a front-end module of the electronic device according to a first frame rate;
the packing unit is used for packing the operating data according to a second frame rate by using the virtual video equipment to obtain a target data packet;
the sending unit is used for sending the target data packet to a processor;
the anomaly detection unit is used for determining whether the front-end module has an anomaly or not at least partially based on adjusting the value of the first frame rate and/or the second frame rate.
In a fifth aspect, embodiments of the present application provide an electronic device, which includes a front-end module, a processor, and a memory, where the memory is configured to store one or more programs and is configured to be executed by the processor or the front-end module, and the programs include instructions for performing the steps in the method according to any one of the first aspect or the third aspect.
In a sixth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program enables a computer to perform some or all of the steps described in the first aspect or the third aspect of the embodiment of the present application.
In a seventh aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect or the third aspect of embodiments of the present application. The computer program product may be a software installation package.
The embodiment of the application has the following beneficial effects:
it can be seen that the data transmission control method and related apparatus described in the embodiments of the present application are used for testing an electronic device, setting a virtual video device, where the virtual video device is configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate, and package the operating data at a second frame rate by using the virtual video device to obtain a target data packet, and send the target data packet to a processor, where whether the front-end module is abnormal or not is determined based at least in part on adjusting a value of the first frame rate and/or the second frame rate.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 3A is a schematic flowchart of a data transmission control method according to an embodiment of the present application;
fig. 3B is a schematic diagram illustrating data transmission between a front-end module and an application processor according to an embodiment of the present application;
fig. 3C is a schematic structural diagram of a data packet provided in an embodiment of the present application;
FIG. 3D is a schematic diagram illustrating another data transmission between a front-end module and an application processor according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another data transmission control method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 6 is a block diagram of functional units of a data transmission control apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
In order to better understand the scheme of the embodiments of the present application, the following first introduces the related terms and concepts that may be involved in the embodiments of the present application.
The electronic device may include various devices having communication functions, such as a smart phone, a vehicle-mounted device, a wearable device, a charging device (e.g., a charger), a smart watch, smart glasses, a wireless bluetooth headset, a computing device or other processing device connected to a wireless modem, and various forms of User Equipment (UE), a Mobile Station (MS), a virtual reality/augmented reality device, a terminal device (terminal device), and so on, and may also be a base Station or a server, and may also be a test module composed of a front-end module and a processor (e.g., an application processor).
In a first section, the software and hardware operating environment of the technical solution disclosed in the present application is described as follows.
As shown, fig. 1 shows a schematic structural diagram of an electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a compass 190, a motor 191, a pointer 192, a camera 193, a display screen 194, and a Subscriber Identity Module (SIM) card interface 195, among others.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an application processor AP, a modem processor, a graphics processor GPU, an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural network processor NPU, among others. Wherein the different processing units may be separate components or may be integrated in one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to complete the control of instruction fetching and instruction execution. In other embodiments, a memory may also be provided in processor 110 for storing instructions and data. Illustratively, the memory in the processor 110 may be a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from memory. This avoids repeated accesses, reduces the latency of the processor 110, and thus increases the efficiency with which the electronic device 100 processes data or executes instructions. The processor may also include an image processor, which may be an image Pre-processor (Pre-ISP), which may be understood as a simplified ISP, which may also perform some image processing operations.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose-output (GPIO) interface, a SIM card interface, and/or a USB interface. The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. The USB interface 130 may also be used to connect to a headset to play audio through the headset.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives an input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G/6G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (blue tooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a mini light-emitting diode (mini-light-emitting diode, mini), a Micro-o led, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or more display screens 194.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or more cameras 193.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may execute the above-mentioned instructions stored in the internal memory 121, so as to enable the electronic device 100 to execute the method for displaying page elements provided in some embodiments of the present application, and various applications and data processing. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage program area may also store one or more applications (e.g., gallery, contacts, etc.), and the like. The storage data area may store data (e.g., photos, contacts, etc.) created during use of the electronic device 100, and the like. Further, the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage components, flash memory components, universal Flash Storage (UFS), and the like. In some embodiments, the processor 110 may cause the electronic device 100 to execute the method for displaying page elements provided in the embodiments of the present application and other applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110. The electronic device 100 may implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor, etc. Such as music playing, recording, etc.
The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., X, Y and the Z axis) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
Fig. 2 shows a block diagram of a software structure of the electronic device 100. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom. The application layer may include a series of application packages.
As shown in fig. 2, the application layer may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions for the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android Runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Based on the electronic device described in fig. 1 or fig. 2, the electronic device includes a front-end module and a processor, and may be configured to implement the following functions:
the front end module is configured to:
setting a virtual video device to package the operation data of the front-end module according to a first frame rate;
packaging the operating data according to a second frame rate by using the virtual video equipment to obtain a target data packet; and
sending the target data packet to the processor;
the processor is configured to:
determining whether an anomaly exists for the front-end module based at least in part on adjusting values of the first frame rate and/or the second frame rate.
It can be seen that, in the electronic device described in this embodiment of the present application, the electronic device is configured to test the electronic device, set a virtual video device, where the virtual video device is configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate, utilize the virtual video device to package the operating data at a second frame rate, obtain a target data packet, send the target data packet to a processor, and determine whether an abnormality exists in the front-end module at least partially based on adjusting a value of the first frame rate and/or the second frame rate.
Optionally, the front-end module is further configured to: acquiring target pipeline configuration parameters;
wherein, in said aspect of using said virtual video device to pack said operating data at a second frame rate to obtain a target data packet, said front-end module is specifically configured to:
determining a target packaging processing parameter corresponding to the target pipeline configuration parameter according to a mapping relation between preset pipeline configuration parameters and packaging processing parameters, wherein the target packaging processing parameter at least comprises the second frame rate;
and packing the running data by the virtual video equipment by adopting the target packing processing parameters to obtain the target data packet.
Optionally, the target data packet includes at least one data packet, and a data size of each data packet is smaller than or equal to a size of each buffer register in a plurality of buffer registers allocated by the USB service, where the buffer register is used to accommodate a data packet in the at least one data packet.
Optionally, in the aspect of obtaining the target pipeline configuration parameter, the front-end module is specifically configured to:
determining target debugging content;
acquiring attribute information corresponding to the target debugging content;
and determining the target pipeline configuration parameters corresponding to the attribute information of the target debugging content according to the mapping relation between the preset attribute information of the debugging content and the pipeline configuration parameters.
Optionally, the processor is further specifically configured to:
and receiving the target data packet through the processor, unpacking the target data packet to obtain the operating data, wherein the operating data is used for realizing data analysis.
Optionally, in the aspect of unpacking the target data packet to obtain the operation data, the processor is further specifically configured to:
copying the target data packet into a target cache register distributed by a USB service in a frame-by-frame mode;
unpacking the data in the target cache register to obtain the operating data.
Optionally, after the unpacking the target data packet to obtain the operation data, the processor is further specifically configured to:
and calling the USB service to upload the running data to external equipment through the target cache register so that the external equipment can analyze the running data.
Optionally, the processor is further specifically configured to:
and performing rotation operation on the target cache register, so that the target cache register after rotation is continuously used for accommodating the data packet received by the processor.
Optionally, before the setting of the virtual video device, the processor is further specifically configured to:
enabling a camera of the electronic device to invoke a hardware abstraction module, wherein the hardware abstraction module is configured to: and the function is used for executing the function of unpacking the target data packet to obtain the operating data.
In the second section, the data transmission control method and apparatus disclosed in the embodiments of the present application are introduced as follows.
As shown in the figure 3A, fig. 3A is a schematic flowchart of a data transmission control method provided in an embodiment of the present application, and is applied to an electronic device, where the electronic device includes a front-end module and a processor, for example, the processor may be an application processor AP, and the data transmission control method may be used for testing the electronic device, and includes:
301. setting a virtual video device configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate.
Wherein the operational data may be at least one of: the operating status data, image data, voice data, debug log data, video data, etc. of each chip in the front-end module are not limited herein. The working state data of each chip in the front-end module can be at least one of the following: the operating level, waveform, operating voltage, operating current, operating power, operating clock, etc., without limitation, the operating data may be raw data or preprocessed raw data, and the preprocessing may be at least one of the following: sampling, compressing, screening, etc., without limitation.
As shown in fig. 3B, in this embodiment, the front end module may include: an Image Signal Processor (ISP), a neural Network Processor (NPU), a memory (DDR), a total control module (TOP), a selector (MUX), a MIPI, etc., without limitation thereto. The selector is used for realizing selection operation of the operation data, for example, only the operation data of the ISP is selected, the MIPI samples and packs the operation data, the packed data is sent to the AP through a Camera Serial Interface (CSI), and the AP transmits the operation data to the external device through a USB service. The front-end module may be an image preprocessor.
Optionally, in step 301, the virtual video device is set, which may be implemented as follows:
and setting the virtual video equipment through a Mobile Industry Processor Interface (MIPI) of the front-end module.
In a specific implementation, a virtual video device may be set through the MIPI, and specifically, the virtual video device may be set through a Camera Serial Interface (CSI) of the MIPI, and the virtual video device may also be referred to as a virtual video (video) device.
Optionally, before the step 301, setting the virtual video device, the following steps may be further included:
a1, obtaining target pipeline configuration parameters;
in step 301, the step of packaging the operation data by using the virtual video device at the second frame rate to obtain a target data packet may include the following steps:
31. determining a target packaging processing parameter corresponding to the target pipeline configuration parameter according to a mapping relation between preset pipeline configuration parameters and packaging processing parameters, wherein the target packaging processing parameter at least comprises the second frame rate;
32. and packing the running data by the virtual video equipment by adopting the target packing processing parameters to obtain the target data packet.
In this embodiment, the pipeline configuration parameter may include at least one of the following: the pipe type may be understood as a pipe for processing data of a certain data type, for example, image data corresponds to an image data pipe type, and for example, voice data corresponds to a voice data pipe type, and the pipe serial number may be understood as a serial number of the pipe, the pipe serial number may be predefined, and pipes with different serial numbers have different functions. The packetizing processing parameter includes at least a frame rate, and the packetizing processing parameter may further include at least one of: packet size, header, trailer, etc., and are not limited herein. The electronic device may pre-store a mapping relationship between preset pipe configuration parameters and packing processing parameters, that is, different pipe configuration parameters may correspond to different packing processing parameters.
In a specific implementation, the target pipeline configuration parameter may be preconfigured, the electronic device may obtain the target pipeline configuration parameter from a hardware abstraction module (camera hal), determine a target packing processing parameter corresponding to the target pipeline configuration parameter according to a mapping relationship between the preset pipeline configuration parameter and the packing processing parameter, and pack the operating data by using the target packing processing parameter through the virtual video device to obtain a target data packet, where the target data packet may be at least one data packet, and the target packing processing parameter at least includes the second frame rate.
For example, if the amount of information generated by the front-end module in a certain time is small, the buffer register may be insufficiently filled, and the buffer register and the transmission bandwidth may be wasted, so that data may also be packed based on the principle of being filled and sent immediately, and the packed data and the next frame of data are transmitted to the AP together for processing, or the packed data may be time-stamped and sent to the AP for processing separately, so as to save the transmission bandwidth.
Optionally, the target data packet includes at least one data packet, and a data size of each data packet is smaller than or equal to a size of each buffer register in a plurality of buffer registers allocated by the USB service, where the buffer register is used to accommodate a data packet in the at least one data packet.
The target data packet may include at least one data packet, and the data size of each data packet is smaller than or equal to the size of each cache register in the plurality of cache registers allocated by the USB service, and the cache register is configured to accommodate the data packet in the at least one data packet, so that the cache register can accommodate the data packet conveniently.
Optionally, in the step A1, obtaining the target pipeline configuration parameter may include the following steps:
a11, determining target debugging content corresponding to the engineering mode;
a12, obtaining attribute information corresponding to the target debugging content;
and A13, determining the target pipeline configuration parameters corresponding to the attribute information of the target debugging content according to the mapping relation between the preset attribute information of the debugging content and the pipeline configuration parameters.
In this embodiment of the present application, the attribute information of the debug content may be at least one of the following: the data type, the debugging purpose, the parameter type of the debugging parameter, the foreground application, and the like, which are not limited herein, and the mapping relationship between the preset attribute information of the debugging content and the pipeline configuration parameter may also be stored in the electronic device in advance.
In specific implementation, the electronic device may determine target debugging content corresponding to the engineering mode in the engineering mode, where the debugging content may be set by a user, or may determine the debugging content based on an operation of the user, and then obtain attribute information corresponding to the target debugging content, and further determine a target pipeline configuration parameter corresponding to the attribute information of the target debugging content according to a mapping relationship between the preset attribute information of the debugging content and the pipeline configuration parameter, so that the corresponding pipeline configuration parameter may be determined according to the attribute information of the debugging content, and further, the pipeline configuration parameter may deeply meet debugging requirements.
Optionally, before the virtual video device is set in step 301, the following steps may be further included:
b1, detecting whether the electronic equipment is in an engineering mode;
and B2, when the electronic equipment is in the engineering mode, executing the step of setting the virtual video equipment.
In a specific implementation, the electronic device may detect whether the electronic device is in an engineering mode, and may execute step 301 when the electronic device is in the engineering mode, otherwise, step 301 may not be executed. The engineering mode can be set by a user, or the electronic equipment automatically enters the engineering mode when the electronic equipment is abnormal.
Optionally, before the virtual video device is set in step 301, the following steps may be further included:
enabling a camera of the electronic device to invoke a hardware abstraction module, wherein the hardware abstraction module is configured to: and the function is used for executing the function of unpacking the target data packet to obtain the operating data.
In the embodiment of the application, only the camera function is started, but the camera is not started, after the camera function is started, the hardware abstraction module needs to be called subsequently, so that the camera function needs to be started, the hardware abstraction module is called smoothly, and the hardware evacuation module can be used for realizing the unpacking operation to obtain the running data.
302. And packaging the operating data according to a second frame rate by using the virtual video equipment to obtain a target data packet.
In specific implementation, the electronic device may obtain the operation data of at least one of the front-end modules, and package the operation data at the second frame rate through the virtual video device to obtain the target data packet, that is, the target data packet may output the operation data according to a certain frequency and size. Wherein, the first frame rate and the second frame rate can be the same or different.
303. And sending the target data packet to a processor.
Specifically, the electronic device can perform virtualization processing on the operation data through the virtual video device, so that the operation data can be transmitted at certain intervals and according to certain size, after the obtained operation data is packed according to certain rules by the front-end module, the operation data can be placed in a cache register (for driving the front-end module to store data) distributed at the back end, the packed data is further transmitted to an Application Processor (AP) (back end) through MIPI, a plurality of cache registers (buffer registers) can be distributed through USB service, and then the rotation operation of the cache registers is realized. The number of cache registers may be set at the discretion of the user or may be determined by the bandwidth and/or speed of generation of the operational data.
304. Determining whether an anomaly exists in the front-end module based at least in part on adjusting the values of the first frame rate and/or the second frame rate.
In specific implementation, the operation data may be sent to an external device for data analysis, or the electronic device may also perform data analysis on the operation data to implement abnormality detection or detection of the operating state of each chip in the front-end module, where the external device may be another electronic device, for example, an upper computer (PC). The data analysis may be at least one of the following functions: the detection of an anomaly, the detection of the state of each chip in the front-end module, the analysis of debug data, and the like, are not limited herein. The electronic device may also display the operational data via a display screen. In the embodiment of the application, the first frame rate and/or the second frame rate can be dynamically adjusted, so that effective encapsulation and packet transmission of the operation data are guaranteed.
Specifically, whether the front-end module is abnormal or not may be determined at least partially based on adjusting the value of the first frame rate, or whether the front-end module is abnormal or not may be determined at least partially based on adjusting the value of the second frame rate, or whether the front-end module is abnormal or not may be determined at least partially based on adjusting the values of the first frame rate and the second frame rate, that is, in a specific implementation, it is possible to adjust some or all of the operation data at the first frame rate or the second frame rate, so as to facilitate efficient transportation data transmission by the front end in a rhythmic manner, and to facilitate fast extraction of the operation data by the back end in a rhythmic manner, and during the adjustment, the target data packet may be analyzed to obtain the operation data of the front-end module, and it may be determined whether the front-end module is abnormal or not by the operation data.
For example, in the embodiment of the present application, the electronic device processes the virtual video device of the abnormal information collection module in the front-end module, so that the abnormal information (including detection information during running) of different modules can be collected and packaged at the front end according to a certain rhythm and size, so as to be output to the PC side in real time, and when a developer needs to perform debugging, the reason why a problem occurs can be obtained by analyzing received monitoring data.
Further, for example, in the embodiment of the present application, when the operation data is image data, for example, a processing effect (for example, an effect of some areas) of each frame may be compared with a pre-stored standard effect graph to obtain a result of determining the processing effect of the current frame, and further, a processing parameter of a correlation algorithm is adaptively adjusted according to the result of determining, so that image processing of the camera system can stably perform image processing.
Optionally, after step 304, the following steps may be further included:
c1, performing feature extraction on the operation data to obtain a target feature set;
c2, inputting the target feature set into a preset neural network model to obtain a target operation result;
and C3, determining target abnormal information according to the target operation result.
The preset neural network model may be at least one of the following: convolutional neural network models, fully-connected neural network models, recurrent neural network models, and the like, without limitation.
In the embodiment of the present application, the steps C1 to C3 may be implemented by an electronic device, or may also be implemented by an external device, for example, an upper computer PC. The set of target features may include at least one feature, or the set of target features may include at least one class of features. When the operational data is image data, the characteristic may be at least one of: feature points, feature vectors, colors, pixels, feature lines, and the like, which are not limited herein. When the operational data is voice data, the characteristic may be at least one of: frequency, amplitude, waveform, timbre, pitch, wavelength, etc., without limitation thereto. When the operational data is debug data, the characteristic may be at least one of: level, waveform, voltage, current, power, log data, etc., without limitation.
Before executing step 301 in the embodiment of the present application, a neural network model may be trained through a large amount of sample data and labels corresponding to the samples, and after the neural network model converges, a preset neural network model may be obtained.
In a specific implementation, the electronic device may perform feature extraction on the operation data to obtain a target feature set, then input the target feature set to a preset neural network model to obtain a target operation result, where the operation result may be at least one tag and a corresponding probability value, then generate an anomaly analysis report according to the target operation result, and use the anomaly analysis report as target anomaly information, or may also store a mapping relationship between the operation result and the anomaly information in advance in the electronic device, and then determine target anomaly information corresponding to the target operation result according to the mapping relationship, where the anomaly information may be at least one of the following: the cause of the abnormality, the location of the abnormality, the duration of the abnormality, etc., are not limited herein.
For example, according to the embodiment of the present application, anomaly detection may be actively initiated on a system and an image through a front-end module, when an anomaly occurs, the anomaly may be detected in advance and relevant anomaly information may be obtained, and then, under the condition that an upper layer is not affected, that is, under the condition that a user experiences and calls an image preprocessor by the upper layer, adaptive processing corresponding to the anomaly may be quickly performed, and further, a processed result and anomaly information may be reported, so that the upper layer may make further judgment and processing according to the adaptive processing result and the anomaly information in time, thereby better improving robustness of the system, reducing the impact of the anomaly on user experience, and improving user experience.
As shown in fig. 3C, any data packet in the destination data packet may include a header PH, operation data, and a trailer PF, where the header may be used to mark a start position of a data packet, and the trailer may be used to indicate an end position of the data packet.
Further, the packet header may include: the data packet comprises a packet header mark, an index packet mark and a packet data length, wherein the packet header mark is used for indicating the data type of a current data packet, the index packet mark is used for indicating an independent index of the current data packet, and the packet data length is used for indicating the data length of the current data packet, and the specific structure is shown in the following table:
baotou structure Length in bytes
Heading label Byte3
Index packet tag Byte2
Packet data length Byte1+Byte0
Further, the packet trailer may include: an end-of-packet flag indicating the end-of-packet position, a packet count indicating the count (the number) of the packet, and a frame count indicating from which frame the packet came, as shown in the following table:
tail wrapping structure Length in bytes
Tailing label Byte3
Packet counting Byte2
Optionally, the method may further include the following steps:
and receiving the target data packet through the AP, unpacking the target data packet to obtain the operating data, wherein the operating data is used for realizing data analysis.
The AP may receive the target data packet, unpack the target data packet according to the first frame rate and/or the second frame rate, obtain operation data, and implement data analysis by using the operation data, so as to achieve the purpose of anomaly detection.
Optionally, in step 302, the target data packet is unpacked to obtain the operation data, which may be implemented as follows:
and unpacking the target data packet through a hardware abstraction module to obtain the operating data.
The unpacking is a reverse operation of a packing action, the hardware abstraction module can analyze a target data packet according to a flag bit or a predetermined rule, and then process the analyzed data, for example, the storage sequence can be adjusted or abnormal information can be packed into image data of a secondary camera when multiple paths of image data exist (the size of the image data of the secondary camera is small, and space and bandwidth can be used for transmitting debugging information), and the cache register of the primary camera only transmits the image data.
For example, as shown in fig. 3D, taking debug data as an example, an internal signal of the front-end module may be forwarded from the MIPI of the front-end module to the AP-side DDR, and the AP-side triggers an interrupt after receiving the data. Since the front-end module performs the packing process on the debug data as a virtual camera device (virtual video device), a special pipeline (pipeline) needs to be started in the hardware abstraction module to serve the virtual video device. Furthermore, the USB service needs to allocate a relevant cache register and send the cache register to the hardware abstraction module, and the hardware abstraction module needs to copy the obtained debug data to the cache register allocated to the USB service in a frame-by-frame manner. For the rotation of the cache register, the USB service side may implement the rotation of the cache register in a callback function (callback).
In the embodiment of the application, for the customization of the AP to the engineering mode and the pipeline configuration parameter, because integrity and differentiability of monitoring information of the front-end module need to be ensured, a data processing module of the AP needs to be selectively accessed during customization, for example, some modules that may affect data content may not be accessed, such as a noise reduction function module, so as to avoid tampering data. Furthermore, the processing of the cache register by the pipeline is different from that of a traditional camera processing module, and the data needs to be split and processed when the monitoring data and the image data are packed and transmitted, so that the cache register of the original data can be analyzed and recombined in the pipeline, the analysis and recombination of the data can be analyzed according to a flag bit or an agreed rule, and then the analyzed data are respectively processed.
Optionally, in the step 302, unpacking the target data packet to obtain the operation data may include the following steps:
31. copying the target data packet into a target cache register distributed by a USB service in a frame-by-frame mode;
32. and unpacking the data in the target cache register to obtain the operating data.
In the concrete implementation, the electronic device can copy the target data packet into a target cache register allocated by the USB service in a frame-by-frame mode through the hardware abstraction module, unpack the data in the target cache register through the hardware abstraction module to obtain running data, and after the target cache register is occupied, the cache register can be rotated so as to provide the next cache register for accommodating the data copied by the hardware abstraction module, and certainly, after the running data is uploaded to an upper computer PC, the data analysis can be realized through the upper computer.
Optionally, in step 302, after unpacking the target data packet to obtain the operating data, the method may further include the following steps:
and calling a USB service to upload the running data to external equipment through the target cache register so that the external equipment can perform data analysis on the running data.
Further, optionally, the method may further include the steps of:
and performing rotation operation on the target cache register, so that the target cache register after rotation is continuously used for accommodating the data packet received by the AP.
For example, when the front-end module performs debugging data and processing, information division and packing may be performed based on an interval between frames, for example, each interval is 33ms or 16.6ms, so as to perform packing processing on the debugging data of a forward corresponding time length, which may ensure that the packed debugging data may correspond to image data (acquisition time) according to an index or a timestamp of a data packet, and the AP may distinguish accordingly when performing processing. The debugging data and the image data of the camera can be packed into the same cache register. When the monitoring data and the image data are packed and transmitted, the data need to be split, so that the cache register of the original data can be analyzed and recombined in the pipeline.
For example, in practical applications, when the infringement detection is required, whether the front-end module is virtualized into a video recording device or not can be identified by checking the working state of the camera during use. Furthermore, by determining the size of the data uploaded by the camera module and the output speed (frame rate), if the sizes are consistent within a certain time and the output speed is output at a fixed speed, it can be basically determined that the implementation method provided by the embodiment of the present application is used, and it can be further determined whether infringement is formed.
For example, in practical application, for debugging data transmitted by the front-end module, because the data volume is large, a file can be scanned on a platform side (a mobile phone side), whether large-size data is generated in real time is checked, when the large-size data is detected in real time, the data can be pulled in an adb mode, and then the data content is analyzed, so as to determine whether the debugging data packet is transmitted by the front-end module.
It can be seen that, the data transmission control method described in the embodiments of the present application is used for testing an electronic device, setting a virtual video device, where the virtual video device is configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate, and utilize the virtual video device to package the operating data at a second frame rate to obtain a target data packet, and send the target data packet to a processor, where whether an abnormality exists in the front-end module is determined based at least in part on adjusting a value of the first frame rate and/or the second frame rate.
As shown in the figure 4, fig. 4 is a schematic flowchart of a data transmission control method provided in an embodiment of the present application, and is applied to an electronic device, where the electronic device includes a front-end module and an application processor AP, and the data transmission control method includes:
401. turning on a camera function to invoke a hardware abstraction module, wherein the hardware abstraction module is configured to: for performing unpacking operations.
402. And acquiring target pipeline configuration parameters.
403. And determining a target packaging processing parameter corresponding to the target pipeline configuration parameter according to a mapping relation between preset pipeline configuration parameters and packaging processing parameters.
404. Setting a virtual video device configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate.
405. And packaging the running data according to the target packaging processing parameters by using the virtual video equipment to obtain a target data packet.
406. And sending the target data packet to a processor.
407. And receiving the target data packet through the AP, and copying the target data packet into a target cache register distributed by the USB service in a frame-by-frame mode through the hardware abstraction module.
408. And unpacking the data in the target cache register through the hardware abstraction module to obtain the operating data.
409. And calling a USB service to upload the running data to external equipment through the target cache register so that the external equipment can perform data analysis on the running data.
For the detailed description of the steps 401 to 409, reference may be made to the related description of the data transmission control method described in fig. 3A, and details are not repeated here.
For example, in a specific implementation, the following steps may be performed:
s1, opening a camera;
s2, configuring related customized pipeline configuration parameters and performing resource allocation;
s3, the front-end module configures a packaging mode according to the pipeline configuration parameters, namely, the data size after packaging is ensured to be consistent with the cache register distributed on the AP side;
s4, starting a data stream, acquiring and outputting running data of the module by each sub-module in the front-end module, packaging the data by the MIPI of the front-end module, and rotating a cache register by each module according to set parameters;
s5, the AP receives the data through the hardware abstraction module, analyzes the data according to the data packaging mode, performs copy operation on the analyzed data, and copies the data into the USB service in real time;
s6, the USB service transmits the received data to external equipment;
and S7, rotating the cache register based on the steps to ensure that the monitoring data at the front end is packed in real time and transmitted to the rear end for storage or transmission processing to the PC side.
Furthermore, due to the fact that the MIPI can perform virtual video device processing on the operation data, the operation data can be output to external equipment according to a certain frame rate and size, and then the operation information of the front-end module can be efficiently captured to be analyzed by developers.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, as shown in the figure, the electronic device includes a front-end module, a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor or the front-end module, and in an embodiment of the present disclosure, the programs include instructions for performing the following steps:
setting a virtual video device configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate;
packaging the operating data by using the virtual video equipment according to a second frame rate to obtain a target data packet;
sending the target data packet to a processor;
determining whether an anomaly exists in the front-end module based at least in part on adjusting the values of the first frame rate and/or the second frame rate.
Optionally, in the aspect of setting up a virtual video device, the program includes instructions for performing the following steps:
and setting the virtual video equipment through a Mobile Industry Processor Interface (MIPI) of the front-end module.
Optionally, before the setting of the virtual video device, the program further includes instructions for performing the following steps:
acquiring target pipeline configuration parameters;
in the aspect of obtaining a target data packet by packaging the operating data at the second frame rate using the virtual video device, the program further includes instructions for performing the following steps:
determining a target packaging processing parameter corresponding to the target pipeline configuration parameter according to a mapping relation between preset pipeline configuration parameters and packaging processing parameters, wherein the target packaging processing parameter at least comprises the second frame rate;
and packing the running data by the virtual video equipment by adopting the target packing processing parameters to obtain the target data packet.
Optionally, the target data packet includes at least one data packet, and a data size of each data packet is smaller than or equal to a size of each buffer register in a plurality of buffer registers allocated by the USB service, where the buffer register is used to accommodate a data packet in the at least one data packet.
Optionally, in the aspect of obtaining the target pipeline configuration parameter, the program includes instructions for performing the following steps:
determining target debugging content corresponding to the engineering mode;
acquiring attribute information corresponding to the target debugging content;
and determining the target pipeline configuration parameters corresponding to the attribute information of the target debugging content according to the mapping relation between the preset attribute information of the debugging content and the pipeline configuration parameters.
Optionally, the program further includes instructions for performing the following steps:
and receiving the target data packet through the processor, unpacking the target data packet to obtain the operating data, wherein the operating data is used for realizing data analysis.
Optionally, in the aspect of unpacking the target data packet to obtain the operating data, the program includes instructions for executing the following steps:
copying the target data packet into a target cache register distributed by a USB service in a frame-by-frame mode;
unpacking the data in the target cache register to obtain the operating data.
Optionally, after the unpacking is performed on the target data packet to obtain the operating data, the program further includes instructions for executing the following steps:
and calling a USB service to upload the running data to external equipment through the target cache register so that the external equipment can perform data analysis on the running data.
Optionally, the program further includes instructions for performing the following steps:
and performing rotation operation on the target cache register, so that the target cache register after rotation is continuously used for accommodating the data packet received by the processor.
Optionally, before the setting of the virtual video device, the program further includes instructions for performing the following steps:
detecting whether the electronic equipment is in an engineering mode;
and when the electronic equipment is in the engineering mode, executing the step of setting the virtual video equipment.
Optionally, before the setting of the virtual video device, the program further includes instructions for performing the following steps:
enabling a camera of the electronic device to invoke a hardware abstraction module, wherein the hardware abstraction module is configured to: and the function is used for executing the function of unpacking the target data packet to obtain the operating data.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 6 is a block diagram showing functional units of a data transmission control apparatus 600 according to an embodiment of the present application. The data transmission control device 600 is applied to an electronic device including a front-end module and a processor, and includes: a setting unit 601, a packing unit 602, a sending unit 603, and an abnormality detecting unit 604, wherein,
the setting unit 601 is configured to set a virtual video device, where the virtual video device is configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate;
the packing unit 602 is configured to pack the operating data at a second frame rate by using the virtual video device, so as to obtain a target data packet;
the sending unit 603 is configured to send the target data packet to a processor;
the anomaly detection unit 604 is configured to determine whether an anomaly exists in the front-end module based at least in part on adjusting the values of the first frame rate and/or the second frame rate.
Optionally, in terms of setting the virtual video device, the setting unit 601 is specifically configured to:
and setting the virtual video equipment through a Mobile Industry Processor Interface (MIPI) of the front-end module.
Optionally, before the setting of the virtual video device, the apparatus 600 is further specifically configured to:
acquiring target pipeline configuration parameters;
in the aspect that the virtual video device is used to package the operating data at the second frame rate to obtain the target data packet, the packaging unit 602 is specifically configured to:
determining a target packaging processing parameter corresponding to the target pipeline configuration parameter according to a mapping relation between preset pipeline configuration parameters and packaging processing parameters, wherein the target packaging processing parameter at least comprises the second frame rate;
and packing the running data by the virtual video equipment by adopting the target packing processing parameters to obtain the target data packet.
Optionally, the target data packet includes at least one data packet, and a data size of each data packet is smaller than or equal to a size of each buffer register in a plurality of buffer registers allocated by the USB service, where the buffer register is used to accommodate a data packet in the at least one data packet.
Optionally, in the aspect of obtaining the target pipeline configuration parameter, the apparatus 600 is specifically configured to:
determining target debugging content;
acquiring attribute information corresponding to the target debugging content;
and determining the target pipeline configuration parameters corresponding to the attribute information of the target debugging content according to the mapping relation between the preset attribute information of the debugging content and the pipeline configuration parameters.
Optionally, the apparatus 600 is further specifically configured to:
and receiving the target data packet through the processor, unpacking the target data packet to obtain the operating data, wherein the operating data is used for realizing data analysis.
Optionally, in the aspect of unpacking the target data packet to obtain the operation data, the apparatus 600 is specifically configured to:
copying the target data packet into a target cache register distributed by a USB service in a frame-by-frame mode;
unpacking the data in the target cache register to obtain the operating data.
Optionally, after the unpacking is performed on the target data packet to obtain the operating data, the apparatus 600 is further specifically configured to:
and calling the USB service to upload the running data to external equipment through the target cache register so that the external equipment can perform data analysis on the running data.
Optionally, the apparatus 600 is further specifically configured to:
and performing rotation operation on the target cache register, so that the target cache register after rotation is continuously used for accommodating the data packet received by the processor.
Optionally, before the setting of the virtual video device, the apparatus 600 is further specifically configured to:
enabling a camera of the electronic device to invoke a hardware abstraction module, wherein the hardware abstraction module is configured to: and the function is used for executing the function of unpacking the target data packet to obtain the operating data.
It should be noted that the electronic device described in the embodiments of the present application is presented in the form of a functional unit. The term "unit" as used herein is to be understood in its broadest possible sense, and objects used to implement the functions described by the respective "unit" may be, for example, an integrated circuit ASIC, a single circuit, a processor (shared, dedicated, or chipset) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
The setting unit 601, the packing unit 602, and the sending unit 603 may be front-end modules, and the anomaly detection unit 604 may be a processor, based on which the functions or steps of any of the above methods can be implemented.
The present embodiment also provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the computer program causes a computer to execute the embodiments of the present application to implement any one of the methods in the embodiments.
The present embodiment also provides a computer program product, which when run on a computer causes the computer to execute the relevant steps described above to implement any of the methods in the above embodiments.
In addition, an apparatus, which may be specifically a chip, a component or a module, may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute any one of the methods in the above method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented as a software functional unit and sold or used as a separate product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A data transmission control method for testing an electronic device, the method comprising:
setting a virtual video device configured to encapsulate operating data of a front-end module of the electronic device at a first frame rate;
packaging the operating data by using the virtual video equipment according to a second frame rate to obtain a target data packet;
sending the target data packet to a processor;
determining whether an anomaly exists in the front-end module based at least in part on adjusting the values of the first frame rate and/or the second frame rate.
2. The method of claim 1, wherein the setting up the virtual video device comprises:
and setting the virtual video equipment through a Mobile Industry Processor Interface (MIPI) of the front-end module.
3. The method of claim 1 or 2, wherein prior to said setting up the virtual video device, the method further comprises:
acquiring target pipeline configuration parameters;
wherein the packing the operating data by using the virtual video device at a second frame rate to obtain a target data packet comprises:
determining a target packaging processing parameter corresponding to the target pipeline configuration parameter according to a mapping relation between preset pipeline configuration parameters and packaging processing parameters, wherein the target packaging processing parameter at least comprises the second frame rate;
and packing the running data by the virtual video equipment by adopting the target packing processing parameters to obtain the target data packet.
4. The method of claim 3, wherein the destination packet comprises at least one packet, and wherein a data size of each packet is less than or equal to a size of each of a plurality of buffer registers allocated by a USB service for accommodating packets of the at least one packet.
5. The method of claim 3 or 4, wherein the obtaining target pipeline configuration parameters comprises:
determining target debugging content;
acquiring attribute information of the target debugging content;
and determining the target pipeline configuration parameters corresponding to the attribute information of the target debugging content according to the mapping relation between the preset attribute information of the debugging content and the pipeline configuration parameters.
6. The method according to any one of claims 1-5, further comprising:
and receiving the target data packet through the processor, unpacking the target data packet to obtain the operating data, wherein the operating data is used for realizing data analysis.
7. The method according to any one of claims 1 to 6, wherein the unpacking the target data packet to obtain the operation data comprises:
copying the target data packet into a target cache register distributed by a USB service in a frame-by-frame mode;
unpacking the data in the target cache register to obtain the operating data.
8. The method of claim 7, wherein after the unpacking the target data packet to obtain the operating data, the method further comprises:
and calling the USB service to upload the running data to external equipment through the target cache register so that the external equipment can perform data analysis on the running data.
9. The method of claim 8, further comprising:
and performing rotation operation on the target cache register, so that the target cache register after rotation is continuously used for accommodating the data packet received by the processor.
10. The method of any of claims 1-9, wherein prior to said setting up the virtual video device, the method further comprises:
enabling a camera of the electronic device to invoke a hardware abstraction module, wherein the hardware abstraction module is configured to: and the function is used for executing the function of unpacking the target data packet to obtain the operating data.
11. An electronic device comprising a front-end module and a processor, characterized in that:
the front end module is configured to:
setting a virtual video device to package the operation data of the front-end module according to a first frame rate;
packaging the operating data by using the virtual video equipment according to a second frame rate to obtain a target data packet; and
sending the target data packet to the processor;
the processor is configured to:
determining whether an anomaly exists in the front-end module based at least in part on adjusting the values of the first frame rate and/or the second frame rate.
12. An electronic device, comprising a front-end module, a processor, a memory for storing one or more programs and configured to be executed by the processor or the front-end module, the programs comprising instructions for performing the steps in the method of any of claims 1-10.
13. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any of the claims 1-10.
CN202110911104.8A 2021-08-09 2021-08-09 Data transmission control method and related device Pending CN115706769A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110911104.8A CN115706769A (en) 2021-08-09 2021-08-09 Data transmission control method and related device
PCT/CN2022/095793 WO2023016059A1 (en) 2021-08-09 2022-05-28 Data transmission control method and related apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110911104.8A CN115706769A (en) 2021-08-09 2021-08-09 Data transmission control method and related device

Publications (1)

Publication Number Publication Date
CN115706769A true CN115706769A (en) 2023-02-17

Family

ID=85180000

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110911104.8A Pending CN115706769A (en) 2021-08-09 2021-08-09 Data transmission control method and related device

Country Status (2)

Country Link
CN (1) CN115706769A (en)
WO (1) WO2023016059A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117156088A (en) * 2023-04-21 2023-12-01 荣耀终端有限公司 Image processing method and related device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106713571A (en) * 2015-08-20 2017-05-24 广州爱九游信息技术有限公司 Mobile terminal and method for testing performance of game engine application
WO2020000489A1 (en) * 2018-06-30 2020-01-02 华为技术有限公司 Pcie sending and receiving method, apparatus, device and system
CN110162451B (en) * 2019-04-22 2022-03-25 腾讯科技(深圳)有限公司 Performance analysis method, performance analysis device, server and storage medium
CN117331427A (en) * 2020-11-09 2024-01-02 腾讯科技(深圳)有限公司 Frame rate adjustment method, device, equipment and computer readable storage medium

Also Published As

Publication number Publication date
WO2023016059A1 (en) 2023-02-16

Similar Documents

Publication Publication Date Title
CN112130742B (en) Full screen display method and device of mobile terminal
CN113726950B (en) Image processing method and electronic equipment
JP7473100B2 (en) User interface layout method and electronic device - Patents.com
CN109559270B (en) Image processing method and electronic equipment
CN111669459B (en) Keyboard display method, electronic device and computer readable storage medium
CN111190681A (en) Display interface adaptation method, display interface adaptation design method and electronic equipment
WO2022042285A1 (en) Method for displaying interface of application program and electronic device
CN113986070B (en) Quick viewing method for application card and electronic equipment
WO2022001258A1 (en) Multi-screen display method and apparatus, terminal device, and storage medium
CN110636554B (en) Data transmission method and device
CN115756268A (en) Cross-device interaction method and device, screen projection system and terminal
WO2022100141A1 (en) Plug-in management method, system and apparatus
WO2023016059A1 (en) Data transmission control method and related apparatus
CN112437341B (en) Video stream processing method and electronic equipment
WO2022170866A1 (en) Data transmission method and apparatus, and storage medium
WO2022121988A1 (en) Display synchronization method, electronic device, and readable storage medium
WO2021238376A1 (en) Function pack loading method and apparatus, and server and electronic device
CN114630152A (en) Parameter transmission method and device for image processor and storage medium
CN115964231A (en) Load model-based assessment method and device
CN114816973A (en) Method and device for debugging codes, electronic equipment and readable storage medium
CN114003241A (en) Interface adaptation display method and system of application program, electronic device and medium
CN113495733A (en) Theme pack installation method and device, electronic equipment and computer readable storage medium
CN113542315B (en) Communication framework, business event processing method and device
WO2023241544A1 (en) Component preview method and electronic device
WO2024104095A1 (en) Data transmission method, apparatus and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination